WorldWideScience

Sample records for multiple baseline analysis

  1. Improving Graduate Students' Graphing Skills of Multiple Baseline Designs with Microsoft[R] Excel 2007

    Science.gov (United States)

    Lo, Ya-yu; Starling, A. Leyf Peirce

    2009-01-01

    This study examined the effects of a graphing task analysis using the Microsoft[R] Office Excel 2007 program on the single-subject multiple baseline graphing skills of three university graduate students. Using a multiple probe across participants design, the study demonstrated a functional relationship between the number of correct graphing…

  2. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  3. Multilevel Analysis of Multiple-Baseline Data Evaluating Precision Teaching as an Intervention for Improving Fluency in Foundational Reading Skills for at Risk Readers

    Science.gov (United States)

    Brosnan, Julie; Moeyaert, Mariola; Brooks Newsome, Kendra; Healy, Olive; Heyvaert, Mieke; Onghena, Patrick; Van den Noortgate, Wim

    2018-01-01

    In this article, multiple-baseline across participants designs were used to evaluate the impact of a precision teaching (PT) program, within a Tier 2 Response to Intervention framework, targeting fluency in foundational reading skills with at risk kindergarten readers. Thirteen multiple-baseline design experiments that included participation from…

  4. Cumulative Effects of Concussion History on Baseline Computerized Neurocognitive Test Scores: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Alsalaheen, Bara; Stockdale, Kayla; Pechumer, Dana; Giessing, Alexander; He, Xuming; Broglio, Steven P

    It is unclear whether individuals with a history of single or multiple clinically recovered concussions exhibit worse cognitive performance on baseline testing compared with individuals with no concussion history. To analyze the effects of concussion history on baseline neurocognitive performance using a computerized neurocognitive test. PubMed, CINAHL, and psycINFO were searched in November 2015. The search was supplemented by a hand search of references. Studies were included if participants completed the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) at baseline (ie, preseason) and if performance was stratified by previous history of single or multiple concussions. Systematic review and meta-analysis. Level 2. Sample size, demographic characteristics of participants, as well as performance of participants on verbal memory, visual memory, visual-motor processing speed, and reaction time were extracted from each study. A random-effects pooled meta-analysis revealed that, with the exception of worsened visual memory for those with 1 previous concussion (Hedges g = 0.10), no differences were observed between participants with 1 or multiple concussions compared with participants without previous concussions. With the exception of decreased visual memory based on history of 1 concussion, history of 1 or multiple concussions was not associated with worse baseline cognitive performance.

  5. Network meta-analysis of disconnected networks: How dangerous are random baseline treatment effects?

    Science.gov (United States)

    Béliveau, Audrey; Goring, Sarah; Platt, Robert W; Gustafson, Paul

    2017-12-01

    In network meta-analysis, the use of fixed baseline treatment effects (a priori independent) in a contrast-based approach is regularly preferred to the use of random baseline treatment effects (a priori dependent). That is because, often, there is not a need to model baseline treatment effects, which carry the risk of model misspecification. However, in disconnected networks, fixed baseline treatment effects do not work (unless extra assumptions are made), as there is not enough information in the data to update the prior distribution on the contrasts between disconnected treatments. In this paper, we investigate to what extent the use of random baseline treatment effects is dangerous in disconnected networks. We take 2 publicly available datasets of connected networks and disconnect them in multiple ways. We then compare the results of treatment comparisons obtained from a Bayesian contrast-based analysis of each disconnected network using random normally distributed and exchangeable baseline treatment effects to those obtained from a Bayesian contrast-based analysis of their initial connected network using fixed baseline treatment effects. For the 2 datasets considered, we found that the use of random baseline treatment effects in disconnected networks was appropriate. Because those datasets were not cherry-picked, there should be other disconnected networks that would benefit from being analyzed using random baseline treatment effects. However, there is also a risk for the normality and exchangeability assumption to be inappropriate in other datasets even though we have not observed this situation in our case study. We provide code, so other datasets can be investigated. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Mass hierarchy sensitivity of medium baseline reactor neutrino experiments with multiple detectors

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hong-Xin, E-mail: hxwang@iphy.me [Department of Physics, Nanjing University, Nanjing 210093 (China); Zhan, Liang; Li, Yu-Feng; Cao, Guo-Fu [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Chen, Shen-Jian [Department of Physics, Nanjing University, Nanjing 210093 (China)

    2017-05-15

    We report the neutrino mass hierarchy (MH) determination of medium baseline reactor neutrino experiments with multiple detectors, where the sensitivity of measuring the MH can be significantly improved by adding a near detector. Then the impact of the baseline and target mass of the near detector on the combined MH sensitivity has been studied thoroughly. The optimal selections of the baseline and target mass of the near detector are ∼12.5 km and ∼4 kton respectively for a far detector with the target mass of 20 kton and the baseline of 52.5 km. As typical examples of future medium baseline reactor neutrino experiments, the optimal location and target mass of the near detector are selected for the specific configurations of JUNO and RENO-50. Finally, we discuss distinct effects of the reactor antineutrino energy spectrum uncertainty for setups of a single detector and double detectors, which indicate that the spectrum uncertainty can be well constrained in the presence of the near detector.

  7. Mass hierarchy sensitivity of medium baseline reactor neutrino experiments with multiple detectors

    Directory of Open Access Journals (Sweden)

    Hong-Xin Wang

    2017-05-01

    Full Text Available We report the neutrino mass hierarchy (MH determination of medium baseline reactor neutrino experiments with multiple detectors, where the sensitivity of measuring the MH can be significantly improved by adding a near detector. Then the impact of the baseline and target mass of the near detector on the combined MH sensitivity has been studied thoroughly. The optimal selections of the baseline and target mass of the near detector are ∼12.5 km and ∼4 kton respectively for a far detector with the target mass of 20 kton and the baseline of 52.5 km. As typical examples of future medium baseline reactor neutrino experiments, the optimal location and target mass of the near detector are selected for the specific configurations of JUNO and RENO-50. Finally, we discuss distinct effects of the reactor antineutrino energy spectrum uncertainty for setups of a single detector and double detectors, which indicate that the spectrum uncertainty can be well constrained in the presence of the near detector.

  8. Schema therapy for personality disorders in older adults : A multiple-baseline study

    NARCIS (Netherlands)

    Videler, A.C.; van Alphen, S.P.J.; Van Royen, R.J.J.; van der Feltz-Cornelis, C.M.; Rossi, G.; Arntz, A.

    2018-01-01

    No studies have been conducted yet into the effectiveness of treatment of personality disorders in later life. This study is a first test of the effectiveness of schema therapy for personality disorders in older adults. Multiple-baseline design with eight cluster C personality disorder patients,

  9. SINGLE VERSUS MULTIPLE TRIAL VECTORS IN CLASSICAL DIFFERENTIAL EVOLUTION FOR OPTIMIZING THE QUANTIZATION TABLE IN JPEG BASELINE ALGORITHM

    Directory of Open Access Journals (Sweden)

    B Vinoth Kumar

    2017-07-01

    Full Text Available Quantization Table is responsible for compression / quality trade-off in baseline Joint Photographic Experts Group (JPEG algorithm and therefore it is viewed as an optimization problem. In the literature, it has been found that Classical Differential Evolution (CDE is a promising algorithm to generate the optimal quantization table. However, the searching capability of CDE could be limited due to generation of single trial vector in an iteration which in turn reduces the convergence speed. This paper studies the performance of CDE by employing multiple trial vectors in a single iteration. An extensive performance analysis has been made between CDE and CDE with multiple trial vectors in terms of Optimization process, accuracy, convergence speed and reliability. The analysis report reveals that CDE with multiple trial vectors improves the convergence speed of CDE and the same is confirmed using a statistical hypothesis test (t-test.

  10. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  11. Assessment of Multiple Intrauterine Gestations from Ovarian Stimulation (AMIGOS) Trial: Baseline Characteristics

    Science.gov (United States)

    Diamond, Michael P.; Legro, Richard S.; Coutifaris, Christos; Alvero, Ruben; Robinson, Randal D.; Casson, Peter; Christman, Gregory M.; Ager, Joel; Huang, Hao; Hansen, Karl R.; Baker, Valerie; Usadi, Rebecca; Seungdamrong, Aimee; Bates, G. Wright; Rosen, R. Mitchell; Haisonleder, Daniell; Krawetz, Stephen A.; Barnhart, Kurt; Trussell, J.C.; Jin, Yufeng; Santoro, Nanette; Eisenberg, Esther; Zhang, Heping

    2015-01-01

    Objective To identify baseline characteristics of women with unexplained infertility to determine whether treatment with an aromatase inhibitor will result in a lower rate of multiple gestations than current standard ovulation induction medications. Design Randomized, prospective clinical trial Patients 900 couples with unexplained infertility Interventions: Ovarian stimulation with gonadotropins, clomiphene citrate, or letrozole in conjunction with intrauterine insemination. Setting Multicenter University based clinical practices. Main Outcome Measures Demographic, laboratory, imaging, and survey characteristics. Interventions Collection of baseline demographics, blood samples, and ultrasonographic assessments. Results Demographic characteristics of women receiving clomiphene citrate, letrozole, or gonadotropins for ovarian stimulation were very consistent. Their mean age was 32.2 ± 4.4 years and infertility duration was 34.7± 25.7 months, with 59% primary infertility. More than 1/3 of the women were current or past smokers. The mean BMI was 27 and mean AMH level was 2.6; only 11 women (1.3%) had antral follicle counts of less than 5. Similar observations were identified for hormonal profiles, ultrasound characterization of the ovaries, semen parameters, and quality of life assessments in both male and female partners. Conclusion The cause of infertility in the couples recruited to this treatment trial is elusive, as the women were regularly ovulating and had evidence of good ovarian reserve both by basal FSH, AMH levels, and antral follicle counts; the male partners had normal semen parameters. The three treatment subgroups have common baseline characteristics, thereby providing comparable patient populations for testing the hypothesis that use of letrozole for ovarian stimulation can reduce the rates of multiples from that observed with gonadotropin and clomiphene citrate treatment. PMID:25707331

  12. Acceptance and Commitment Therapy for Self-Stigma around Sexual Orientation: A Multiple Baseline Evaluation

    Science.gov (United States)

    Yadavaia, James E.; Hayes, Steven C.

    2012-01-01

    This study evaluated the effectiveness of 6 to 10 sessions of Acceptance and Commitment Therapy (ACT) for self-stigma around sexual orientation linked to same-sex attraction (what has generally been referred to as internalized homophobia; IH) in a concurrent multiple-baseline across-participants design. Three men and 2 women showed sizeable…

  13. A Systematic Review and Meta-Analysis of Baseline Ohip-Edent Scores.

    Science.gov (United States)

    Duale, J M J; Patel, Y A; Wu, J; Hyde, T P

    2018-03-01

    OHIP-EDENT is widely used in the literature to assess Oral-Health-Related-Quality-of-Life (OHRQoL) for edentulous patients. However the normal variance and mean of the baseline OHIP scores has not been reported. It would facilitate critical appraisal of studies if we had knowledge of the normal variation and mean of baseline OHIP-EDENT scores. An established figure for baseline OHIP-EDENT, obtained from a meta-analysis, would simplify comparisons of studies and quantify variations in initial OHRQoL of the trial participants. The aim of this study is to quantify a normal baseline value for pre-operative OHIP-EDENT scores by a systematic review and meta-analysis of the available literature. A systematic literature review was carried. 83 papers were identified that included OHIP-EDENT values. After screening and eligibility assessment, 7 papers were selected and included in the meta-analysis. A meta-analysis for the 7 papers by a random-effect model yielded a mean baseline OHIP-EDENT score of 28.63 with a 95% Confidence intervals from 21.93 to 35.34. A pre-operative baseline OHIP-EDENT has been established by meta-analysis of published papers. This will facilitate the comparison of the initial OHRQoL of one study population to that found elsewhere in the published literature. Copyright© 2018 Dennis Barber Ltd.

  14. Build It, But Will They Come? A Geoscience Cyberinfrastructure Baseline Analysis

    Directory of Open Access Journals (Sweden)

    Joel Cutcher-Gershenfeld

    2016-07-01

    Full Text Available Understanding the earth as a system requires integrating many forms of data from multiple fields. Builders and funders of the cyberinfrastructure designed to enable open data sharing in the geosciences risk a key failure mode: What if geoscientists do not use the cyberinfrastructure to share, discover and reuse data? In this study, we report a baseline assessment of engagement with the NSF EarthCube initiative, an open cyberinfrastructure effort for the geosciences. We find scientists perceive the need for cross-disciplinary engagement and engage where there is organizational or institutional support. However, we also find a possibly imbalanced involvement between cyber and geoscience communities at the outset, with the former showing more interest than the latter. This analysis highlights the importance of examining fields and disciplines as stakeholders to investments in the cyberinfrastructure supporting science.

  15. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  16. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  17. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  18. Performance Analysis for Airborne Interferometric SAR Affected by Flexible Baseline Oscillation

    Directory of Open Access Journals (Sweden)

    Liu Zhong-sheng

    2014-04-01

    Full Text Available The airborne interferometric SAR platform suffers from instability factors, such as air turbulence and mechanical vibrations during flight. Such factors cause the oscillation of the flexible baseline, which leads to significant degradation of the performance of the interferometric SAR system. This study is concerned with the baseline oscillation. First, the error of the slant range model under baseline oscillation conditions is formulated. Then, the SAR complex image signal and dual-channel correlation coefficient are modeled based on the first-order, second-order, and generic slant range error. Subsequently, the impact of the baseline oscillation on the imaging and interferometric performance of the SAR system is analyzed. Finally, simulations of the echo data are used to validate the theoretical analysis of the baseline oscillation in the airborne interferometric SAR.

  19. Delayed P100-Like Latencies in Multiple Sclerosis: A Preliminary Investigation Using Visual Evoked Spread Spectrum Analysis

    Science.gov (United States)

    Kiiski, Hanni S. M.; Ní Riada, Sinéad; Lalor, Edmund C.; Gonçalves, Nuno R.; Nolan, Hugh; Whelan, Robert; Lonergan, Róisín; Kelly, Siobhán; O'Brien, Marie Claire; Kinsella, Katie; Bramham, Jessica; Burke, Teresa; Ó Donnchadha, Seán; Hutchinson, Michael; Tubridy, Niall; Reilly, Richard B.

    2016-01-01

    Conduction along the optic nerve is often slowed in multiple sclerosis (MS). This is typically assessed by measuring the latency of the P100 component of the Visual Evoked Potential (VEP) using electroencephalography. The Visual Evoked Spread Spectrum Analysis (VESPA) method, which involves modulating the contrast of a continuous visual stimulus over time, can produce a visually evoked response analogous to the P100 but with a higher signal-to-noise ratio and potentially higher sensitivity to individual differences in comparison to the VEP. The main objective of the study was to conduct a preliminary investigation into the utility of the VESPA method for probing and monitoring visual dysfunction in multiple sclerosis. The latencies and amplitudes of the P100-like VESPA component were compared between healthy controls and multiple sclerosis patients, and multiple sclerosis subgroups. The P100-like VESPA component activations were examined at baseline and over a 3-year period. The study included 43 multiple sclerosis patients (23 relapsing-remitting MS, 20 secondary-progressive MS) and 42 healthy controls who completed the VESPA at baseline. The follow-up sessions were conducted 12 months after baseline with 24 MS patients (15 relapsing-remitting MS, 9 secondary-progressive MS) and 23 controls, and again at 24 months post-baseline with 19 MS patients (13 relapsing-remitting MS, 6 secondary-progressive MS) and 14 controls. The results showed P100-like VESPA latencies to be delayed in multiple sclerosis compared to healthy controls over the 24-month period. Secondary-progressive MS patients had most pronounced delay in P100-like VESPA latency relative to relapsing-remitting MS and controls. There were no longitudinal P100-like VESPA response differences. These findings suggest that the VESPA method is a reproducible electrophysiological method that may have potential utility in the assessment of visual dysfunction in multiple sclerosis. PMID:26726800

  20. Delayed P100-Like Latencies in Multiple Sclerosis: A Preliminary Investigation Using Visual Evoked Spread Spectrum Analysis.

    Directory of Open Access Journals (Sweden)

    Hanni S M Kiiski

    Full Text Available Conduction along the optic nerve is often slowed in multiple sclerosis (MS. This is typically assessed by measuring the latency of the P100 component of the Visual Evoked Potential (VEP using electroencephalography. The Visual Evoked Spread Spectrum Analysis (VESPA method, which involves modulating the contrast of a continuous visual stimulus over time, can produce a visually evoked response analogous to the P100 but with a higher signal-to-noise ratio and potentially higher sensitivity to individual differences in comparison to the VEP. The main objective of the study was to conduct a preliminary investigation into the utility of the VESPA method for probing and monitoring visual dysfunction in multiple sclerosis. The latencies and amplitudes of the P100-like VESPA component were compared between healthy controls and multiple sclerosis patients, and multiple sclerosis subgroups. The P100-like VESPA component activations were examined at baseline and over a 3-year period. The study included 43 multiple sclerosis patients (23 relapsing-remitting MS, 20 secondary-progressive MS and 42 healthy controls who completed the VESPA at baseline. The follow-up sessions were conducted 12 months after baseline with 24 MS patients (15 relapsing-remitting MS, 9 secondary-progressive MS and 23 controls, and again at 24 months post-baseline with 19 MS patients (13 relapsing-remitting MS, 6 secondary-progressive MS and 14 controls. The results showed P100-like VESPA latencies to be delayed in multiple sclerosis compared to healthy controls over the 24-month period. Secondary-progressive MS patients had most pronounced delay in P100-like VESPA latency relative to relapsing-remitting MS and controls. There were no longitudinal P100-like VESPA response differences. These findings suggest that the VESPA method is a reproducible electrophysiological method that may have potential utility in the assessment of visual dysfunction in multiple sclerosis.

  1. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  2. Modular risk analysis for assessing multiple waste sites

    International Nuclear Information System (INIS)

    Whelan, G.; Buck, J.W.; Nazarali, A.

    1994-06-01

    Human-health impacts, especially to the surrounding public, are extremely difficult to assess at installations that contain multiple waste sites and a variety of mixed-waste constituents (e.g., organic, inorganic, and radioactive). These assessments must address different constituents, multiple waste sites, multiple release patterns, different transport pathways (i.e., groundwater, surface water, air, and overland soil), different receptor types and locations, various times of interest, population distributions, land-use patterns, baseline assessments, a variety of exposure scenarios, etc. Although the process is complex, two of the most important difficulties to overcome are associated with (1) establishing an approach that allows for modifying the source term, transport, or exposure component as an individual module without having to re-evaluate the entire installation-wide assessment (i.e., all modules simultaneously), and (2) displaying and communicating the results in an understandable and useable maimer to interested parties. An integrated, physics-based, compartmentalized approach, which is coupled to a Geographical Information System (GIS), captures the regional health impacts associated with multiple waste sites (e.g., hundreds to thousands of waste sites) at locations within and surrounding the installation. Utilizing a modular/GIS-based approach overcomes difficulties in (1) analyzing a wide variety of scenarios for multiple waste sites, and (2) communicating results from a complex human-health-impact analysis by capturing the essence of the assessment in a relatively elegant manner, so the meaning of the results can be quickly conveyed to all who review them

  3. Contingency management adapted for African-American adolescents with obesity enhances youth weight loss with caregiver participation: a multiple baseline pilot study.

    Science.gov (United States)

    Hartlieb, Kathryn Brogan; Naar, Sylvie; Ledgerwood, David M; Templin, Thomas N; Ellis, Deborah A; Donohue, Bradley; Cunningham, Phillippe B

    2015-12-07

    Contingency management (CM) interventions, which use operant conditioning principles to encourage completion of target behavioral goals, may be useful for improving adherence to behavioral skills training (BST). Research-to-date has yet to explore CM for weight loss in minority adolescents. To examine the effects of CM in improving adolescent weight loss when added to BST. The study utilized an innovative experimental design that builds upon multiple baseline approaches as recommended by the National Institutes of Health. Six obese African-American youth and their primary caregivers living in Detroit, Michigan, USA. Adolescents received between 4 and 12 weeks of BST during a baseline period and subsequently received CM targeting weight loss. Youth weight. Linear mixed effects modeling was used in the analysis. CM did not directly affect adolescent weight loss above that of BST (p=0.053). However, when caregivers were involved in CM session treatment, contingency management had a positive effect on adolescent weight loss. The estimated weight loss due to CM when caregivers also attended was 0.66 kg/week (pcontingency management for minority youth weight loss. Lessons learned from contingency management program implementation are also discussed in order to inform practice.

  4. Addendum to the 2015 Eastern Interconnect Baselining and Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Amidan, Brett G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Follum, James D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-06-30

    This report serves as an addendum to the report 2015 Eastern Interconnect Baselining and Analysis Report (Amidan, Follum, and Freeman, 2015). This addendum report investigates the following: the impact of shorter record lengths and of adding a daily regularization term to the date/time models for angle pair measurements, additional development of a method to monitor the trend in phase angle pairs, the effect of changing the length of time to determine a baseline, when calculating atypical events, and a comparison between quantitatively discovered atypical events and actual events.

  5. Prolonged-release fampridine treatment improved subject-reported impact of multiple sclerosis: Item-level analysis of the MSIS-29.

    Science.gov (United States)

    Gasperini, Claudio; Hupperts, Raymond; Lycke, Jan; Short, Christine; McNeill, Manjit; Zhong, John; Mehta, Lahar R

    2016-11-15

    Prolonged-release (PR) fampridine is approved to treat walking impairment in persons with multiple sclerosis (MS); however, treatment benefits may extend beyond walking. MOBILE was a phase 2, 24-week, double-blind, placebo-controlled exploratory study to assess the impact of 10mg PR-fampridine twice daily versus placebo on several subject-assessed measures. This analysis evaluated the physical and psychological health outcomes of subjects with progressing or relapsing MS from individual items of the Multiple Sclerosis Impact Scale (MSIS-29). PR-fampridine treatment (n=68) resulted in greater improvements from baseline in the MSIS-29 physical (PHYS) and psychological (PSYCH) impact subscales, with differences of 89% and 148% in mean score reduction from baseline (n=64) at week 24 versus placebo, respectively. MSIS-29 item analysis showed that a higher percentage of PR-fampridine subjects had mean improvements in 16/20 PHYS and 6/9 PSYCH items versus placebo after 24weeks. Post hoc analysis of the 12-item Multiple Sclerosis Walking Scale (MSWS-12) improver population (≥8-point mean improvement) demonstrated differences in mean reductions from baseline of 97% and 111% in PR-fampridine MSIS-29 PHYS and PSYCH subscales versus the overall placebo group over 24weeks. A higher percentage of MSWS-12 improvers treated with PR-fampridine showed mean improvements in 20/20 PHYS and 8/9 PSYCH items versus placebo at 24weeks. In conclusion, PR-fampridine resulted in physical and psychological benefits versus placebo, sustained over 24weeks. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Neutron Multiplicity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Frame, Katherine Chiyoko [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-28

    Neutron multiplicity measurements are widely used for nondestructive assay (NDA) of special nuclear material (SNM). When combined with isotopic composition information, neutron multiplicity analysis can be used to estimate the spontaneous fission rate and leakage multiplication of SNM. When combined with isotopic information, the total mass of fissile material can also be determined. This presentation provides an overview of this technique.

  7. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  8. Mouse Chromosome 4 Is Associated with the Baseline and Allergic IgE Phenotypes

    Directory of Open Access Journals (Sweden)

    Cynthia Kanagaratham

    2017-08-01

    Full Text Available Regulation of IgE concentration in the blood is a complex trait, with high concentrations associated with parasitic infections as well as allergic diseases. A/J strain mice have significantly higher plasma concentrations of IgE, both at baseline and after ovalbumin antigen exposure, when compared to C57BL/6J strain mice. Our objective was to determine the genomic regions associated with this difference in phenotype. To achieve this, we used a panel of recombinant congenic strains (RCS derived from A/J and C57BL/6J strains. We measured IgE in the RCS panel at baseline and following allergen exposure. Using marker by marker analysis of the RCS genotype and phenotype data, we identified multiple regions associated with the IgE phenotype. A single region was identified to be associated with baseline IgE level, while multiple regions wereassociated with the phenotype after allergen exposure. The most significant region was found on Chromosome 4, from 81.46 to 86.17 Mbp. Chromosome 4 substitution strain mice had significantly higher concentration of IgE than their background parental strain mice, C57BL/6J. Our data presents multiple candidate regions associated with plasma IgE concentration at baseline and following allergen exposure, with the most significant one located on Chromosome 4.

  9. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  10. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    Science.gov (United States)

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  11. Functional Multiple-Set Canonical Correlation Analysis

    Science.gov (United States)

    Hwang, Heungsun; Jung, Kwanghee; Takane, Yoshio; Woodward, Todd S.

    2012-01-01

    We propose functional multiple-set canonical correlation analysis for exploring associations among multiple sets of functions. The proposed method includes functional canonical correlation analysis as a special case when only two sets of functions are considered. As in classical multiple-set canonical correlation analysis, computationally, the…

  12. Project management with dynamic scheduling baseline scheduling, risk analysis and project control

    CERN Document Server

    Vanhoucke, Mario

    2013-01-01

    The topic of this book is known as dynamic scheduling, and is used to refer to three dimensions of project management and scheduling: the construction of a baseline schedule and the analysis of a project schedule's risk as preparation of the project control phase during project progress. This dynamic scheduling point of view implicitly assumes that the usability of a project's baseline schedule is rather limited and only acts as a point of reference in the project life cycle.

  13. Preliminary Report: Analysis of the baseline study on the prevalence of Salmonella in laying hen flocks of Gallus gallus

    DEFF Research Database (Denmark)

    Hald, Tine

    This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary for the establ......This is a preliminary report on the analysis of the Community-wide baseline study to estimate the prevalence of Salmonella in laying hen flocks. It is being published pending the full analysis of the entire dataset from the baseline study. The report contains the elements necessary...

  14. Use of Multiple Imputation Method to Improve Estimation of Missing Baseline Serum Creatinine in Acute Kidney Injury Research

    Science.gov (United States)

    Peterson, Josh F.; Eden, Svetlana K.; Moons, Karel G.; Ikizler, T. Alp; Matheny, Michael E.

    2013-01-01

    Summary Background and objectives Baseline creatinine (BCr) is frequently missing in AKI studies. Common surrogate estimates can misclassify AKI and adversely affect the study of related outcomes. This study examined whether multiple imputation improved accuracy of estimating missing BCr beyond current recommendations to apply assumed estimated GFR (eGFR) of 75 ml/min per 1.73 m2 (eGFR 75). Design, setting, participants, & measurements From 41,114 unique adult admissions (13,003 with and 28,111 without BCr data) at Vanderbilt University Hospital between 2006 and 2008, a propensity score model was developed to predict likelihood of missing BCr. Propensity scoring identified 6502 patients with highest likelihood of missing BCr among 13,003 patients with known BCr to simulate a “missing” data scenario while preserving actual reference BCr. Within this cohort (n=6502), the ability of various multiple-imputation approaches to estimate BCr and classify AKI were compared with that of eGFR 75. Results All multiple-imputation methods except the basic one more closely approximated actual BCr than did eGFR 75. Total AKI misclassification was lower with multiple imputation (full multiple imputation + serum creatinine) (9.0%) than with eGFR 75 (12.3%; Pcreatinine) (15.3%) versus eGFR 75 (40.5%; P<0.001). Multiple imputation improved specificity and positive predictive value for detecting AKI at the expense of modestly decreasing sensitivity relative to eGFR 75. Conclusions Multiple imputation can improve accuracy in estimating missing BCr and reduce misclassification of AKI beyond currently proposed methods. PMID:23037980

  15. Analysis of baseline gene expression levels from ...

    Science.gov (United States)

    The use of gene expression profiling to predict chemical mode of action would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in gene expression. A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques. The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selectiv

  16. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    Science.gov (United States)

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  17. Idaho National Engineering Laboratory (INEL) Environmental Restoration Program (ERP), Baseline Safety Analysis File (BSAF). Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-20

    This document was prepared to take the place of a Safety Evaluation Report since the Baseline Safety Analysis File (BSAF)and associated Baseline Technical Safety Requirements (TSR) File do not meet the requirements of a complete safety analysis documentation. Its purpose is to present in summary form the background of how the BSAF and Baseline TSR originated and a description of the process by which it was produced and approved for use in the Environmental Restoration Program.The BSAF is a facility safety reference document for INEL environmental restoration activities including environmental remediation of inactive waste sites and decontamination and decommissioning (D&D) of surplus facilities. The BSAF contains safety bases common to environmental restoration activities and guidelines for performing and documenting safety analysis. The common safety bases can be incorporated by reference into the safety analysis documentation prepared for individual environmental restoration activities with justification and any necessary revisions. The safety analysis guidelines in BSAF provide an accepted method for hazard analysis; analysis of normal, abnormal, and accident conditions; human factors analysis; and derivation of TSRS. The BSAF safety bases and guidelines are graded for environmental restoration activities.

  18. Idaho National Engineering Laboratory (INEL) Environmental Restoration Program (ERP), Baseline Safety Analysis File (BSAF). Revision 1

    International Nuclear Information System (INIS)

    1994-01-01

    This document was prepared to take the place of a Safety Evaluation Report since the Baseline Safety Analysis File (BSAF)and associated Baseline Technical Safety Requirements (TSR) File do not meet the requirements of a complete safety analysis documentation. Its purpose is to present in summary form the background of how the BSAF and Baseline TSR originated and a description of the process by which it was produced and approved for use in the Environmental Restoration Program.The BSAF is a facility safety reference document for INEL environmental restoration activities including environmental remediation of inactive waste sites and decontamination and decommissioning (D ampersand D) of surplus facilities. The BSAF contains safety bases common to environmental restoration activities and guidelines for performing and documenting safety analysis. The common safety bases can be incorporated by reference into the safety analysis documentation prepared for individual environmental restoration activities with justification and any necessary revisions. The safety analysis guidelines in BSAF provide an accepted method for hazard analysis; analysis of normal, abnormal, and accident conditions; human factors analysis; and derivation of TSRS. The BSAF safety bases and guidelines are graded for environmental restoration activities

  19. Multilevel models for multiple-baseline data: modeling across-participant variation in autocorrelation and residual variance.

    Science.gov (United States)

    Baek, Eun Kyeng; Ferron, John M

    2013-03-01

    Multilevel models (MLM) have been used as a method for analyzing multiple-baseline single-case data. However, some concerns can be raised because the models that have been used assume that the Level-1 error covariance matrix is the same for all participants. The purpose of this study was to extend the application of MLM of single-case data in order to accommodate across-participant variation in the Level-1 residual variance and autocorrelation. This more general model was then used in the analysis of single-case data sets to illustrate the method, to estimate the degree to which the autocorrelation and residual variances differed across participants, and to examine whether inferences about treatment effects were sensitive to whether or not the Level-1 error covariance matrix was allowed to vary across participants. The results from the analyses of five published studies showed that when the Level-1 error covariance matrix was allowed to vary across participants, some relatively large differences in autocorrelation estimates and error variance estimates emerged. The changes in modeling the variance structure did not change the conclusions about which fixed effects were statistically significant in most of the studies, but there was one exception. The fit indices did not consistently support selecting either the more complex covariance structure, which allowed the covariance parameters to vary across participants, or the simpler covariance structure. Given the uncertainty in model specification that may arise when modeling single-case data, researchers should consider conducting sensitivity analyses to examine the degree to which their conclusions are sensitive to modeling choices.

  20. Imagery Rescripting for Body Dysmorphic Disorder: A Multiple-Baseline Single-Case Experimental Design.

    Science.gov (United States)

    Willson, Rob; Veale, David; Freeston, Mark

    2016-03-01

    Individuals with body dysmorphic disorder (BDD) often experience negative distorted images of their appearance, and research suggests these may be linked to memories of adverse events such as bullying or teasing. This study evaluates imagery rescripting (ImR) as an intervention for BDD. In this article, we present a multiple-baseline single-case experimental design testing imagery rescripting as a brief, stand-alone intervention, with six individuals with BDD that related to aversive memories. The impact of the intervention was assessed by self-reported daily measures of symptom severity (preoccupation with appearance, appearance-related checking behaviors, appearance-related distress, and strength of belief that their main problem is their appearance) and standardized clinician ratings of BDD severity (Yale-Brown Obsessive Compulsive Scale modified for BDD). Four out of six of the participants responded positively to the intervention, with clinically meaningful improvement in symptomatology. Overall response was rapid; improvements began within the first week post-ImR intervention. From a small sample it is cautiously concluded that imagery rescripting may show promise as a module in cognitive-behavioral therapy for BDD, and is worthy of further investigation. Copyright © 2015. Published by Elsevier Ltd.

  1. Meta-Analysis of the Relation of Baseline Right Ventricular Function to Response to Cardiac Resynchronization Therapy.

    Science.gov (United States)

    Sharma, Abhishek; Bax, Jerome J; Vallakati, Ajay; Goel, Sunny; Lavie, Carl J; Kassotis, John; Mukherjee, Debabrata; Einstein, Andrew; Warrier, Nikhil; Lazar, Jason M

    2016-04-15

    Right ventricular (RV) dysfunction has been associated with adverse clinical outcomes in patients with heart failure (HF). Cardiac resynchronization therapy (CRT) improves left ventricular (LV) size and function in patients with markedly abnormal electrocardiogram QRS duration. However, relation of baseline RV function with response to CRT has not been well described. In this study, we aim to investigate the relation of baseline RV function with response to CRT as assessed by change in LV ejection fraction (EF). A systematic search of studies published from 1966 to May 31, 2015 was conducted using PubMed, CINAHL, Cochrane CENTRAL, and the Web of Science databases. Studies were included if they have reported (1) parameters of baseline RV function (tricuspid annular plane systolic excursion [TAPSE] or RVEF or RV basal strain or RV fractional area change [FAC]) and (2) LVEF before and after CRT. Random-effects metaregression was used to evaluate the effect of baseline RV function parameters and change in LVEF. Sixteen studies (n = 1,764) were selected for final analysis. Random-effects metaregression analysis showed no significant association between the magnitude of the difference in EF before and after CRT with baseline TAPSE (β = 0.005, p = 0.989); baseline RVEF (β = 0.270, p = 0.493); baseline RVFAC (β = -0.367, p = 0.06); baseline basal strain (β = -0.342, p = 0.462) after a mean follow-up period of 10.5 months. In conclusion, baseline RV function as assessed by TAPSE, FAC, basal strain, or RVEF does not determine response to CRT as assessed by change in LVEF. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  3. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  4. Using multiple schedules during functional communication training to promote rapid transfer of treatment effects.

    Science.gov (United States)

    Fisher, Wayne W; Greer, Brian D; Fuhrman, Ashley M; Querim, Angie C

    2015-12-01

    Multiple schedules with signaled periods of reinforcement and extinction have been used to thin reinforcement schedules during functional communication training (FCT) to make the intervention more practical for parents and teachers. We evaluated whether these signals would also facilitate rapid transfer of treatment effects across settings and therapists. With 2 children, we conducted FCT in the context of mixed (baseline) and multiple (treatment) schedules introduced across settings or therapists using a multiple baseline design. Results indicated that when the multiple schedules were introduced, the functional communication response came under rapid discriminative control, and problem behavior remained at near-zero rates. We extended these findings with another individual by using a more traditional baseline in which problem behavior produced reinforcement. Results replicated those of the previous participants and showed rapid reductions in problem behavior when multiple schedules were implemented across settings. © Society for the Experimental Analysis of Behavior.

  5. [Modular risk analysis for assessing multiple waste sites]: Proceedings

    International Nuclear Information System (INIS)

    Whelan, G.

    1994-01-01

    This document contains proceedings from the Integrated Planning Workshop from Strategic Planning to Baselining and Other Objectives. Topics discussed include: stakeholder involvement; regulations; future site use planning; site integration and baseline methods; risk analysis in decision making; land uses; and economics in decision making. Individual records have been processed separately for the database

  6. Modeling Rabbit Responses to Single and Multiple Aerosol ...

    Science.gov (United States)

    Journal Article Survival models are developed here to predict response and time-to-response for mortality in rabbits following exposures to single or multiple aerosol doses of Bacillus anthracis spores. Hazard function models were developed for a multiple dose dataset to predict the probability of death through specifying dose-response functions and the time between exposure and the time-to-death (TTD). Among the models developed, the best-fitting survival model (baseline model) has an exponential dose-response model with a Weibull TTD distribution. Alternative models assessed employ different underlying dose-response functions and use the assumption that, in a multiple dose scenario, earlier doses affect the hazard functions of each subsequent dose. In addition, published mechanistic models are analyzed and compared with models developed in this paper. None of the alternative models that were assessed provided a statistically significant improvement in fit over the baseline model. The general approach utilizes simple empirical data analysis to develop parsimonious models with limited reliance on mechanistic assumptions. The baseline model predicts TTDs consistent with reported results from three independent high-dose rabbit datasets. More accurate survival models depend upon future development of dose-response datasets specifically designed to assess potential multiple dose effects on response and time-to-response. The process used in this paper to dev

  7. Analysis of baseline, average, and longitudinally measured blood pressure data using linear mixed models.

    Science.gov (United States)

    Hossain, Ahmed; Beyene, Joseph

    2014-01-01

    This article compares baseline, average, and longitudinal data analysis methods for identifying genetic variants in genome-wide association study using the Genetic Analysis Workshop 18 data. We apply methods that include (a) linear mixed models with baseline measures, (b) random intercept linear mixed models with mean measures outcome, and (c) random intercept linear mixed models with longitudinal measurements. In the linear mixed models, covariates are included as fixed effects, whereas relatedness among individuals is incorporated as the variance-covariance structure of the random effect for the individuals. The overall strategy of applying linear mixed models decorrelate the data is based on Aulchenko et al.'s GRAMMAR. By analyzing systolic and diastolic blood pressure, which are used separately as outcomes, we compare the 3 methods in identifying a known genetic variant that is associated with blood pressure from chromosome 3 and simulated phenotype data. We also analyze the real phenotype data to illustrate the methods. We conclude that the linear mixed model with longitudinal measurements of diastolic blood pressure is the most accurate at identifying the known single-nucleotide polymorphism among the methods, but linear mixed models with baseline measures perform best with systolic blood pressure as the outcome.

  8. Baseline drift effect on the performance of neutron and γ ray discrimination using frequency gradient analysis

    International Nuclear Information System (INIS)

    Liu Guofu; Luo Xiaoliang; Yang Jun; Lin Cunbao; Hu Qingqing; Peng Jinxian

    2013-01-01

    Frequency gradient analysis (FGA) effectively discriminates neutrons and γ rays by examining the frequency-domain features of the photomultiplier tube anode signal. This approach is insensitive to noise but is inevitably affected by the baseline drift similar to other pulse shape discrimination methods. The baseline drift effect is attributed to factors such as power line fluctuation, dark current, noise disturbances, hum, and pulse tail in front-end electronics. This effect needs to be elucidated and quantified before the baseline shift can be estimated and removed from the captured signal. Therefore, the effect of baseline shift on the discrimination performance of neutrons and γ rays with organic scintillation detectors using FGA is investigated in this paper. The relationship between the baseline shift and discrimination parameters of FGA is derived and verified by an experimental system consisting of an americium—beryllium source, a BC501A liquid scintillator detector, and a 5 GSample/s 8-bit oscilloscope. The theoretical and experimental results both show that the estimation of the baseline shift is necessary, and the removal of baseline drift from the pulse shapes can improve the discrimination performance of FGA. (authors)

  9. A novel variable baseline visibility detection system and its measurement method

    Science.gov (United States)

    Li, Meng; Jiang, Li-hui; Xiong, Xing-long; Zhang, Guizhong; Yao, JianQuan

    2017-10-01

    As an important meteorological observation instrument, the visibility meter can ensure the safety of traffic operation. However, due to the optical system contamination as well as sample error, the accuracy and stability of the equipment are difficult to meet the requirement in the low-visibility environment. To settle this matter, a novel measurement equipment was designed based upon multiple baseline, which essentially acts as an atmospheric transmission meter with movable optical receiver, applying weighted least square method to process signal. Theoretical analysis and experiments in real atmosphere environment support this technique.

  10. Eye movement desensitisation and reprocessing therapy for posttraumatic stress disorder in a child and an adolescent with mild to borderline intellectual disability: A multiple baseline across subjects study

    NARCIS (Netherlands)

    Mevissen, E.H.M.; Didden, H.C.M.; Korzilius, H.P.L.M.; Jongh, A. de

    2017-01-01

    BACKGROUND: This study explored the effectiveness of eye movement desensitisation and reprocessing (EMDR) therapy for post-traumatic stress disorder (PTSD) in persons with mild to borderline intellectual disability (MBID) using a multiple baseline across subjects design. METHODS: One child and one

  11. Eye movement desensitisation and reprocessing therapy for posttraumatic stress disorder in a child and an adolescent with mild to borderline intellectual disability : A multiple baseline across subjects study

    NARCIS (Netherlands)

    Mevissen, L.; Didden, R.; Korzilius, H.; de Jongh, A.

    2017-01-01

    Background: This study explored the effectiveness of eye movement desensitisation and reprocessing (EMDR) therapy for post-traumatic stress disorder (PTSD) in persons with mild to borderline intellectual disability (MBID) using a multiple baseline across subjects design. Methods: One child and one

  12. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Baseline response rates affect resistance to change.

    Science.gov (United States)

    Kuroda, Toshikazu; Cook, James E; Lattal, Kennon A

    2018-01-01

    The effect of response rates on resistance to change, measured as resistance to extinction, was examined in two experiments. In Experiment 1, responding in transition from a variable-ratio schedule and its yoked-interval counterpart to extinction was compared with pigeons. Following training on a multiple variable-ratio yoked-interval schedule of reinforcement, in which response rates were higher in the former component, reinforcement was removed from both components during a single extended extinction session. Resistance to extinction in the yoked-interval component was always either greater or equal to that in the variable-ratio component. In Experiment 2, resistance to extinction was compared for two groups of rats that exhibited either high or low response rates when maintained on identical variable-interval schedules. Resistance to extinction was greater for the lower-response-rate group. These results suggest that baseline response rate can contribute to resistance to change. Such effects, however, can only be revealed when baseline response rate and reinforcement rate are disentangled (Experiments 1 and 2) from the more usual circumstance where the two covary. Furthermore, they are more cleanly revealed when the programmed contingencies controlling high and low response rates are identical, as in Experiment 2. © 2017 Society for the Experimental Analysis of Behavior.

  14. An Examination of Fluoxetine for the Treatment of Selective Mutism Using a Nonconcurrent Multiple-Baseline Single-Case Design Across 5 Cases.

    Science.gov (United States)

    Barterian, Justin A; Sanchez, Joel M; Magen, Jed; Siroky, Allison K; Mash, Brittany L; Carlson, John S

    2018-01-01

    This study examined the utility of fluoxetine in the treatment of 5 children, aged 5 to 14 years, diagnosed with selective mutism who also demonstrated symptoms of social anxiety. A nonconcurrent, randomized, multiple-baseline, single-case design with a single-blind placebo-controlled procedure was used. Parents and the study psychiatrist completed multiple methods of assessment including Direct Behavior Ratings and questionnaires. Treatment outcomes were evaluated by calculating effect sizes for each participant as an individual and for the participants as a group. Information regarding adverse effects with an emphasis on behavioral disinhibition and ratings of parental acceptance of the intervention was gathered. All 5 children experienced improvement in social anxiety, responsive speech, and spontaneous speech with medium to large effect sizes; however, children still met criteria for selective mutism at the end of the study. Adverse events were minimal, with only 2 children experiencing brief occurrences of minor behavioral disinhibition. Parents found the treatment highly acceptable.

  15. Predicting Baseline for Analysis of Electricity Pricing

    Energy Technology Data Exchange (ETDEWEB)

    Kim, T. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Lee, D. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Choi, J. [Ulsan National Inst. of Science and Technology (Korea, Republic of); Spurlock, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-03

    To understand the impact of new pricing structure on residential electricity demands, we need a baseline model that captures every factor other than the new price. The standard baseline is a randomized control group, however, a good control group is hard to design. This motivates us to devlop data-driven approaches. We explored many techniques and designed a strategy, named LTAP, that could predict the hourly usage years ahead. The key challenge in this process is that the daily cycle of electricity demand peaks a few hours after the temperature reaching its peak. Existing methods rely on the lagged variables of recent past usages to enforce this daily cycle. These methods have trouble making predictions years ahead. LTAP avoids this trouble by assuming the daily usage profile is determined by temperature and other factors. In a comparison against a well-designed control group, LTAP is found to produce accurate predictions.

  16. Eye Movement Desensitisation and Reprocessing Therapy for Posttraumatic Stress Disorder in a Child and an Adolescent with Mild to Borderline Intellectual Disability: A Multiple Baseline across Subjects Study

    Science.gov (United States)

    Mevissen, Liesbeth; Didden, Robert; Korzilius, Hubert; de Jongh, Ad

    2017-01-01

    Background: This study explored the effectiveness of eye movement desensitisation and reprocessing (EMDR) therapy for post-traumatic stress disorder (PTSD) in persons with mild to borderline intellectual disability (MBID) using a multiple baseline across subjects design. Methods: One child and one adolescent with MBID, who met diagnostic criteria…

  17. MARBLE (Multiple Antenna Radio-interferometry for Baseline Length Evaluation): Development of a Compact VLBI System for Calibrating GNSS and Electronic Distance Measurement Devices

    Science.gov (United States)

    Ichikawa, R.; Ishii, A.; Takiguchi, H.; Kimura, M.; Sekido, M.; Takefuji, K.; Ujihara, H.; Hanado, Y.; Koyama, Y.; Kondo, T.; Kurihara, S.; Kokado, K.; Kawabata, R.; Nozawa, K.; Mukai, Y.; Kuroda, J.; Ishihara, M.; Matsuzaka, S.

    2012-12-01

    We are developing a compact VLBI system with a 1.6-m diameter aperture dish in order to provide reference baseline lengths for calibration. The reference baselines are used to validate surveying instruments such as GPS and EDM and is maintained by the Geospatial Information Authority of Japan (GSI). The compact VLBI system will be installed at both ends of the reference baseline. Since the system is not sensitive enough to detect fringes between the two small dishes, we have designed a new observation concept including one large dish station. We can detect two group delays between each compact VLBI system and the large dish station based on conventional VLBI measurement. A group delay between the two compact dishes can be indirectly calculated using a simple equation. We named the idea "Multiple Antenna Radio-interferometry for Baseline Length Evaluation", or MARBLE system. The compact VLBI system is easy transportable and consists of the compact dish, a new wide-band front-end system, azimuth and elevation drive units, an IF down-converter unit, an antenna control unit (ACU), a counterweight, and a monument pillar. Each drive unit is equipped with a zero-backlash harmonic drive gearing component. A monument pillar is designed to mount typical geodetic GNSS antennas easily and an offset between the GNSS antenna reference point. The location of the azimuth-elevation crossing point of the VLBI system is precisely determined with an uncertainty of less than 0.2 mm. We have carried out seven geodetic VLBI experiments on the Kashima-Tsukuba baseline (about 54 km) using the two prototypes of the compact VLBI system between December 2009 and December 2010. The average baseline length and repeatability of the experiments is 54184874.0 ± 2.4 mm. The results are well consistent with those obtained by GPS measurements. In addition, we are now planning to use the compact VLBI system for precise time and frequency comparison between separated locations.

  18. Idaho National Engineering Laboratory (INEL) Environmental Restoration (ER) Program Baseline Safety Analysis File (BSAF)

    International Nuclear Information System (INIS)

    1995-09-01

    The Baseline Safety Analysis File (BSAF) is a facility safety reference document for the Idaho National Engineering Laboratory (INEL) environmental restoration activities. The BSAF contains information and guidance for safety analysis documentation required by the U.S. Department of Energy (DOE) for environmental restoration (ER) activities, including: Characterization of potentially contaminated sites. Remedial investigations to identify and remedial actions to clean up existing and potential releases from inactive waste sites Decontamination and dismantlement of surplus facilities. The information is INEL-specific and is in the format required by DOE-EM-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports. An author of safety analysis documentation need only write information concerning that activity and refer to BSAF for further information or copy applicable chapters and sections. The information and guidance provided are suitable for: sm-bullet Nuclear facilities (DOE Order 5480-23, Nuclear Safety Analysis Reports) with hazards that meet the Category 3 threshold (DOE-STD-1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports) sm-bullet Radiological facilities (DOE-EM-STD-5502-94, Hazard Baseline Documentation) Nonnuclear facilities (DOE-EM-STD-5502-94) that are classified as open-quotes lowclose quotes hazard facilities (DOE Order 5481.1B, Safety Analysis and Review System). Additionally, the BSAF could be used as an information source for Health and Safety Plans and for Safety Analysis Reports (SARs) for nuclear facilities with hazards equal to or greater than the Category 2 thresholds, or for nonnuclear facilities with open-quotes moderateclose quotes or open-quotes highclose quotes hazard classifications

  19. Idaho National Engineering Laboratory (INEL) Environmental Restoration (ER) Program Baseline Safety Analysis File (BSAF)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The Baseline Safety Analysis File (BSAF) is a facility safety reference document for the Idaho National Engineering Laboratory (INEL) environmental restoration activities. The BSAF contains information and guidance for safety analysis documentation required by the U.S. Department of Energy (DOE) for environmental restoration (ER) activities, including: Characterization of potentially contaminated sites. Remedial investigations to identify and remedial actions to clean up existing and potential releases from inactive waste sites Decontamination and dismantlement of surplus facilities. The information is INEL-specific and is in the format required by DOE-EM-STD-3009-94, Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Safety Analysis Reports. An author of safety analysis documentation need only write information concerning that activity and refer to BSAF for further information or copy applicable chapters and sections. The information and guidance provided are suitable for: {sm_bullet} Nuclear facilities (DOE Order 5480-23, Nuclear Safety Analysis Reports) with hazards that meet the Category 3 threshold (DOE-STD-1027-92, Hazard Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports) {sm_bullet} Radiological facilities (DOE-EM-STD-5502-94, Hazard Baseline Documentation) Nonnuclear facilities (DOE-EM-STD-5502-94) that are classified as {open_quotes}low{close_quotes} hazard facilities (DOE Order 5481.1B, Safety Analysis and Review System). Additionally, the BSAF could be used as an information source for Health and Safety Plans and for Safety Analysis Reports (SARs) for nuclear facilities with hazards equal to or greater than the Category 2 thresholds, or for nonnuclear facilities with {open_quotes}moderate{close_quotes} or {open_quotes}high{close_quotes} hazard classifications.

  20. Three-dimensional thermal analysis of a baseline spent fuel repository

    International Nuclear Information System (INIS)

    Altenbach, T.J.; Lowry, W.E.

    1980-01-01

    A three-dimensional thermal analysis has been performed using finite difference techniques to determine the near-field response of a baseline spent fuel repository in a deep geologic salt medium. A baseline design incorporates previous thermal modeling experience and OWI recommendations for areal thermal loading in specifying the waste form properties, package details, and emplacement configuration. The base case in this thermal analysis considers one 10-year old PWR spent fuel assembly emplaced to yield a 36 kw/acre (8.9 w/m 2 ) loading. A unit cell model in an infinite array is used to simplify the problem and provide upper-bound temperatures. Boundary conditions are imposed which allow simulations to 1000 years. Variations studied include a comparison of ventilated and unventilated storage room conditions, emplacement packages with and without air gaps surrounding the canister, and room cool-down scenarios with ventilation following an unventilated state for retrieval purposes. At this low power level ventilating the emplacement room has an immediate cooling influence on the canister and effectively maintains the emplacement room floor near the temperature of the ventilating air. The annular gap separating the canister and sleeve causes the peak temperature of the canister surface to rise by 10 0 F (5.6 0 C) over that from a no gap case assuming perfect thermal contact. It was also shown that the time required for the emplacement room to cool down to 100 0 F (38 0 C) from an unventilated state ranged from 2 weeks to 6 months; when ventilation initiated after times of 5 years to 50 years, respectively. As the work was performed for the Nuclear Regulatory Commission, these results provide a significant addition to the regulatory data base for spent fuel performance in a geologic repository

  1. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener

    Directory of Open Access Journals (Sweden)

    Yun-Kyu An

    2016-09-01

    Full Text Available This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  2. Stripe-PZT Sensor-Based Baseline-Free Crack Diagnosis in a Structure with a Welded Stiffener.

    Science.gov (United States)

    An, Yun-Kyu; Shen, Zhiqi; Wu, Zhishen

    2016-09-16

    This paper proposes a stripe-PZT sensor-based baseline-free crack diagnosis technique in the heat affected zone (HAZ) of a structure with a welded stiffener. The proposed technique enables one to identify and localize a crack in the HAZ using only current data measured using a stripe-PZT sensor. The use of the stripe-PZT sensor makes it possible to significantly improve the applicability to real structures and minimize man-made errors associated with the installation process by embedding multiple piezoelectric sensors onto a printed circuit board. Moreover, a new frequency-wavenumber analysis-based baseline-free crack diagnosis algorithm minimizes false alarms caused by environmental variations by avoiding simple comparison with the baseline data accumulated from the pristine condition of a target structure. The proposed technique is numerically as well as experimentally validated using a plate-like structure with a welded stiffener, reveling that it successfully identifies and localizes a crack in HAZ.

  3. Simultaneous Two-Way Clustering of Multiple Correspondence Analysis

    Science.gov (United States)

    Hwang, Heungsun; Dillon, William R.

    2010-01-01

    A 2-way clustering approach to multiple correspondence analysis is proposed to account for cluster-level heterogeneity of both respondents and variable categories in multivariate categorical data. Specifically, in the proposed method, multiple correspondence analysis is combined with k-means in a unified framework in which "k"-means is…

  4. Prediction of diabetes based on baseline metabolic characteristics in individuals at high risk.

    Science.gov (United States)

    Defronzo, Ralph A; Tripathy, Devjit; Schwenke, Dawn C; Banerji, Maryann; Bray, George A; Buchanan, Thomas A; Clement, Stephen C; Henry, Robert R; Kitabchi, Abbas E; Mudaliar, Sunder; Ratner, Robert E; Stentz, Frankie B; Musi, Nicolas; Reaven, Peter D; Gastaldelli, Amalia

    2013-11-01

    Individuals with impaired glucose tolerance (IGT) are at high risk for developing type 2 diabetes mellitus (T2DM). We examined which characteristics at baseline predicted the development of T2DM versus maintenance of IGT or conversion to normal glucose tolerance. We studied 228 subjects at high risk with IGT who received treatment with placebo in ACT NOW and who underwent baseline anthropometric measures and oral glucose tolerance test (OGTT) at baseline and after a mean follow-up of 2.4 years. In a univariate analysis, 45 of 228 (19.7%) IGT individuals developed diabetes. After adjusting for age, sex, and center, increased fasting plasma glucose, 2-h plasma glucose, G0-120 during OGTT, HbA1c, adipocyte insulin resistance index, ln fasting plasma insulin, and ln I0-120, as well as family history of diabetes and presence of metabolic syndrome, were associated with increased risk of diabetes. At baseline, higher insulin secretion (ln [I0-120/G0-120]) during the OGTT was associated with decreased risk of diabetes. Higher β-cell function (insulin secretion/insulin resistance or disposition index; ln [I0-120/G0-120 × Matsuda index of insulin sensitivity]; odds ratio 0.11; P < 0.0001) was the variable most closely associated with reduced risk of diabetes. In a stepwise multiple-variable analysis, only HbA1c and β-cell function (ln insulin secretion/insulin resistance index) predicted the development of diabetes (r = 0.49; P < 0.0001).

  5. Variable precision rough set for multiple decision attribute analysis

    Institute of Scientific and Technical Information of China (English)

    Lai; Kin; Keung

    2008-01-01

    A variable precision rough set (VPRS) model is used to solve the multi-attribute decision analysis (MADA) problem with multiple conflicting decision attributes and multiple condition attributes. By introducing confidence measures and a β-reduct, the VPRS model can rationally solve the conflicting decision analysis problem with multiple decision attributes and multiple condition attributes. For illustration, a medical diagnosis example is utilized to show the feasibility of the VPRS model in solving the MADA...

  6. Waste management project technical baseline description

    International Nuclear Information System (INIS)

    Sederburg, J.P.

    1997-01-01

    A systems engineering approach has been taken to describe the technical baseline under which the Waste Management Project is currently operating. The document contains a mission analysis, function analysis, requirement analysis, interface definitions, alternative analysis, system definition, documentation requirements, implementation definitions, and discussion of uncertainties facing the Project

  7. Proceedings of the workshop on multiple prompt gamma-ray analysis

    International Nuclear Information System (INIS)

    Ebihara, Mitsuru; Hatsukawa, Yuichi; Oshima, Masumi

    2006-10-01

    The workshop on 'Multiple Prompt Gamma-ray Analysis' was held on March 8, 2006 at Tokai. It is based on a project, 'Developments of real time, non-destructive ultra sensitive elemental analysis using multiple gamma-ray detections and prompt gamma ray analysis and its application to real samples', one of the High priority Cooperative Research Programs performed by Japan Atomic Energy Agency and the University of Tokyo. In this workshop, the latest results of the Multiple Prompt Gamma ray Analysis (MPGA) study were presented, together with those of Neutron Activation Analysis with Multiple Gamma-ray Detection (NAAMG). The 9 of the presented papers are indexed individually. (J.P.N.)

  8. Digital baseline estimation method for multi-channel pulse height analyzing

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun

    2005-01-01

    The basic features of digital baseline estimation for multi-channel pulse height analysis are introduced. The weight-function of minimum-noise baseline filter is deduced with functional variational calculus. The frequency response of this filter is also deduced with Fourier transformation, and the influence of parameters on amplitude frequency response characteristics is discussed. With MATLAB software, the noise voltage signal from the charge sensitive preamplifier is simulated, and the processing effect of minimum-noise digital baseline estimation is verified. According to the results of this research, digital baseline estimation method can estimate baseline optimally, and it is very suitable to be used in digital multi-channel pulse height analysis. (authors)

  9. Idiopathic Pulmonary Fibrosis: Data-driven Textural Analysis of Extent of Fibrosis at Baseline and 15-Month Follow-up.

    Science.gov (United States)

    Humphries, Stephen M; Yagihashi, Kunihiro; Huckleberry, Jason; Rho, Byung-Hak; Schroeder, Joyce D; Strand, Matthew; Schwarz, Marvin I; Flaherty, Kevin R; Kazerooni, Ella A; van Beek, Edwin J R; Lynch, David A

    2017-10-01

    Purpose To evaluate associations between pulmonary function and both quantitative analysis and visual assessment of thin-section computed tomography (CT) images at baseline and at 15-month follow-up in subjects with idiopathic pulmonary fibrosis (IPF). Materials and Methods This retrospective analysis of preexisting anonymized data, collected prospectively between 2007 and 2013 in a HIPAA-compliant study, was exempt from additional institutional review board approval. The extent of lung fibrosis at baseline inspiratory chest CT in 280 subjects enrolled in the IPF Network was evaluated. Visual analysis was performed by using a semiquantitative scoring system. Computer-based quantitative analysis included CT histogram-based measurements and a data-driven textural analysis (DTA). Follow-up CT images in 72 of these subjects were also analyzed. Univariate comparisons were performed by using Spearman rank correlation. Multivariate and longitudinal analyses were performed by using a linear mixed model approach, in which models were compared by using asymptotic χ 2 tests. Results At baseline, all CT-derived measures showed moderate significant correlation (P pulmonary function. At follow-up CT, changes in DTA scores showed significant correlation with changes in both forced vital capacity percentage predicted (ρ = -0.41, P pulmonary function (P fibrosis at CT yields an index of severity that correlates with visual assessment and functional change in subjects with IPF. © RSNA, 2017.

  10. Causal Mediation Analysis of Survival Outcome with Multiple Mediators.

    Science.gov (United States)

    Huang, Yen-Tsung; Yang, Hwai-I

    2017-05-01

    Mediation analyses have been a popular approach to investigate the effect of an exposure on an outcome through a mediator. Mediation models with multiple mediators have been proposed for continuous and dichotomous outcomes. However, development of multimediator models for survival outcomes is still limited. We present methods for multimediator analyses using three survival models: Aalen additive hazard models, Cox proportional hazard models, and semiparametric probit models. Effects through mediators can be characterized by path-specific effects, for which definitions and identifiability assumptions are provided. We derive closed-form expressions for path-specific effects for the three models, which are intuitively interpreted using a causal diagram. Mediation analyses using Cox models under the rare-outcome assumption and Aalen additive hazard models consider effects on log hazard ratio and hazard difference, respectively; analyses using semiparametric probit models consider effects on difference in transformed survival time and survival probability. The three models were applied to a hepatitis study where we investigated effects of hepatitis C on liver cancer incidence mediated through baseline and/or follow-up hepatitis B viral load. The three methods show consistent results on respective effect scales, which suggest an adverse estimated effect of hepatitis C on liver cancer not mediated through hepatitis B, and a protective estimated effect mediated through the baseline (and possibly follow-up) of hepatitis B viral load. Causal mediation analyses of survival outcome with multiple mediators are developed for additive hazard and proportional hazard and probit models with utility demonstrated in a hepatitis study.

  11. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and sustainability analysis

    International Nuclear Information System (INIS)

    Ringius, L.; Grohnheit, P.E.; Nielsen, L.H.; Olivier, A.L.; Painuly, J.; Villavicencio, A.

    2002-12-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO 2 /MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production

  12. Association between magnetic resonance imaging patterns and baseline disease features in multiple myeloma: analyzing surrogates of tumour mass and biology

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Elias K.; Merz, Maximilian; Shah, Sofia; Hillengass, Michaela; Wagner, Barbara; Hose, Dirk; Raab, M.S. [University Hospital Heidelberg, Department of Internal Medicine V, Heidelberg (Germany); Hielscher, Thomas [German Cancer Research Center, Division of Biostatistics, Heidelberg (Germany); Kloth, Jost K.; Weber, Marc-Andre [University Hospital of Heidelberg, Clinic of Diagnostic and Interventional Radiology, Heidelberg (Germany); Jauch, Anna [University Hospital of Heidelberg, Institute of Human Genetics, Heidelberg (Germany); Delorme, Stefan [German Cancer Research Center, Department of Radiology, Heidelberg (Germany); Goldschmidt, Hartmut [University Hospital Heidelberg, Department of Internal Medicine V, Heidelberg (Germany); National Center for Tumor Diseases (NCT) Heidelberg, Heidelberg (Germany); Hillengass, Jens [University Hospital Heidelberg, Department of Internal Medicine V, Heidelberg (Germany); German Cancer Research Center, Department of Radiology, Heidelberg (Germany)

    2016-11-15

    To assess associations between bone marrow infiltration patterns and localization in magnetic resonance imaging (MRI) and baseline clinical/prognostic parameters in multiple myeloma (MM). We compared baseline MM parameters, MRI patterns and localization of focal lesions to the mineralized bone in 206 newly diagnosed MM patients. A high tumour mass (represented by International Staging System stage III) was significantly associated with severe diffuse infiltration (p = 0.015) and a higher number of focal lesions (p = 0.006). Elevated creatinine (p = 0.003), anaemia (p < 0.001) and high LDH (p = 0.001) correlated with severe diffuse infiltration. A salt and pepper diffuse pattern had a favourable prognosis. A higher degree of destruction of mineralized bone (assessed by X-ray or computed tomography) was associated with an increasing number of focal lesions on MRI (p < 0.001). Adverse cytogenetics (del17p/gain1q21/t(4;14)) were associated with diffuse infiltration (p = 0.008). The presence of intraosseous focal lesions exceeding the mineralized bone had a borderline significant impact on prognosis. Diffuse bone marrow infiltration on MRI correlates with adverse cytogenetics, lowered haemoglobin values and high tumour burden in newly diagnosed MM whereas an increasing number of focal lesions correlates with a higher degree of bone destruction. Focal lesions exceeding the cortical bone did not adversely affect the prognosis. (orig.)

  13. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Data for baseline and alternative baseline. 74.20 Section 74.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  14. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  15. Improved Small Baseline processing by means of CAESAR eigen-interferograms decomposition

    Science.gov (United States)

    Verde, Simona; Reale, Diego; Pauciullo, Antonio; Fornaro, Gianfranco

    2018-05-01

    The Component extrAction and sElection SAR (CAESAR) is a method for the selection and filtering of scattering mechanisms recently proposed in the multibaseline interferometric SAR framework. Its strength is related to the possibility to select and extract multiple dominant scattering mechanisms, even interfering in the same pixel, since the stage of the interferograms generation, and to carry out a decorrelation noise phase filtering. Up to now, the validation of CAESAR has been addressed in the framework of SAR Tomography for the model-based detection of Persistent Scatterers (PSs). In this paper we investigate the effectiveness related to the use of CAESAR eigen-interferograms in classical multi-baseline DInSAR processing, based on the Small BAseline Subset (SBAS) strategy, typically adopted to extract large scale distributed deformation and atmospheric phase screen. Such components are also exploited for the calibration of the full resolution data for PS or tomographic analysis. By using COSMO-SKyMed (CSK) SAR data, it is demonstrated that dominant scattering component filtering effectively improves the monitoring of distributed spatially decorrelated areas (f.i. bare soil, rocks, etc.) and allows bringing to light man-made structures with dominant backscattering characteristics embedded in highly temporally decorrelated scenario, as isolated asphalt roads and block of buildings in non-urban areas. Moreover it is shown that, thanks to the CAESAR multiple scattering components separation, the layover mitigation in low-topography eigen-interferograms relieves Phase Unwrapping (PhU) errors in urban areas due to abrupt height variations.

  16. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    Science.gov (United States)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  17. Solid Waste Program technical baseline description

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  18. Parkinson’s Disease Severity at 3 Years Can Be Predicted from Non-Motor Symptoms at Baseline

    Directory of Open Access Journals (Sweden)

    Alba Ayala

    2017-10-01

    Full Text Available ObjectiveThe aim of this study is to present a predictive model of Parkinson’s disease (PD global severity, measured with the Clinical Impression of Severity Index for Parkinson’s Disease (CISI-PD.MethodsThis is an observational, longitudinal study with annual follow-up assessments over 3 years (four time points. A multilevel analysis and multiple imputation techniques were performed to generate a predictive model that estimates changes in the CISI-PD at 1, 2, and 3 years.ResultsThe clinical state of patients (CISI-PD significantly worsened in the 3-year follow-up. However, this change was of small magnitude (effect size: 0.44. The following baseline variables were significant predictors of the global severity change: baseline global severity of disease, levodopa equivalent dose, depression and anxiety symptoms, autonomic dysfunction, and cognitive state. The goodness-of-fit of the model was adequate, and the sensitive analysis showed that the data imputation method applied was suitable.ConclusionDisease progression depends more on the individual’s baseline characteristics than on the 3-year time period. Results may contribute to a better understanding of the evolution of PD including the non-motor manifestations of the disease.

  19. Stability analysis of geomagnetic baseline data obtained at Cheongyang observatory in Korea

    Science.gov (United States)

    Amran, Shakirah M.; Kim, Wan-Seop; Cho, Heh Ree; Park, Po Gyu

    2017-07-01

    The stability of baselines produced by Cheongyang (CYG) observatory from the period of 2014 to 2016 is analysed. Step heights of higher than 5 nT were found in H and Z components in 2014 and 2015 due to magnetic noise in the absolute-measurement hut. In addition, a periodic modulation behaviour observed in the H and Z baseline curves was related to annual temperature variation of about 20 °C in the fluxgate magnetometer hut. Improvement in data quality was evidenced by a small dispersion between successive measurements from June 2015 to the end of 2016. Moreover, the baseline was also improved by correcting the discontinuity in the H and Z baselines.

  20. Stability analysis of geomagnetic baseline data obtained at Cheongyang observatory in Korea

    Directory of Open Access Journals (Sweden)

    S. M. Amran

    2017-07-01

    Full Text Available The stability of baselines produced by Cheongyang (CYG observatory from the period of 2014 to 2016 is analysed. Step heights of higher than 5 nT were found in H and Z components in 2014 and 2015 due to magnetic noise in the absolute-measurement hut. In addition, a periodic modulation behaviour observed in the H and Z baseline curves was related to annual temperature variation of about 20 °C in the fluxgate magnetometer hut. Improvement in data quality was evidenced by a small dispersion between successive measurements from June 2015 to the end of 2016. Moreover, the baseline was also improved by correcting the discontinuity in the H and Z baselines.

  1. Baseline development, economic risk, and schedule risk: An integrated approach

    International Nuclear Information System (INIS)

    Tonkinson, J.A.

    1994-01-01

    The economic and schedule risks of Environmental Restoration (ER) projects are commonly analyzed toward the end of the baseline development process. Risk analysis is usually performed as the final element of the scheduling or estimating processes for the purpose of establishing cost and schedule contingency. However, there is an opportunity for earlier assessment of risks, during development of the technical scope and Work Breakdown Structure (WBS). Integrating the processes of risk management and baselining provides for early incorporation of feedback regarding schedule and cost risk into the proposed scope of work. Much of the information necessary to perform risk analysis becomes available during development of the technical baseline, as the scope of work and WBS are being defined. The analysis of risk can actually be initiated early on during development of the technical baseline and continue throughout development of the complete project baseline. Indeed, best business practices suggest that information crucial to the success of a project be analyzed and incorporated into project planning as soon as it is available and usable

  2. A long baseline global stereo matching based upon short baseline estimation

    Science.gov (United States)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  3. 1999 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1999_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  4. 1997 Baseline Sampling and Analysis Sample Locations, Geographic NAD83, LOSCO (2004) [BSA_1997_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis (BSA) program coordinated by the Louisiana Oil Spill Coordinator's Office....

  5. 1998 Baseline Sampling and Analysis Sampling Locations, Geographic NAD83, LOSCO (2004) [BSA_1998_sample_locations_LOSCO_2004

    Data.gov (United States)

    Louisiana Geographic Information Center — The monitor point data set was produced as a part of the Baseline Sampling and Analysis program coordinated by the Louisiana Oil Spill Coordinator's Office. This...

  6. Seismic analysis of piping systems subjected to multiple support excitations

    International Nuclear Information System (INIS)

    Sundararajan, C.; Vaish, A.K.; Slagis, G.C.

    1981-01-01

    The paper presents the results of a comparative study between the multiple response spectrum method and the time-history method for the seismic analysis of nuclear piping systems subjected to different excitation at different supports or support groups. First, the necessary equations for the above analysis procedures are derived. Then, three actual nuclear piping systems subjected to single and multiple excitations are analyzed by the different methods, and extensive comparisons of the results (stresses) are made. Based on the results, it is concluded that the multiple response spectrum analysis gives acceptable results as compared to the ''exact'', but much more costly, time-history analysis. 6 refs

  7. Long baseline neutrino oscillation experiments

    International Nuclear Information System (INIS)

    Gallagher, H.

    2006-01-01

    In this paper I will review briefly the experimental results which established the existence of neutrino mixing, the current generation of long baseline accelerator experiments, and the prospects for the future. In particular I will focus on the recent analysis of the MINOS experiment. (author)

  8. Failure analysis of multiple delaminated composite plates due

    Indian Academy of Sciences (India)

    The present work aims at the first ply failure analysis of laminated composite plates with arbitrarily located multiple delaminations subjected to transverse static load as well as impact. The theoretical formulation is based on a simple multiple delamination model. Conventional first order shear deformation is assumed using ...

  9. Item analysis of ADAS-Cog: effect of baseline cognitive impairment in a clinical AD trial.

    Science.gov (United States)

    Sevigny, Jeffrey J; Peng, Yahong; Liu, Lian; Lines, Christopher R

    2010-03-01

    We explored the association of Alzheimer's disease (AD) Assessment Scale (ADAS-Cog) item scores with AD severity using cross-sectional and longitudinal data from the same study. Post hoc analyses were performed using placebo data from a 12-month trial of patients with mild-to-moderate AD (N =281 randomized, N =209 completed). Baseline distributions of ADAS-Cog item scores by Mini-Mental State Examination (MMSE) score and Clinical Dementia Rating (CDR) sum of boxes score (measures of dementia severity) were estimated using local and nonparametric regressions. Mixed-effect models were used to characterize ADAS-Cog item score changes over time by dementia severity (MMSE: mild =21-26, moderate =14-20; global CDR: mild =0.5-1, moderate =2). In the cross-sectional analysis of baseline ADAS-Cog item scores, orientation was the most sensitive item to differentiate patients across levels of cognitive impairment. Several items showed a ceiling effect, particularly in milder AD. In the longitudinal analysis of change scores over 12 months, orientation was the only item with noticeable decline (8%-10%) in mild AD. Most items showed modest declines (5%-20%) in moderate AD.

  10. Probing neutrino oscillations jointly in long and very long baseline experiments

    International Nuclear Information System (INIS)

    Wang, Y.F.; Whisnant, K.; Young Binglin; Xiong Zhaohua; Yang Jinmin

    2002-01-01

    We examine the prospects of making a joint analysis of neutrino oscillations at two baselines with neutrino superbeams. Assuming narrow band superbeams and a 100 kiloton water Cherenkov calorimeter, we calculate the event rates and sensitivities to the matter effect, the signs of the neutrino mass differences, the CP phase, and the mixing angle θ 13 . Taking into account all possible experimental errors under general consideration, we explore the optimum cases of a narrow band beam to measure the matter effect and the CP violation effect at all baselines up to 3000 km. We then focus on two specific baselines, a long baseline of 300 km and a very long baseline of 2100 km, and analyze their joint capabilities. We find that the joint analysis can offer extra leverage to resolve some of the ambiguities that are associated with the measurement at a single baseline

  11. Multiple fMRI system-level baseline connectivity is disrupted in patients with consciousness alterations.

    Science.gov (United States)

    Demertzi, Athena; Gómez, Francisco; Crone, Julia Sophia; Vanhaudenhuyse, Audrey; Tshibanda, Luaba; Noirhomme, Quentin; Thonnard, Marie; Charland-Verville, Vanessa; Kirsch, Murielle; Laureys, Steven; Soddu, Andrea

    2014-03-01

    In healthy conditions, group-level fMRI resting state analyses identify ten resting state networks (RSNs) of cognitive relevance. Here, we aim to assess the ten-network model in severely brain-injured patients suffering from disorders of consciousness and to identify those networks which will be most relevant to discriminate between patients and healthy subjects. 300 fMRI volumes were obtained in 27 healthy controls and 53 patients in minimally conscious state (MCS), vegetative state/unresponsive wakefulness syndrome (VS/UWS) and coma. Independent component analysis (ICA) reduced data dimensionality. The ten networks were identified by means of a multiple template-matching procedure and were tested on neuronality properties (neuronal vs non-neuronal) in a data-driven way. Univariate analyses detected between-group differences in networks' neuronal properties and estimated voxel-wise functional connectivity in the networks, which were significantly less identifiable in patients. A nearest-neighbor "clinical" classifier was used to determine the networks with high between-group discriminative accuracy. Healthy controls were characterized by more neuronal components compared to patients in VS/UWS and in coma. Compared to healthy controls, fewer patients in MCS and VS/UWS showed components of neuronal origin for the left executive control network, default mode network (DMN), auditory, and right executive control network. The "clinical" classifier indicated the DMN and auditory network with the highest accuracy (85.3%) in discriminating patients from healthy subjects. FMRI multiple-network resting state connectivity is disrupted in severely brain-injured patients suffering from disorders of consciousness. When performing ICA, multiple-network testing and control for neuronal properties of the identified RSNs can advance fMRI system-level characterization. Automatic data-driven patient classification is the first step towards future single-subject objective diagnostics

  12. Measuring happiness in individuals with profound multiple disabilities.

    Science.gov (United States)

    Darling, Joseph A; Circo, Deborah K

    2015-12-01

    This quantitative study assessed whether presentation of preferred items and activities during multiple periods of the day (and over multiple days) increased indices of happiness (over time/sustained) in individuals with PMD. A multiple baseline design across participants was utilized to measure changes in indices of happiness of the participants. Participants were recruited from an adult day activity program specializing in providing assistance to individuals with disabilities. For Mary, baseline indices of happiness were 26.67% of intervals, increasing 6.76% during intervention to 33.43%. For Caleb, baseline indices of happiness were 20.84% of intervals, increasing 6.34% during intervention to 27.18%. For Mark, baseline indices of happiness were 40.00% of intervals, increasing 12.75% during intervention to 52.75%. Overall interobserver agreement was 82.8%, with interobserver agreement observations occurring during 63.04% of the observations. The results of the investigation demonstrated that presenting preferred items and activities increased the indices of happiness compared to baseline rates of indices of happiness. Results may have been more robust if the participants were assessed for overall responsiveness patterns prior to the initiation of measurement of indices of happiness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Analysis of factors affecting baseline SF-36 Mental Component Summary in Adult Spinal Deformity and its impact on surgical outcomes.

    Science.gov (United States)

    Mmopelwa, Tiro; Ayhan, Selim; Yuksel, Selcen; Nabiyev, Vugar; Niyazi, Asli; Pellise, Ferran; Alanay, Ahmet; Sanchez Perez Grueso, Francisco Javier; Kleinstuck, Frank; Obeid, Ibrahim; Acaroglu, Emre

    2018-03-01

    To identify the factors that affect SF-36 mental component summary (MCS) in patients with adult spinal deformity (ASD) at the time of presentation, and to analyse the effect of SF-36 MCS on clinical outcomes in surgically treated patients. Prospectively collected data from a multicentric ASD database was analysed for baseline parameters. Then, the same database for surgically treated patients with a minimum of 1-year follow-up was analysed to see the effect of baseline SF-36 MCS on treatment results. A clinically useful SF-36 MCS was determined by ROC Curve analysis. A total of 229 patients with the baseline parameters were analysed. A strong correlation between SF-36 MCS and SRS-22, ODI, gender, and diagnosis were found (p baseline SF-36 MCS (p baseline SF-36 MCS in an ASD population are other HRQOL parameters such as SRS-22 and ODI as well as the baseline thoracic kyphosis and gender. This study has also demonstrated that baseline SF-36 MCS does not necessarily have any effect on the treatment results by surgery as assessed by SRS-22 or ODI. Level III, prognostic study. Copyright © 2018 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  14. Multiplicative calculus in biomedical image analysis

    NARCIS (Netherlands)

    Florack, L.M.J.; Assen, van H.C.

    2011-01-01

    We advocate the use of an alternative calculus in biomedical image analysis, known as multiplicative (a.k.a. non-Newtonian) calculus. It provides a natural framework in problems in which positive images or positive definite matrix fields and positivity preserving operators are of interest. Indeed,

  15. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    Directory of Open Access Journals (Sweden)

    M. Pinheiro

    2017-09-01

    Full Text Available The global Digital Elevation Model (DEM resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  16. Approaches to Data Analysis of Multiple-Choice Questions

    Science.gov (United States)

    Ding, Lin; Beichner, Robert

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics…

  17. RAWS II: A MULTIPLE REGRESSION ANALYSIS PROGRAM,

    Science.gov (United States)

    This memorandum gives instructions for the use and operation of a revised version of RAWS, a multiple regression analysis program. The program...of preprocessed data, the directed retention of variable, listing of the matrix of the normal equations and its inverse, and the bypassing of the regression analysis to provide the input variable statistics only. (Author)

  18. Environmental Modeling, A goal of the Baseline Sampling and Analysis program is to determine baseline levels of select priority pollutants and petroleum markers in areas with high probability for oil spills., Published in 1999, 1:24000 (1in=2000ft) scale, Louisiana State University (LSU).

    Data.gov (United States)

    NSGIC Education | GIS Inventory — Environmental Modeling dataset current as of 1999. A goal of the Baseline Sampling and Analysis program is to determine baseline levels of select priority pollutants...

  19. Integrative Analysis of Prognosis Data on Multiple Cancer Subtypes

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Zhang, Yawei; Lan, Qing; Rothman, Nathaniel; Zheng, Tongzhang; Ma, Shuangge

    2014-01-01

    Summary In cancer research, profiling studies have been extensively conducted, searching for genes/SNPs associated with prognosis. Cancer is diverse. Examining the similarity and difference in the genetic basis of multiple subtypes of the same cancer can lead to a better understanding of their connections and distinctions. Classic meta-analysis methods analyze each subtype separately and then compare analysis results across subtypes. Integrative analysis methods, in contrast, analyze the raw data on multiple subtypes simultaneously and can outperform meta-analysis methods. In this study, prognosis data on multiple subtypes of the same cancer are analyzed. An AFT (accelerated failure time) model is adopted to describe survival. The genetic basis of multiple subtypes is described using the heterogeneity model, which allows a gene/SNP to be associated with prognosis of some subtypes but not others. A compound penalization method is developed to identify genes that contain important SNPs associated with prognosis. The proposed method has an intuitive formulation and is realized using an iterative algorithm. Asymptotic properties are rigorously established. Simulation shows that the proposed method has satisfactory performance and outperforms a penalization-based meta-analysis method and a regularized thresholding method. An NHL (non-Hodgkin lymphoma) prognosis study with SNP measurements is analyzed. Genes associated with the three major subtypes, namely DLBCL, FL, and CLL/SLL, are identified. The proposed method identifies genes that are different from alternatives and have important implications and satisfactory prediction performance. PMID:24766212

  20. Baseline Tumor Size Is an Independent Prognostic Factor for Overall Survival in Patients With Melanoma Treated With Pembrolizumab.

    Science.gov (United States)

    Joseph, Richard W; Elassaiss-Schaap, Jeroen; Kefford, Richard F; Hwu, Wen-Jen; Wolchok, Jedd D; Joshua, Anthony Michael; Ribas, Antoni; Hodi, F Stephen; Hamid, Omid; Robert, Caroline; Daud, Adil I; Dronca, Roxana S; Hersey, Peter; Weber, Jeffrey S; Patnaik, Amita; de Alwis, Dinesh P; Perrone, Andrea M; Zhang, Jin; Kang, Soonmo Peter; Ebbinghaus, Scot W; Anderson, Keaven M; Gangadhar, Tara

    2018-04-23

    To assess the association of baseline tumor size (BTS) with other baseline clinical factors and outcomes in pembrolizumab-treated patients with advanced melanoma in KEYNOTE-001 (NCT01295827). BTS was quantified by adding the sum of the longest dimensions of all measurable baseline target lesions. BTS as a dichotomous and continuous variable was evaluated with other baseline factors using logistic regression for objective response rate (ORR) and Cox regression for overall survival (OS). Nominal P values with no multiplicity adjustment describe the strength of observed associations. Per central review by RECIST v1.1, 583 of 655 patients had baseline measurable disease and were included in this post hoc analysis. Median BTS was 10.2 cm (range, 1-89.5). Larger median BTS was associated with Eastern Cooperative Oncology Group performance status 1, elevated lactate dehydrogenase (LDH), stage M1c disease, and liver metastases (with or without any other sites) (all P ≤ 0.001). In univariate analyses, BTS below the median was associated with higher ORR (44% vs 23%; P BTS below the median remained an independent prognostic marker of OS (P BTS below the median and PD-L1-positive tumors were independently associated with higher ORR and longer OS. BTS is associated with many other baseline clinical factors but is also independently prognostic of survival in pembrolizumab-treated patients with advanced melanoma. Copyright ©2018, American Association for Cancer Research.

  1. Wide-Baseline Stereo-Based Obstacle Mapping for Unmanned Surface Vehicles

    Science.gov (United States)

    Mou, Xiaozheng; Wang, Han

    2018-01-01

    This paper proposes a wide-baseline stereo-based static obstacle mapping approach for unmanned surface vehicles (USVs). The proposed approach eliminates the complicated calibration work and the bulky rig in our previous binocular stereo system, and raises the ranging ability from 500 to 1000 m with a even larger baseline obtained from the motion of USVs. Integrating a monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed while the USV is traveling, and an obstacle map is then built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. Experimental results based on our own dataset demonstrate the high efficiency of our system. To the best of our knowledge, we are the first to address the task of wide-baseline stereo-based obstacle mapping in a maritime environment. PMID:29617293

  2. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  3. The Baselines Project: Establishing Reference Environmental Conditions for Marine Habitats in the Gulf of Mexico using Forecast Models and Satellite Data

    Science.gov (United States)

    Jolliff, J. K.; Gould, R. W.; deRada, S.; Teague, W. J.; Wijesekera, H. W.

    2012-12-01

    We provide an overview of the NASA-funded project, "High-Resolution Subsurface Physical and Optical Property Fields in the Gulf of Mexico: Establishing Baselines and Assessment Tools for Resource Managers." Data assimilative models, analysis fields, and multiple satellite data streams were used to construct temperature and photon flux climatologies for the Flower Garden Banks National Marine Sanctuary (FGBNMS) and similar habitats in the northwestern Gulf of Mexico where geologic features provide a platform for unique coral reef ecosystems. Comparison metrics of the products to in situ data collected during complimentary projects are also examined. Similarly, high-resolution satellite-data streams and advanced processing techniques were used to establish baseline suspended sediment load and turbidity conditions in selected northern Gulf of Mexico estuaries. The results demonstrate the feasibility of blending models and data into accessible web-based analysis products for resource managers, policy makers, and the public.

  4. Parent-administered computer-assisted tutoring targeting letter-sound knowledge: Evaluation via multiple-baseline across three preschool students.

    Science.gov (United States)

    DuBois, Matthew R; Volpe, Robert J; Burns, Matthew K; Hoffman, Jessica A

    2016-12-01

    Knowledge of letters sounds has been identified as a primary objective of preschool instruction and intervention. Despite this designation, large disparities exist in the number of letter sounds children know at school entry. Enhancing caregivers' ability to teach their preschool-aged children letter sounds may represent an effective practice for reducing this variability and ensuring that more children are prepared to experience early school success. This study used a non-concurrent multiple-baseline-across-participants design to evaluate the effectiveness of caregivers (N=3) delivering a computer-assisted tutoring program (Tutoring Buddy) targeting letter sound knowledge to their preschool-aged children. Visual analyses and effect size estimates derived from Percentage of All Non-Overlapping Data (PAND) statistics indicated consistent results for letter sound acquisition, as 6weeks of intervention yielded large effects for letter sound knowledge (LSK) across all three children. Large effect sizes were also found for letter sound fluency (LSF) and nonsense word fluency (NWF) for two children. All three caregivers rated the intervention as highly usable and were able to administer it with high levels of fidelity. Taken together, the results of the present study found Tutoring Buddy to be an effective, simple, and usable way for the caregivers to support their children's literacy development. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  5. Trace element analysis of environmental samples by multiple prompt gamma-ray analysis method

    International Nuclear Information System (INIS)

    Oshima, Masumi; Matsuo, Motoyuki; Shozugawa, Katsumi

    2011-01-01

    The multiple γ-ray detection method has been proved to be a high-resolution and high-sensitivity method in application to nuclide quantification. The neutron prompt γ-ray analysis method is successfully extended by combining it with the γ-ray detection method, which is called Multiple prompt γ-ray analysis, MPGA. In this review we show the principle of this method and its characteristics. Several examples of its application to environmental samples, especially river sediments in the urban area and sea sediment samples are also described. (author)

  6. A simple technique investigating baseline heterogeneity helped to eliminate potential bias in meta-analyses.

    Science.gov (United States)

    Hicks, Amy; Fairhurst, Caroline; Torgerson, David J

    2018-03-01

    To perform a worked example of an approach that can be used to identify and remove potentially biased trials from meta-analyses via the analysis of baseline variables. True randomisation produces treatment groups that differ only by chance; therefore, a meta-analysis of a baseline measurement should produce no overall difference and zero heterogeneity. A meta-analysis from the British Medical Journal, known to contain significant heterogeneity and imbalance in baseline age, was chosen. Meta-analyses of baseline variables were performed and trials systematically removed, starting with those with the largest t-statistic, until the I 2 measure of heterogeneity became 0%, then the outcome meta-analysis repeated with only the remaining trials as a sensitivity check. We argue that heterogeneity in a meta-analysis of baseline variables should not exist, and therefore removing trials which contribute to heterogeneity from a meta-analysis will produce a more valid result. In our example none of the overall outcomes changed when studies contributing to heterogeneity were removed. We recommend routine use of this technique, using age and a second baseline variable predictive of outcome for the particular study chosen, to help eliminate potential bias in meta-analyses. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Approaches to data analysis of multiple-choice questions

    OpenAIRE

    Lin Ding; Robert Beichner

    2009-01-01

    This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  8. Meta-analysis of prediction model performance across multiple studies: Which scale helps ensure between-study normality for the C-statistic and calibration measures?

    Science.gov (United States)

    Snell, Kym Ie; Ensor, Joie; Debray, Thomas Pa; Moons, Karel Gm; Riley, Richard D

    2017-01-01

    If individual participant data are available from multiple studies or clusters, then a prediction model can be externally validated multiple times. This allows the model's discrimination and calibration performance to be examined across different settings. Random-effects meta-analysis can then be used to quantify overall (average) performance and heterogeneity in performance. This typically assumes a normal distribution of 'true' performance across studies. We conducted a simulation study to examine this normality assumption for various performance measures relating to a logistic regression prediction model. We simulated data across multiple studies with varying degrees of variability in baseline risk or predictor effects and then evaluated the shape of the between-study distribution in the C-statistic, calibration slope, calibration-in-the-large, and E/O statistic, and possible transformations thereof. We found that a normal between-study distribution was usually reasonable for the calibration slope and calibration-in-the-large; however, the distributions of the C-statistic and E/O were often skewed across studies, particularly in settings with large variability in the predictor effects. Normality was vastly improved when using the logit transformation for the C-statistic and the log transformation for E/O, and therefore we recommend these scales to be used for meta-analysis. An illustrated example is given using a random-effects meta-analysis of the performance of QRISK2 across 25 general practices.

  9. Can persons with a history of multiple addiction treatment episodes benefit from technology delivered behavior therapy? A moderating role of treatment history at baseline.

    Science.gov (United States)

    Kim, Sunny Jung; Marsch, Lisa A; Acosta, Michelle C; Guarino, Honoria; Aponte-Melendez, Yesenia

    2016-03-01

    A growing line of research has shown positive treatment outcomes from technology-based therapy for substance use disorders (SUDs). However, little is known about the effectiveness of technology-based SUD interventions for persons who already had numerous prior SUD treatments. We conducted a secondary analysis on a 12-month trial with patients (N=160) entering methadone maintenance treatment (MMT). Patients were randomly assigned to either standard MMT treatment or a model in which half of standard counseling sessions were replaced with a computer-based intervention, called Therapeutic Education System (standard+TES). Four treatment history factors at baseline, the number of lifetime SUD treatment episodes, detoxification episodes, and inpatient/outpatient treatment episodes were categorized into three levels based on their tertile points, and analyzed as moderators. Dependent variables were urine toxicology results for opioid and cocaine abstinence for 52-weeks. The standard+TES condition produced significantly better opioid abstinence than standard treatment for participants with 1) a moderate or high frequency of lifetime SUD treatment episodes, and 2) those with all three levels (low, moderate and high) of detoxification and inpatient/outpatient treatment episodes, pshistory, pstechnology-based behavioral therapy as part of treatment can be more effective than MMT alone, even among patients with a history of multiple addiction treatment episodes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    One of the greatest challenges surrounding adaptation to the spaceflight environment is the large variability in symptoms, and corresponding functional impairments, from one crewmember to the next. This renders preflight training and countermeasure development difficult, as a "one-size-fits-all" approach is inappropriate. Therefore, it would be highly advantageous to know ahead of time which crewmembers might have more difficulty adjusting to the novel g-levels inherent to spaceflight. Such knowledge could guide individually customized countermeasures, which would enable more efficient use of crew time, both preflight and inflight, and provide better outcomes. The primary goal of this project is to look for a baseline performance metric that can forecast sensorimotor adaptability without exposure to an adaptive stimulus. We propose a novel hypothesis that considers baseline inter-trial correlations, the trial-to-trial fluctuations in motor performance, as a predictor of individual sensorimotor adaptive capabilities. To-date, a strong relationship has been found between baseline inter-trial correlations and adaptability in two oculomotor systems. For this project, we will explore an analogous predictive mechanism in the locomotion system. METHODS: Baseline Inter-trial Correlations: Inter-trial correlations specify the relationships among repeated trials of a given task that transpire as a consequence of correcting for previous performance errors over multiple timescales. We can quantify the strength of inter-trial correlations by measuring the decay of the autocorrelation function (ACF), which describes how rapidly information from past trials is "forgotten." Processes whose ACFs decay more slowly exhibit longer-term inter-trial correlations (longer memory processes), while processes whose ACFs decay more rapidly exhibit shorterterm inter-trial correlations (shorter memory processes). Longer-term correlations reflect low-frequency activity, which is more easily

  11. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    Science.gov (United States)

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  12. Multiple-Group Analysis Using the sem Package in the R System

    Science.gov (United States)

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  13. Baseline Body Mass Index and the Efficacy of Hypoglycemic Treatment in Type 2 Diabetes: A Meta-Analysis

    Science.gov (United States)

    Cai, Xiaoling; Yang, Wenjia; Gao, Xueying; Zhou, Lingli; Han, Xueyao; Ji, Linong

    2016-01-01

    Aim The aim of this study is to compare the effects of hypoglycemic treatments in groups of patients categorized according to the mean baseline body mass indexes (BMIs). Methods Studies were identified by a literature search and all the studies were double blind, placebo-controlled randomized trials in type 2 diabetes patients; study length of ≥12 weeks with the efficacy evaluated by changes in HbA1c from baseline in groups. The electronic search was first conducted in January 2015 and repeated in June 2015. Results 227 studies were included. Treatment with sulfonylureas was compared with placebo in overweight patients and resulted in a significantly greater change in the HbA1c levels (weighted mean difference (WMD), −1.39%) compared to obese patients (WMD, −0.77%)(p0.05). Treatment with alpha glucosidase inhibitors in normal weight patients was associated with a HbA1c change (WMD, −0.94%) that was comparable that in overweight (WMD, −0.72%) and obese patients (WMD, −0.56%)(p>0.05). Treatment with thiazolidinediones in normal weight patients was associated with a HbA1c change (WMD, −1.04%) that was comparable with that in overweight (WMD, −1.02%) and obese patients (WMD, −0.88%)(p>0.05). Treatment with DPP-4 inhibitors in normal weight patients was associated with a HbA1c change (WMD, −0.93%) that was comparable with that in overweight (WMD, −0.66%) and obese patients (WMD, −0.61%)(p>0.05). In total, of the seven hypoglycemic agents, regression analysis indicated that the mean baseline BMI was not associated with the mean HbA1c changes from baseline. Conclusion In each kind of hypoglycemic therapy in type 2 diabetes, the baseline BMI was not associated with the efficacy of HbA1c changes from baseline. PMID:27935975

  14. Analysis of multiple scattering effects in optical Doppler tomography

    DEFF Research Database (Denmark)

    Yura, H.T.; Thrane, L.; Andersen, Peter E.

    2005-01-01

    Optical Doppler tomography (ODT) combines Doppler velocimetry and optical coherence tomography (OCT) to obtain high-resolution cross-sectional imaging of particle flow velocity in scattering media such as the human retina and skin. Here, we present the results of a theoretical analysis of ODT where...... multiple scattering effects are included. The purpose of this analysis is to determine how multiple scattering affects the estimation of the depth-resolved localized flow velocity. Depth-resolved velocity estimates are obtained directly from the corresponding mean or standard deviation of the observed...

  15. Approaches to data analysis of multiple-choice questions

    Directory of Open Access Journals (Sweden)

    Lin Ding

    2009-09-01

    Full Text Available This paper introduces five commonly used approaches to analyzing multiple-choice test data. They are classical test theory, factor analysis, cluster analysis, item response theory, and model analysis. Brief descriptions of the goals and algorithms of these approaches are provided, together with examples illustrating their applications in physics education research. We minimize mathematics, instead placing emphasis on data interpretation using these approaches.

  16. Multiple scattering problems in heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Johnston, P.N.; El Bouanani, M.; Stannard, W.B.; Bubb, I.F.; Cohen, D.D.; Dytlewski, N.; Siegele, R.

    1998-01-01

    A number of groups use Heavy Ion Elastic Recoil Detection Analysis (HIERDA) to study materials science problems. Nevertheless, there is no standard methodology for the analysis of HIERDA spectra. To overcome this deficiency we have been establishing codes for 2-dimensional data analysis. A major problem involves the effects of multiple and plural scattering which are very significant, even for quite thin (∼100 nm) layers of the very heavy elements. To examine the effects of multiple scattering we have made comparisons between the small-angle model of Sigmund et al. and TRIM calculations. (authors)

  17. Analysis of (n, 2n) multiplication in lead

    International Nuclear Information System (INIS)

    Segev, M.

    1984-01-01

    Lead is being considered as a possible amplifier of neutrons for fusion blankets. A simple one-group model of neutron multiplications in Pb is presented. Given the 14 MeV neutron cross section on Pb, the model predicts the multiplication. Given measured multiplications, the model enables the determination of the (n, 2n) and transport cross sections. Required for the model are: P-the collision probability for source neutrons in the Pb body-and W- an average collision probability for non-virgin, non-degraded neutrons. In simple geometries, such as a source in the center of a spherical shell, P and an approximate W can be expressed analytically in terms of shell dimensions and the Pb transport cross section. The model was applied to Takahashi's measured multiplications in Pb shells in order to understand the apparent very high multiplicative power of Pb. The results of the analysis are not consistent with basic energy-balance and cross section magnitude constraints in neutron interaction theory. (author)

  18. The multiple imputation method: a case study involving secondary data analysis.

    Science.gov (United States)

    Walani, Salimah R; Cleland, Charles M

    2015-05-01

    To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.

  19. Localizing gravitational wave sources with single-baseline atom interferometers

    Science.gov (United States)

    Graham, Peter W.; Jung, Sunghoon

    2018-02-01

    Localizing sources on the sky is crucial for realizing the full potential of gravitational waves for astronomy, astrophysics, and cosmology. We show that the midfrequency band, roughly 0.03 to 10 Hz, has significant potential for angular localization. The angular location is measured through the changing Doppler shift as the detector orbits the Sun. This band maximizes the effect since these are the highest frequencies in which sources live for several months. Atom interferometer detectors can observe in the midfrequency band, and even with just a single baseline they can exploit this effect for sensitive angular localization. The single-baseline orbits around the Earth and the Sun, causing it to reorient and change position significantly during the lifetime of the source, and making it similar to having multiple baselines/detectors. For example, atomic detectors could predict the location of upcoming black hole or neutron star merger events with sufficient accuracy to allow optical and other electromagnetic telescopes to observe these events simultaneously. Thus, midband atomic detectors are complementary to other gravitational wave detectors and will help complete the observation of a broad range of the gravitational spectrum.

  20. Accounting for baseline differences and measurement error in the analysis of change over time.

    Science.gov (United States)

    Braun, Julia; Held, Leonhard; Ledergerber, Bruno

    2014-01-15

    If change over time is compared in several groups, it is important to take into account baseline values so that the comparison is carried out under the same preconditions. As the observed baseline measurements are distorted by measurement error, it may not be sufficient to include them as covariate. By fitting a longitudinal mixed-effects model to all data including the baseline observations and subsequently calculating the expected change conditional on the underlying baseline value, a solution to this problem has been provided recently so that groups with the same baseline characteristics can be compared. In this article, we present an extended approach where a broader set of models can be used. Specifically, it is possible to include any desired set of interactions between the time variable and the other covariates, and also, time-dependent covariates can be included. Additionally, we extend the method to adjust for baseline measurement error of other time-varying covariates. We apply the methodology to data from the Swiss HIV Cohort Study to address the question if a joint infection with HIV-1 and hepatitis C virus leads to a slower increase of CD4 lymphocyte counts over time after the start of antiretroviral therapy. Copyright © 2013 John Wiley & Sons, Ltd.

  1. A comparison of the effects of visual deprivation and regular body weight support treadmill training on improving over-ground walking of stroke patients: a multiple baseline single subject design.

    Science.gov (United States)

    Kim, Jeong-Soo; Kang, Sun-Young; Jeon, Hye-Seon

    2015-01-01

    The body-weight-support treadmill (BWST) is commonly used for gait rehabilitation, but other forms of BWST are in development, such as visual-deprivation BWST (VDBWST). In this study, we compare the effect of VDBWST training and conventional BWST training on spatiotemporal gait parameters for three individuals who had hemiparetic strokes. We used a single-subject experimental design, alternating multiple baselines across the individuals. We recruited three individuals with hemiparesis from stroke; two on the left side and one on the right. For the main outcome measures we assessed spatiotemporal gait parameters using GAITRite, including: gait velocity; cadence; step time of the affected side (STA); step time of the non-affected side (STN); step length of the affected side (SLA); step length of the non-affected side (SLN); step-time asymmetry (ST-asymmetry); and step-length asymmetry (SL-asymmetry). Gait velocity, cadence, SLA, and SLN increased from baseline after both interventions, but STA, ST-asymmetry, and SL-asymmetry decreased from the baseline after the interventions. The VDBWST was significantly more effective than the BWST for increasing gait velocity and cadence and for decreasing ST-asymmetry. VDBWST is more effective than BWST for improving gait performance during the rehabilitation for ground walking.

  2. The influence of coping styles on long-term employment in multiple sclerosis: A prospective study.

    Science.gov (United States)

    Grytten, Nina; Skår, Anne Br; Aarseth, Jan Harald; Assmus, Jorg; Farbu, Elisabeth; Lode, Kirsten; Nyland, Harald I; Smedal, Tori; Myhr, Kjell Morten

    2017-06-01

    The aim was to investigate predictive values of coping styles, clinical and demographic factors on time to unemployment in patients diagnosed with multiple sclerosis (MS) during 1998-2002 in Norway. All patients ( N = 108) diagnosed with MS 1998-2002 in Hordaland and Rogaland counties, Western Norway, were invited to participate in the long-term follow-up study in 2002. Baseline recordings included disability scoring (Expanded Disability Status Scale (EDSS)), fatigue (Fatigue Severity Scale (FSS)), depression (Beck Depression Inventory (BDI)), and questionnaire assessing coping (the Dispositional Coping Styles Scale (COPE)). Logistic regression analysis was used to identify factors associated with unemployed at baseline, and Cox regression analysis to identify factors at baseline associated with time to unemployment during follow-up. In all, 41 (44%) were employed at baseline. After 13 years follow-up in 2015, mean disease duration of 22 years, 16 (17%) were still employed. Median time from baseline to unemployment was 6 years (±5). Older age at diagnosis, female gender, and depression were associated with patients being unemployed at baseline. Female gender, long disease duration, and denial as avoidant coping strategy at baseline predicted shorter time to unemployment. Avoidant coping style, female gender, and longer disease duration were associated with shorter time to unemployment. These factors should be considered when advising patients on MS and future employment.

  3. Baseline cerebral oximetry values depend on non-modifiable patient characteristics.

    Science.gov (United States)

    Valencia, Lucía; Rodríguez-Pérez, Aurelio; Ojeda, Nazario; Santana, Romen Yone; Morales, Laura; Padrón, Oto

    2015-12-01

    The aim of the present study was to evaluate baseline regional cerebral oxygen saturation (rSO2) values and identify factors influencing preoperative rSO2 in elective minor surgery. Observational analysis post-hoc. Observational post-hoc analysis of data for the patient sample (n=50) of a previously conducted clinical trial in patients undergoing tumourectomy for breast cancer or inguinal hernia repair. Exclusion criteria included pre-existing cerebrovascular diseases, anaemia, baseline pulse oximetry values were recorded while the patient breathed room air, using the INVOS 5100C monitor™ (Covidien, Dublin, Ireland). Thirty-seven women (72%) and 13 men (28%) 48 ± 13 years of age were enrolled in this study. Baseline rSO2 was 62.01 ± 10.38%. Baseline rSO2 was significantly different between men (67.6 ± 11.2%) and women (60 ± 9.4%), (P=0.023). There were also differences between baseline rSO2 and ASA physical status (ASA I: 67.6 ± 10.7%, ASA II: 61.6 ± 8.4%, ASA III: 55.8 ± 13.9%, P=0.045). Baseline rSO2 had a positive correlation with body weight (r=0.347, P=0.014) and height (r=0.345, P=0.014). We also found significant differences in baseline rSO2 among patients with and without chronic renal failure (P=0.005). No differences were found in any other studied variables. Non-modifiable patient characteristics (ASA physical status, sex, chronic renal failure, body weight and height) influence baseline rSO2. Copyright © 2015 Société française d’anesthésie et de réanimation (Sfar). Published by Elsevier Masson SAS. All rights reserved.

  4. Using Correspondence Analysis in Multiple Case Studies

    NARCIS (Netherlands)

    Kienstra, Natascha; van der Heijden, Peter G.M.

    2015-01-01

    In qualitative research of multiple case studies, Miles and Huberman proposed to summarize the separate cases in a so-called meta-matrix that consists of cases by variables. Yin discusses cross-case synthesis to study this matrix. We propose correspondence analysis (CA) as a useful tool to study

  5. Using correspondence analysis in multiple case studies

    NARCIS (Netherlands)

    Kienstra, N.H.H.; van der Heijden, P.G.M.

    2015-01-01

    In qualitative research of multiple case studies, Miles and Huberman proposed to summarize the separate cases in a so-called meta-matrix that consists of cases by variables. Yin discusses cross-case synthesis to study this matrix. We propose correspondence analysis (CA) as a useful tool to study

  6. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis[CDM=Clean Development Mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Ringius, L.; Grohnheit, P.E.; Nielsen, L.H.; Olivier, A.L.; Painuly, J.; Villavicencio, A.

    2002-12-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO{sub 2}/MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind

  7. IMPROVING FUNCTIONAL INDEPENDENCE OF PATIENTS WITH MULTIPLE SCLEROSIS BY PHYSICAL THERAPY AND OCCUPATIONAL THERAPY

    Directory of Open Access Journals (Sweden)

    Ana-Maria Ticărat

    2011-06-01

    Full Text Available Introduction. Patients with multiple sclerosis can have a normal life despite of their real or possible disability and of the progressive nature of it. Scope. Patients who follow physical therapy and occupational therapy will have an increased quality of life and a greater functional independence.Methods. The randomized study was made on 7 patients with multiple sclerosis, from Oradea Day Centre, 3 times/week, ages between 35 – 55 years, functional level between mild and sever. Assessment and rehabilitation methods: inspection, BARTHEL Index. Frenkel method, brething exercises, weights exercises, gait exercises, writind exercises and games were used in the rehabilitation process. Group therapies: sociotherapy, arttherapy, music therapy. Results analysis consisted of the comparison of baseline and final means.Results. By analizing baseline and final means for Barthel Index for each functon separately, it was shown a mild improvement of functional independence for almost assessed functions, with at least 1-1,5 points.Conclusions. Persons with multiple sclerosis who follow physical therapy and occupational therapy presents a better functional independence after the treatment.

  8. Baseline and changes in serum uric acid independently predict 11-year incidence of metabolic syndrome among community-dwelling women.

    Science.gov (United States)

    Kawamoto, R; Ninomiya, D; Kasai, Y; Senzaki, K; Kusunoki, T; Ohtsuka, N; Kumagi, T

    2018-02-19

    Metabolic syndrome (MetS) is associated with an increased risk of major cardiovascular events. In women, increased serum uric acid (SUA) levels are associated with MetS and its components. However, whether baseline and changes in SUA predict incidence of MetS and its components remains unclear. The subjects comprised 407 women aged 71 ± 8 years from a rural village. We have identified participants who underwent a similar examination 11 years ago, and examined the relationship between baseline and changes in SUA, and MetS based on the modified criteria of the National Cholesterol Education Program's Adult Treatment Panel (NCEP-ATP) III report. Of these subjects, 83 (20.4%) women at baseline and 190 (46.7%) women at follow-up had MetS. Multiple linear regression analysis was performed to evaluate the contribution of each confounding factor for MetS; both baseline and changes in SUA as well as history of cardiovascular disease, low-density lipoprotein cholesterol, and estimated glomerular filtration ratio (eGFR) were independently and significantly associated with the number of MetS components during an 11-year follow-up. The adjusted odds ratios (ORs) (95% confidence interval) for incident MetS across tertiles of baseline SUA and changes in SUA were 1.00, 1.47 (0.82-2.65), and 3.11 (1.66-5.83), and 1.00, 1.88 (1.03-3.40), and 2.49 (1.38-4.47), respectively. In addition, the combined effect between increased baseline and changes in SUA was also a significant and independent determinant for the accumulation of MetS components (F = 20.29, p baseline MetS. These results suggested that combined assessment of baseline and changes in SUA levels provides increased information for incident MetS, independent of other confounding factors in community-dwelling women.

  9. Analysis and prediction of Multiple-Site Damage (MSD) fatigue crack growth

    Science.gov (United States)

    Dawicke, D. S.; Newman, J. C., Jr.

    1992-08-01

    A technique was developed to calculate the stress intensity factor for multiple interacting cracks. The analysis was verified through comparison with accepted methods of calculating stress intensity factors. The technique was incorporated into a fatigue crack growth prediction model and used to predict the fatigue crack growth life for multiple-site damage (MSD). The analysis was verified through comparison with experiments conducted on uniaxially loaded flat panels with multiple cracks. Configuration with nearly equal and unequal crack distribution were examined. The fatigue crack growth predictions agreed within 20 percent of the experimental lives for all crack configurations considered.

  10. Analysis and prediction of Multiple-Site Damage (MSD) fatigue crack growth

    Science.gov (United States)

    Dawicke, D. S.; Newman, J. C., Jr.

    1992-01-01

    A technique was developed to calculate the stress intensity factor for multiple interacting cracks. The analysis was verified through comparison with accepted methods of calculating stress intensity factors. The technique was incorporated into a fatigue crack growth prediction model and used to predict the fatigue crack growth life for multiple-site damage (MSD). The analysis was verified through comparison with experiments conducted on uniaxially loaded flat panels with multiple cracks. Configuration with nearly equal and unequal crack distribution were examined. The fatigue crack growth predictions agreed within 20 percent of the experimental lives for all crack configurations considered.

  11. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    Science.gov (United States)

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  12. Association analysis of multiple traits by an approach of combining ...

    Indian Academy of Sciences (India)

    Lili Chen

    diseases. Joint analysis of multiple traits can increase statistical power of association analysis and uncover the underlying genetic ... genthaler and Thilly 2007), the combined multivariate and ... Because of using reverse regression model, our.

  13. Analysis of dynamic multiplicity fluctuations at PHOBOS

    Science.gov (United States)

    Chai, Zhengwei; PHOBOS Collaboration; Back, B. B.; Baker, M. D.; Ballintijn, M.; Barton, D. S.; Betts, R. R.; Bickley, A. A.; Bindel, R.; Budzanowski, A.; Busza, W.; Carroll, A.; Chai, Z.; Decowski, M. P.; García, E.; George, N.; Gulbrandsen, K.; Gushue, S.; Halliwell, C.; Hamblen, J.; Heintzelman, G. A.; Henderson, C.; Hofman, D. J.; Hollis, R. S.; Holynski, R.; Holzman, B.; Iordanova, A.; Johnson, E.; Kane, J. L.; Katzy, J.; Khan, N.; Kucewicz, W.; Kulinich, P.; Kuo, C. M.; Lin, W. T.; Manly, S.; McLeod, D.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Park, I. C.; Pernegger, H.; Reed, C.; Remsberg, L. P.; Reuter, M.; Roland, C.; Roland, G.; Rosenberg, L.; Sagerer, J.; Sarin, P.; Sawicki, P.; Skulski, W.; Steinberg, P.; Stephans, G. S. F.; Sukhanov, A.; Tang, J. L.; Trzupek, A.; Vale, C.; van Nieuwenhuizen, G. J.; Verdier, R.; Wolfs, F. L. H.; Wosiek, B.; Wozniak, K.; Wuosmaa, A. H.; Wyslouch, B.

    2005-01-01

    This paper presents the analysis of the dynamic fluctuations in the inclusive charged particle multiplicity measured by PHOBOS for Au+Au collisions at surdsNN = 200GeV within the pseudo-rapidity range of -3 < η < 3. First the definition of the fluctuations observables used in this analysis is presented, together with the discussion of their physics meaning. Then the procedure for the extraction of dynamic fluctuations is described. Some preliminary results are included to illustrate the correlation features of the fluctuation observable. New dynamic fluctuations results will be available in a later publication.

  14. Baseline energy forecasts and analysis of alternative strategies for airline fuel conservation

    Energy Technology Data Exchange (ETDEWEB)

    1976-07-01

    To evaluate the impact of fuel conservation strategies, baseline forecasts of airline activity and energy consumption to 1990 were developed. Alternative policy options to reduce fuel consumption were identified and analyzed for three baseline levels of aviation activity within the framework of an aviation activity/energy consumption model. By combining the identified policy options, a strategy was developed to provide incentives for airline fuel conservation. Strategies and policy options were evaluated in terms of their impact on airline fuel conservation and the functioning of the airline industry as well as the associated social, environmental, and economic costs. (GRA)

  15. Diversity Performance Analysis on Multiple HAP Networks

    Science.gov (United States)

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  16. Diversity Performance Analysis on Multiple HAP Networks

    Directory of Open Access Journals (Sweden)

    Feihong Dong

    2015-06-01

    Full Text Available One of the main design challenges in wireless sensor networks (WSNs is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV. In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF and cumulative distribution function (CDF of the received signal-to-noise ratio (SNR are derived. In addition, the average symbol error rate (ASER with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  17. Toward Baseline Software Anomalies in NASA Missions

    Science.gov (United States)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  18. Multivariant design and multiple criteria analysis of building refurbishments

    Energy Technology Data Exchange (ETDEWEB)

    Kaklauskas, A.; Zavadskas, E. K.; Raslanas, S. [Faculty of Civil Engineering, Vilnius Gediminas Technical University, Vilnius (Lithuania)

    2005-07-01

    In order to design and realize an efficient building refurbishment, it is necessary to carry out an exhaustive investigation of all solutions that form it. The efficiency level of the considered building's refurbishment depends on a great many of factors, including: cost of refurbishment, annual fuel economy after refurbishment, tentative pay-back time, harmfulness to health of the materials used, aesthetics, maintenance properties, functionality, comfort, sound insulation and longevity, etc. Solutions of an alternative character allow for a more rational and realistic assessment of economic, ecological, legislative, climatic, social and political conditions, traditions and for better the satisfaction of customer requirements. They also enable one to cut down on refurbishment costs. In carrying out the multivariant design and multiple criteria analysis of a building refurbishment much data was processed and evaluated. Feasible alternatives could be as many as 100,000. How to perform a multivariant design and multiple criteria analysis of alternate alternatives based on the enormous amount of information became the problem. Method of multivariant design and multiple criteria of a building refurbishment's analysis were developed by the authors to solve the above problems. In order to demonstrate the developed method, a practical example is presented in this paper. (author)

  19. Early antihypertensive treatment and clinical outcomes in acute ischemic stroke: subgroup analysis by baseline blood pressure.

    Science.gov (United States)

    He, William J; Zhong, Chongke; Xu, Tan; Wang, Dali; Sun, Yingxian; Bu, Xiaoqing; Chen, Chung-Shiuan; Wang, Jinchao; Ju, Zhong; Li, Qunwei; Zhang, Jintao; Geng, Deqin; Zhang, Jianhui; Li, Dong; Li, Yongqiu; Yuan, Xiaodong; Zhang, Yonghong; Kelly, Tanika N

    2018-06-01

    We studied the effect of early antihypertensive treatment on death, major disability, and vascular events among patients with acute ischemic stroke according to their baseline SBP. We randomly assigned 4071 acute ischemic stroke patients with SBP between 140 and less than 220 mmHg to receive antihypertensive treatment or to discontinue all antihypertensive medications during hospitalization. A composite primary outcome of death and major disability and secondary outcomes were compared between treatment and control stratified by baseline SBP levels of less than 160, 160-179, and at least 180 mmHg. At 24 h after randomization, differences in SBP reductions were 8.8, 8.6 and 7.8 mmHg between the antihypertensive treatment and control groups among patients with baseline SBP less than 160, 160-179, and at least 180 mmHg, respectively (P baseline SBP subgroups on death (P = 0.02): odds ratio (95% CI) of 2.42 (0.74-7.89) in patients with baseline SBP less than 60 mmHg and 0.34 (0.11-1.09) in those with baseline SBP at least 180 mmHg. At the 3-month follow-up, the primary and secondary clinical outcomes were not significantly different between the treatment and control groups by baseline SBP levels. Early antihypertensive treatment had a neutral effect on clinical outcomes among acute ischemic stroke patients with various baseline SBP levels. Future clinical trials are warranted to test BP-lowering effects in acute ischemic stroke patients by baseline SBP levels. ClinicalTrials.gov Identifier: NCT01840072.

  20. Leveraging probabilistic peak detection to estimate baseline drift in complex chromatographic samples

    NARCIS (Netherlands)

    Lopatka, M.; Barcaru, A.; Sjerps, M.J.; Vivó-Truyols, G.

    2016-01-01

    Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant

  1. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  2. Adjustment of Measurements with Multiplicative Errors: Error Analysis, Estimates of the Variance of Unit Weight, and Effect on Volume Estimation from LiDAR-Type Digital Elevation Models

    Directory of Open Access Journals (Sweden)

    Yun Shi

    2014-01-01

    Full Text Available Modern observation technology has verified that measurement errors can be proportional to the true values of measurements such as GPS, VLBI baselines and LiDAR. Observational models of this type are called multiplicative error models. This paper is to extend the work of Xu and Shimada published in 2000 on multiplicative error models to analytical error analysis of quantities of practical interest and estimates of the variance of unit weight. We analytically derive the variance-covariance matrices of the three least squares (LS adjustments, the adjusted measurements and the corrections of measurements in multiplicative error models. For quality evaluation, we construct five estimators for the variance of unit weight in association of the three LS adjustment methods. Although LiDAR measurements are contaminated with multiplicative random errors, LiDAR-based digital elevation models (DEM have been constructed as if they were of additive random errors. We will simulate a model landslide, which is assumed to be surveyed with LiDAR, and investigate the effect of LiDAR-type multiplicative error measurements on DEM construction and its effect on the estimate of landslide mass volume from the constructed DEM.

  3. Combining information from multiple flood projections in a hierarchical Bayesian framework

    Science.gov (United States)

    Le Vine, Nataliya

    2016-04-01

    This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.

  4. Changes of deep gray matter magnetic susceptibility over 2years in multiple sclerosis and healthy control brain

    Directory of Open Access Journals (Sweden)

    Jesper Hagemeier

    Full Text Available In multiple sclerosis, pathological changes of both tissue iron and myelin occur, yet these factors have not been characterized in a longitudinal fashion using the novel iron- and myelin-sensitive quantitative susceptibility mapping (QSM MRI technique. We investigated disease-relevant tissue changes associated with myelin loss and iron accumulation in multiple sclerosis deep gray matter (DGM over two years. One-hundred twenty (120 multiple sclerosis patients and 40 age- and sex-matched healthy controls were included in this prospective study. Written informed consent and local IRB approval were obtained from all participants. Clinical testing and QSM were performed both at baseline and at follow-up. Brain magnetic susceptibility was measured in major DGM structures. Temporal (baseline vs. follow-up and cross-sectional (multiple sclerosis vs. controls differences were studied using mixed factorial ANOVA analysis and appropriate t-tests. At either time-point, multiple sclerosis patients had significantly higher susceptibility in the caudate and globus pallidus and lower susceptibility in the thalamus. Over two years, susceptibility increased significantly in the caudate of both controls and multiple sclerosis patients. Inverse thalamic findings among MS patients suggest a multi-phase pathology explained by simultaneous myelin loss and/or iron accumulation followed by iron depletion and/or calcium deposition at later stages. Keywords: Quantitative susceptibility mapping, QSM, Iron, Multiple sclerosis, Longitudinal study

  5. Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Ma, Shuangge

    2012-01-01

    Genome-wide association studies have been extensively conducted, searching for markers for biologically meaningful outcomes and phenotypes. Penalization methods have been adopted in the analysis of the joint effects of a large number of SNPs (single nucleotide polymorphisms) and marker identification. This study is partly motivated by the analysis of heterogeneous stock mice dataset, in which multiple correlated phenotypes and a large number of SNPs are available. Existing penalization methods designed to analyze a single response variable cannot accommodate the correlation among multiple response variables. With multiple response variables sharing the same set of markers, joint modeling is first employed to accommodate the correlation. The group Lasso approach is adopted to select markers associated with all the outcome variables. An efficient computational algorithm is developed. Simulation study and analysis of the heterogeneous stock mice dataset show that the proposed method can outperform existing penalization methods. PMID:23272092

  6. Multiplicity Analysis during Photon Interrogation of Fissionable Material

    International Nuclear Information System (INIS)

    Clarke, Shaun D.; Pozzi, Sara A.; Padovani, Enrico; Downar, Thomas J.

    2007-01-01

    Simulation of multiplicity distributions with the Monte Carlo method is difficult because each history is treated individually. In order to accurately model the multiplicity distribution, the intensity and time width of the interrogation pulse must be incorporated into the calculation. This behavior dictates how many photons arrive at the target essentially simultaneously. In order to model the pulse width correctly, a Monte Carlo code system consisting of modified versions of the codes MCNPX and MCNP-PoliMi has been developed in conjunction with a post-processing algorithm to operate on the MCNP-PoliMi output file. The purpose of this subroutine is to assemble the interactions into groups corresponding to the number of interactions which would occur during a given pulse. The resulting multiplicity distributions appear more realistic and capture the higher-order multiplets which are a product of multiple reactions occurring during a single accelerator pulse. Plans are underway to gather relevant experimental data to verify and validate the methodology developed and presented here. This capability will enable the simulation of a large number of materials and detector geometries. Analysis of this information will determine the feasibility of using multiplicity distributions as an identification tool for special nuclear material.

  7. Program Baseline Change Control Procedure

    International Nuclear Information System (INIS)

    1993-02-01

    This procedure establishes the responsibilities and process for approving initial issues of and changes to the technical, cost, and schedule baselines, and selected management documents developed by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System. This procedure implements the OCRWM Baseline Management Plan and DOE Order 4700.1, Chg 1. It streamlines the change control process to enhance integration, accountability, and traceability of Level 0 and Level I decisions through standardized Baseline Change Proposal (BCP) forms to be used by the Level 0, 1, 2, and 3 Baseline Change Control Boards (BCCBs) and to be tracked in the OCRWM-wide Configuration Information System (CIS) Database.This procedure applies to all technical, cost, and schedule baselines controlled by the Energy System Acquisition Advisory Board (ESAAB) BCCB (Level 0) and, OCRWM Program Baseline Control Board (PBCCB) (Level 1). All baseline BCPs initiated by Level 2 or lower BCCBs, which require approval from ESAAB or PBCCB, shall be processed in accordance with this procedure. This procedure also applies to all Program-level management documents controlled by the OCRWM PBCCB

  8. CHOOSING A HEALTH INSTITUTION WITH MULTIPLE CORRESPONDENCE ANALYSIS AND CLUSTER ANALYSIS IN A POPULATION BASED STUDY

    Directory of Open Access Journals (Sweden)

    ASLI SUNER

    2013-06-01

    Full Text Available Multiple correspondence analysis is a method making easy to interpret the categorical variables given in contingency tables, showing the similarities, associations as well as divergences among these variables via graphics on a lower dimensional space. Clustering methods are helped to classify the grouped data according to their similarities and to get useful summarized data from them. In this study, interpretations of multiple correspondence analysis are supported by cluster analysis; factors affecting referred health institute such as age, disease group and health insurance are examined and it is aimed to compare results of the methods.

  9. Baseline Vascular Cognitive Impairment Predicts the Course of Apathetic Symptoms After Stroke: The CASPER Study.

    Science.gov (United States)

    Douven, Elles; Köhler, Sebastian; Schievink, Syenna H J; van Oostenbrugge, Robert J; Staals, Julie; Verhey, Frans R J; Aalten, Pauline

    2018-03-01

    To examine the influence of vascular cognitive impairment (VCI) on the course of poststroke depression (PSD) and poststroke apathy (PSA). Included were 250 stroke patients who underwent neuropsychological and neuropsychiatric assessment 3 months after stroke (baseline) and at a 6- and 12-month follow-up after baseline. Linear mixed models tested the influence of VCI in at least one cognitive domain (any VCI) or multidomain VCI (VCI in multiple cognitive domains) at baseline and domain-specific VCI at baseline on levels of depression and apathy over time, with random effects for intercept and slope. Almost half of the patients showed any VCI at baseline, and any VCI was associated with increasing apathy levels from baseline to the 12-month follow-up. Patients with multidomain VCI had higher apathy scores at the 6- and 12-month follow-up compared with patients with VCI in a single cognitive domain. Domain-specific analyses showed that impaired executive function and slowed information processing speed went together with increasing apathy levels from baseline to 6- and 12-month follow-up. None of the cognitive variables predicted the course of depressive symptoms. Baseline VCI is associated with increasing apathy levels from baseline to the chronic stroke phase, whereas no association was found between baseline VCI and the course of depressive symptoms. Health professionals should be aware that apathy might be absent early after stroke but may evolve over time in patients with VCI. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. General Nature of Multicollinearity in Multiple Regression Analysis.

    Science.gov (United States)

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  11. Modified Truncated Multiplicity Analysis to Improve Verification of Uranium Fuel Cycle Materials

    International Nuclear Information System (INIS)

    LaFleur, A.; Miller, K.; Swinhoe, M.; Belian, A.; Croft, S.

    2015-01-01

    Accurate verification of 235U enrichment and mass in UF6 storage cylinders and the UO2F2 holdup contained in the process equipment is needed to improve international safeguards and nuclear material accountancy at uranium enrichment plants. Small UF6 cylinders (1.5'' and 5'' diameter) are used to store the full range of enrichments from depleted to highly-enriched UF6. For independent verification of these materials, it is essential that the 235U mass and enrichment measurements do not rely on facility operator declarations. Furthermore, in order to be deployed by IAEA inspectors to detect undeclared activities (e.g., during complementary access), it is also imperative that the measurement technique is quick, portable, and sensitive to a broad range of 235U masses. Truncated multiplicity analysis is a technique that reduces the variance in the measured count rates by only considering moments 1, 2, and 3 of the multiplicity distribution. This is especially important for reducing the uncertainty in the measured doubles and triples rates in environments with a high cosmic ray background relative to the uranium signal strength. However, we believe that the existing truncated multiplicity analysis throws away too much useful data by truncating the distribution after the third moment. This paper describes a modified truncated multiplicity analysis method that determines the optimal moment to truncate the multiplicity distribution based on the measured data. Experimental measurements of small UF6 cylinders and UO2F2 working reference materials were performed at Los Alamos National Laboratory (LANL). The data were analyzed using traditional and modified truncated multiplicity analysis to determine the optimal moment to truncate the multiplicity distribution to minimize the uncertainty in the measured count rates. The results from this analysis directly support nuclear safeguards at enrichment plants and provide a more accurate verification method for UF6

  12. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  13. Mastering the Multiplication Facts

    Science.gov (United States)

    D'Ettorre, Jenna

    2009-01-01

    The purpose of this paper is to share the results of a six-week research project (after baseline data was collected) that focused on three different strategies (flashcards, interactive games, and music) and their effectiveness in helping fifth grade students memorize the basic multiplication facts. Many teachers face a serious problem when their…

  14. Seismic response analysis of structural system subjected to multiple support excitation

    International Nuclear Information System (INIS)

    Wu, R.W.; Hussain, F.A.; Liu, L.K.

    1978-01-01

    In the seismic analysis of a multiply supported structural system subjected to nonuniform excitations at each support point, the single response spectrum, the time history, and the multiple response spectrum are the three commonly employed methods. In the present paper the three methods are developed, evaluated, and the limitations and advantages of each method assessed. A numerical example has been carried out for a typical piping system. Considerably smaller responses have been predicted by the time history method than that by the single response spectrum method. This is mainly due to the fact that the phase and amplitude relations between the support excitations are faithfully retained in the time history method. The multiple response spectrum prediction has been observed to compare favourably with the time history method prediction. Based on the present evaluation, the multiple response spectrum method is the most efficient method for seismic response analysis of structural systems subjected to multiple support excitation. (Auth.)

  15. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  16. Effectiveness of an intervention in increasing the provision of preventive care by community mental health services: a non-randomized, multiple baseline implementation trial.

    Science.gov (United States)

    Bartlem, Kate M; Bowman, Jenny; Freund, Megan; Wye, Paula M; Barker, Daniel; McElwaine, Kathleen M; Wolfenden, Luke; Campbell, Elizabeth M; McElduff, Patrick; Gillham, Karen; Wiggers, John

    2016-04-02

    Relative to the general population, people with a mental illness are more likely to have modifiable chronic disease health risk behaviours. Care to reduce such risks is not routinely provided by community mental health clinicians. This study aimed to determine the effectiveness of an intervention in increasing the provision of preventive care by such clinicians addressing four chronic disease risk behaviours. A multiple baseline trial was undertaken in two groups of community mental health services in New South Wales, Australia (2011-2014). A 12-month practice change intervention was sequentially implemented in each group. Outcome data were collected continuously via telephone interviews with a random sample of clients over a 3-year period, from 6 months pre-intervention in the first group, to 6 months post intervention in the second group. Outcomes were client-reported receipt of assessment, advice and referral for tobacco smoking, harmful alcohol consumption, inadequate fruit and/or vegetable consumption and inadequate physical activity and for the four behaviours combined. Logistic regression analyses examined change in client-reported receipt of care. There was an increase in assessment for all risks combined following the intervention (18 to 29 %; OR 3.55, p = 0.002: n = 805 at baseline, 982 at follow-up). No significant change in assessment, advice or referral for each individual risk was found. The intervention had a limited effect on increasing the provision of preventive care. Further research is required to determine how to increase the provision of preventive care in community mental health services. Australian and New Zealand Clinical Trials Registry ACTRN12613000693729.

  17. On the feasibility of routine baseline improvement in processing of geomagnetic observatory data

    Science.gov (United States)

    Soloviev, Anatoly; Lesur, Vincent; Kudin, Dmitry

    2018-02-01

    We propose a new approach to the calculation of regular baselines at magnetic observatories. The proposed approach is based on the simultaneous analysis of the irregular absolute observations and the continuous time-series deltaF, widely used for estimating the data quality. The systematic deltaF analysis allows to take into account all available information about the operation of observatory instruments (i.e., continuous records of the field variations and its modulus) in the intervals between the times of absolute observations, as compared to the traditional baseline calculation where only spot values are considered. To establish a connection with the observed spot baseline values, we introduce a function for approximate evaluation of the intermediate baseline values. An important feature of the algorithm is its quantitative estimation of the resulting data precision and thus determination of the problematic fragments in raw data. We analyze the robustness of the algorithm operation using synthetic data sets. We also compare baselines and definitive data derived by the proposed algorithm with those derived by the traditional approach using Saint Petersburg observatory data, recorded in 2015 and accepted by INTERMAGNET. It is shown that the proposed method allows to essentially improve the resulting data quality when baseline data are not good enough. The obtained results prove that the baseline variability in time might be quite rapid.[Figure not available: see fulltext.

  18. Mission control of multiple unmanned aerial vehicles: a workload analysis.

    Science.gov (United States)

    Dixon, Stephen R; Wickens, Christopher D; Chang, Dervon

    2005-01-01

    With unmanned aerial vehicles (UAVs), 36 licensed pilots flew both single-UAV and dual-UAV simulated military missions. Pilots were required to navigate each UAV through a series of mission legs in one of the following three conditions: a baseline condition, an auditory autoalert condition, and an autopilot condition. Pilots were responsible for (a) mission completion, (b) target search, and (c) systems monitoring. Results revealed that both the autoalert and the autopilot automation improved overall performance by reducing task interference and alleviating workload. The autoalert system benefited performance both in the automated task and mission completion task, whereas the autopilot system benefited performance in the automated task, the mission completion task, and the target search task. Practical implications for the study include the suggestion that reliable automation can help alleviate task interference and reduce workload, thereby allowing pilots to better handle concurrent tasks during single- and multiple-UAV flight control.

  19. HARMONIC ANALYSIS OF SVPWM INVERTER USING MULTIPLE-PULSES METHOD

    Directory of Open Access Journals (Sweden)

    Mehmet YUMURTACI

    2009-01-01

    Full Text Available Space Vector Modulation (SVM technique is a popular and an important PWM technique for three phases voltage source inverter in the control of Induction Motor. In this study harmonic analysis of Space Vector PWM (SVPWM is investigated using multiple-pulses method. Multiple-Pulses method calculates the Fourier coefficients of individual positive and negative pulses of the output PWM waveform and adds them together using the principle of superposition to calculate the Fourier coefficients of the all PWM output signal. Harmonic magnitudes can be calculated directly by this method without linearization, using look-up tables or Bessel functions. In this study, the results obtained in the application of SVPWM for values of variable parameters are compared with the results obtained with the multiple-pulses method.

  20. Evaluation of dairy effluent management options using multiple criteria analysis.

    Science.gov (United States)

    Hajkowicz, Stefan A; Wheeler, Sarah A

    2008-04-01

    This article describes how options for managing dairy effluent on the Lower Murray River in South Australia were evaluated using multiple criteria analysis (MCA). Multiple criteria analysis is a framework for combining multiple environmental, social, and economic objectives in policy decisions. At the time of the study, dairy irrigation in the region was based on flood irrigation which involved returning effluent to the river. The returned water contained nutrients, salts, and microbial contaminants leading to environmental, human health, and tourism impacts. In this study MCA was used to evaluate 11 options against 6 criteria for managing dairy effluent problems. Of the 11 options, the MCA model selected partial rehabilitation of dairy paddocks with the conversion of remaining land to other agriculture. Soon after, the South Australian Government adopted this course of action and is now providing incentives for dairy farmers in the region to upgrade irrigation infrastructure and/or enter alternative industries.

  1. Multiple sclerosis and employment: Associations of psychological factors and work instability.

    Science.gov (United States)

    Wicks, Charlotte Rose; Ward, Karl; Stroud, Amanda; Tennant, Alan; Ford, Helen L

    2016-10-12

    People with multiple sclerosis often stop working earlier than expected. Psychological factors may have an impact on job retention. Investigation may inform interventions to help people stay in work. To investigate the associations between psychological factors and work instability in people with multiple sclerosis. A multi-method, 2-phased study. Focus groups were held to identify key themes. Questionnaire packs using validated scales of the key themes were completed at baseline and at 8-month follow-up. Four key psychological themes emerged. Out of 208 study subjects 57.2% reported medium/high risk of job loss, with marginal changes at 8 months. Some psychological variables fluctuated significantly, e.g. depression fell from 24.6% to 14.5%. Work instability and anxiety and depression were strongly correlated (χ2 p work instability, and baseline depression levels also predicted later work instability (Hosmer-Lemeshow test 0.899; Nagelkerke R Square 0.579). Psychological factors fluctuated over the 8-month follow-up period. Some psychological variables, including anxiety and depression, were significantly associated with, and predictive of, work instability. Longitudinal analysis should further identify how these psychological attributes impact on work instability and potential job loss in the longer term.

  2. Baseline effects on carbon footprints of biofuels: The case of wood

    International Nuclear Information System (INIS)

    Johnson, Eric; Tschudi, Daniel

    2012-01-01

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: ► Four baseline types for biofuel footprinting are identified. ► One type, ‘biomass opportunity cost’, is defined mathematically and graphically. ► Choice of baseline can dramatically affect the footprint result. ► The ‘no baseline’ approach is not acceptable. ► Choice between the other three baselines depends on the question being addressed.

  3. Baseline effects on carbon footprints of biofuels: The case of wood

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Eric, E-mail: johnsonatlantic@gmail.com [Atlantic Consulting, 8136 Gattikon (Switzerland); Tschudi, Daniel [ETH, Berghaldenstrasse 46, 8800 Thalwil (Switzerland)

    2012-11-15

    As biofuel usage has boomed over the past decade, so has research and regulatory interest in its carbon accounting. This paper examines one aspect of that carbon accounting: the baseline, i.e. the reference case against which other conditions or changes can be compared. A literature search and analysis identified four baseline types: no baseline; reference point; marginal fossil fuel; and biomass opportunity cost. The fourth one, biomass opportunity cost, is defined in more detail, because this is not done elsewhere in the literature. The four baselines are then applied to the carbon footprint of a wood-fired power plant. The footprint of the resulting wood-fired electricity varies dramatically, according to the type of baseline. Baseline type is also found to be the footprint's most significant sensitivity. Other significant sensitivities are: efficiency of the power plant; the growth (or re-growth) rate of the forest that supplies the wood; and the residue fraction of the wood. Length of the policy horizon is also an important factor in determining the footprint. The paper concludes that because of their significance and variability, baseline choices should be made very explicit in biofuel carbon footprints. - Highlights: Black-Right-Pointing-Pointer Four baseline types for biofuel footprinting are identified. Black-Right-Pointing-Pointer One type, 'biomass opportunity cost', is defined mathematically and graphically. Black-Right-Pointing-Pointer Choice of baseline can dramatically affect the footprint result. Black-Right-Pointing-Pointer The 'no baseline' approach is not acceptable. Black-Right-Pointing-Pointer Choice between the other three baselines depends on the question being addressed.

  4. Analysis of Product Sampling for New Product Diffusion Incorporating Multiple-Unit Ownership

    Directory of Open Access Journals (Sweden)

    Zhineng Hu

    2014-01-01

    Full Text Available Multiple-unit ownership of nondurable products is an important component of sales in many product categories. Based on the Bass model, this paper develops a new model considering the multiple-unit adoptions as a diffusion process under the influence of product sampling. Though the analysis aims to determine the optimal dynamic sampling effort for a firm and the results demonstrate that experience sampling can accelerate the diffusion process, the best time to send free samples is just before the product being launched. Multiple-unit purchasing behavior can increase sales to make more profit for a firm, and it needs more samples to make the product known much better. The local sensitivity analysis shows that the increase of both external coefficients and internal coefficients has a negative influence on the sampling level, but the internal influence on the subsequent multiple-unit adoptions has little significant influence on the sampling. Using the logistic regression along with linear regression, the global sensitivity analysis gives a whole analysis of the interaction of all factors, which manifests the external influence and multiunit purchase rate are two most important factors to influence the sampling level and net present value of the new product, and presents a two-stage method to determine the sampling level.

  5. Single-visit or multiple-visit root canal treatment: systematic review, meta-analysis and trial sequential analysis.

    Science.gov (United States)

    Schwendicke, Falk; Göstemeyer, Gerd

    2017-02-01

    Single-visit root canal treatment has some advantages over conventional multivisit treatment, but might increase the risk of complications. We systematically evaluated the risk of complications after single-visit or multiple-visit root canal treatment using meta-analysis and trial-sequential analysis. Controlled trials comparing single-visit versus multiple-visit root canal treatment of permanent teeth were included. Trials needed to assess the risk of long-term complications (pain, infection, new/persisting/increasing periapical lesions ≥1 year after treatment), short-term pain or flare-up (acute exacerbation of initiation or continuation of root canal treatment). Electronic databases (PubMed, EMBASE, Cochrane Central) were screened, random-effects meta-analyses performed and trial-sequential analysis used to control for risk of random errors. Evidence was graded according to GRADE. 29 trials (4341 patients) were included, all but 6 showing high risk of bias. Based on 10 trials (1257 teeth), risk of complications was not significantly different in single-visit versus multiple-visit treatment (risk ratio (RR) 1.00 (95% CI 0.75 to 1.35); weak evidence). Based on 20 studies (3008 teeth), risk of pain did not significantly differ between treatments (RR 0.99 (95% CI 0.76 to 1.30); moderate evidence). Risk of flare-up was recorded by 8 studies (1110 teeth) and was significantly higher after single-visit versus multiple-visit treatment (RR 2.13 (95% CI 1.16 to 3.89); very weak evidence). Trial-sequential analysis revealed that firm evidence for benefit, harm or futility was not reached for any of the outcomes. There is insufficient evidence to rule out whether important differences between both strategies exist. Dentists can provide root canal treatment in 1 or multiple visits. Given the possibly increased risk of flare-ups, multiple-visit treatment might be preferred for certain teeth (eg, those with periapical lesions). Published by the BMJ Publishing Group Limited

  6. Evaluation of Clinical Gait Analysis parameters in patients affected by Multiple Sclerosis: Analysis of kinematics.

    Science.gov (United States)

    Severini, Giacomo; Manca, Mario; Ferraresi, Giovanni; Caniatti, Luisa Maria; Cosma, Michela; Baldasso, Francesco; Straudi, Sofia; Morelli, Monica; Basaglia, Nino

    2017-06-01

    Clinical Gait Analysis is commonly used to evaluate specific gait characteristics of patients affected by Multiple Sclerosis. The aim of this report is to present a retrospective cross-sectional analysis of the changes in Clinical Gait Analysis parameters in patients affected by Multiple Sclerosis. In this study a sample of 51 patients with different levels of disability (Expanded Disability Status Scale 2-6.5) was analyzed. We extracted a set of 52 parameters from the Clinical Gait Analysis of each patient and used statistical analysis and linear regression to assess differences among several groups of subjects stratified according to the Expanded Disability Status Scale and 6-Minutes Walking Test. The impact of assistive devices (e.g. canes and crutches) on the kinematics was also assessed in a subsample of patients. Subjects showed decreased range of motion at hip, knee and ankle that translated in increased pelvic tilt and hiking. Comparison between the two stratifications showed that gait speed during 6-Minutes Walking Test is better at discriminating patients' kinematics with respect to Expanded Disability Status Scale. Assistive devices were shown not to significantly impact gait kinematics and the Clinical Gait Analysis parameters analyzed. We were able to characterize disability-related trends in gait kinematics. The results presented in this report provide a small atlas of the changes in gait characteristics associated with different disability levels in the Multiple Sclerosis population. This information could be used to effectively track the progression of MS and the effect of different therapies. Copyright © 2017. Published by Elsevier Ltd.

  7. Effects of Baseline Selection on Magnetocardiography: P-Q and T-P Intervals

    International Nuclear Information System (INIS)

    Lim, Hyun Kyoon; Kwon, Hyuk Chan; Kim, Tae En; Lee, Yong Ho; Kim, Jin Mok; Kim, In Seon; Kim, Ki Woong; Park, Yong Ki

    2007-01-01

    The baseline selection is the first and important step to analyze magnetocardiography (MCG) parameters. There are no difficulties to select the baseline between P- and Q-wave peak (P-Q interval) of MCG wave recorded from healthy subjects because the P-Q intervals of the healthy subjects do not much vary. However, patients with ischemic heart disease often show an unstable P-Q interval which does not seem to be appropriate for the baseline. In this case, T-P interval is alternatively recommended for the baseline. However, there has been no study on the difference made by the baseline selection. In this study, we studied the effect of the different baseline selection. MCG data were analyzed from twenty healthy subjects and twenty one patients whose baselines were alternatively selected in the T-P interval for their inappropriate P-Q interval. Paired T-test was used to compare two set of data. Fifteen parameters derived from the R-wave peak, the T-wave peak, and the period, T max/3 ∼ T max were compared for the different baseline selection. As a result, most parameters did not show significant differences (p>0.05) except few parameters. Therefore, there will be no significant differences if anyone of two intervals were selected for the MCG baseline. However, for the consistent analysis, P-Q interval is strongly recommended for the baseline correction.

  8. Research and analyze of physical health using multiple regression analysis

    Directory of Open Access Journals (Sweden)

    T. S. Kyi

    2014-01-01

    Full Text Available This paper represents the research which is trying to create a mathematical model of the "healthy people" using the method of regression analysis. The factors are the physical parameters of the person (such as heart rate, lung capacity, blood pressure, breath holding, weight height coefficient, flexibility of the spine, muscles of the shoulder belt, abdominal muscles, squatting, etc.., and the response variable is an indicator of physical working capacity. After performing multiple regression analysis, obtained useful multiple regression models that can predict the physical performance of boys the aged of fourteen to seventeen years. This paper represents the development of regression model for the sixteen year old boys and analyzed results.

  9. Dimensions of cultural consumption among tourists : Multiple correspondence analysis

    NARCIS (Netherlands)

    Richards, G.W.; van der Ark, L.A.

    2013-01-01

    The cultural tourism market has diversified and fragmented into many different niches. Previous attempts to segment cultural tourists have been largely unidimensional, failing to capture the complexity of cultural production and consumption. We employ multiple correspondence analysis to visualize

  10. DairyBISS Baseline report

    NARCIS (Netherlands)

    Buizer, N.N.; Berhanu, Tinsae; Murutse, Girmay; Vugt, van S.M.

    2015-01-01

    This baseline report of the Dairy Business Information Service and Support (DairyBISS) project presents the findings of a baseline survey among 103 commercial farms and 31 firms and advisors working in the dairy value chain. Additional results from the survey among commercial dairy farms are

  11. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...... the family is associated with a standard rule and we show that if the latter obeys some properties reflecting principles of impartiality, priority and solidarity, the former obeys them too....

  12. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-02-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative—intended to catalog final products—rather than formative—intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  13. Baseline Evaluations to Support Control Room Modernization at Nuclear Power Plants

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Joe, Jeffrey C.

    2015-01-01

    For any major control room modernization activity at a commercial nuclear power plant (NPP) in the U.S., a utility should carefully follow the four phases prescribed by the U.S. Nuclear Regulatory Commission in NUREG-0711, Human Factors Engineering Program Review Model. These four phases include Planning and Analysis, Design, Verification and Validation, and Implementation and Operation. While NUREG-0711 is a useful guideline, it is written primarily from the perspective of regulatory review, and it therefore does not provide a nuanced account of many of the steps the utility might undertake as part of control room modernization. The guideline is largely summative–intended to catalog final products–rather than formative–intended to guide the overall modernization process. In this paper, we highlight two crucial formative sub-elements of the Planning and Analysis phase specific to control room modernization that are not covered in NUREG-0711. These two sub-elements are the usability and ergonomics baseline evaluations. A baseline evaluation entails evaluating the system as-built and currently in use. The usability baseline evaluation provides key insights into operator performance using the control system currently in place. The ergonomics baseline evaluation identifies possible deficiencies in the physical configuration of the control system. Both baseline evaluations feed into the design of the replacement system and subsequent summative benchmarking activities that help ensure that control room modernization represents a successful evolution of the control system.

  14. Baseline predictors of persistence to first disease-modifying treatment in multiple sclerosis.

    Science.gov (United States)

    Zettl, U K; Schreiber, H; Bauer-Steinhusen, U; Glaser, T; Hechenbichler, K; Hecker, M

    2017-08-01

    Patients with multiple sclerosis (MS) require lifelong therapy. However, success of disease-modifying therapies is dependent on patients' persistence and adherence to treatment schedules. In the setting of a large multicenter observational study, we aimed at assessing multiple parameters for their predictive power with respect to discontinuation of therapy. We analyzed 13 parameters to predict discontinuation of interferon beta-1b treatment during a 2-year follow-up period based on data from 395 patients with MS who were treatment-naïve at study onset. Besides clinical characteristics, patient-related psychosocial outcomes were assessed as well. Among patients without clinically relevant fatigue, males showed a higher persistence rate than females (80.3% vs 64.7%). Clinically relevant fatigue scores decreased the persistence rate in men and especially in women (71.4% and 51.2%). Besides gender and fatigue, univariable and multivariable analyses revealed further factors associated with interferon beta-1b therapy discontinuation, namely lower quality of life, depressiveness, and higher relapse rate before therapy initiation, while higher education, living without a partner, and higher age improved persistence. Patients with higher grades of fatigue and depressiveness are at higher risk to prematurely discontinue MS treatment; especially, women suffering from fatigue have an increased discontinuation rate. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Corrective action baseline report for underground storage tank 2331-U Building 9201-1

    International Nuclear Information System (INIS)

    1994-01-01

    The purpose of this report is to provide baseline geochemical and hydrogeologic data relative to corrective action for underground storage tank (UST) 2331-U at the Building 9201-1 Site. Progress in support of the Building 9201-1 Site has included monitoring well installation and baseline groundwater sampling and analysis. This document represents the baseline report for corrective action at the Building 9201-1 site and is organized into three sections. Section 1 presents introductory information relative to the site, including the regulatory initiative, site description, and progress to date. Section 2 includes the summary of additional monitoring well installation activities and the results of baseline groundwater sampling. Section 3 presents the baseline hydrogeology and planned zone of influence for groundwater remediation

  16. Multiple predictor smoothing methods for sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  17. Multiple predictor smoothing methods for sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, Jon Craig; Storlie, Curtis B.

    2006-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  18. Baseline restoration using current conveyors

    International Nuclear Information System (INIS)

    Morgado, A.M.L.S.; Simoes, J.B.; Correia, C.M.

    1996-01-01

    A good performance of high resolution nuclear spectrometry systems, at high pulse rates, demands restoration of baseline between pulses, in order to remove rate dependent baseline shifts. This restoration is performed by circuits named baseline restorers (BLRs) which also remove low frequency noise, such as power supply hum and detector microphonics. This paper presents simple circuits for baseline restoration based on a commercial current conveyor (CCII01). Tests were performed, on two circuits, with periodic trapezoidal shaped pulses in order to measure the baseline restoration for several pulse rates and restorer duty cycles. For the current conveyor based Robinson restorer, the peak shift was less than 10 mV, for duty cycles up to 60%, at high pulse rates. Duty cycles up to 80% were also tested, being the maximum peak shift 21 mV. The peak shift for the current conveyor based Grubic restorer was also measured. The maximum value found was 30 mV at 82% duty cycle. Keeping the duty cycle below 60% improves greatly the restorer performance. The ability of both baseline restorer architectures to reject low frequency modulation is also measured, with good results on both circuits

  19. Large short-baseline νμ disappearance

    International Nuclear Information System (INIS)

    Giunti, Carlo; Laveder, Marco

    2011-01-01

    We analyze the LSND, KARMEN, and MiniBooNE data on short-baseline ν μ →ν e oscillations and the data on short-baseline ν e disappearance obtained in the Bugey-3 and CHOOZ reactor experiments in the framework of 3+1 antineutrino mixing, taking into account the MINOS observation of long-baseline ν μ disappearance and the KamLAND observation of very-long-baseline ν e disappearance. We show that the fit of the data implies that the short-baseline disappearance of ν μ is relatively large. We obtain a prediction of an effective amplitude sin 2 2θ μμ > or approx. 0.1 for short-baseline ν μ disappearance generated by 0.2 2 2 , which could be measured in future experiments.

  20. IEA Wind Task 26: Offshore Wind Farm Baseline Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Smart, Gavin [Offshore Renewable Energy Catapult, Blyth, Northumberland (United Kingdom); Smith, Aaron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sperstad, Iver Bakken [SINTEF Energy Research, Trondheim (Norway); Prinsen, Bob [Ecofys, Utrecht (Netherlands). TKI Wind Op Zee; Lacal-Arantegui, Roberto [European Commission Joint Research Centre (JRC), Brussels (Belgium)

    2016-06-02

    This document has been produced to provide the definition and rationale for the Baseline Offshore Wind Farm established within IEA Wind Task 26--Cost of Wind Energy. The Baseline has been developed to provide a common starting point for country comparisons and sensitivity analysis on key offshore wind cost and value drivers. The baseline project reflects an approximate average of the characteristics of projects installed between 2012 and 2014, with the project life assumed to be 20 years. The baseline wind farm is located 40 kilometres (km) from construction and operations and maintenance (O&M) ports and from export cable landfall. The wind farm consists of 100 4-megawatt (MW) wind turbines mounted on monopile foundations in an average water depth of 25 metres (m), connected by 33-kilovolt (kV) inter-array cables. The arrays are connected to a single offshore substation (33kV/220kV) mounted on a jacket foundation, with the substation connected via a single 220kV export cable to an onshore substation, 10km from landfall. The wind farm employs a port-based O&M strategy using crew-transfer vessels.

  1. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  2. Relationship between visual field progression and baseline refraction in primary open-angle glaucoma.

    Science.gov (United States)

    Naito, Tomoko; Yoshikawa, Keiji; Mizoue, Shiro; Nanno, Mami; Kimura, Tairo; Suzumura, Hirotaka; Umeda, Yuzo; Shiraga, Fumio

    2016-01-01

    To analyze the relationship between visual field (VF) progression and baseline refraction in Japanese patients with primary open-angle glaucoma (POAG) including normal-tension glaucoma. In this retrospective study, the subjects were patients with POAG who had undergone VF tests at least ten times with a Humphrey Field Analyzer (Swedish interactive thresholding algorithm standard, Central 30-2 program). VF progression was defined as a significantly negative value of mean deviation (MD) slope at the final VF test. Multivariate logistic regression models were applied to detect an association between MD slope deterioration and baseline refraction. A total of 156 eyes of 156 patients were included in this analysis. Significant deterioration of MD slope was observed in 70 eyes of 70 patients (44.9%), whereas no significant deterioration was evident in 86 eyes of 86 patients (55.1%). The eyes with VF progression had significantly higher baseline refraction compared to those without apparent VF progression (-1.9±3.8 diopter [D] vs -3.5±3.4 D, P=0.0048) (mean ± standard deviation). When subject eyes were classified into four groups by the level of baseline refraction applying spherical equivalent (SE): no myopia (SE > -1D), mild myopia (-1D ≥ SE > -3D), moderate myopia (-3D ≥ SE > -6D), and severe myopia (-6D ≥ SE), the Cochran-Armitage trend analysis showed a decreasing trend in the proportion of MD slope deterioration with increasing severity of myopia (P=0.0002). The multivariate analysis revealed that baseline refraction (P=0.0108, odds ratio [OR]: 1.13, 95% confidence interval [CI]: 1.03-1.25) and intraocular pressure reduction rate (P=0.0150, OR: 0.97, 95% CI: 0.94-0.99) had a significant association with MD slope deterioration. In the current analysis of Japanese patients with POAG, baseline refraction was a factor significantly associated with MD slope deterioration as well as intraocular pressure reduction rate. When baseline refraction was classified into

  3. Baseline atmospheric program Australia 1993

    International Nuclear Information System (INIS)

    Francey, R.J.; Dick, A.L.; Derek, N.

    1996-01-01

    This publication reports activities, program summaries and data from the Cape Grim Baseline Air Pollution Station in Tasmania, during the calendar year 1993. These activities represent Australia's main contribution to the Background Air Pollution Monitoring Network (BAPMoN), part of the World Meteorological Organization's Global Atmosphere Watch (GAW). The report includes 5 research reports covering trace gas sampling, ozone and radon interdependence, analysis of atmospheric dimethylsulfide and carbon-disulfide, sampling of trace gas composition of the troposphere, and sulfur aerosol/CCN relationship in marine air. Summaries of program reports for the calendar year 1993 are also included. Tabs., figs., refs

  4. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    Science.gov (United States)

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  5. Decomposition for emission baseline setting in China's electricity sector

    International Nuclear Information System (INIS)

    Steenhof, Paul A.

    2007-01-01

    Decomposition analysis is used to generate carbon dioxide emission baselines in China's electricity sector to the year 2020. This is undertaken from the vantage point of the final consumer of electricity, and therefore considers factors influencing electricity demand, efficiency of generation, sources of energy used for generation purposes, and the effectiveness of transmission and distribution. It is found that since 1980, gains in efficiency of generation have been the most important factor affecting change in the emission intensity of electricity generated. Based upon known energy and economic policy, efficiency gains will continue to contribute to reductions in the emission intensity of electricity generated, however, fuel shifts to natural gas and increases in nuclear generation will further these trends into the future. The analysis confirms other sources in the literature that decomposition is an appropriate technique available for baseline construction, thereby suitable for the emerging carbon market and its related mechanisms

  6. Clinical outcomes of linezolid and vancomycin in patients with nosocomial pneumonia caused by methicillin-resistant Staphylococcus aureus stratified by baseline renal function: a retrospective, cohort analysis.

    Science.gov (United States)

    Liu, Ping; Capitano, Blair; Stein, Amy; El-Solh, Ali A

    2017-05-22

    The primary objective of this study is to assess whether baseline renal function impacts treatment outcomes of linezolid and vancomycin (with a dose-optimized regimen) for methicillin-resistant Staphylococcus aureus (MRSA) pneumonia. We conducted a retrospective cohort analysis of data generated from a prospective, randomized, controlled clinical trial (NCT 00084266). The analysis included 405 patients with culture-proven MRSA pneumonia. Baseline renal function was stratified based on creatinine clearance. Clinical and microbiological success rates and presence of nephrotoxicity were assessed at the end of treatment (EOT) and end of study (EOS). Multivariate logistic regression analyses of baseline patient characteristics, including treatment, were performed to identify independent predictors of efficacy. Vancomycin concentrations were analyzed using a nonlinear mixed-effects modeling approach. The relationships between vancomycin exposures, pharmacokinetic-pharmacodynamic index (trough concentration, area under the curve over a 24-h interval [AUC 0-24 ], and AUC 0-24 /MIC) and efficacy/nephrotoxicity were assessed in MRSA pneumonia patients using univariate logistic regression or Cox proportional hazards regression analysis approach. After controlling for use of vasoactive agents, choice of antibiotic therapy and bacteremia, baseline renal function was not correlated with clinical and microbiological successes in MRSA pneumonia at either end of treatment or at end of study for both treatment groups. No positive association was identified between vancomycin exposures and efficacy in these patients. Higher vancomycin exposures were correlated with an increased risk of nephrotoxicity (e.g., hazards ratio [95% confidence interval] for a 5 μg/ml increase in trough concentration: 1.42 [1.10, 1.82]). In non-dialysis patients, baseline renal function did not impact the differences in efficacy or nephrotoxicity with treatment of linezolid versus vancomycin in MRSA

  7. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  8. Young Adult and Usual Adult Body Mass Index and Multiple Myeloma Risk: A Pooled Analysis in the International Multiple Myeloma Consortium (IMMC).

    Science.gov (United States)

    Birmann, Brenda M; Andreotti, Gabriella; De Roos, Anneclaire J; Camp, Nicola J; Chiu, Brian C H; Spinelli, John J; Becker, Nikolaus; Benhaim-Luzon, Véronique; Bhatti, Parveen; Boffetta, Paolo; Brennan, Paul; Brown, Elizabeth E; Cocco, Pierluigi; Costas, Laura; Cozen, Wendy; de Sanjosé, Silvia; Foretová, Lenka; Giles, Graham G; Maynadié, Marc; Moysich, Kirsten; Nieters, Alexandra; Staines, Anthony; Tricot, Guido; Weisenburger, Dennis; Zhang, Yawei; Baris, Dalsu; Purdue, Mark P

    2017-06-01

    Background: Multiple myeloma risk increases with higher adult body mass index (BMI). Emerging evidence also supports an association of young adult BMI with multiple myeloma. We undertook a pooled analysis of eight case-control studies to further evaluate anthropometric multiple myeloma risk factors, including young adult BMI. Methods: We conducted multivariable logistic regression analysis of usual adult anthropometric measures of 2,318 multiple myeloma cases and 9,609 controls, and of young adult BMI (age 25 or 30 years) for 1,164 cases and 3,629 controls. Results: In the pooled sample, multiple myeloma risk was positively associated with usual adult BMI; risk increased 9% per 5-kg/m 2 increase in BMI [OR, 1.09; 95% confidence interval (CI), 1.04-1.14; P = 0.007]. We observed significant heterogeneity by study design ( P = 0.04), noting the BMI-multiple myeloma association only for population-based studies ( P trend = 0.0003). Young adult BMI was also positively associated with multiple myeloma (per 5-kg/m 2 ; OR, 1.2; 95% CI, 1.1-1.3; P = 0.0002). Furthermore, we observed strong evidence of interaction between younger and usual adult BMI ( P interaction adult BMI may increase multiple myeloma risk and suggest that healthy BMI maintenance throughout life may confer an added benefit of multiple myeloma prevention. Cancer Epidemiol Biomarkers Prev; 26(6); 876-85. ©2017 AACR . ©2017 American Association for Cancer Research.

  9. Measurement of Long Baseline Neutrino Oscillations and Improvements from Deep Learning

    Energy Technology Data Exchange (ETDEWEB)

    Psihas, Fernanda [Indiana U.

    2018-01-01

    NOvA is a long-baseline neutrino oscillation experiment which measures the oscillation of muon neutrinos from the NuMI beam at Fermilab after they travel through the Earth for 810 km. In this dissertation I describe the operations and monitoring of the detectors which make it possible to record over 98% of the delivered neutrino beam. I also present reconstruction and identification techniques using deep convolutional neural networks (CNNs), which are applicable to multiple analyses. Lastly, I detail the oscillation analyses in the $\

  10. Establishing baseline water quality for household wells within the Marcellus Shale gas region, Susquehanna County, Pennsylvania, U.S.A

    International Nuclear Information System (INIS)

    Rhodes, Amy L.; Horton, Nicholas J.

    2015-01-01

    Highlights: • Laws do not specify how baseline tests are conducted prior to hydraulic fracturing. • Study estimates variability of groundwater chemistry for repeated measurements. • Water chemistry varies more geographically than at a single, household well. • A single, certified test can characterize baseline geochemistry of groundwater. • Multiple measurements better estimate upper limits of regional baseline values. - Abstract: Flowback fluids associated with hydraulic fracturing shale gas extraction are a potential source of contamination for shallow aquifers. In the Marcellus Shale region of northeastern Pennsylvania, certified water tests have been used to establish baseline water chemistry of private drinking water wells. This study investigates whether a single, certified multiparameter water test is sufficient for establishing baseline water chemistry from which possible future contamination by flowback waters could be reliably recognized. We analyzed the water chemistry (major and minor inorganic elements and stable isotopic composition) of multiple samples collected from lake, spring, and well water from 35 houses around Fiddle Lake, Susquehanna County, PA that were collected over approximately a two-year period. Statistical models estimated variance of results within and between households and tested for significant differences between means of our repeated measurements and prior certified water tests. Overall, groundwater chemistry varies more spatially due to heterogeneity of minerals within the bedrock aquifer and due to varying inputs of road salt runoff from paved roads than it does temporally at a single location. For wells located within road salt-runoff zones, Na + and Cl − concentrations, although elevated, are generally consistent through repeated measurements. High acid neutralizing capacity (ANC) and base cation concentrations in well water sourced from mineral weathering reactions, and a uniform stable isotopic composition for

  11. Could baseline establishment be counterproductive for emissions reduction? Insights from Vietnam’s building sector

    DEFF Research Database (Denmark)

    Henrysson, Maryna; Lütken, Søren; Puig, Daniel

    2017-01-01

    Mitigation Actions (NAMAs) to illustrate institutional dynamics, nationally and transnationally, as well as to question whether demands for baseline setting achieve the ideal trade-off between actual GHG emissions reduction and institutionalized demands for accountability. The analysis reveals that......, it argues for the abolition of baselines in favour of adequate monitoring and evaluation, from the perspective that requirement for deviation from fictitious baselines is unproductive and only serves an international techno-managerial discourse....

  12. Dependence of optimum baseline setting on scatter fraction and detector response function

    International Nuclear Information System (INIS)

    Atkins, F.B.; Beck, R.N.; Hoffer, P.B.; Palmer, D.

    1977-01-01

    A theoretical and experimental investigation has been undertaken to determine the dependence of an optimum baseline setting on the amount of scattered radiation recorded in a spectrum, and on the energy resolution of the detector. In particular, baseline settings were established for clinical examinations which differed greatly in the amount of scattered radiation, namely, liver and brain scans, for which individual variations were found to produce only minimal fluctuations in the optimum baseline settings. This analysis resulted in an optimum baseline setting of 125.0 keV for brain scans and 127.2 keV for liver scans for the scintillation camera used in these studies. The criterion that was used is based on statistical considerations of the measurement of an unscattered component in the presence of a background due to scattered photons. The limitations of such a criterion are discussed, and phantom images are presented to illustrate these effects at various baseline settings. (author)

  13. A comparison of baseline methodologies for 'Reducing Emissions from Deforestation and Degradation'

    Directory of Open Access Journals (Sweden)

    Kok Kasper

    2009-07-01

    Full Text Available Abstract Background A mechanism for emission reductions from deforestation and degradation (REDD is very likely to be included in a future climate agreement. The choice of REDD baseline methodologies will crucially influence the environmental and economic effectiveness of the climate regime. We compare three different historical baseline methods and one innovative dynamic model baseline approach to appraise their applicability under a future REDD policy framework using a weighted multi-criteria analysis. Results The results show that each baseline method has its specific strengths and weaknesses. Although the dynamic model allows for the best environmental and for comparatively good economic performance, its high demand for data and technical capacity limit the current applicability in many developing countries. Conclusion The adoption of a multi-tier approach will allow countries to select the baseline method best suiting their specific capabilities and data availability while simultaneously ensuring scientific transparency, environmental effectiveness and broad political support.

  14. Effects of Functional Mobility Skills Training for Adults with Severe Multiple Disabilities

    Science.gov (United States)

    Whinnery, Stacie B.; Whinnery, Keith W.

    2011-01-01

    This study investigated the effects of a functional mobility program on the functional standing and walking skills of five adults with developmental disabilities. The Mobility Opportunities Via Education (MOVE) Curriculum was implemented using a multiple-baseline across subjects design. Repeated measures were taken during baseline, intervention…

  15. Theoretical analysis of moiré fringe multiplication under a scanning electron microscope

    International Nuclear Information System (INIS)

    Li, Yanjie; Xie, Huimin; Chen, Pengwan; Zhang, Qingming

    2011-01-01

    In this study, theoretical analysis and experimental verification of fringe multiplication under a scanning electron microscope (SEM) are presented. Fringe multiplication can be realized by enhancing the magnification or the number of scanning lines under the SEM. A universal expression of the pitch of moiré fringes is deduced. To apply this method to deformation measurement, the calculation formulas of strain and displacement are derived. Compared to natural moiré, the displacement sensitivity is increased by fringe multiplication while the strain sensitivity may be retained or enhanced depending on the number of scanning lines used. The moiré patterns are formed by the interference of a 2000 lines mm −1 grating with the scanning lines of SEM, and the measured parameters of moiré fringes from experimental results agree well with theoretical analysis

  16. Two-locus linkage analysis in multiple sclerosis (MS)

    Energy Technology Data Exchange (ETDEWEB)

    Tienari, P.J. (National Public Health Institute, Helsinki (Finland) Univ. of Helsinki (Finland)); Terwilliger, J.D.; Ott, J. (Columbia Univ., New York (United States)); Palo, J. (Univ. of Helsinki (Finland)); Peltonen, L. (National Public Health Institute, Helsinki (Finland))

    1994-01-15

    One of the major challenges in genetic linkage analyses is the study of complex diseases. The authors demonstrate here the use of two-locus linkage analysis in multiple sclerosis (MS), a multifactorial disease with a complex mode of inheritance. In a set of Finnish multiplex families, they have previously found evidence for linkage between MS susceptibility and two independent loci, the myelin basic protein gene (MBP) on chromosome 18 and the HLA complex on chromosome 6. This set of families provides a unique opportunity to perform linkage analysis conditional on two loci contributing to the disease. In the two-trait-locus/two-marker-locus analysis, the presence of another disease locus is parametrized and the analysis more appropriately treats information from the unaffected family member than single-disease-locus analysis. As exemplified here in MS, the two-locus analysis can be a powerful method for investigating susceptibility loci in complex traits, best suited for analysis of specific candidate genes, or for situations in which preliminary evidence for linkage already exists or is suggested. 41 refs., 6 tabs.

  17. mma: An R Package for Mediation Analysis with Multiple Mediators

    Directory of Open Access Journals (Sweden)

    Qingzhao Yu

    2017-04-01

    Full Text Available Mediation refers to the effect transmitted by mediators that intervene in the relationship between an exposure and a response variable. Mediation analysis has been broadly studied in many fields. However, it remains a challenge for researchers to consider complicated associations among variables and to differentiate individual effects from multiple mediators. [1] proposed general definitions of mediation effects that were adaptable to all different types of response (categorical or continuous, exposure, or mediation variables. With these definitions, multiple mediators of different types can be considered simultaneously, and the indirect effects carried by individual mediators can be separated from the total effect. Moreover, the derived mediation analysis can be performed with general predictive models. That is, the relationships among variables can be modeled using not only generalized linear models but also nonparametric models such as the Multiple Additive Regression Trees. Therefore, more complicated variable transformations and interactions can be considered in analyzing the mediation effects. The proposed method is realized by the R package 'mma'. We illustrate in this paper the proposed method and how to use 'mma' to estimate mediation effects and make inferences.

  18. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  19. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    Science.gov (United States)

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  20. Functional analysis screening for multiple topographies of problem behavior.

    Science.gov (United States)

    Bell, Marlesha C; Fahmie, Tara A

    2018-04-23

    The current study evaluated a screening procedure for multiple topographies of problem behavior in the context of an ongoing functional analysis. Experimenters analyzed the function of a topography of primary concern while collecting data on topographies of secondary concern. We used visual analysis to predict the function of secondary topographies and a subsequent functional analysis to test those predictions. Results showed that a general function was accurately predicted for five of six (83%) secondary topographies. A specific function was predicted and supported for a subset of these topographies. The experimenters discuss the implication of these results for clinicians who have limited time for functional assessment. © 2018 Society for the Experimental Analysis of Behavior.

  1. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  2. Baselines for carbon emissions in the Indian and Chinese power sectors: Implications for international carbon trading

    International Nuclear Information System (INIS)

    Zhang Chi; Shukla, P.R.; Victor, David G.; Heller, Thomas C.; Biswas, Debashish; Nag, Tirthankar

    2006-01-01

    The study examines the dynamics of carbon emissions baselines of electricity generation in Indian states and Chinese provinces in the backdrop of ongoing electricity sector reforms in these countries. Two Indian states-Gujarat and Andhra Pradesh, and three Chinese provinces-Guangdong, Liaoning and Hubei have been chosen for detailed analysis to bring out regional variations that are not captured in aggregate country studies. The study finds that fuel mix is the main driver behind the trends exhibited by the carbon baselines in these five cases. The cases confirm that opportunities exist in the Indian and Chinese electricity sectors to lower carbon intensity mainly in the substitution of other fuels for coal and, to a lesser extent, adoption of more efficient and advanced coal-fired generation technology. Overall, the findings suggest that the electricity sectors in India and China are becoming friendlier to the global environment. Disaggregated analysis, detailed and careful industry analysis is essential to establishing a power sector carbon emissions baseline as a reference for CDM crediting. However, considering all the difficulties associated with the baseline issue, our case studies demonstrate that there is merit in examining alternate approaches that rely on more aggregated baselines

  3. Baseline and cognition activated brain SPECT imaging in depression

    International Nuclear Information System (INIS)

    Zhao Jinhua; Lin Xiangtong; Jiang Kaida; Liu Yongchang; Xu Lianqin

    1998-01-01

    Purpose: To evaluate the regional cerebral blood flow (rCBF) abnormalities through the semiquantitative analysis of the baseline and cognition activated rCBF imaging in unmedicated depressed patients. Methods: 27 depressed patients unmedicated by anti-depressants were enrolled. The diagnosis (depression of moderate degree with somatization) was confirmed by the ICD-10 criteria. 15 age matched normal controls were studied under identical conditions. Baseline and cognition activated 99m Tc-ECD SPECT were performed on 21 of the 27 patients with depression and 13 of the 15 normal controls. Baseline 99m Tc-ECD SPECT alone were performed on the rest 6 patients with depression and 2 normal controls. The cognitive activation is achieved by Wisconsin Card Sorting Test (WCST). 1110 MBq of 99m Tc-ECD was administered by intravenous bolus injection 5 minutes after the onset of the WCST. Semi-quantitative analysis was conducted with the 7th, 8th, 9th, 10th, 11th slices of the transaxial imaging. rCBF ratios of every ROI were calculated using the average tissue activity in the region divided by the maximum activity in the cerebellum. Results: 1) The baseline rCBF of left frontal (0.720) and left temporal lobe (0.720) were decreased significantly in depressed patients comparing with those of the control subjects. 2) The activated rCBF of left frontal lobe (0.719) and left temporal lobe (0.690), left parietal lobe (0.701) were decreased evidently than those of the controls. Conclusions: 1) Hypoperfusions of left frontal and left temporal cortexes were identified in patients with depression. 2) The hypoperfusion of left frontal and left temporal cortexes may be the cause of cognition disorder and depressed mood in patients with depression. 3) Cognition activated brain perfusion imaging is helpful for making a more accurate diagnosis of depression

  4. Baseline micronuclei frequency in children: estimates from meta- and pooled analyses

    DEFF Research Database (Denmark)

    Neri, Monica; Ceppi, Marcello; Knudsen, Lisbeth E

    2005-01-01

    the statistical power of studies and to assess the quality of data. In this article, we provide estimates of the baseline frequency of MN in children, conducting a meta-analysis of MN frequency reported by field studies in children and a pooled analysis of individual data [available from published studies...

  5. High titers of both rheumatoid factor and anti-CCP antibodies at baseline in patients with rheumatoid arthritis are associated with increased circulating baseline TNF level, low drug levels, and reduced clinical responses: a post hoc analysis of the RISING study.

    Science.gov (United States)

    Takeuchi, Tsutomu; Miyasaka, Nobuyuki; Inui, Takashi; Yano, Toshiro; Yoshinari, Toru; Abe, Tohru; Koike, Takao

    2017-09-02

    Although both rheumatoid factor (RF) and anticyclic citrullinated peptide antibodies (anti-CCP) are useful for diagnosing rheumatoid arthritis (RA), the impact of these autoantibodies on the efficacy of tumor necrosis factor (TNF) inhibitors has been controversial. The aim of this post hoc analysis of a randomized double-blind study (the RISING study) was to investigate the influences of RF and anti-CCP on the clinical response to infliximab in patients with RA. Methotrexate-refractory patients with RA received 3 mg/kg of infliximab from weeks 0 to 6 and then 3, 6, or 10 mg/kg every 8 weeks from weeks 14 to 46. In this post hoc analysis, patients were stratified into three classes on the basis of baseline RF/anti-CCP titers: "low/low-C" (RF < 55 IU/ml, anti-CCP < 42 U/ml), "high/high-C" (RF ≥ 160 IU/ml, anti-CCP ≥ 100 U/ml), and "middle-C" (neither low/low-C nor high/high-C). Baseline plasma TNF level, serum infliximab level, and disease activity were compared between the three classes. Baseline RF and anti-CCP titers showed significant correlations with baseline TNF and infliximab levels in weeks 2-14. Comparison of the three classes showed that baseline TNF level was lowest in the low/low-C group and highest in the high/high-C group (median 0.73 versus 1.15 pg/ml), that infliximab levels at week 14 were highest in the low/low-C group and lowest in the high/high-C group (median 1.0 versus 0.1 μg/ml), and that Disease Activity Score in 28 joints based on C-reactive protein at week 14 was lowest in the low/low-C group and highest in the high/high-C group (median 3.17 versus 3.82). A similar correlation was observed at week 54 in the 3 mg/kg dosing group, but not in the 6 or 10 mg/kg group. Significant decreases in both RF and anti-CCP were observed during infliximab treatment. RF/anti-CCP titers correlated with TNF level. This might explain the association of RF/anti-CCP with infliximab level and clinical response in patients with RA

  6. Baseline Quality of Life and Risk of Stroke in the ALLHAT Study (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial).

    Science.gov (United States)

    Shams, Tanzila; Auchus, Alexander P; Oparil, Suzanne; Wright, Clinton B; Wright, Jackson; Furlan, Anthony J; Sila, Cathy A; Davis, Barry R; Pressel, Sara; Yamal, Jose-Miguel; Einhorn, Paula T; Lerner, Alan J

    2017-11-01

    The visual analogue scale is a self-reported, validated tool to measure quality of life (QoL). Our purpose was to determine whether baseline QoL predicted strokes in the ALLHAT study (Antihypertensive and Lipid Lowering Treatment to Prevent Heart Attack Trial) and evaluate determinants of poststroke change in QoL. In the ALLHAT study, among the 33 357 patients randomized to treatment arms, 1525 experienced strokes; 1202 (79%) strokes were nonfatal. This study cohort includes 32 318 (97%) subjects who completed the baseline visual analogue scale QoL estimate. QoL was measured on a visual analogue scale and adjusted using a Torrance transformation (transformed QoL [TQoL]). Kaplan-Meier curves and adjusted proportional hazards analyses were used to estimate the effect of TQoL on the risk of stroke, on a continuous scale (0-1) and by quartiles (≤0.81, >0.81≤0.89, >0.89≤0.95, >0.95). We analyzed the change from baseline to first poststroke TQoL using adjusted linear regression. After adjusting for multiple stroke risk factors, the hazard ratio for stroke events for baseline TQoL was 0.93 (95% confidence interval, 0.89-0.98) per 0.1 U increase. The lowest baseline TQoL quartile had a 20% increased stroke risk (hazard ratio=1.20 [95% confidence interval, 1.00-1.44]) compared with the reference highest quartile TQoL. Poststroke TQoL change was significant within all treatment groups ( P ≤0.001). Multivariate regression analysis revealed that baseline TQoL was the strongest predictor of poststroke TQoL with similar results for the untransformed QoL. The lowest baseline TQoL quartile had a 20% higher stroke risk than the highest quartile. Baseline TQoL was the only factor that predicted poststroke change in TQoL. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00000542. © 2017 American Heart Association, Inc.

  7. 33 CFR 2.20 - Territorial sea baseline.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line.... Normally, the territorial sea baseline is the mean low water line along the coast of the United States...

  8. A P-value model for theoretical power analysis and its applications in multiple testing procedures

    Directory of Open Access Journals (Sweden)

    Fengqing Zhang

    2016-10-01

    Full Text Available Abstract Background Power analysis is a critical aspect of the design of experiments to detect an effect of a given size. When multiple hypotheses are tested simultaneously, multiplicity adjustments to p-values should be taken into account in power analysis. There are a limited number of studies on power analysis in multiple testing procedures. For some methods, the theoretical analysis is difficult and extensive numerical simulations are often needed, while other methods oversimplify the information under the alternative hypothesis. To this end, this paper aims to develop a new statistical model for power analysis in multiple testing procedures. Methods We propose a step-function-based p-value model under the alternative hypothesis, which is simple enough to perform power analysis without simulations, but not too simple to lose the information from the alternative hypothesis. The first step is to transform distributions of different test statistics (e.g., t, chi-square or F to distributions of corresponding p-values. We then use a step function to approximate each of the p-value’s distributions by matching the mean and variance. Lastly, the step-function-based p-value model can be used for theoretical power analysis. Results The proposed model is applied to problems in multiple testing procedures. We first show how the most powerful critical constants can be chosen using the step-function-based p-value model. Our model is then applied to the field of multiple testing procedures to explain the assumption of monotonicity of the critical constants. Lastly, we apply our model to a behavioral weight loss and maintenance study to select the optimal critical constants. Conclusions The proposed model is easy to implement and preserves the information from the alternative hypothesis.

  9. Program Baseline Change Control Board charter

    International Nuclear Information System (INIS)

    1993-02-01

    The purpose of this Charter is to establish the Program Baseline Change Control Board (PBCCB) for the Office of Civilian Radioactive Waste Management (OCRWM) Program, and to describe its organization, responsibilities, and basic methods of operation. Guidance for implementing this Charter is provided by the OCRWM Baseline Management Plan (BMP) and OCRWM Program Baseline Change Control Procedure

  10. Analysis of Multiple Genomic Sequence Alignments: A Web Resource, Online Tools, and Lessons Learned From Analysis of Mammalian SCL Loci

    Science.gov (United States)

    Chapman, Michael A.; Donaldson, Ian J.; Gilbert, James; Grafham, Darren; Rogers, Jane; Green, Anthony R.; Göttgens, Berthold

    2004-01-01

    Comparative analysis of genomic sequences is becoming a standard technique for studying gene regulation. However, only a limited number of tools are currently available for the analysis of multiple genomic sequences. An extensive data set for the testing and training of such tools is provided by the SCL gene locus. Here we have expanded the data set to eight vertebrate species by sequencing the dog SCL locus and by annotating the dog and rat SCL loci. To provide a resource for the bioinformatics community, all SCL sequences and functional annotations, comprising a collation of the extensive experimental evidence pertaining to SCL regulation, have been made available via a Web server. A Web interface to new tools specifically designed for the display and analysis of multiple sequence alignments was also implemented. The unique SCL data set and new sequence comparison tools allowed us to perform a rigorous examination of the true benefits of multiple sequence comparisons. We demonstrate that multiple sequence alignments are, overall, superior to pairwise alignments for identification of mammalian regulatory regions. In the search for individual transcription factor binding sites, multiple alignments markedly increase the signal-to-noise ratio compared to pairwise alignments. PMID:14718377

  11. The relationship, structure and profiles of schizophrenia measurements: a post-hoc analysis of the baseline measures from a randomized clinical trial

    Directory of Open Access Journals (Sweden)

    Chen Lei

    2011-12-01

    Full Text Available Background To fully assess the various dimensions affected by schizophrenia, clinical trials often include multiple scales measuring various symptom profiles, cognition, quality of life, subjective well-being, and functional impairment. In this exploratory study, we characterized the relationships among six clinical, functional, cognitive, and quality-of-life measures, identifying a parsimonious set of measurements. Methods We used baseline data from a randomized, multicenter study of patients diagnosed with schizophrenia, schizoaffective disorder, or schizophreniform disorder who were experiencing an acute symptom exacerbation (n = 628 to examine the relationship among several outcome measures. These measures included the Positive and Negative Syndrome Scale (PANSS, Montgomery-Asberg Depression Rating Scale (MADRS, Brief Assessment of Cognition in Schizophrenia Symbol Coding Test, Subjective Well-being Under Neuroleptics Scale Short Form (SWN-K, Schizophrenia Objective Functioning Instrument (SOFI, and Quality of Life Scale (QLS. Three analytic approaches were used: 1 path analysis; 2 factor analysis; and 3 categorical latent variable analysis. In the optimal path model, the SWN-K was selected as the final outcome, while the SOFI mediated the effect of the exogenous variables (PANSS, MADRS on the QLS. Results The overall model explained 47% of variance in QLS and 17% of the variance in SOFI, but only 15% in SWN-K. Factor analysis suggested four factors: "Functioning," "Daily Living," "Depression," and "Psychopathology." A strong positive correlation was observed between the SOFI and QLS (r = 0.669, and both the QLS and SOFI loaded on the "Functioning" factor, suggesting redundancy between these scales. The measurement profiles from the categorical latent variable analysis showed significant variation in functioning and quality of life despite similar levels of psychopathology. Conclusions Researchers should consider collecting PANSS, SOFI, and

  12. Using Module Analysis for Multiple Choice Responses: A New Method Applied to Force Concept Inventory Data

    Science.gov (United States)

    Brewe, Eric; Bruun, Jesper; Bearden, Ian G.

    2016-01-01

    We describe "Module Analysis for Multiple Choice Responses" (MAMCR), a new methodology for carrying out network analysis on responses to multiple choice assessments. This method is used to identify modules of non-normative responses which can then be interpreted as an alternative to factor analysis. MAMCR allows us to identify conceptual…

  13. Updated global 3+1 analysis of short-baseline neutrino oscillations

    Science.gov (United States)

    Gariazzo, S.; Giunti, C.; Laveder, M.; Li, Y. F.

    2017-06-01

    We present the results of an updated fit of short-baseline neutrino oscillation data in the framework of 3+1 active-sterile neutrino mixing. We first consider ν e and {\\overline{ν}}_e disappearance in the light of the Gallium and reactor anomalies. We discuss the implications of the recent measurement of the reactor {\\overline{ν}}_e spectrum in the NEOS experiment, which shifts the allowed regions of the parameter space towards smaller values of | U e4|2. The β-decay constraints of the Mainz and Troitsk experiments allow us to limit the oscillation length between about 2 cm and 7 m at 3 σ for neutrinos with an energy of 1 MeV. The corresponding oscillations can be discovered in a model-independent way in ongoing reactor and source experiments by measuring ν e and {\\overline{ν}}_e disappearance as a function of distance. We then consider the global fit of the data on short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} transitions in the light of the LSND anomaly, taking into account the constraints from {}_{ν_e}^{(-)} and {}_{ν_{μ}}^{(-)} disappearance experiments, including the recent data of the MINOS and IceCube experiments. The combination of the NEOS constraints on | U e4|2 and the MINOS and IceCube constraints on | U μ4|2 lead to an unacceptable appearance-disappearance tension which becomes tolerable only in a pragmatic fit which neglects the MiniBooNE low-energy anomaly. The minimization of the global χ 2 in the space of the four mixing parameters Δ m 41 2 , | U e4|2, | U μ4|2, and | U τ4|2 leads to three allowed regions with narrow Δ m 41 2 widths at Δ m 41 2 ≈ 1.7 (best-fit), 1.3 (at 2 σ), 2.4 (at 3 σ) eV2. The effective amplitude of short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} oscillations is limited by 0.00048 ≲ sin2 2 ϑ eμ ≲ 0.0020 at 3 σ. The restrictions of the allowed regions of the mixing parameters with respect to our previous global fits are mainly due to the NEOS constraints. We present a comparison of the

  14. A New Approach to Estimate Forest Parameters Using Dual-Baseline Pol-InSAR Data

    Science.gov (United States)

    Bai, L.; Hong, W.; Cao, F.; Zhou, Y.

    2009-04-01

    In POL-InSAR applications using ESPRIT technique, it is assumed that there exist stable scattering centres in the forest. However, the observations in forest severely suffer from volume and temporal decorrelation. The forest scatters are not stable as assumed. The obtained interferometric information is not accurate as expected. Besides, ESPRIT techniques could not identify the interferometric phases corresponding to the ground and the canopy. It provides multiple estimations for the height between two scattering centers due to phase unwrapping. Therefore, estimation errors are introduced to the forest height results. To suppress the two types of errors, we use the dual-baseline POL-InSAR data to estimate forest height. Dual-baseline coherence optimization is applied to obtain interferometric information of stable scattering centers in the forest. From the interferometric phases for different baselines, estimation errors caused by phase unwrapping is solved. Other estimation errors can be suppressed, too. Experiments are done to the ESAR L band POL-InSAR data. Experimental results show the proposed methods provide more accurate forest height than ESPRIT technique.

  15. Mobile Robots Path Planning Using the Overall Conflict Resolution and Time Baseline Coordination

    Directory of Open Access Journals (Sweden)

    Yong Ma

    2014-01-01

    Full Text Available This paper aims at resolving the path planning problem in a time-varying environment based on the idea of overall conflict resolution and the algorithm of time baseline coordination. The basic task of the introduced path planning algorithms is to fulfill the automatic generation of the shortest paths from the defined start poses to their end poses with consideration of generous constraints for multiple mobile robots. Building on this, by using the overall conflict resolution, within the polynomial based paths, we take into account all the constraints including smoothness, motion boundary, kinematics constraints, obstacle avoidance, and safety constraints among robots together. And time baseline coordination algorithm is proposed to process the above formulated problem. The foremost strong point is that much time can be saved with our approach. Numerical simulations verify the effectiveness of our approach.

  16. Baseline factors that influence ASAS 20 response in patients with ankylosing spondylitis treated with etanercept.

    Science.gov (United States)

    Davis, John C; Van der Heijde, Désirée M F M; Dougados, Maxime; Braun, Jurgen; Cush, John J; Clegg, Daniel O; Inman, Robert D; de Vries, Todd; Tsuji, Wayne H

    2005-09-01

    To examine the baseline demographic and disease characteristics that might influence improvement as measured by the Assessment in Ankylosing Spondylitis Response Criteria (ASAS 20) in patients with ankylosing spondylitis (AS). A multicenter Phase 3 study was performed to compare the safety and efficacy of 24 weeks of etanercept 25 mg subcutaneous injection twice weekly (n = 138) and placebo (n = 139) in patients with AS. The ASAS 20 was measured at multiple time points. Using a significance level of 0.05, a repeated measures logistic regression model was used to determine which baseline factors influenced response in the etanercept-treated patients during the 24-week double blind portion of the trial. The following baseline factors were used in the model: demographic and disease severity variables, concomitant medications, extra-articular manifestations, and HLA-B27 status. The predictive capability of the model was then tested on the patients receiving placebo after they had received open-label etanercept treatment. Baseline factors that were significant predictors of an ASAS 20 response in etanercept-treated patients were C-reactive protein (CRP), back pain score, and Bath Ankylosing Spondylitis Functional Index (BASFI) score. Although clinical response to etanercept was seen at all levels of baseline disease activity, responses were consistently more likely with higher CRP levels or back pain scores and less likely with increased BASFI scores at baseline. Higher CRP values and back pain scores and lower BASFI scores at baseline were significant predictors of a higher ASAS 20 response in patients with AS receiving etanercept but predictive value was of insufficient magnitude to determine treatment in individual patients.

  17. A neutron multiplicity analysis method for uranium samples with liquid scintillators

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Hao, E-mail: zhouhao_ciae@126.com [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China); Lin, Hongtao [Xi' an Reasearch Institute of High-tech, Xi' an, Shaanxi 710025 (China); Liu, Guorong; Li, Jinghuai; Liang, Qinglei; Zhao, Yonggang [China Institute of Atomic Energy, P.O.BOX 275-8, Beijing 102413 (China)

    2015-10-11

    A new neutron multiplicity analysis method for uranium samples with liquid scintillators is introduced. An active well-type fast neutron multiplicity counter has been built, which consists of four BC501A liquid scintillators, a n/γdiscrimination module MPD-4, a multi-stop time to digital convertor MCS6A, and two Am–Li sources. A mathematical model is built to symbolize the detection processes of fission neutrons. Based on this model, equations in the form of R=F*P*Q*T could be achieved, where F indicates the induced fission rate by interrogation sources, P indicates the transfer matrix determined by multiplication process, Q indicates the transfer matrix determined by detection efficiency, T indicates the transfer matrix determined by signal recording process and crosstalk in the counter. Unknown parameters about the item are determined by the solutions of the equations. A {sup 252}Cf source and some low enriched uranium items have been measured. The feasibility of the method is proven by its application to the data analysis of the experiments.

  18. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  19. Factors influencing changes in health related quality of life of caregivers of persons with multiple chronic conditions.

    Science.gov (United States)

    Duggleby, Wendy; Williams, Allison; Ghosh, Sunita; Moquin, Heather; Ploeg, Jenny; Markle-Reid, Maureen; Peacock, Shelley

    2016-05-27

    The majority of care for older adults with multiple chronic conditions (MCC) is provided by family (including friends) caregivers. Although caregivers have reported positive benefits to caregiving they also experience decreases in their physical and mental health. As there is a critical need for supportive interventions for this population, it is important to know what influences the health of family caregivers of persons with MCC. This research examined relationships among the changes from baseline to 6 months in health related quality of life (SF12v2) of family caregivers caring for older adults with multiple chronic conditions and the following factors: a) demographic variables, b) gender identity [Bem Sex Role Inventory (BSRI)] c) changes in general self-efficacy [General Self Efficacy Scale (GSES) (baseline to 6 months) and d)) changes in caregiver burden [Zarit Burden Inventory (ZBI)] baseline to 6 months. Specific hypothesis were based on a conceptual framework generated from a literature review. This is a secondary analysis of a study of 194 family caregivers who were recruited from two Canadian provinces Alberta and Ontario. Data were collected in-person, by telephone, by Skype or by mail at two time periods spaced 6 months apart. The sample size for this secondary analysis was n = 185, as 9 participants had dropped out of the study at 6 months. Changes in the scores between the two time periods were calculated for SF12v2 physical component score (PCS) and mental component score (MCS) and the other main variables. Generalized Linear Modeling was then used to determine factors associated with changes in HRQL. Participants who had significantly positive increases in their MCS (baseline to 6 months) reported lower burden (ZBI, p gender identity (which incorporates assertive and instrumental approaches to caregiving), and confidence in the ability to deal with difficult situations was positively related to improvement in mental health for caregivers of

  20. Interventional Effects for Mediation Analysis with Multiple Mediators.

    Science.gov (United States)

    Vansteelandt, Stijn; Daniel, Rhian M

    2017-03-01

    The mediation formula for the identification of natural (in)direct effects has facilitated mediation analyses that better respect the nature of the data, with greater consideration of the need for confounding control. The default assumptions on which it relies are strong, however. In particular, they are known to be violated when confounders of the mediator-outcome association are affected by the exposure. This complicates extensions of counterfactual-based mediation analysis to settings that involve repeatedly measured mediators, or multiple correlated mediators. VanderWeele, Vansteelandt, and Robins introduced so-called interventional (in)direct effects. These can be identified under much weaker conditions than natural (in)direct effects, but have the drawback of not adding up to the total effect. In this article, we adapt their proposal to achieve an exact decomposition of the total effect, and extend it to the multiple mediator setting. Interestingly, the proposed effects capture the path-specific effects of an exposure on an outcome that are mediated by distinct mediators, even when-as often-the structural dependence between the multiple mediators is unknown, for instance, when the direction of the causal effects between the mediators is unknown, or there may be unmeasured common causes of the mediators.

  1. Multiple Criteria and Multiple Periods Performance Analysis: The Comparison of North African Railways

    Science.gov (United States)

    Sabri, Karim; Colson, Gérard E.; Mbangala, Augustin M.

    2008-10-01

    Multi-period differences of technical and financial performances are analysed by comparing five North African railways over the period (1990-2004). A first approach is based on the Malmquist DEA TFP index for measuring the total factors productivity change, decomposed into technical efficiency change and technological changes. A multiple criteria analysis is also performed using the PROMETHEE II method and the software ARGOS. These methods provide complementary detailed information, especially by discriminating the technological and management progresses by Malmquist and the two dimensions of performance by Promethee: that are the service to the community and the enterprises performances, often in conflict.

  2. Analysis of the thermal balance characteristics for multiple-connected piezoelectric transformers.

    Science.gov (United States)

    Park, Joung-Hu; Cho, Bo-Hyung; Choi, Sung-Jin; Lee, Sang-Min

    2009-08-01

    Because the amount of power that a piezoelectric transformer (PT) can handle is limited, multiple connections of PTs are necessary for the power-capacity improvement of PT-applications. In the connection, thermal imbalance between the PTs should be prevented to avoid the thermal runaway of each PT. The thermal balance of the multiple-connected PTs is dominantly affected by the electrothermal characteristics of individual PTs. In this paper, the thermal balance of both parallel-parallel and parallel-series connections are analyzed by electrical model parameters. For quantitative analysis, the thermal-balance effects are estimated by the simulation of the mechanical loss ratio between the PTs. The analysis results show that with PTs of similar characteristics, the parallel-series connection has better thermal balance characteristics due to the reduced mechanical loss of the higher temperature PT. For experimental verification of the analysis, a hardware-prototype test of a Cs-Lp type 40 W adapter system with radial-vibration mode PTs has been performed.

  3. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    Science.gov (United States)

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  4. Causal Mediation Analysis for the Cox Proportional Hazards Model with a Smooth Baseline Hazard Estimator.

    Science.gov (United States)

    Wang, Wei; Albert, Jeffrey M

    2017-08-01

    An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.

  5. Multifractal detrended fluctuation analysis of analog random multiplicative processes

    Energy Technology Data Exchange (ETDEWEB)

    Silva, L.B.M.; Vermelho, M.V.D. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil); Lyra, M.L. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)], E-mail: marcelo@if.ufal.br; Viswanathan, G.M. [Instituto de Fisica, Universidade Federal de Alagoas, Maceio - AL, 57072-970 (Brazil)

    2009-09-15

    We investigate non-Gaussian statistical properties of stationary stochastic signals generated by an analog circuit that simulates a random multiplicative process with weak additive noise. The random noises are originated by thermal shot noise and avalanche processes, while the multiplicative process is generated by a fully analog circuit. The resulting signal describes stochastic time series of current interest in several areas such as turbulence, finance, biology and environment, which exhibit power-law distributions. Specifically, we study the correlation properties of the signal by employing a detrended fluctuation analysis and explore its multifractal nature. The singularity spectrum is obtained and analyzed as a function of the control circuit parameter that tunes the asymptotic power-law form of the probability distribution function.

  6. Delta Healthy Sprouts: Participants' Diet and Food Environment at Baseline

    Science.gov (United States)

    Local food environments influence the nutrition and health of area residents. This baseline analysis focuses on the food environments of women who participated in the Delta Healthy Sprouts project, a randomized, controlled, comparative trial designed to test the efficacy of two Maternal, Infant, an...

  7. What is different about workers' compensation patients? Socioeconomic predictors of baseline disability status among patients with lumbar radiculopathy.

    Science.gov (United States)

    Atlas, Steven J; Tosteson, Tor D; Hanscom, Brett; Blood, Emily A; Pransky, Glenn S; Abdu, William A; Andersson, Gunnar B; Weinstein, James N

    2007-08-15

    Combined analysis of 2 prospective clinical studies. To identify socioeconomic characteristics associated with workers' compensation in patients with an intervertebral disc herniation (IDH) or spinal stenosis (SpS). Few studies have compared socioeconomic differences between those receiving or not receiving workers' compensation with the same underlying clinical conditions. Patients were identified from the Spine Patient Outcomes Research Trial (SPORT) and the National Spine Network (NSN) practice-based outcomes study. Patients with IDH and SpS within NSN were identified satisfying SPORT eligibility criteria. Information on disability and work status at baseline evaluation was used to categorize patients into 3 groups: workers' compensation, other disability compensation, or work-eligible controls. Enrollment rates of patients with disability in a clinical efficacy trial (SPORT) and practice-based network (NSN) were compared. Independent socioeconomic predictors of baseline workers' compensation status were identified in multivariate logistic regression models controlling for clinical condition, study cohort, and initial treatment designation. Among 3759 eligible patients (1480 in SPORT and 2279 in NSN), 564 (15%) were receiving workers' compensation, 317 (8%) were receiving other disability compensation, and 2878 (77%) were controls. Patients receiving workers' compensation were less common in SPORT than NSN (9.2% vs. 18.8%, P socioeconomic characteristics significantly differed according to baseline workers' compensation status. In multiple logistic regression analyses, gender, educational level, work characteristics, legal action, and expectations about ability to work without surgery were independently associated with receiving workers' compensation. Clinical trials involving conditions commonly seen in patients with workers' compensation may need special efforts to ensure adequate representation. Socioeconomic characteristics markedly differed between patients

  8. Methodological issues underlying multiple decrement life table analysis.

    Science.gov (United States)

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  9. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  10. Energy Consumption Analysis for Concrete Residences—A Baseline Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Liang Lin

    2017-02-01

    Full Text Available Estimating building energy consumption is difficult because it deals with complex interactions among uncertain weather conditions, occupant behaviors, and building characteristics. To facilitate estimation, this study employs a benchmarking methodology to obtain energy baseline for sample buildings. Utilizing a scientific simulation tool, this study attempts to develop energy consumption baselines of two typical concrete residences in Taiwan, and subsequently allows a simplified energy consumption prediction process at an early design stage of building development. Using weather data of three metropolitan cities as testbeds, annual energy consumption of two types of modern residences are determined through a series of simulation sessions with different building settings. The impacts of key building characteristics, including building insulation, air tightness, orientation, location, and residence type, are carefully investigated. Sample utility bills are then collected to validate the simulated results, resulting in three adjustment parameters for normalization, including ‘number of residents’, ‘total floor area’, and ‘air conditioning comfort level’, for justification of occupant behaviors in different living conditions. Study results not only provide valuable benchmarking data serving as references for performance evaluation of different energy-saving strategies, but also show how effective extended building insulation, enhanced air tightness, and prudent selection of residence location and orientation can be for successful implementation of building sustainability in tropical and subtropical regions.

  11. Hazard Baseline Downgrade Effluent Treatment Facility

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility

  12. Analysis of multiple spurions and associated circuits in Cofrentes

    International Nuclear Information System (INIS)

    Molina, J. J.; Celaya, M. A.

    2015-01-01

    The article describes the process followed by the Cofrentes Nuclear Power Plant (CNC) to conduct the analysis of multiple spurious in compliance with regulatory standards IS-30 rev 1 and CSN Safety Guide 1.19 based on the recommendations of the NEI-00-01 Guidance for Post-fire Safe Shutdown Circuit and NUREG/CR-6850. Fire PRA Methodology for Nuclear Power Facilities. (Author)

  13. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  14. Retinal layer segmentation in multiple sclerosis: a systematic review and meta-analysis.

    Science.gov (United States)

    Petzold, Axel; Balcer, Laura J; Calabresi, Peter A; Costello, Fiona; Frohman, Teresa C; Frohman, Elliot M; Martinez-Lapiscina, Elena H; Green, Ari J; Kardon, Randy; Outteryck, Olivier; Paul, Friedemann; Schippling, Sven; Vermersch, Patrik; Villoslada, Pablo; Balk, Lisanne J

    2017-10-01

    Structural retinal imaging biomarkers are important for early recognition and monitoring of inflammation and neurodegeneration in multiple sclerosis. With the introduction of spectral domain optical coherence tomography (SD-OCT), supervised automated segmentation of individual retinal layers is possible. We aimed to investigate which retinal layers show atrophy associated with neurodegeneration in multiple sclerosis when measured with SD-OCT. In this systematic review and meta-analysis, we searched for studies in which SD-OCT was used to look at the retina in people with multiple sclerosis with or without optic neuritis in PubMed, Web of Science, and Google Scholar between Nov 22, 1991, and April 19, 2016. Data were taken from cross-sectional cohorts and from one timepoint from longitudinal studies (at least 3 months after onset in studies of optic neuritis). We classified data on eyes into healthy controls, multiple-sclerosis-associated optic neuritis (MSON), and multiple sclerosis without optic neuritis (MSNON). We assessed thickness of the retinal layers and we rated individual layer segmentation performance by random effects meta-analysis for MSON eyes versus control eyes, MSNON eyes versus control eyes, and MSNON eyes versus MSON eyes. We excluded relevant sources of bias by funnel plots. Of 25 497 records identified, 110 articles were eligible and 40 reported data (in total 5776 eyes from patients with multiple sclerosis [1667 MSON eyes and 4109 MSNON eyes] and 1697 eyes from healthy controls) that met published OCT quality control criteria and were suitable for meta-analysis. Compared with control eyes, the peripapillary retinal nerve fibre layer (RNFL) showed thinning in MSON eyes (mean difference -20·10 μm, 95% CI -22·76 to -17·44; pmultiple sclerosis and control eyes were found in the peripapillary RNFL and macular GCIPL. Inflammatory disease activity might be captured by the INL. Because of the consistency, robustness, and large effect size, we

  15. Predictors of postoperative outcomes of cubital tunnel syndrome treatments using multiple logistic regression analysis.

    Science.gov (United States)

    Suzuki, Taku; Iwamoto, Takuji; Shizu, Kanae; Suzuki, Katsuji; Yamada, Harumoto; Sato, Kazuki

    2017-05-01

    This retrospective study was designed to investigate prognostic factors for postoperative outcomes for cubital tunnel syndrome (CubTS) using multiple logistic regression analysis with a large number of patients. Eighty-three patients with CubTS who underwent surgeries were enrolled. The following potential prognostic factors for disease severity were selected according to previous reports: sex, age, type of surgery, disease duration, body mass index, cervical lesion, presence of diabetes mellitus, Workers' Compensation status, preoperative severity, and preoperative electrodiagnostic testing. Postoperative severity of disease was assessed 2 years after surgery by Messina's criteria which is an outcome measure specifically for CubTS. Bivariate analysis was performed to select candidate prognostic factors for multiple linear regression analyses. Multiple logistic regression analysis was conducted to identify the association between postoperative severity and selected prognostic factors. Both bivariate and multiple linear regression analysis revealed only preoperative severity as an independent risk factor for poor prognosis, while other factors did not show any significant association. Although conflicting results exist regarding prognosis of CubTS, this study supports evidence from previous studies and concludes early surgical intervention portends the most favorable prognosis. Copyright © 2017 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.

  16. Baseline-dependent averaging in radio interferometry

    Science.gov (United States)

    Wijnholds, S. J.; Willis, A. G.; Salvini, S.

    2018-05-01

    This paper presents a detailed analysis of the applicability and benefits of baseline-dependent averaging (BDA) in modern radio interferometers and in particular the Square Kilometre Array. We demonstrate that BDA does not affect the information content of the data other than a well-defined decorrelation loss for which closed form expressions are readily available. We verify these theoretical findings using simulations. We therefore conclude that BDA can be used reliably in modern radio interferometry allowing a reduction of visibility data volume (and hence processing costs for handling visibility data) by more than 80 per cent.

  17. Profile of NASA software engineering: Lessons learned from building the baseline

    Science.gov (United States)

    Hall, Dana; Mcgarry, Frank

    1993-01-01

    It is critically important in any improvement activity to first understand the organization's current status, strengths, and weaknesses and, only after that understanding is achieved, examine and implement promising improvements. This fundamental rule is certainly true for an organization seeking to further its software viability and effectiveness. This paper addresses the role of the organizational process baseline in a software improvement effort and the lessons we learned assembling such an understanding for NASA overall and for the NASA Goddard Space Flight Center in particular. We discuss important, core data that must be captured and contrast that with our experience in actually finding such information. Our baselining efforts have evolved into a set of data gathering, analysis, and crosschecking techniques and information presentation formats that may prove useful to others seeking to establish similar baselines for their organization.

  18. Data analysis and pattern recognition in multiple databases

    CERN Document Server

    Adhikari, Animesh; Pedrycz, Witold

    2014-01-01

    Pattern recognition in data is a well known classical problem that falls under the ambit of data analysis. As we need to handle different data, the nature of patterns, their recognition and the types of data analyses are bound to change. Since the number of data collection channels increases in the recent time and becomes more diversified, many real-world data mining tasks can easily acquire multiple databases from various sources. In these cases, data mining becomes more challenging for several essential reasons. We may encounter sensitive data originating from different sources - those cannot be amalgamated. Even if we are allowed to place different data together, we are certainly not able to analyse them when local identities of patterns are required to be retained. Thus, pattern recognition in multiple databases gives rise to a suite of new, challenging problems different from those encountered before. Association rule mining, global pattern discovery, and mining patterns of select items provide different...

  19. Development Of Regional Climate Mitigation Baseline For A DominantAgro-Ecological Zone Of Karnataka, India

    Energy Technology Data Exchange (ETDEWEB)

    Sudha, P.; Shubhashree, D.; Khan, H.; Hedge, G.T.; Murthy, I.K.; Shreedhara, V.; Ravindranath, N.H.

    2007-06-01

    Setting a baseline for carbon stock changes in forest andland use sector mitigation projects is an essential step for assessingadditionality of the project. There are two approaches for settingbaselines namely, project-specific and regional baseline. This paperpresents the methodology adopted for estimating the land available formitigation, for developing a regional baseline, transaction cost involvedand a comparison of project-specific and regional baseline. The studyshowed that it is possible to estimate the potential land and itssuitability for afforestation and reforestation mitigation projects,using existing maps and data, in the dry zone of Karnataka, southernIndia. The study adopted a three-step approach for developing a regionalbaseline, namely: i) identification of likely baseline options for landuse, ii) estimation of baseline rates of land-use change, and iii)quantification of baseline carbon profile over time. The analysis showedthat carbon stock estimates made for wastelands and fallow lands forproject-specific as well as the regional baseline are comparable. Theratio of wasteland Carbon stocks of a project to regional baseline is1.02, and that of fallow lands in the project to regional baseline is0.97. The cost of conducting field studies for determination of regionalbaseline is about a quarter of the cost of developing a project-specificbaseline on a per hectare basis. The study has shown the reliability,feasibility and cost-effectiveness of adopting regional baseline forforestry sectormitigation projects.

  20. Steady State Analysis of Stochastic Systems with Multiple Time Delays

    Science.gov (United States)

    Xu, W.; Sun, C. Y.; Zhang, H. Q.

    In this paper, attention is focused on the steady state analysis of a class of nonlinear dynamic systems with multi-delayed feedbacks driven by multiplicative correlated Gaussian white noises. The Fokker-Planck equations for delayed variables are at first derived by Novikov's theorem. Then, under small delay assumption, the approximate stationary solutions are obtained by the probability density approach. As a special case, the effects of multidelay feedbacks and the correlated additive and multiplicative Gaussian white noises on the response of a bistable system are considered. It is shown that the obtained analytical results are in good agreement with experimental results in Monte Carlo simulations.

  1. Deep sequencing analysis of HIV-1 reverse transcriptase at baseline and time of failure in patients receiving rilpivirine in the phase III studies ECHO and THRIVE.

    Science.gov (United States)

    Van Eygen, Veerle; Thys, Kim; Van Hove, Carl; Rimsky, Laurence T; De Meyer, Sandra; Aerssens, Jeroen; Picchio, Gaston; Vingerhoets, Johan

    2016-05-01

    Minority variants (1.0-25.0%) were evaluated by deep sequencing (DS) at baseline and virological failure (VF) in a selection of antiretroviral treatment-naïve, HIV-1-infected patients from the rilpivirine ECHO/THRIVE phase III studies. Linkage between frequently emerging resistance-associated mutations (RAMs) was determined. DS (llIumina®) and population sequencing (PS) results were available at baseline for 47 VFs and time of failure for 48 VFs; and at baseline for 49 responders matched for baseline characteristics. Minority mutations were accurately detected at frequencies down to 1.2% of the HIV-1 quasispecies. No baseline minority rilpivirine RAMs were detected in VFs; one responder carried 1.9% F227C. Baseline minority mutations associated with resistance to other non-nucleoside reverse transcriptase inhibitors (NNRTIs) were detected in 8/47 VFs (17.0%) and 7/49 responders (14.3%). Baseline minority nucleoside/nucleotide reverse transcriptase inhibitor (NRTI) RAMs M184V and L210W were each detected in one VF (none in responders). At failure, two patients without NNRTI RAMs by PS carried minority rilpivirine RAMs K101E and/or E138K; and five additional patients carried other minority NNRTI RAMs V90I, V106I, V179I, V189I, and Y188H. Overall at failure, minority NNRTI RAMs and NRTI RAMs were found in 29/48 (60.4%) and 16/48 VFs (33.3%), respectively. Linkage analysis showed that E138K and K101E were usually not observed on the same viral genome. In conclusion, baseline minority rilpivirine RAMs and other NNRTI/NRTI RAMs were uncommon in the rilpivirine arm of the ECHO and THRIVE studies. DS at failure showed emerging NNRTI resistant minority variants in seven rilpivirine VFs who had no detectable NNRTI RAMs by PS. © 2015 Wiley Periodicals, Inc.

  2. Pharmacogenetic meta-analysis of baseline risk factors, pharmacodynamic, efficacy and tolerability endpoints from two large global cardiovascular outcomes trials for darapladib.

    Directory of Open Access Journals (Sweden)

    Astrid Yeo

    Full Text Available Darapladib, a lipoprotein-associated phospholipase A2 (Lp-PLA2 inhibitor, failed to demonstrate efficacy for the primary endpoints in two large phase III cardiovascular outcomes trials, one in stable coronary heart disease patients (STABILITY and one in acute coronary syndrome (SOLID-TIMI 52. No major safety signals were observed but tolerability issues of diarrhea and odor were common (up to 13%. We hypothesized that genetic variants associated with Lp-PLA2 activity may influence efficacy and tolerability and therefore performed a comprehensive pharmacogenetic analysis of both trials. We genotyped patients within the STABILITY and SOLID-TIMI 52 trials who provided a DNA sample and consent (n = 13,577 and 10,404 respectively, representing 86% and 82% of the trial participants using genome-wide arrays with exome content and performed imputation using a 1000 Genomes reference panel. We investigated baseline and change from baseline in Lp-PLA2 activity, two efficacy endpoints (major coronary events and myocardial infarction as well as tolerability parameters at genome-wide and candidate gene level using a meta-analytic approach. We replicated associations of published loci on baseline Lp-PLA2 activity (APOE, CELSR2, LPA, PLA2G7, LDLR and SCARB1 and identified three novel loci (TOMM5, FRMD5 and LPL using the GWAS-significance threshold P≤5E-08. Review of the PLA2G7 gene (encoding Lp-PLA2 within these datasets identified V279F null allele carriers as well as three other rare exonic null alleles within various ethnic groups, however none of these variants nor any other loci associated with Lp-PLA2 activity at baseline were associated with any of the drug response endpoints. The analysis of darapladib efficacy endpoints, despite low power, identified six low frequency loci with main genotype effect (though with borderline imputation scores and one common locus (minor allele frequency 0.24 with genotype by treatment interaction effect passing the GWAS

  3. 3D fluid-structure modelling and vibration analysis for fault diagnosis of Francis turbine using multiple ANN and multiple ANFIS

    Science.gov (United States)

    Saeed, R. A.; Galybin, A. N.; Popov, V.

    2013-01-01

    This paper discusses condition monitoring and fault diagnosis in Francis turbine based on integration of numerical modelling with several different artificial intelligence (AI) techniques. In this study, a numerical approach for fluid-structure (turbine runner) analysis is presented. The results of numerical analysis provide frequency response functions (FRFs) data sets along x-, y- and z-directions under different operating load and different position and size of faults in the structure. To extract features and reduce the dimensionality of the obtained FRF data, the principal component analysis (PCA) has been applied. Subsequently, the extracted features are formulated and fed into multiple artificial neural networks (ANN) and multiple adaptive neuro-fuzzy inference systems (ANFIS) in order to identify the size and position of the damage in the runner and estimate the turbine operating conditions. The results demonstrated the effectiveness of this approach and provide satisfactory accuracy even when the input data are corrupted with certain level of noise.

  4. Real-world Clinical Outcomes Among Patients With Type 2 Diabetes Receiving Canagliflozin at a Specialty Diabetes Clinic: Subgroup Analysis by Baseline HbA1c and Age.

    Science.gov (United States)

    Johnson, June Felice; Parsa, Rahul; Bailey, Robert A

    2017-06-01

    Canagliflozin, a sodium glucose co-transporter 2 inhibitor developed for the treatment of type 2 diabetes mellitus (T2DM), has demonstrated effectiveness in patients with T2DM receiving care at a specialty diabetes clinic. We report the outcomes in these patients in subgroups classified by baseline hemoglobin A 1c (HbA 1c ) and age. This subgroup analysis was based on a review of data from the electronic health records of adults with T2DM who were prescribed canagliflozin at a specialty diabetes clinic and who returned for ≥1 follow-up office visit. Mean changes from baseline to the first and second follow-up office visits in HbA 1c , body weight, and systolic and diastolic blood pressure (BP) were calculated in each subgroup classified by baseline HbA 1c (≥7.0%, ≥8.0%, and >9.0%) and age (baseline HbA 1c ≥7.0%, ≥8.0%, and >9.0%, respectively; 396 and 66 patients were aged baseline HbA 1c and age experienced clinically and statistically significant reductions from baseline in HbA 1c , body weight, and systolic BP that were sustained over 2 office visits; diastolic BP was also reduced across baseline HbA 1c and age subgroups. Greater reductions in HbA 1c were seen among the canagliflozin-treated patients with higher baseline HbA 1c and among younger versus older patients. These findings from clinical practice demonstrate real-world effectiveness of canagliflozin in lowering HbA 1c , body weight, and systolic BP among patients with T2DM, regardless of baseline HbA 1c levels or age. Copyright © 2017 Elsevier HS Journals, Inc. All rights reserved.

  5. Linear MALDI-ToF simultaneous spectrum deconvolution and baseline removal.

    Science.gov (United States)

    Picaud, Vincent; Giovannelli, Jean-Francois; Truntzer, Caroline; Charrier, Jean-Philippe; Giremus, Audrey; Grangeat, Pierre; Mercier, Catherine

    2018-04-05

    Thanks to a reasonable cost and simple sample preparation procedure, linear MALDI-ToF spectrometry is a growing technology for clinical microbiology. With appropriate spectrum databases, this technology can be used for early identification of pathogens in body fluids. However, due to the low resolution of linear MALDI-ToF instruments, robust and accurate peak picking remains a challenging task. In this context we propose a new peak extraction algorithm from raw spectrum. With this method the spectrum baseline and spectrum peaks are processed jointly. The approach relies on an additive model constituted by a smooth baseline part plus a sparse peak list convolved with a known peak shape. The model is then fitted under a Gaussian noise model. The proposed method is well suited to process low resolution spectra with important baseline and unresolved peaks. We developed a new peak deconvolution procedure. The paper describes the method derivation and discusses some of its interpretations. The algorithm is then described in a pseudo-code form where the required optimization procedure is detailed. For synthetic data the method is compared to a more conventional approach. The new method reduces artifacts caused by the usual two-steps procedure, baseline removal then peak extraction. Finally some results on real linear MALDI-ToF spectra are provided. We introduced a new method for peak picking, where peak deconvolution and baseline computation are performed jointly. On simulated data we showed that this global approach performs better than a classical one where baseline and peaks are processed sequentially. A dedicated experiment has been conducted on real spectra. In this study a collection of spectra of spiked proteins were acquired and then analyzed. Better performances of the proposed method, in term of accuracy and reproductibility, have been observed and validated by an extended statistical analysis.

  6. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview workers...

  7. 40 CFR 80.92 - Baseline auditor requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  8. Treatment with the first TNF inhibitor in rheumatoid arthritis patients in the Hellenic Registry of Biologic Therapies improves quality of life especially in young patients with better baseline functional status.

    Science.gov (United States)

    Boubouchairopoulou, Nadia; Flouri, Irini; Drosos, Alexandros A; Boki, Kyriaki; Settas, Loukas; Zisopoulos, Dimitrios; Skopouli, Fotini N; Papadopoulos, Ioannis; Iliopoulos, Alexios; Kyriopoulos, John; Boumpas, Dimitrios T; Athanasakis, Konstantinos; Sidiropoulos, Prodromos

    2016-01-01

    To assess in daily practice in patients with rheumatoid arthritis (RA) the effect of treatment with first tumour necrosis factor-α inhibitor (TNFi) in quality of life (Qol), disease activity and depict possible baseline predictors for gains in Qol. Patients followed prospectively by the Hellenic Registry of Biologic Therapies were analysed. Demographics were recorded at baseline, while RA-related characteristics at baseline and every 6 months. Paired t-tests were used to detect divergences between patient-reported (Health Assessment Questionnaire (HAQ), EuroQol (EQ-5D)) and clinical tools (Disease Activity Score-28 joints (DAS28)). Clinical versus self-reported outcomes were examined via cross-tabulation analysis. Multiple regression analysis was performed for identifying baseline predictors of improvements in QALYs. We analysed 255 patients (age (mean±SD) 57.1±13.0, disease duration 9.2±9.1 years, prior non-biologic disease-modifying anti-rheumatic drugs 2.3±1.2). Baseline EQ-5D, HAQ and DAS28 were 0.36 (0.28), 1.01 (0.72) and 5.9 (1.3), respectively, and were all significantly improved after 12 months (0.77 (0.35), 0.50 (0.66), 3.9 (1.5), respectively, p<0.05 for all). 90% of patients who improved from high to a lower DAS28 status (low-remission or moderate) had clinically important improvement in Qol (phi-coefficient=0.531,p<0.05). Independent predictors of gains in Qol were lower baseline HAQ, VAS global and younger age (adjusted R2=0.27). In daily practice TNFi improve both disease activity and Qol for the first 12 months of therapy. 90% of patients who improved from high to a lower DAS28 status had clinically important improvement in Qol. Younger patients starting with lower HAQ and VAS global are more likely to benefit.

  9. The prognostic utility of baseline alpha-fetoprotein for hepatocellular carcinoma patients.

    Science.gov (United States)

    Silva, Jack P; Gorman, Richard A; Berger, Nicholas G; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2017-12-01

    Alpha-fetoprotein (AFP) has a valuable role in postoperative surveillance for hepatocellular carcinoma (HCC) recurrence. The utility of pretreatment or baseline AFP remains controversial. The present study hypothesized that elevated baseline AFP levels are associated with worse overall survival in HCC patients. Adult HCC patients were identified using the National Cancer Database (2004-2013). Patients were stratified according to baseline AFP measurements into the following groups: Negative (2000). The primary outcome was overall survival (OS), which was analyzed by log-rank test and graphed using Kaplan-Meier method. Multivariate regression modeling was used to determine hazard ratios (HR) for OS. Of 41 107 patients identified, 15 809 (33.6%) were Negative. Median overall survival was highest in the Negative group, followed by Borderline, Elevated, and Highly Elevated (28.7 vs 18.9 vs 8.8 vs 3.2 months; P < 0.001). On multivariate analysis, overall survival hazard ratios for the Borderline, Elevated, and Highly Elevated groups were 1.18 (P = 0.267), 1.94 (P < 0.001), and 1.77 (P = 0.007), respectively (reference Negative). Baseline AFP independently predicted overall survival in HCC patients regardless of treatment plan. A baseline AFP value is a simple and effective method to assist in expected survival for HCC patients. © 2017 Wiley Periodicals, Inc.

  10. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  11. Integrated planning: A baseline development perspective

    International Nuclear Information System (INIS)

    Clauss, L.; Chang, D.

    1994-01-01

    The FEMP Baseline establishes the basis for integrating environmental activity technical requirements with their cost and schedule elements. The result is a path forward to successfully achieving the FERMCO mission. Specific to cost management, the FEMP Baseline has been incorporate into the FERMCO Project Control System (PCS) to provide a time-phased budget plan against which contractor performance is measured with an earned value management system. The result is the Performance Measurement Baseline (PMB), an important tool for keeping cost under control

  12. Programmatic Baseline Summary for Phase 1 Privatization for the Tank Farm contractor

    International Nuclear Information System (INIS)

    DIEDIKER, J.A.

    2000-01-01

    The document describes the systematic integrated baseline planning process and provides a summary of the Tank Farm Contractor scope, schedule and cost analysis developed in support of the Phase 1 privatization mission

  13. Programmatic Baseline Summary for Phase 1 Privatization for the Tank Farm contractor

    Energy Technology Data Exchange (ETDEWEB)

    DIEDIKER, J.A.

    2000-04-22

    The document describes the systematic integrated baseline planning process and provides a summary of the Tank Farm Contractor scope, schedule and cost analysis developed in support of the Phase 1 privatization mission.

  14. Beyond total treatment effects in randomised controlled trials: Baseline measurement of intermediate outcomes needed to reduce confounding in mediation investigations.

    Science.gov (United States)

    Landau, Sabine; Emsley, Richard; Dunn, Graham

    2018-06-01

    Random allocation avoids confounding bias when estimating the average treatment effect. For continuous outcomes measured at post-treatment as well as prior to randomisation (baseline), analyses based on (A) post-treatment outcome alone, (B) change scores over the treatment phase or (C) conditioning on baseline values (analysis of covariance) provide unbiased estimators of the average treatment effect. The decision to include baseline values of the clinical outcome in the analysis is based on precision arguments, with analysis of covariance known to be most precise. Investigators increasingly carry out explanatory analyses to decompose total treatment effects into components that are mediated by an intermediate continuous outcome and a non-mediated part. Traditional mediation analysis might be performed based on (A) post-treatment values of the intermediate and clinical outcomes alone, (B) respective change scores or (C) conditioning on baseline measures of both intermediate and clinical outcomes. Using causal diagrams and Monte Carlo simulation, we investigated the performance of the three competing mediation approaches. We considered a data generating model that included three possible confounding processes involving baseline variables: The first two processes modelled baseline measures of the clinical variable or the intermediate variable as common causes of post-treatment measures of these two variables. The third process allowed the two baseline variables themselves to be correlated due to past common causes. We compared the analysis models implied by the competing mediation approaches with this data generating model to hypothesise likely biases in estimators, and tested these in a simulation study. We applied the methods to a randomised trial of pragmatic rehabilitation in patients with chronic fatigue syndrome, which examined the role of limiting activities as a mediator. Estimates of causal mediation effects derived by approach (A) will be biased if one of

  15. Physics with a very long neutrino factory baseline

    International Nuclear Information System (INIS)

    Gandhi, Raj; Winter, Walter

    2007-01-01

    We discuss the neutrino oscillation physics of a very long neutrino factory baseline over a broad range of lengths (between 6000 km and 9000 km), centered on the 'magic baseline' (∼7500 km) where correlations with the leptonic CP phase are suppressed by matter effects. Since the magic baseline depends only on the density, we study the impact of matter density profile effects and density uncertainties over this range, and the impact of detector locations off the optimal baseline. We find that the optimal constant density describing the physics over this entire baseline range is about 5% higher than the average matter density. This implies that the magic baseline is significantly shorter than previously inferred. However, while a single detector optimization requires fine-tuning of the (very long) baseline length, its combination with a near detector at a shorter baseline is much less sensitive to the far detector location and to uncertainties in the matter density. In addition, we point out different applications of this baseline which go beyond its excellent correlation and degeneracy resolution potential. We demonstrate that such a long baseline assists in the improvement of the θ 13 precision and in the resolution of the octant degeneracy. Moreover, we show that the neutrino data from such a baseline could be used to extract the matter density along the profile up to 0.24% at 1σ for large sin 2 2θ 13 , providing a useful discriminator between different geophysical models

  16. Investigation of key parameters for the development of reliable ITER baseline operation scenarios using CORSICA

    Science.gov (United States)

    Kim, S. H.; Casper, T. A.; Snipes, J. A.

    2018-05-01

    ITER will demonstrate the feasibility of burning plasma operation by operating DT plasmas in the ELMy H-mode regime with a high ratio of fusion power gain Q ~ 10. 15 MA ITER baseline operation scenario has been studied using CORSICA, focusing on the entry to burn, flat-top burning plasma operation and exit from burn. The burning plasma operation for about 400 s of the current flat-top was achieved in H-mode within the various engineering constraints imposed by the poloidal field coil and power supply systems. The target fusion gain (Q ~ 10) was achievable in the 15 MA ITER baseline operation with a moderate amount of the total auxiliary heating power (~50 MW). It has been observed that the tungsten (W) concentration needs to be maintained low level (n w/n e up to the order of 1.0  ×  10-5) to avoid the radiative collapse and uncontrolled early termination of the discharge. The dynamic evolution of the density can modify the H-mode access unless the applied auxiliary heating power is significantly higher than the H-mode threshold power. Several qualitative sensitivity studies have been performed to provide guidance for further optimizing the plasma operation and performance. Increasing the density profile peaking factor was quite effective in increasing the alpha particle self-heating power and fusion power multiplication factor. Varying the combination of auxiliary heating power has shown that the fusion power multiplication factor can be reduced along with the increase in the total auxiliary heating power. As the 15 MA ITER baseline operation scenario requires full capacity of the coil and power supply systems, the operation window for H-mode access and shape modification was narrow. The updated ITER baseline operation scenarios developed in this work will become a basis for further optimization studies necessary along with the improvement in understanding the burning plasma physics.

  17. Efficacy of a foodlet-based multiple micronutrient supplement for preventing growth faltering, anemia, and micronutrient deficiency of infants: the four country IRIS trial pooled data analysis.

    Science.gov (United States)

    Smuts, Cornelius M; Lombard, Carl J; Benadé, A J Spinnler; Dhansay, Muhammad A; Berger, Jacques; Hop, Le Thi; López de Romaña, Guillermo; Untoro, Juliawati; Karyadi, Elvina; Erhardt, Jürgen; Gross, Rainer

    2005-03-01

    Diets of infants across the world are commonly deficient in multiple micronutrients during the period of growth faltering and dietary transition from milk to solid foods. A randomized placebo controlled trial was carried out in Indonesia, Peru, South Africa, and Vietnam, using a common protocol to investigate whether improving status for multiple micronutrients prevented growth faltering and anemia during infancy. The results of the pooled data analysis of the 4 countries for growth, anemia, and micronutrient status are reported. A total of 1134 infants were randomized to 4 treatment groups, with 283 receiving a daily placebo (P), 283 receiving a weekly multiple micronutrient supplement (WMM), 280 received a daily multiple micronutrient (DMM) supplement, and 288 received daily iron (DI) supplements. The DMM group had a significantly greater weight gain, growing at an average rate of 207 g/mo compared with 192 g/mo for the WMM group, and 186 g/mo for the DI and P groups. There were no differences in height gain. DMM was also the most effective treatment for controlling anemia and iron deficiency, besides improving zinc, retinol, tocopherol, and riboflavin status. DI supplementation alone increased zinc deficiency. The prevalence of multiple micronutrient deficiencies at baseline was high, with anemia affecting the majority, and was not fully controlled even after 6 mo of supplementation. These positive results indicate the need for larger effectiveness trials to examine how to deliver supplements at the program scale and to estimate cost benefits. Consideration should also be given to increasing the dosages of micronutrients being delivered in the foodlets.

  18. Application of range-test in multiple linear regression analysis in ...

    African Journals Online (AJOL)

    Application of range-test in multiple linear regression analysis in the presence of outliers is studied in this paper. First, the plot of the explanatory variables (i.e. Administration, Social/Commercial, Economic services and Transfer) on the dependent variable (i.e. GDP) was done to identify the statistical trend over the years.

  19. Effect of walking on sand on gait kinematics in individuals with multiple sclerosis.

    Science.gov (United States)

    van den Berg, Maayken E L; Barr, Christopher J; McLoughlin, James V; Crotty, Maria

    2017-08-01

    Walking in the real-world involves negotiating challenging or uneven surfaces, including sand. This can be challenging for people with Multiple Sclerosis (PWMS) due to motor deficits affecting the lower extremities. The study objective was to characterise kinematic gait adaptations made by PWMS when walking on sand and describe any immediate post-adaptation effects. 17 PWMS (mean age 51.4 ± 5.5, Disease Steps 2.4 ± 1.0), and 14 age-and gender matched healthy adults (HA) took part in a case-control study. 3D gait analysis was conducted using an eight-camera Vicon motion capture system. Each participant completed walking trials over level ground (baseline), sand (gait adaptation response), and again level ground (post-adaptation). Spatiotemporal data and kinematic data for the hip knee and ankle were recorded. At baseline PWMS showed significantly less total lower limb flexion (pgait pattern to near baseline levels, in a manner similar to but with values not equalling HA. Further work is required to determine whether this mode of walking has potential to act as a gait retraining strategy to increase flexion of the lower limb. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Empirical comparison of four baseline covariate adjustment methods in analysis of continuous outcomes in randomized controlled trials

    Directory of Open Access Journals (Sweden)

    Zhang S

    2014-07-01

    Full Text Available Shiyuan Zhang,1 James Paul,2 Manyat Nantha-Aree,2 Norman Buckley,2 Uswa Shahzad,2 Ji Cheng,2 Justin DeBeer,5 Mitchell Winemaker,5 David Wismer,5 Dinshaw Punthakee,5 Victoria Avram,5 Lehana Thabane1–41Department of Clinical Epidemiology and Biostatistics, 2Department of Anesthesia, McMaster University, Hamilton, ON, Canada; 3Biostatistics Unit/Centre for Evaluation of Medicines, St Joseph's Healthcare - Hamilton, Hamilton, ON, Canada; 4Population Health Research Institute, Hamilton Health Science/McMaster University, 5Department of Surgery, Division of Orthopaedics, McMaster University, Hamilton, ON, CanadaBackground: Although seemingly straightforward, the statistical comparison of a continuous variable in a randomized controlled trial that has both a pre- and posttreatment score presents an interesting challenge for trialists. We present here empirical application of four statistical methods (posttreatment scores with analysis of variance, analysis of covariance, change in scores, and percent change in scores, using data from a randomized controlled trial of postoperative pain in patients following total joint arthroplasty (the Morphine COnsumption in Joint Replacement Patients, With and Without GaBapentin Treatment, a RandomIzed ControlLEd Study [MOBILE] trials.Methods: Analysis of covariance (ANCOVA was used to adjust for baseline measures and to provide an unbiased estimate of the mean group difference of the 1-year postoperative knee flexion scores in knee arthroplasty patients. Robustness tests were done by comparing ANCOVA with three comparative methods: the posttreatment scores, change in scores, and percentage change from baseline.Results: All four methods showed similar direction of effect; however, ANCOVA (-3.9; 95% confidence interval [CI]: -9.5, 1.6; P=0.15 and the posttreatment score (-4.3; 95% CI: -9.8, 1.2; P=0.12 method provided the highest precision of estimate compared with the change score (-3.0; 95% CI: -9.9, 3.8; P=0

  1. Higher baseline viral diversity correlates with lower HBsAg decline following PEGylated interferon-alpha therapy in patients with HBeAg-positive chronic hepatitis B.

    Science.gov (United States)

    Li, Hu; Zhang, Li; Ren, Hong; Hu, Peng

    2018-01-01

    Viral diversity seems to predict treatment outcomes in certain viral infections. The aim of this study was to evaluate the association between baseline intra-patient viral diversity and hepatitis B surface antigen (HBsAg) decline following PEGylated interferon-alpha (Peg-IFN-α) therapy. Twenty-six HBeAg-positive patients who were treated with Peg-IFN-α were enrolled. Nested polymerase chain reaction (PCR), cloning, and sequencing of the hepatitis B virus S gene were performed on baseline samples, and normalized Shannon entropy (Sn) was calculated as a measure of small hepatitis B surface protein (SHBs) diversity. Multiple regression analysis was used to estimate the association between baseline Sn and HBsAg decline. Of the 26 patients enrolled in the study, 65.4% were male and 61.5% were infected with hepatitis B virus genotype B. The median HBsAg level at baseline was 4.5 log 10 IU/mL (interquartile range: 4.1-4.9) and declined to 3.0 log 10 IU/mL (interquartile range: 1.7-3.9) after 48 weeks of Peg-IFN-α treatment. In models adjusted for baseline alanine aminotransferase (ALT) and HBsAg, the adjusted coefficients (95% CI) for ΔHBsAg and relative percentage HBsAg decrease were -1.3 (-2.5, -0.2) log 10 IU/mL for higher SHBs diversity (Sn≥0.58) patients and -26.4% (-50.2%, -2.5%) for lower diversity (Sndiversity. Baseline intra-patient SHBs diversity was inverse to HBsAg decline in HBeAg-positive chronic hepatitis B (CHB) patients receiving Peg-IFN-α monotherapy. Also, more sequence variations within the "a" determinant upstream flanking region and the first loop of the "a" determinant were the main sources of the higher SHBs diversity.

  2. Tools for Closure Project and Contract Management: Development of the Rocky Flats Integrated Closure Project Baseline

    International Nuclear Information System (INIS)

    Gelles, C. M.; Sheppard, F. R.

    2002-01-01

    This paper details the development of the Rocky Flats Integrated Closure Project Baseline - an innovative project management effort undertaken to ensure proactive management of the Rocky Flats Closure Contract in support of the Department's goal for achieving the safe closure of the Rocky Flats Environmental Technology Site (RFETS) in December 2006. The accelerated closure of RFETS is one of the most prominent projects within the Department of Energy (DOE) Environmental Management program. As the first major former weapons plant to be remediated and closed, it is a first-of-kind effort requiring the resolution of multiple complex technical and institutional challenges. Most significantly, the closure of RFETS is dependent upon the shipment of all special nuclear material and wastes to other DOE sites. The Department is actively working to strengthen project management across programs, and there is increasing external interest in this progress. The development of the Rocky Flats Integrated Closure Project Baseline represents a groundbreaking and cooperative effort to formalize the management of such a complex project across multiple sites and organizations. It is original in both scope and process, however it provides a useful precedent for the other ongoing project management efforts within the Environmental Management program

  3. Mediation Analysis with Multiple Mediators

    OpenAIRE

    VanderWeele, T.J.; Vansteelandt, S.

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects throu...

  4. Visualization-based analysis of multiple response survey data

    Science.gov (United States)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  5. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    Science.gov (United States)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  6. Mode Shape Analysis of Multiple Cracked Functionally Graded Timoshenko Beams

    Directory of Open Access Journals (Sweden)

    Tran Van Lien

    Full Text Available Abstract The present paper addresses free vibration of multiple cracked Timoshenko beams made of Functionally Graded Material (FGM. Cracks are modeled by rotational spring of stiffness calculated from the crack depth and material properties vary according to the power law throughout the beam thickness. Governing equations for free vibration of the beam are formulated with taking into account actual position of the neutral plane. The obtained frequency equation and mode shapes are used for analysis of the beam mode shapes in dependence on the material and crack parameters. Numerical results validate usefulness of the proposed herein theory and show that mode shapes are good indication for detecting multiple cracks in Timoshenko FGM beams.

  7. Multiple Pregnancies in CKD Patients: An Explosive Mix

    Science.gov (United States)

    Arduino, Silvana; Attini, Rossella; Parisi, Silvia; Fassio, Federica; Biolcati, Marlisa; Pagano, Arianna; Bossotti, Carlotta; Vasario, Elena; Borgarello, Valentina; Daidola, Germana; Ferraresi, Martina; Gaglioti, Pietro; Todros, Tullia

    2013-01-01

    Summary Background and objectives CKD and multiple pregnancies bear important risks for pregnancy outcomes. The aim of the study was to define the risk for adverse pregnancy-related outcomes in multiple pregnancies in CKD patients in comparison with a control group of “low-risk” multiple pregnancies. Design, setting, participants, & measurements The study was performed in the Maternal Hospital of the University of Turin, Italy. Of 314 pregnancies referred in CKD (2000–2011), 20 were multiple (15 twin deliveries). Control groups consisted of 379 low-risk multiple pregnancies (314 twin deliveries) and 19 (15 twin deliveries) cases with hypertension-collagen diseases. Baseline data and outcomes were compared by univariate and logistic regression analyses. Results The prevalence of multiple pregnancies was relatively high in the CKD population (6.4%); all referred cases were in early CKD stages (I-II); both creatinine (0.68 to 0.79 mg/dl; P=0.010) and proteinuria (0.81 to 3.42 g/d; P=0.041) significantly increased from referral to delivery. No significant difference in demographic data at baseline was found between cases and low-risk controls. CKD was associated with higher risk of adverse pregnancy outcomes versus low-risk twin pregnancies. Statistical significance was reached for preterm delivery (<34 weeks: 60% vs 26.4%; P=0.005; <32 weeks: 53.3% vs 12.7%; P<0.001), small for gestational age babies (28.6% vs 8.1%; P<0.001), need for Neonatal Intensive Care Unit (60% vs 12.7%; P<0.001), weight discordance between twins (40% vs 17.8%; P=0.032), and neonatal and perinatal mortality (6.6% vs 0.8%; P=0.032). Conclusion This study suggests that maternal-fetal risks are increased in multiple pregnancies in the early CKD stages. PMID:23124785

  8. Monitoring urban greenness dynamics using multiple endmember spectral mixture analysis.

    Directory of Open Access Journals (Sweden)

    Muye Gan

    Full Text Available Urban greenness is increasingly recognized as an essential constituent of the urban environment and can provide a range of services and enhance residents' quality of life. Understanding the pattern of urban greenness and exploring its spatiotemporal dynamics would contribute valuable information for urban planning. In this paper, we investigated the pattern of urban greenness in Hangzhou, China, over the past two decades using time series Landsat-5 TM data obtained in 1990, 2002, and 2010. Multiple endmember spectral mixture analysis was used to derive vegetation cover fractions at the subpixel level. An RGB-vegetation fraction model, change intensity analysis and the concentric technique were integrated to reveal the detailed, spatial characteristics and the overall pattern of change in the vegetation cover fraction. Our results demonstrated the ability of multiple endmember spectral mixture analysis to accurately model the vegetation cover fraction in pixels despite the complex spectral confusion of different land cover types. The integration of multiple techniques revealed various changing patterns in urban greenness in this region. The overall vegetation cover has exhibited a drastic decrease over the past two decades, while no significant change occurred in the scenic spots that were studied. Meanwhile, a remarkable recovery of greenness was observed in the existing urban area. The increasing coverage of small green patches has played a vital role in the recovery of urban greenness. These changing patterns were more obvious during the period from 2002 to 2010 than from 1990 to 2002, and they revealed the combined effects of rapid urbanization and greening policies. This work demonstrates the usefulness of time series of vegetation cover fractions for conducting accurate and in-depth studies of the long-term trajectories of urban greenness to obtain meaningful information for sustainable urban development.

  9. Leveraging probabilistic peak detection to estimate baseline drift in complex chromatographic samples.

    Science.gov (United States)

    Lopatka, Martin; Barcaru, Andrei; Sjerps, Marjan J; Vivó-Truyols, Gabriel

    2016-01-29

    Accurate analysis of chromatographic data often requires the removal of baseline drift. A frequently employed strategy strives to determine asymmetric weights in order to fit a baseline model by regression. Unfortunately, chromatograms characterized by a very high peak saturation pose a significant challenge to such algorithms. In addition, a low signal-to-noise ratio (i.e. s/npeak detection algorithm. A posterior probability of being affected by a peak is computed for each point in the chromatogram, leading to a set of weights that allow non-iterative calculation of a baseline estimate. For extremely saturated chromatograms, the peak weighted (PW) method demonstrates notable improvement compared to the other methods examined. However, in chromatograms characterized by low-noise and well-resolved peaks, the asymmetric least squares (ALS) and the more sophisticated Mixture Model (MM) approaches achieve superior results in significantly less time. We evaluate the performance of these three baseline correction methods over a range of chromatographic conditions to demonstrate the cases in which each method is most appropriate. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. [A factor analysis method for contingency table data with unlimited multiple choice questions].

    Science.gov (United States)

    Toyoda, Hideki; Haiden, Reina; Kubo, Saori; Ikehara, Kazuya; Isobe, Yurie

    2016-02-01

    The purpose of this study is to propose a method of factor analysis for analyzing contingency tables developed from the data of unlimited multiple-choice questions. This method assumes that the element of each cell of the contingency table has a binominal distribution and a factor analysis model is applied to the logit of the selection probability. Scree plot and WAIC are used to decide the number of factors, and the standardized residual, the standardized difference between the sample, and the proportion ratio, is used to select items. The proposed method was applied to real product impression research data on advertised chips and energy drinks. Since the results of the analysis showed that this method could be used in conjunction with conventional factor analysis model, and extracted factors were fully interpretable, and suggests the usefulness of the proposed method in the study of psychology using unlimited multiple-choice questions.

  11. Cinteny: flexible analysis and visualization of synteny and genome rearrangements in multiple organisms

    Directory of Open Access Journals (Sweden)

    Meller Jaroslaw

    2007-03-01

    Full Text Available Abstract Background Identifying syntenic regions, i.e., blocks of genes or other markers with evolutionary conserved order, and quantifying evolutionary relatedness between genomes in terms of chromosomal rearrangements is one of the central goals in comparative genomics. However, the analysis of synteny and the resulting assessment of genome rearrangements are sensitive to the choice of a number of arbitrary parameters that affect the detection of synteny blocks. In particular, the choice of a set of markers and the effect of different aggregation strategies, which enable coarse graining of synteny blocks and exclusion of micro-rearrangements, need to be assessed. Therefore, existing tools and resources that facilitate identification, visualization and analysis of synteny need to be further improved to provide a flexible platform for such analysis, especially in the context of multiple genomes. Results We present a new tool, Cinteny, for fast identification and analysis of synteny with different sets of markers and various levels of coarse graining of syntenic blocks. Using Hannenhalli-Pevzner approach and its extensions, Cinteny also enables interactive determination of evolutionary relationships between genomes in terms of the number of rearrangements (the reversal distance. In particular, Cinteny provides: i integration of synteny browsing with assessment of evolutionary distances for multiple genomes; ii flexibility to adjust the parameters and re-compute the results on-the-fly; iii ability to work with user provided data, such as orthologous genes, sequence tags or other conserved markers. In addition, Cinteny provides many annotated mammalian, invertebrate and fungal genomes that are pre-loaded and available for analysis at http://cinteny.cchmc.org. Conclusion Cinteny allows one to automatically compare multiple genomes and perform sensitivity analysis for synteny block detection and for the subsequent computation of reversal distances

  12. Two-Way Regularized Fuzzy Clustering of Multiple Correspondence Analysis.

    Science.gov (United States)

    Kim, Sunmee; Choi, Ji Yeh; Hwang, Heungsun

    2017-01-01

    Multiple correspondence analysis (MCA) is a useful tool for investigating the interrelationships among dummy-coded categorical variables. MCA has been combined with clustering methods to examine whether there exist heterogeneous subclusters of a population, which exhibit cluster-level heterogeneity. These combined approaches aim to classify either observations only (one-way clustering of MCA) or both observations and variable categories (two-way clustering of MCA). The latter approach is favored because its solutions are easier to interpret by providing explicitly which subgroup of observations is associated with which subset of variable categories. Nonetheless, the two-way approach has been built on hard classification that assumes observations and/or variable categories to belong to only one cluster. To relax this assumption, we propose two-way fuzzy clustering of MCA. Specifically, we combine MCA with fuzzy k-means simultaneously to classify a subgroup of observations and a subset of variable categories into a common cluster, while allowing both observations and variable categories to belong partially to multiple clusters. Importantly, we adopt regularized fuzzy k-means, thereby enabling us to decide the degree of fuzziness in cluster memberships automatically. We evaluate the performance of the proposed approach through the analysis of simulated and real data, in comparison with existing two-way clustering approaches.

  13. A baseline for the multivariate comparison of resting state networks

    Directory of Open Access Journals (Sweden)

    Elena A Allen

    2011-02-01

    Full Text Available As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting state networks of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12 to 71 years. Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. Resting state networks were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease.

  14. A Baseline for the Multivariate Comparison of Resting-State Networks

    Science.gov (United States)

    Allen, Elena A.; Erhardt, Erik B.; Damaraju, Eswar; Gruner, William; Segall, Judith M.; Silva, Rogers F.; Havlicek, Martin; Rachakonda, Srinivas; Fries, Jill; Kalyanam, Ravi; Michael, Andrew M.; Caprihan, Arvind; Turner, Jessica A.; Eichele, Tom; Adelsheim, Steven; Bryan, Angela D.; Bustillo, Juan; Clark, Vincent P.; Feldstein Ewing, Sarah W.; Filbey, Francesca; Ford, Corey C.; Hutchison, Kent; Jung, Rex E.; Kiehl, Kent A.; Kodituwakku, Piyadasa; Komesu, Yuko M.; Mayer, Andrew R.; Pearlson, Godfrey D.; Phillips, John P.; Sadek, Joseph R.; Stevens, Michael; Teuscher, Ursina; Thoma, Robert J.; Calhoun, Vince D.

    2011-01-01

    As the size of functional and structural MRI datasets expands, it becomes increasingly important to establish a baseline from which diagnostic relevance may be determined, a processing strategy that efficiently prepares data for analysis, and a statistical approach that identifies important effects in a manner that is both robust and reproducible. In this paper, we introduce a multivariate analytic approach that optimizes sensitivity and reduces unnecessary testing. We demonstrate the utility of this mega-analytic approach by identifying the effects of age and gender on the resting-state networks (RSNs) of 603 healthy adolescents and adults (mean age: 23.4 years, range: 12–71 years). Data were collected on the same scanner, preprocessed using an automated analysis pipeline based in SPM, and studied using group independent component analysis. RSNs were identified and evaluated in terms of three primary outcome measures: time course spectral power, spatial map intensity, and functional network connectivity. Results revealed robust effects of age on all three outcome measures, largely indicating decreases in network coherence and connectivity with increasing age. Gender effects were of smaller magnitude but suggested stronger intra-network connectivity in females and more inter-network connectivity in males, particularly with regard to sensorimotor networks. These findings, along with the analysis approach and statistical framework described here, provide a useful baseline for future investigations of brain networks in health and disease. PMID:21442040

  15. Continuous subcutaneous insulin infusion versus multiple daily injections: the impact of baseline A1c

    NARCIS (Netherlands)

    Retnakaran, Ravi; Hochman, Jackie; DeVries, J. Hans; Hanaire-Broutin, Helene; Heine, Robert J.; Melki, Vincent; Zinman, Bernard

    2004-01-01

    Rapid-acting insulin analogs (insulin lispro and insulin aspart) have emerged as the meal insulin of choice in both multiple daily insulin injection (MDII) therapy and continuous subcutaneous insulin infusion (CSII) for type 1 diabetes. Thus, a comparison of efficacy between CSII and MDII should be

  16. Improving Baseline Model Assumptions: Evaluating the Impacts of Typical Methodological Approaches in Watershed Models

    Science.gov (United States)

    Muenich, R. L.; Kalcic, M. M.; Teshager, A. D.; Long, C. M.; Wang, Y. C.; Scavia, D.

    2017-12-01

    Thanks to the availability of open-source software, online tutorials, and advanced software capabilities, watershed modeling has expanded its user-base and applications significantly in the past thirty years. Even complicated models like the Soil and Water Assessment Tool (SWAT) are being used and documented in hundreds of peer-reviewed publications each year, and likely more applied in practice. These models can help improve our understanding of present, past, and future conditions, or analyze important "what-if" management scenarios. However, baseline data and methods are often adopted and applied without rigorous testing. In multiple collaborative projects, we have evaluated the influence of some of these common approaches on model results. Specifically, we examined impacts of baseline data and assumptions involved in manure application, combined sewer overflows, and climate data incorporation across multiple watersheds in the Western Lake Erie Basin. In these efforts, we seek to understand the impact of using typical modeling data and assumptions, versus using improved data and enhanced assumptions on model outcomes and thus ultimately, study conclusions. We provide guidance for modelers as they adopt and apply data and models for their specific study region. While it is difficult to quantitatively assess the full uncertainty surrounding model input data and assumptions, recognizing the impacts of model input choices is important when considering actions at the both the field and watershed scales.

  17. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  18. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  19. Failure analysis of high strength pipeline with single and multiple corrosions

    International Nuclear Information System (INIS)

    Chen, Yanfei; Zhang, Hong; Zhang, Juan; Li, Xin; Zhou, Jing

    2015-01-01

    Highlights: • We study failure of high strength pipelines with single corrosion. • We give regression equations for failure pressure prediction. • We propose assessment procedure for pipelines with multiple corrosions. - Abstract: Corrosion will compromise safety operation of oil and gas pipelines, accurate determination of failure pressure finds importance in residual strength assessment and corrosion allowance design of onshore and offshore pipelines. This paper investigates failure pressure of high strength pipeline with single and multiple corrosions using nonlinear finite element analysis. On the basis of developed regression equations for failure pressure prediction of high strength pipeline with single corrosion, the paper proposes an assessment procedure for predicting failure pressure of high strength pipeline with multiple corrosions. Furthermore, failure pressures predicted by proposed solutions are compared with experimental results and various assessment methods available in literature, where accuracy and versatility are demonstrated

  20. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  1. Multiple Sclerosis Increases Fracture Risk: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Guixian Dong

    2015-01-01

    Full Text Available Purpose. The association between multiple sclerosis (MS and fracture risk has been reported, but results of previous studies remain controversial and ambiguous. To assess the association between MS and fracture risk, a meta-analysis was performed. Method. Based on comprehensive searches of the PubMed, Embase, and Web of Science, we identified outcome data from all articles estimating the association between MS and fracture risk. The pooled risk ratios (RRs with 95% confidence intervals (CIs were calculated. Results. A significant association between MS and fracture risk was found. This result remained statistically significant when the adjusted RRs were combined. Subgroup analysis stratified by the site of fracture suggested significant associations between MS and tibia fracture risk, femur fracture risk, hip fracture risk, pelvis fracture risk, vertebrae fracture risk, and humerus fracture risk. In the subgroup analysis by gender, female MS patients had increased fracture risk. When stratified by history of drug use, use of antidepressants, hypnotics/anxiolytics, anticonvulsants, and glucocorticoids increased the risk of fracture risk in MS patients. Conclusions. This meta-analysis demonstrated that MS was significantly associated with fracture risk.

  2. The Research of Multiple Attenuation Based on Feedback Iteration and Independent Component Analysis

    Science.gov (United States)

    Xu, X.; Tong, S.; Wang, L.

    2017-12-01

    How to solve the problem of multiple suppression is a difficult problem in seismic data processing. The traditional technology for multiple attenuation is based on the principle of the minimum output energy of the seismic signal, this criterion is based on the second order statistics, and it can't achieve the multiple attenuation when the primaries and multiples are non-orthogonal. In order to solve the above problems, we combine the feedback iteration method based on the wave equation and the improved independent component analysis (ICA) based on high order statistics to suppress the multiple waves. We first use iterative feedback method to predict the free surface multiples of each order. Then, in order to predict multiples from real multiple in amplitude and phase, we design an expanded pseudo multi-channel matching filtering method to get a more accurate matching multiple result. Finally, we present the improved fast ICA algorithm which is based on the maximum non-Gauss criterion of output signal to the matching multiples and get better separation results of the primaries and the multiples. The advantage of our method is that we don't need any priori information to the prediction of the multiples, and can have a better separation result. The method has been applied to several synthetic data generated by finite-difference model technique and the Sigsbee2B model multiple data, the primaries and multiples are non-orthogonal in these models. The experiments show that after three to four iterations, we can get the perfect multiple results. Using our matching method and Fast ICA adaptive multiple subtraction, we can not only effectively preserve the effective wave energy in seismic records, but also can effectively suppress the free surface multiples, especially the multiples related to the middle and deep areas.

  3. Rasch analysis of the Multiple Sclerosis Impact Scale (MSIS-29

    Directory of Open Access Journals (Sweden)

    Misajon Rose

    2009-06-01

    Full Text Available Abstract Background Multiple Sclerosis (MS is a degenerative neurological disease that causes impairments, including spasticity, pain, fatigue, and bladder dysfunction, which negatively impact on quality of life. The Multiple Sclerosis Impact Scale (MSIS-29 is a disease-specific health-related quality of life (HRQoL instrument, developed using the patient's perspective on disease impact. It consists of two subscales assessing the physical (MSIS-29-PHYS and psychological (MSIS-29-PSYCH impact of MS. Although previous studies have found support for the psychometric properties of the MSIS-29 using traditional methods of scale evaluation, the scale has not been subjected to a detailed Rasch analysis. Therefore, the objective of this study was to use Rasch analysis to assess the internal validity of the scale, and its response format, item fit, targeting, internal consistency and dimensionality. Methods Ninety-two persons with definite MS residing in the community were recruited from a tertiary hospital database. Patients completed the MSIS-29 as part of a larger study. Rasch analysis was undertaken to assess the psychometric properties of the MSIS-29. Results Rasch analysis showed overall support for the psychometric properties of the two MSIS-29 subscales, however it was necessary to reduce the response format of the MSIS-29-PHYS to a 3-point response scale. Both subscales were unidimensional, had good internal consistency, and were free from item bias for sex and age. Dimensionality testing indicated it was not appropriate to combine the two subscales to form a total MSIS score. Conclusion In this first study to use Rasch analysis to fully assess the psychometric properties of the MSIS-29 support was found for the two subscales but not for the use of the total scale. Further use of Rasch analysis on the MSIS-29 in larger and broader samples is recommended to confirm these findings.

  4. Rasch analysis of the Multiple Sclerosis Impact Scale (MSIS-29)

    Science.gov (United States)

    Ramp, Melina; Khan, Fary; Misajon, Rose Anne; Pallant, Julie F

    2009-01-01

    Background Multiple Sclerosis (MS) is a degenerative neurological disease that causes impairments, including spasticity, pain, fatigue, and bladder dysfunction, which negatively impact on quality of life. The Multiple Sclerosis Impact Scale (MSIS-29) is a disease-specific health-related quality of life (HRQoL) instrument, developed using the patient's perspective on disease impact. It consists of two subscales assessing the physical (MSIS-29-PHYS) and psychological (MSIS-29-PSYCH) impact of MS. Although previous studies have found support for the psychometric properties of the MSIS-29 using traditional methods of scale evaluation, the scale has not been subjected to a detailed Rasch analysis. Therefore, the objective of this study was to use Rasch analysis to assess the internal validity of the scale, and its response format, item fit, targeting, internal consistency and dimensionality. Methods Ninety-two persons with definite MS residing in the community were recruited from a tertiary hospital database. Patients completed the MSIS-29 as part of a larger study. Rasch analysis was undertaken to assess the psychometric properties of the MSIS-29. Results Rasch analysis showed overall support for the psychometric properties of the two MSIS-29 subscales, however it was necessary to reduce the response format of the MSIS-29-PHYS to a 3-point response scale. Both subscales were unidimensional, had good internal consistency, and were free from item bias for sex and age. Dimensionality testing indicated it was not appropriate to combine the two subscales to form a total MSIS score. Conclusion In this first study to use Rasch analysis to fully assess the psychometric properties of the MSIS-29 support was found for the two subscales but not for the use of the total scale. Further use of Rasch analysis on the MSIS-29 in larger and broader samples is recommended to confirm these findings. PMID:19545445

  5. Climate change and watershed mercury export: a multiple projection and model analysis.

    Science.gov (United States)

    Golden, Heather E; Knightes, Christopher D; Conrads, Paul A; Feaster, Toby D; Davis, Gary M; Benedict, Stephen T; Bradley, Paul M

    2013-09-01

    Future shifts in climatic conditions may impact watershed mercury (Hg) dynamics and transport. An ensemble of watershed models was applied in the present study to simulate and evaluate the responses of hydrological and total Hg (THg) fluxes from the landscape to the watershed outlet and in-stream THg concentrations to contrasting climate change projections for a watershed in the southeastern coastal plain of the United States. Simulations were conducted under stationary atmospheric deposition and land cover conditions to explicitly evaluate the effect of projected precipitation and temperature on watershed Hg export (i.e., the flux of Hg at the watershed outlet). Based on downscaled inputs from 2 global circulation models that capture extremes of projected wet (Community Climate System Model, Ver 3 [CCSM3]) and dry (ECHAM4/HOPE-G [ECHO]) conditions for this region, watershed model simulation results suggest a decrease of approximately 19% in ensemble-averaged mean annual watershed THg fluxes using the ECHO climate-change model and an increase of approximately 5% in THg fluxes with the CCSM3 model. Ensemble-averaged mean annual ECHO in-stream THg concentrations increased 20%, while those of CCSM3 decreased by 9% between the baseline and projected simulation periods. Watershed model simulation results using both climate change models suggest that monthly watershed THg fluxes increase during the summer, when projected flow is higher than baseline conditions. The present study's multiple watershed model approach underscores the uncertainty associated with climate change response projections and their use in climate change management decisions. Thus, single-model predictions can be misleading, particularly in developmental stages of watershed Hg modeling. Copyright © 2013 SETAC.

  6. Climate change and watershed mercury export: a multiple projection and model analysis

    Science.gov (United States)

    Golden, Heather E.; Knightes, Christopher D.; Conrads, Paul; Feaster, Toby D.; Davis, Gary M.; Benedict, Stephen T.; Bradley, Paul M.

    2013-01-01

    Future shifts in climatic conditions may impact watershed mercury (Hg) dynamics and transport. An ensemble of watershed models was applied in the present study to simulate and evaluate the responses of hydrological and total Hg (THg) fluxes from the landscape to the watershed outlet and in-stream THg concentrations to contrasting climate change projections for a watershed in the southeastern coastal plain of the United States. Simulations were conducted under stationary atmospheric deposition and land cover conditions to explicitly evaluate the effect of projected precipitation and temperature on watershed Hg export (i.e., the flux of Hg at the watershed outlet). Based on downscaled inputs from 2 global circulation models that capture extremes of projected wet (Community Climate System Model, Ver 3 [CCSM3]) and dry (ECHAM4/HOPE-G [ECHO]) conditions for this region, watershed model simulation results suggest a decrease of approximately 19% in ensemble-averaged mean annual watershed THg fluxes using the ECHO climate-change model and an increase of approximately 5% in THg fluxes with the CCSM3 model. Ensemble-averaged mean annual ECHO in-stream THg concentrations increased 20%, while those of CCSM3 decreased by 9% between the baseline and projected simulation periods. Watershed model simulation results using both climate change models suggest that monthly watershed THg fluxes increase during the summer, when projected flow is higher than baseline conditions. The present study's multiple watershed model approach underscores the uncertainty associated with climate change response projections and their use in climate change management decisions. Thus, single-model predictions can be misleading, particularly in developmental stages of watershed Hg modeling.

  7. Non-destructive assay of fissile materials by detection and multiplicity analysis of spontaneous neutrons

    International Nuclear Information System (INIS)

    Prosdocimi, A.

    1979-01-01

    A method for determining the absolute reaction rate of nuclear events giving rise to neutron emission, according to their neutron multiplicity, is proposed. A typical application is the measurement of the (α, n) and spontaneous fission rates in a fissile material sample, particularly of Pu oxide composition. An analysis of random and correlated neutron pulses is carried out on the basis of sequential order without requiring any time interval analysis, then the primary nuclear events are sorted versus their neutron multiplicity. Suitable theoretical relationships enable to derive the absolute (α, n) and SF reaction rates when the physical parameters of the neutron detector and the multiplicity spectrumm of pulses are known. A typical device is described and the results of experiments leading to Pu-239 and Pu-240 assay are given

  8. Autonomous and controlled motivation for eating disorders treatment: baseline predictors and relationship to treatment outcome.

    Science.gov (United States)

    Carter, Jacqueline C; Kelly, Allison C

    2015-03-01

    This study aimed to identify baseline predictors of autonomous and controlled motivation for treatment (ACMT) in a transdiagnostic eating disorder sample, and to examine whether ACMT at baseline predicted change in eating disorder psychopathology during treatment. Participants were 97 individuals who met DSM-IV-TR criteria for an eating disorder and were admitted to a specialized intensive treatment programme. Self-report measures of eating disorder psychopathology, ACMT, and various psychosocial variables were completed at the start of treatment. A subset of these measures was completed again after 3, 6, 9, and 12 weeks of treatment. Multiple regression analyses showed that baseline autonomous motivation was higher among patients who reported more self-compassion and more received social support, whereas the only baseline predictor of controlled motivation was shame. Multilevel modelling revealed that higher baseline autonomous motivation predicted faster decreases in global eating disorder psychopathology, whereas the level of controlled motivation at baseline did not. The current findings suggest that developing interventions designed to foster autonomous motivation specifically and employing autonomy supportive strategies may be important to improving eating disorders treatment outcome. The findings of this study suggest that developing motivational interventions that focus specifically on enhancing autonomous motivation for change may be important for promoting eating disorder recovery. Our results lend support for the use of autonomy supportive strategies to strengthen personally meaningful reasons to achieve freely chosen change goals in order to enhance treatment for eating disorders. One study limitation is that there were no follow-up assessments beyond the 12-week study and we therefore do not know whether the relationships that we observed persisted after treatment. Another limitation is that this was a correlational study and it is therefore important

  9. THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2007-08-06

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory (BNL)and Fermi National Accelerator Laboratory (FNAL) to investigate the potential for future U.S. based long baseline neutrino oscillation experiments using MW class conventional neutrino beams that can be produced at FNAL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing FNAL NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from FNAL aimed at a massive detector with a baseline of > 1000km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2{sup o}.

  10. Hearing the voices of service user researchers in collaborative qualitative data analysis: the case for multiple coding.

    Science.gov (United States)

    Sweeney, Angela; Greenwood, Kathryn E; Williams, Sally; Wykes, Til; Rose, Diana S

    2013-12-01

    Health research is frequently conducted in multi-disciplinary teams, with these teams increasingly including service user researchers. Whilst it is common for service user researchers to be involved in data collection--most typically interviewing other service users--it is less common for service user researchers to be involved in data analysis and interpretation. This means that a unique and significant perspective on the data is absent. This study aims to use an empirical report of a study on Cognitive Behavioural Therapy for psychosis (CBTp) to demonstrate the value of multiple coding in enabling service users voices to be heard in team-based qualitative data analysis. The CBTp study employed multiple coding to analyse service users' discussions of CBT for psychosis (CBTp) from the perspectives of a service user researcher, clinical researcher and psychology assistant. Multiple coding was selected to enable multiple perspectives to analyse and interpret data, to understand and explore differences and to build multi-disciplinary consensus. Multiple coding enabled the team to understand where our views were commensurate and incommensurate and to discuss and debate differences. Through the process of multiple coding, we were able to build strong consensus about the data from multiple perspectives, including that of the service user researcher. Multiple coding is an important method for understanding and exploring multiple perspectives on data and building team consensus. This can be contrasted with inter-rater reliability which is only appropriate in limited circumstances. We conclude that multiple coding is an appropriate and important means of hearing service users' voices in qualitative data analysis. © 2012 John Wiley & Sons Ltd.

  11. The effect of a motivational intervention on weight loss is moderated by level of baseline controlled motivation

    Directory of Open Access Journals (Sweden)

    Tate Deborah F

    2010-01-01

    Full Text Available Abstract Background Clinic-based behavioral weight loss programs are effective in producing significant weight loss. A one-size-fits-all approach is often taken with these programs. It may be beneficial to tailor programs based on participants' baseline characteristics. Type and level of motivation may be an important factor to consider. Previous research has found that, in general, higher levels of controlled motivation are detrimental to behavior change while higher levels of autonomous motivation improve the likelihood of behavior modification. Methods This study assessed the outcomes of two internet behavioral weight loss interventions and assessed the effect of baseline motivation levels on program success. Eighty females (M (SD age 48.7 (10.6 years; BMI 32.0 (3.7 kg/m2; 91% Caucasian were randomized to one of two groups, a standard group or a motivation-enhanced group. Both received a 16-week internet behavioral weight loss program and attended an initial and a four-week group session. Weight and motivation were measured at baseline, four and 16 weeks. Hierarchical regression analysis was conducted to test for moderation. Results There was significant weight loss at 16-weeks in both groups (p p = 0.57 (standard group 3.4 (3.6 kg; motivation-enhanced group 3.9 (3.4 kg. Further analysis was conducted to examine predictors of weight loss. Baseline controlled motivation level was negatively correlated with weight loss in the entire sample (r = -0.30; p = 0.01. Statistical analysis revealed an interaction between study group assignment and baseline level of controlled motivation. Weight loss was not predicted by baseline level of controlled motivation in the motivation-enhanced group, but was significantly predicted by controlled motivation in the standard group. Baseline autonomous motivation did not predict weight change in either group. Conclusions This research found that, in participants with high levels of baseline controlled motivation

  12. Supervised Cross-Modal Factor Analysis for Multiple Modal Data Classification

    KAUST Repository

    Wang, Jingbin

    2015-10-09

    In this paper we study the problem of learning from multiple modal data for purpose of document classification. In this problem, each document is composed two different modals of data, i.e., An image and a text. Cross-modal factor analysis (CFA) has been proposed to project the two different modals of data to a shared data space, so that the classification of a image or a text can be performed directly in this space. A disadvantage of CFA is that it has ignored the supervision information. In this paper, we improve CFA by incorporating the supervision information to represent and classify both image and text modals of documents. We project both image and text data to a shared data space by factor analysis, and then train a class label predictor in the shared space to use the class label information. The factor analysis parameter and the predictor parameter are learned jointly by solving one single objective function. With this objective function, we minimize the distance between the projections of image and text of the same document, and the classification error of the projection measured by hinge loss function. The objective function is optimized by an alternate optimization strategy in an iterative algorithm. Experiments in two different multiple modal document data sets show the advantage of the proposed algorithm over other CFA methods.

  13. Limitations in Using Multiple Imputation to Harmonize Individual Participant Data for Meta-Analysis.

    Science.gov (United States)

    Siddique, Juned; de Chavez, Peter J; Howe, George; Cruden, Gracelyn; Brown, C Hendricks

    2018-02-01

    Individual participant data (IPD) meta-analysis is a meta-analysis in which the individual-level data for each study are obtained and used for synthesis. A common challenge in IPD meta-analysis is when variables of interest are measured differently in different studies. The term harmonization has been coined to describe the procedure of placing variables on the same scale in order to permit pooling of data from a large number of studies. Using data from an IPD meta-analysis of 19 adolescent depression trials, we describe a multiple imputation approach for harmonizing 10 depression measures across the 19 trials by treating those depression measures that were not used in a study as missing data. We then apply diagnostics to address the fit of our imputation model. Even after reducing the scale of our application, we were still unable to produce accurate imputations of the missing values. We describe those features of the data that made it difficult to harmonize the depression measures and provide some guidelines for using multiple imputation for harmonization in IPD meta-analysis.

  14. Pipeline integrity: ILI baseline data for QRA

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd R. [Tuboscope Pipeline Services, Houston, TX (United States)]. E-mail: tporter@varco.com; Silva, Jose Augusto Pereira da [Pipeway Engenharia, Rio de Janeiro, RJ (Brazil)]. E-mail: guto@pipeway.com; Marr, James [MARR and Associates, Calgary, AB (Canada)]. E-mail: jmarr@marr-associates.com

    2003-07-01

    The initial phase of a pipeline integrity management program (IMP) is conducting a baseline assessment of the pipeline system and segments as part of Quantitative Risk Assessment (QRA). This gives the operator's integrity team the opportunity to identify critical areas and deficiencies in the protection, maintenance, and mitigation strategies. As a part of data gathering and integration of a wide variety of sources, in-line inspection (ILI) data is a key element. In order to move forward in the integrity program development and execution, the baseline geometry of the pipeline must be determined with accuracy and confidence. From this, all subsequent analysis and conclusions will be derived. Tuboscope Pipeline Services (TPS), in conjunction with Pipeway Engenharia of Brazil, operate ILI inertial navigation system (INS) and Caliper geometry tools, to address this integrity requirement. This INS and Caliper ILI tool data provides pipeline trajectory at centimeter level resolution and sub-metre 3D position accuracy along with internal geometry - ovality, dents, misalignment, and wrinkle/buckle characterization. Global strain can be derived from precise INS curvature measurements and departure from the initial pipeline state. Accurate pipeline elevation profile data is essential in the identification of sag/over bend sections for fluid dynamic and hydrostatic calculations. This data, along with pipeline construction, operations, direct assessment and maintenance data is integrated in LinaViewPRO{sup TM}, a pipeline data management system for decision support functions, and subsequent QRA operations. This technology provides the baseline for an informed, accurate and confident integrity management program. This paper/presentation will detail these aspects of an effective IMP, and experience will be presented, showing the benefits for liquid and gas pipeline systems. (author)

  15. Instantiating the multiple levels of analysis perspective in a program of study on externalizing behavior

    Science.gov (United States)

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa M.

    2014-01-01

    During the last quarter century, developmental psychopathology has become increasingly inclusive and now spans disciplines ranging from psychiatric genetics to primary prevention. As a result, developmental psychopathologists have extended traditional diathesis–stress and transactional models to include causal processes at and across all relevant levels of analysis. Such research is embodied in what is known as the multiple levels of analysis perspective. We describe how multiple levels of analysis research has informed our current thinking about antisocial and borderline personality development among trait impulsive and therefore vulnerable individuals. Our approach extends the multiple levels of analysis perspective beyond simple Biology × Environment interactions by evaluating impulsivity across physiological systems (genetic, autonomic, hormonal, neural), psychological constructs (social, affective, motivational), developmental epochs (preschool, middle childhood, adolescence, adulthood), sexes (male, female), and methods of inquiry (self-report, informant report, treatment outcome, cardiovascular, electrophysiological, neuroimaging). By conducting our research using any and all available methods across these levels of analysis, we have arrived at a developmental model of trait impulsivity that we believe confers a greater understanding of this highly heritable trait and captures at least some heterogeneity in key behavioral outcomes, including delinquency and suicide. PMID:22781868

  16. Do leukocyte telomere length dynamics depend on baseline telomere length? An analysis that corrects for ‘regression to the mean’

    International Nuclear Information System (INIS)

    Verhulst, Simon; Aviv, Abraham; Benetos, Athanase; Berenson, Gerald S.; Kark, Jeremy D.

    2013-01-01

    Leukocyte telomere length (LTL) shortens with age. Longitudinal studies have reported accelerated LTL attrition when baseline LTL is longer. However, the dependency of LTL attrition on baseline LTL might stem from a statistical artifact known as regression to the mean (RTM). To our knowledge no published study of LTL dynamics (LTL and its attrition rate) has corrected for this phenomenon. We illustrate the RTM effect using replicate LTL measurements, and show, using simulated data, how the RTM effect increases with a rise in stochastic measurement variation (representing LTL measurement error), resulting in spurious increasingly elevated dependencies of attrition on baseline values. In addition, we re-analyzed longitudinal LTL data collected from four study populations to test the hypothesis that LTL attrition depends on baseline LTL. We observed that the rate of LTL attrition was proportional to baseline LTL, but correction for the RTM effect reduced the slope of the relationship by 57 % when measurement error was low (coefficient of variation ∼2 %). A modest but statistically significant effect remained however, indicating that high baseline LTL is associated with higher LTL attrition even when correcting for the RTM effect. Baseline LTL explained 1.3 % of the variation in LTL attrition, but this effect, which differed significantly between the study samples, appeared to be primarily attributable to the association in men (3.7 %)

  17. Multiple Regression Analysis of Unconfined Compression Strength of Mine Tailings Matrices

    Directory of Open Access Journals (Sweden)

    Mahmood Ali A.

    2017-01-01

    Full Text Available As part of a novel approach of sustainable development of mine tailings, experimental and numerical analysis is carried out on newly formulated tailings matrices. Several physical characteristic tests are carried out including the unconfined compression strength test to ascertain the integrity of these matrices when subjected to loading. The current paper attempts a multiple regression analysis of the unconfined compressive strength test results of these matrices to investigate the most pertinent factors affecting their strength. Results of this analysis showed that the suggested equation is reasonably applicable to the range of binder combinations used.

  18. 49 CFR 192.921 - How is the baseline assessment to be conducted?

    Science.gov (United States)

    2010-10-01

    ...) PIPELINE AND HAZARDOUS MATERIALS SAFETY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) PIPELINE SAFETY TRANSPORTATION OF NATURAL AND OTHER GAS BY PIPELINE: MINIMUM FEDERAL SAFETY STANDARDS Gas... the covered pipeline segments for the baseline assessment according to a risk analysis that considers...

  19. Exploration of machine learning techniques in predicting multiple sclerosis disease course

    OpenAIRE

    Zhao, Yijun; Healy, Brian C.; Rotstein, Dalia; Guttmann, Charles R. G.; Bakshi, Rohit; Weiner, Howard L.; Brodley, Carla E.; Chitnis, Tanuja

    2017-01-01

    Objective To explore the value of machine learning methods for predicting multiple sclerosis disease course. Methods 1693 CLIMB study patients were classified as increased EDSS?1.5 (worsening) or not (non-worsening) at up to five years after baseline visit. Support vector machines (SVM) were used to build the classifier, and compared to logistic regression (LR) using demographic, clinical and MRI data obtained at years one and two to predict EDSS at five years follow-up. Results Baseline data...

  20. Robustness Analysis of Real Network Topologies Under Multiple Failure Scenarios

    DEFF Research Database (Denmark)

    Manzano, M.; Marzo, J. L.; Calle, E.

    2012-01-01

    on topological characteristics. Recently approaches also consider the services supported by such networks. In this paper we carry out a robustness analysis of five real backbone telecommunication networks under defined multiple failure scenarios, taking into account the consequences of the loss of established......Nowadays the ubiquity of telecommunication networks, which underpin and fulfill key aspects of modern day living, is taken for granted. Significant large-scale failures have occurred in the last years affecting telecommunication networks. Traditionally, network robustness analysis has been focused...... connections. Results show which networks are more robust in response to a specific type of failure....

  1. Symptomatic treatment in multiple sclerosis-interim analysis of a nationwide registry.

    Science.gov (United States)

    Skierlo, S; Rommer, P S; Zettl, U K

    2017-04-01

    To analyze symptomatic treatment in patients with multiple sclerosis (MS). Multiple sclerosis is a chronic inflammatory disease of the central nervous system, with accumulating disability symptoms like spasticity, voiding disorders, depression, and pain might occur. The nationwide German MS registry was initiated 2001 under guidance of the German MS society (Deutsche MS Gesellschaft). This study was performed as an interim analysis to lay foundation for future work on this topic. A subcohort of 5113 patients was assessed for this interim analysis. The mean age of the patients was 45.3 years; mean EDSS was 4.2. More than two-third of the enrolled patients were females (70.9%). Most frequent symptoms were fatigue (60%), followed by spasticity (52.5%) and voiding disorders (51.7%). The likelihood of treatment was highest for epileptic disorders (68.8%), spasticity (68.5%), pain (60.7%), and depression (58.9%). Multivariate regression analysis showed that retirement was the strongest factor predictive for antispastic treatment (β=.061, P=.005). Almost all patients in this analysis suffer from symptoms due to advanced MS. Treatment for the various symptoms differed tremendously. The likelihood of treatment correlated with the availability of effective therapeutic agents. Clinicians should put more awareness on MS symptoms. Symptomatic treatment may improve quality of life. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Association Between Baseline LDL-C Level and Total and Cardiovascular Mortality After LDL-C Lowering: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Navarese, Eliano P; Robinson, Jennifer G; Kowalewski, Mariusz; Kolodziejczak, Michalina; Andreotti, Felicita; Bliden, Kevin; Tantry, Udaya; Kubica, Jacek; Raggi, Paolo; Gurbel, Paul A

    2018-04-17

    Effects on specific fatal and nonfatal end points appear to vary for low-density lipoprotein cholesterol (LDL-C)-lowering drug trials. To evaluate whether baseline LDL-C level is associated with total and cardiovascular mortality risk reductions. Electronic databases (Cochrane, MEDLINE, EMBASE, TCTMD, ClinicalTrials.gov, major congress proceedings) were searched through February 2, 2018, to identify randomized clinical trials of statins, ezetimibe, and PCSK9-inhibiting monoclonal antibodies. Two investigators abstracted data and appraised risks of bias. Intervention groups were categorized as "more intensive" (more potent pharmacologic intervention) or "less intensive" (less potent, placebo, or control group). The coprimary end points were total mortality and cardiovascular mortality. Random-effects meta-regression and meta-analyses evaluated associations between baseline LDL-C level and reductions in mortality end points and secondary end points including major adverse cardiac events (MACE). In 34 trials, 136 299 patients received more intensive and 133 989 received less intensive LDL-C lowering. All-cause mortality was lower for more vs less intensive therapy (7.08% vs 7.70%; rate ratio [RR], 0.92 [95% CI, 0.88 to 0.96]), but varied by baseline LDL-C level. Meta-regression showed more intensive LDL-C lowering was associated with greater reductions in all-cause mortality with higher baseline LDL-C levels (change in RRs per 40-mg/dL increase in baseline LDL-C, 0.91 [95% CI, 0.86 to 0.96]; P = .001; absolute risk difference [ARD], -1.05 incident cases per 1000 person-years [95% CI, -1.59 to -0.51]), but only when baseline LDL-C levels were 100 mg/dL or greater (P baseline LDL-C level. Meta-regression showed more intensive LDL-C lowering was associated with a greater reduction in cardiovascular mortality with higher baseline LDL-C levels (change in RRs per 40-mg/dL increase in baseline LDL-C, 0.86 [95% CI, 0.80 to 0.94]; P baseline LDL-C levels were 100

  3. Cost-effectiveness analysis of treating transplant-eligible multiple myeloma patients in Macedonia

    Directory of Open Access Journals (Sweden)

    Qerimi V

    2018-06-01

    Full Text Available Vjollca Qerimi,1,2 Aleksandra Kapedanovska Nestorovska,1 Zoran Sterjev,1 Sonja Genadieva-Stavric,3 Ljubica Suturkova1 1Faculty of Pharmacy, Ss. Cyril and Methodius University in Skopje, Skopje, Macedonia; 2Institute of Public Health, Medical Decision Making and Health Technology Assessment, Department of Public Health, Health Services Research and Health Technology Assessment, UMIT – University for Health Sciences, Medical Informatics and Technology, Hall in Tirol, Austria; 3Medical Faculty, University Hematology Clinic, Skopje, Macedonia Purpose: A decision-analytic model was developed to study the impact of induction regimens vincristine, adriamycin, dexamethasone (VAD; thalidomide, dexamethasone (TD; and bortezomib, dexamethasone (BorD, followed by autologous stem cell transplantation (ASCT for treating multiple myeloma (MM patients in Macedonia. Additionally, a cost-effectiveness analysis (CEA of treatment sequences to predict health effects and costs of different treatment sequences was performed.Methods: Model strategies were based on a previously published study for treating patients with MM in Macedonia. The data on disease progression and treatment effectiveness were obtained from the published reports of randomized clinical trials (GIMEMA M-B02005, IFM 2005-01. Utility parameters were extracted from the literature. To compare treatment combinations, a decision tree model was developed. Additionally, a cost analysis for one-time per-protocol costs was performed from a Macedonian national health care perspective. The incremental cost-effectiveness ratios (ICERs/quality-adjusted life years (QALYs gained for 1-, 10-, and 20-year time horizons were determined. Costs and health outcomes were discounted to evaluate the effects of time in the model.Results: The one-time costs of BorD (EUR 5,656 were higher compared to VAD (EUR 303 and TD (EUR 329, increasing the overall costs for BorD. Thus, the BorD combination dominated in the baseline

  4. Mediation analysis with multiple versions of the mediator.

    Science.gov (United States)

    Vanderweele, Tyler J

    2012-05-01

    The causal inference literature has provided definitions of direct and indirect effects based on counterfactuals that generalize the approach found in the social science literature. However, these definitions presuppose well-defined hypothetical interventions on the mediator. In many settings, there may be multiple ways to fix the mediator to a particular value, and these various hypothetical interventions may have very different implications for the outcome of interest. In this paper, we consider mediation analysis when multiple versions of the mediator are present. Specifically, we consider the problem of attempting to decompose a total effect of an exposure on an outcome into the portion through the intermediate and the portion through other pathways. We consider the setting in which there are multiple versions of the mediator but the investigator has access only to data on the particular measurement, not information on which version of the mediator may have brought that value about. We show that the quantity that is estimated as a natural indirect effect using only the available data does indeed have an interpretation as a particular type of mediated effect; however, the quantity estimated as a natural direct effect, in fact, captures both a true direct effect and an effect of the exposure on the outcome mediated through the effect of the version of the mediator that is not captured by the mediator measurement. The results are illustrated using 2 examples from the literature, one in which the versions of the mediator are unknown and another in which the mediator itself has been dichotomized.

  5. Long Baseline Observatory (LBO)

    Data.gov (United States)

    Federal Laboratory Consortium — The Long Baseline Observatory (LBO) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  6. Dynamic Bus Travel Time Prediction Models on Road with Multiple Bus Routes.

    Science.gov (United States)

    Bai, Cong; Peng, Zhong-Ren; Lu, Qing-Chang; Sun, Jian

    2015-01-01

    Accurate and real-time travel time information for buses can help passengers better plan their trips and minimize waiting times. A dynamic travel time prediction model for buses addressing the cases on road with multiple bus routes is proposed in this paper, based on support vector machines (SVMs) and Kalman filtering-based algorithm. In the proposed model, the well-trained SVM model predicts the baseline bus travel times from the historical bus trip data; the Kalman filtering-based dynamic algorithm can adjust bus travel times with the latest bus operation information and the estimated baseline travel times. The performance of the proposed dynamic model is validated with the real-world data on road with multiple bus routes in Shenzhen, China. The results show that the proposed dynamic model is feasible and applicable for bus travel time prediction and has the best prediction performance among all the five models proposed in the study in terms of prediction accuracy on road with multiple bus routes.

  7. Dynamic Bus Travel Time Prediction Models on Road with Multiple Bus Routes

    Science.gov (United States)

    Bai, Cong; Peng, Zhong-Ren; Lu, Qing-Chang; Sun, Jian

    2015-01-01

    Accurate and real-time travel time information for buses can help passengers better plan their trips and minimize waiting times. A dynamic travel time prediction model for buses addressing the cases on road with multiple bus routes is proposed in this paper, based on support vector machines (SVMs) and Kalman filtering-based algorithm. In the proposed model, the well-trained SVM model predicts the baseline bus travel times from the historical bus trip data; the Kalman filtering-based dynamic algorithm can adjust bus travel times with the latest bus operation information and the estimated baseline travel times. The performance of the proposed dynamic model is validated with the real-world data on road with multiple bus routes in Shenzhen, China. The results show that the proposed dynamic model is feasible and applicable for bus travel time prediction and has the best prediction performance among all the five models proposed in the study in terms of prediction accuracy on road with multiple bus routes. PMID:26294903

  8. Low baseline levels of NK cells may predict a positive response to ipilimumab in melanoma therapy.

    Science.gov (United States)

    Tietze, Julia K; Angelova, Daniela; Heppt, Markus V; Ruzicka, Thomas; Berking, Carola

    2017-07-01

    The introduction of immune checkpoint blockade (ICB) has been a breakthrough in the therapy of metastatic melanoma. The influence of ICB on T-cell populations has been studied extensively, but little is known about the effect on NK cells. In this study, we analysed the relative and absolute amounts of NK cells and of the subpopulations of CD56 dim and CD56 bright NK cells among the peripheral blood mononuclear cells (PBMCs) of 32 patients with metastatic melanoma before and under treatment with ipilimumab or pembrolizumab by flow cytometry. In 15 (47%) patients, an abnormal low amount of NK cells was found at baseline. Analysis of the subpopulations showed also low or normal baseline levels for CD56 dim NK cells, whereas the baseline levels of CD56 bright NK cells were either normal or abnormally high. The relative and absolute amounts of NK cells and of CD56 dim and CD56 bright NK cell subpopulations in patients with a normal baseline did not change under treatment. However, patients with a low baseline of NK cells and CD56 dim NK cells showed a significant increase in these immune cell subsets, but the amounts remained to be lower than the normal baseline. The amount of CD56 bright NK cells was unaffected by treatment. The baseline levels of NK cells were correlated with the number of metastatic organs. Their proportion increased, whereas the expression of NKG2D decreased significantly when more than one organ was affected by metastases. Low baseline levels of NK cells and CD56 dim NK cells as well as normal baseline levels of CD56 bright NK cells correlated significantly with a positive response to ipilimumab but not to pembrolizumab. Survival curves of patients with low amounts of CD56 dim NK cells treated with ipilimumab showed a trend to longer survival. Normal baseline levels of CD56 bright NK cells were significantly correlated with longer survival as compared to patients with high baseline levels. In conclusion, analysis of the amounts of total NK cells

  9. Magnetic resonance imaging perfusion is associated with disease severity and activity in multiple sclerosis

    Energy Technology Data Exchange (ETDEWEB)

    Sowa, Piotr [Oslo University Hospital, Department of Radiology and Nuclear Medicine, Oslo (Norway); University of Oslo, Institute of Clinical Medicine, Faculty of Medicine, Oslo (Norway); Owren Nygaard, Gro [Oslo University Hospital, Department of Neurology, Oslo (Norway); Bjoernerud, Atle [Intervention Center, Oslo University Hospital, Oslo (Norway); University of Oslo, Department of Physics, Oslo (Norway); Gulowsen Celius, Elisabeth [Oslo University Hospital, Department of Neurology, Oslo (Norway); University of Oslo, Institute of Health and Society, Faculty of Medicine, Oslo (Norway); Flinstad Harbo, Hanne [University of Oslo, Institute of Clinical Medicine, Faculty of Medicine, Oslo (Norway); Oslo University Hospital, Department of Neurology, Oslo (Norway); Kristiansen Beyer, Mona [Oslo University Hospital, Department of Radiology and Nuclear Medicine, Oslo (Norway); Oslo and Akershus University College of Applied Sciences, Department of Life Sciences and Health, Oslo (Norway)

    2017-07-15

    The utility of perfusion-weighted imaging in multiple sclerosis (MS) is not well investigated. The purpose of this study was to compare baseline normalized perfusion measures in subgroups of newly diagnosed MS patients. We wanted to test the hypothesis that this method can differentiate between groups defined according to disease severity and disease activity at 1 year follow-up. Baseline magnetic resonance imaging (MRI) including a dynamic susceptibility contrast perfusion sequence was performed on a 1.5-T scanner in 66 patients newly diagnosed with relapsing-remitting MS. From the baseline MRI, cerebral blood flow (CBF), cerebral blood volume (CBV), and mean transit time (MTT) maps were generated. Normalized (n) perfusion values were calculated by dividing each perfusion parameter obtained in white matter lesions by the same parameter obtained in normal-appearing white matter. Neurological examination was performed at baseline and at follow-up approximately 1 year later to establish the multiple sclerosis severity score (MSSS) and evidence of disease activity (EDA). Baseline normalized mean transit time (nMTT) was lower in patients with MSSS >3.79 (p = 0.016), in patients with EDA (p = 0.041), and in patients with both MSSS >3.79 and EDA (p = 0.032) at 1-year follow-up. Baseline normalized cerebral blood flow and normalized cerebral blood volume did not differ between these groups. Lower baseline nMTT was associated with higher disease severity and with presence of disease activity 1 year later in newly diagnosed MS patients. Further longitudinal studies are needed to confirm whether baseline-normalized perfusion measures can differentiate between disease severity and disease activity subgroups over time. (orig.)

  10. Magnetic resonance imaging perfusion is associated with disease severity and activity in multiple sclerosis

    International Nuclear Information System (INIS)

    Sowa, Piotr; Owren Nygaard, Gro; Bjoernerud, Atle; Gulowsen Celius, Elisabeth; Flinstad Harbo, Hanne; Kristiansen Beyer, Mona

    2017-01-01

    The utility of perfusion-weighted imaging in multiple sclerosis (MS) is not well investigated. The purpose of this study was to compare baseline normalized perfusion measures in subgroups of newly diagnosed MS patients. We wanted to test the hypothesis that this method can differentiate between groups defined according to disease severity and disease activity at 1 year follow-up. Baseline magnetic resonance imaging (MRI) including a dynamic susceptibility contrast perfusion sequence was performed on a 1.5-T scanner in 66 patients newly diagnosed with relapsing-remitting MS. From the baseline MRI, cerebral blood flow (CBF), cerebral blood volume (CBV), and mean transit time (MTT) maps were generated. Normalized (n) perfusion values were calculated by dividing each perfusion parameter obtained in white matter lesions by the same parameter obtained in normal-appearing white matter. Neurological examination was performed at baseline and at follow-up approximately 1 year later to establish the multiple sclerosis severity score (MSSS) and evidence of disease activity (EDA). Baseline normalized mean transit time (nMTT) was lower in patients with MSSS >3.79 (p = 0.016), in patients with EDA (p = 0.041), and in patients with both MSSS >3.79 and EDA (p = 0.032) at 1-year follow-up. Baseline normalized cerebral blood flow and normalized cerebral blood volume did not differ between these groups. Lower baseline nMTT was associated with higher disease severity and with presence of disease activity 1 year later in newly diagnosed MS patients. Further longitudinal studies are needed to confirm whether baseline-normalized perfusion measures can differentiate between disease severity and disease activity subgroups over time. (orig.)

  11. Causal mediation analysis with multiple mediators.

    Science.gov (United States)

    Daniel, R M; De Stavola, B L; Cousens, S N; Vansteelandt, S

    2015-03-01

    In diverse fields of empirical research-including many in the biological sciences-attempts are made to decompose the effect of an exposure on an outcome into its effects via a number of different pathways. For example, we may wish to separate the effect of heavy alcohol consumption on systolic blood pressure (SBP) into effects via body mass index (BMI), via gamma-glutamyl transpeptidase (GGT), and via other pathways. Much progress has been made, mainly due to contributions from the field of causal inference, in understanding the precise nature of statistical estimands that capture such intuitive effects, the assumptions under which they can be identified, and statistical methods for doing so. These contributions have focused almost entirely on settings with a single mediator, or a set of mediators considered en bloc; in many applications, however, researchers attempt a much more ambitious decomposition into numerous path-specific effects through many mediators. In this article, we give counterfactual definitions of such path-specific estimands in settings with multiple mediators, when earlier mediators may affect later ones, showing that there are many ways in which decomposition can be done. We discuss the strong assumptions under which the effects are identified, suggesting a sensitivity analysis approach when a particular subset of the assumptions cannot be justified. These ideas are illustrated using data on alcohol consumption, SBP, BMI, and GGT from the Izhevsk Family Study. We aim to bridge the gap from "single mediator theory" to "multiple mediator practice," highlighting the ambitious nature of this endeavor and giving practical suggestions on how to proceed. © 2014 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  12. Hanford Site technical baseline database. Revision 1

    International Nuclear Information System (INIS)

    Porter, P.E.

    1995-01-01

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available

  13. Simultaneous Treatment of Missing Data and Measurement Error in HIV Research Using Multiple Overimputation.

    Science.gov (United States)

    Schomaker, Michael; Hogger, Sara; Johnson, Leigh F; Hoffmann, Christopher J; Bärnighausen, Till; Heumann, Christian

    2015-09-01

    Both CD4 count and viral load in HIV-infected persons are measured with error. There is no clear guidance on how to deal with this measurement error in the presence of missing data. We used multiple overimputation, a method recently developed in the political sciences, to account for both measurement error and missing data in CD4 count and viral load measurements from four South African cohorts of a Southern African HIV cohort collaboration. Our knowledge about the measurement error of ln CD4 and log10 viral load is part of an imputation model that imputes both missing and mismeasured data. In an illustrative example, we estimate the association of CD4 count and viral load with the hazard of death among patients on highly active antiretroviral therapy by means of a Cox model. Simulation studies evaluate the extent to which multiple overimputation is able to reduce bias in survival analyses. Multiple overimputation emphasizes more strongly the influence of having high baseline CD4 counts compared to both a complete case analysis and multiple imputation (hazard ratio for >200 cells/mm vs. <25 cells/mm: 0.21 [95% confidence interval: 0.18, 0.24] vs. 0.38 [0.29, 0.48], and 0.29 [0.25, 0.34], respectively). Similar results are obtained when varying assumptions about measurement error, when using p-splines, and when evaluating time-updated CD4 count in a longitudinal analysis. The estimates of the association with viral load are slightly more attenuated when using multiple imputation instead of multiple overimputation. Our simulation studies suggest that multiple overimputation is able to reduce bias and mean squared error in survival analyses. Multiple overimputation, which can be used with existing software, offers a convenient approach to account for both missing and mismeasured data in HIV research.

  14. The effects of single instance, multiple instance, and general case training on generalized vending machine use by moderately and severely handicapped students.

    OpenAIRE

    Sprague, J R; Horner, R H

    1984-01-01

    This report provides an experimental analysis of generalized vending machine use by six moderately or severely retarded high school students. Dependent variables were training trials to criterion and performance on 10 nontrained "generalization" vending machines. A multiple-baseline design across subjects was used to compare three strategies for teaching generalized vending machine use. Training occurred with (a) a single vending machine, (b) three similar machines, or (c) three machines that...

  15. Coffee and Green Tea Consumption and Subsequent Risk of Malignant Lymphoma and Multiple Myeloma in Japan: The Japan Public Health Center-based Prospective Study.

    Science.gov (United States)

    Ugai, Tomotaka; Matsuo, Keitaro; Sawada, Norie; Iwasaki, Motoki; Yamaji, Taiki; Shimazu, Taichi; Sasazuki, Shizuka; Inoue, Manami; Kanda, Yoshinobu; Tsugane, Shoichiro

    2017-08-01

    Background: The aim of this study was to investigate the association of coffee and green tea consumption and the risk of malignant lymphoma and multiple myeloma in a large-scale population-based cohort study in Japan. Methods: In this analysis, a total of 95,807 Japanese subjects (45,937 men and 49,870 women; ages 40-69 years at baseline) of the Japan Public Health Center-based Prospective Study who completed a questionnaire about their coffee and green tea consumption were followed up until December 31, 2012, for an average of 18 years. HRs and 95% confidence intervals were estimated using a Cox regression model adjusted for potential confounders as a measure of association between the risk of malignant lymphoma and multiple myeloma associated with coffee and green tea consumption at baseline. Results: During the follow-up period, a total of 411 malignant lymphoma cases and 138 multiple myeloma cases were identified. Overall, our findings showed no significant association between coffee or green tea consumption and the risk of malignant lymphoma or multiple myeloma for both sexes. Conclusions: In this study, we observed no significant association between coffee or green tea consumption and the risk of malignant lymphoma or multiple myeloma. Impact: Our results do not support an association between coffee or green tea consumption and the risk of malignant lymphoma or multiple myeloma. Cancer Epidemiol Biomarkers Prev; 26(8); 1352-6. ©2017 AACR . ©2017 American Association for Cancer Research.

  16. Inherent Risk Factors for Nosocomial Infection in the Long Stay Critically Ill Child Without Known Baseline Immunocompromise: A Post Hoc Analysis of the CRISIS Trial.

    Science.gov (United States)

    Carcillo, Joseph A; Dean, J Michael; Holubkov, Richard; Berger, John; Meert, Kathleen L; Anand, Kanwaljeet J S; Zimmerman, Jerry; Newth, Christopher J; Harrison, Rick; Burr, Jeri; Willson, Douglas F; Nicholson, Carol; Bell, Michael J; Berg, Robert A; Shanley, Thomas P; Heidemann, Sabrina M; Dalton, Heidi; Jenkins, Tammara L; Doctor, Allan; Webster, Angie

    2016-11-01

    Nosocomial infection remains an important health problem in long stay (>3 days) pediatric intensive care unit (PICU) patients. Admission risk factors related to the development of nosocomial infection in long stay immune competent patients in particular are not known. Post-hoc analysis of the previously published Critical Illness Stress induced Immune Suppression (CRISIS) prevention trial database, to identify baseline risk factors for nosocomial infection. Because there was no difference between treatment arms of that study in nosocomial infection in the population without known baseline immunocompromise, both arms were combined and the cohort that developed nosocomial infection was compared with the cohort that did not. There were 254 long stay PICU patients without known baseline immunocompromise. Ninety (35%) developed nosocomial infection, and 164 (65%) did not. Admission characteristics associated with increased nosocomial infection risk were increased age, higher Pediatric Risk of Mortality version III score, the diagnoses of trauma or cardiac arrest and lymphopenia (P risk of developing nosocomial infection (P risk factors (P < 0.05); whereas trauma tended to be related to nosocomial infection development (P = 0.07). These data suggest that increasing age, cardiac arrest and lymphopenia predispose long stay PICU patients without known baseline immunocompromise to nosocomial infection. These findings may inform pre-hoc stratification randomization strategies for prospective studies designed to prevent nosocomial infection in this population.

  17. Studying the physics potential of long-baseline experiments in terms of new sensitivity parameters

    International Nuclear Information System (INIS)

    Singh, Mandip

    2016-01-01

    We investigate physics opportunities to constraint the leptonic CP-violation phase δ_C_P through numerical analysis of working neutrino oscillation probability parameters, in the context of long-baseline experiments. Numerical analysis of two parameters, the “transition probability δ_C_P phase sensitivity parameter (A"M)” and the “CP-violation probability δ_C_P phase sensitivity parameter (A"C"P),” as functions of beam energy and/or baseline have been carried out. It is an elegant technique to broadly analyze different experiments to constrain the δ_C_P phase and also to investigate the mass hierarchy in the leptonic sector. Positive and negative values of the parameter A"C"P, corresponding to either hierarchy in the specific beam energy ranges, could be a very promising way to explore the mass hierarchy and δ_C_P phase. The keys to more robust bounds on the δ_C_P phase are improvements of the involved detection techniques to explore lower energies and relatively long baseline regions with better experimental accuracy.

  18. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    Science.gov (United States)

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the

  19. Price competition and equilibrium analysis in multiple hybrid channel supply chain

    Science.gov (United States)

    Kuang, Guihua; Wang, Aihu; Sha, Jin

    2017-06-01

    The amazing boom of Internet and logistics industry prompts more and more enterprises to sell commodity through multiple channels. Such market conditions make the participants of multiple hybrid channel supply chain compete each other in traditional and direct channel at the same time. This paper builds a two-echelon supply chain model with a single manufacturer and a single retailer who both can choose different channel or channel combination for their own sales, then, discusses the price competition and calculates the equilibrium price under different sales channel selection combinations. Our analysis shows that no matter the manufacturer and retailer choose same or different channel price to compete, the equilibrium price does not necessarily exist the equilibrium price in the multiple hybrid channel supply chain and wholesale price change is not always able to coordinate supply chain completely. We also present the sufficient and necessary conditions for the existence of equilibrium price and coordination wholesale price.

  20. Developing protocols for geochemical baseline studies: An example from the Coles Hill uranium deposit, Virginia, USA

    International Nuclear Information System (INIS)

    Levitan, Denise M.; Schreiber, Madeline E.; Seal, Robert R.; Bodnar, Robert J.; Aylor, Joseph G.

    2014-01-01

    Highlights: • We outline protocols for baseline geochemical surveys of stream sediments and water. • Regression on order statistics was used to handle non-detect data. • U concentrations in stream water near this unmined ore were below regulatory standards. • Concentrations of major and trace elements were correlated with stream discharge. • Methods can be applied to other extraction activities, including hydraulic fracturing. - Abstract: In this study, we determined baseline geochemical conditions in stream sediments and surface waters surrounding an undeveloped uranium deposit. Emphasis was placed on study design, including site selection to encompass geological variability and temporal sampling to encompass hydrological and climatic variability, in addition to statistical methods for baseline data analysis. The concentrations of most elements in stream sediments were above analytical detection limits, making them amenable to standard statistical analysis. In contrast, some trace elements in surface water had concentrations that were below the respective detection limits, making statistical analysis more challenging. We describe and compare statistical methods appropriate for concentrations that are below detection limits (non-detect data) and conclude that regression on order statistics provided the most rigorous analysis of our results, particularly for trace elements. Elevated concentrations of U and deposit-associated elements (e.g. Ba, Pb, and V) were observed in stream sediments and surface waters downstream of the deposit, but concentrations were below regulatory guidelines for the protection of aquatic ecosystems and for drinking water. Analysis of temporal trends indicated that concentrations of major and trace elements were most strongly related to stream discharge. These findings highlight the need for sampling protocols that will identify and evaluate the temporal and spatial variations in a thorough baseline study

  1. The impact of sterile neutrinos on CP measurements at long baselines

    International Nuclear Information System (INIS)

    Gandhi, Raj; Kayser, Boris; Masud, Mehedi; Prakash, Suprabh

    2015-01-01

    With the Deep Underground Neutrino Experiment (DUNE) as an example, we show that the presence of even one sterile neutrino of mass ∼1 eV can significantly impact the measurements of CP violation in long baseline experiments. Using a probability level analysis and neutrino-antineutrino asymmetry calculations, we discuss the large magnitude of these effects, and show how they translate into significant event rate deviations at DUNE. Our results demonstrate that measurements which, when interpreted in the context of the standard three family paradigm, indicate CP conservation at long baselines, may, in fact hide large CP violation if there is a sterile state. Similarly, any data indicating the violation of CP cannot be properly interpreted within the standard paradigm unless the presence of sterile states of mass O(1 eV) can be conclusively ruled out. Our work underscores the need for a parallel and linked short baseline oscillation program and a highly capable near detector for DUNE, in order that its highly anticipated results on CP violation in the lepton sector may be correctly interpreted.

  2. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  3. Estimate the contribution of incubation parameters influence egg hatchability using multiple linear regression analysis.

    Science.gov (United States)

    Khalil, Mohamed H; Shebl, Mostafa K; Kosba, Mohamed A; El-Sabrout, Karim; Zaki, Nesma

    2016-08-01

    This research was conducted to determine the most affecting parameters on hatchability of indigenous and improved local chickens' eggs. Five parameters were studied (fertility, early and late embryonic mortalities, shape index, egg weight, and egg weight loss) on four strains, namely Fayoumi, Alexandria, Matrouh, and Montazah. Multiple linear regression was performed on the studied parameters to determine the most influencing one on hatchability. The results showed significant differences in commercial and scientific hatchability among strains. Alexandria strain has the highest significant commercial hatchability (80.70%). Regarding the studied strains, highly significant differences in hatching chick weight among strains were observed. Using multiple linear regression analysis, fertility made the greatest percent contribution (71.31%) to hatchability, and the lowest percent contributions were made by shape index and egg weight loss. A prediction of hatchability using multiple regression analysis could be a good tool to improve hatchability percentage in chickens.

  4. Design and baseline data from the Gratitude Research in Acute Coronary Events (GRACE) study.

    Science.gov (United States)

    Huffman, Jeff C; Beale, Eleanor E; Beach, Scott R; Celano, Christopher M; Belcher, Arianna M; Moore, Shannon V; Suarez, Laura; Gandhi, Parul U; Motiwala, Shweta R; Gaggin, Hanna; Januzzi, James L

    2015-09-01

    Positive psychological constructs, especially optimism, have been linked with superior cardiovascular health. However, there has been minimal study of positive constructs in patients with acute coronary syndrome (ACS), despite the prevalence and importance of this condition. Furthermore, few studies have examined multiple positive psychological constructs and multiple cardiac-related outcomes within the same cohort to determine specifically which positive construct may affect a particular cardiac outcome. The Gratitude Research in Acute Coronary Events (GRACE) study examines the association between optimism/gratitude 2weeks post-ACS and subsequent clinical outcomes. The primary outcome measure is physical activity at 6months, measured via accelerometer, and key secondary outcome measures include levels of prognostic biomarkers and rates of nonelective cardiac rehospitalization at 6months. These relationships will be analyzed using multivariable linear regression, controlling for sociodemographic, medical, and negative psychological factors; associations between baseline positive constructs and subsequent rehospitalizations will be assessed via Cox regression. Overall, 164 participants enrolled and completed the baseline 2-week assessment; the cohort had a mean age of 61.5+/?10.5years and was 84% men; this was the first ACS for 58% of participants. The GRACE study will determine whether optimism and gratitude are prospectively and independently associated with physical activity and other critical outcomes in the 6months following an ACS. If these constructs are associated with superior outcomes, this may highlight the importance of these constructs as independent prognostic factors post-ACS. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. The Danish National Multiple Myeloma Registry

    DEFF Research Database (Denmark)

    Gimsing, Peter; Holmström, Morten Orebo; Klausen, Tobias Wirenfelt

    2016-01-01

    AIM: The Danish National Multiple Myeloma Registry (DMMR) is a population-based clinical quality database established in January 2005. The primary aim of the database is to ensure that diagnosis and treatment of plasma cell dyscrasia are of uniform quality throughout the country. Another aim...... diagnosed patients with multiple myeloma (MM), smoldering MM, solitary plasmacytomas, and plasma cell leukemia in Denmark are registered annually; ~350 patients. Amyloid light-chain amyloidosis, POEMS syndrome (polyneuropathy, organomegaly, endocrinopathy, monoclonal gammopathy, and skin changes syndrome......), monoclonal gammopathy of undetermined significance and monoclonal gammopathy of undetermined significance with polyneuropathy have been registered since 2014. MAIN VARIABLES: The main registered variables at diagnosis are patient demographics, baseline disease characteristics, myeloma-defining events...

  6. Links between Bloom's Taxonomy and Gardener's Multiple Intelligences: The Issue of Textbook Analysis

    Science.gov (United States)

    Tabari, Mahmoud Abdi; Tabari, Iman Abdi

    2015-01-01

    The major thrust of this research was to investigate the cognitive aspect of the high school textbooks and interchange series, due to their extensive use, through content analysis based on Bloom's taxonomy and Gardner's Multiple Intelligences (MI). This study embraced two perspectives in a grid in order to broaden and deepen the analysis by…

  7. Baseline and postoperative levels of C-reactive protein and interleukins as inflammatory predictors of atrial fibrillation following cardiac surgery: a systematic review and meta-analysis.

    Science.gov (United States)

    Weymann, Alexander; Popov, Aron-Frederik; Sabashnikov, Anton; Ali-Hasan-Al-Saegh, Sadeq; Ryazanov, Mikhail; Tse, Gary; Mirhosseini, Seyed Jalil; Liu, Tong; Lotfaliani, Mohammadreza; Sedaghat, Meghdad; Baker, William L; Ghanei, Azam; Yavuz, Senol; Zeriouh, Mohamed; Izadpanah, Payman; Dehghan, Hamidreza; Testa, Luca; Nikfard, Maryam; Sá, Michel Pompeu Barros de Oliveira; Mashhour, Ahmed; Nombela-Franco, Luis; Rezaeisadrabadi, Mohammad; D'Ascenzo, Fabrizio; Zhigalov, Konstantin; Benedetto, Umberto; Aminolsharieh Najafi, Soroosh; Szczechowicz, Marcin; Roever, Leonardo; Meng, Lei; Gong, Mengqi; Deshmukh, Abhishek J; Palmerini, Tullio; Linde, Cecilia; Filipiak, Krzysztof J; Stone, Gregg W; Biondi-Zoccai, Giuseppe; Calkins, Hugh

    2018-01-01

    Postoperative atrial fibrillation (POAF) is a leading arrhythmia with high incidence and serious clinical implications after cardiac surgery. Cardiac surgery is associated with systemic inflammatory response including increase in cytokines and activation of endothelial and leukocyte responses. This systematic review and meta-analysis aimed to determine the strength of evidence for evaluating the association of inflammatory markers, such as C-reactive protein (CRP) and interleukins (IL), with POAF following isolated coronary artery bypass grafting (CABG), isolated valvular surgery, or a combination of these procedures. We conducted a meta-analysis of studies evaluating measured baseline (from one week before surgical procedures) and postoperative levels (until one week after surgical procedures) of inflammatory markers in patients with POAF. A compre-hensive search was performed in electronic medical databases (Medline/PubMed, Web of Science, Embase, Science Direct, and Google Scholar) from their inception through May 2017 to identify relevant studies. A comprehensive subgroup analysis was performed to explore potential sources of heterogeneity. A literature search of all major databases retrieved 1014 studies. After screening, 42 studies were analysed including a total of 8398 patients. Pooled analysis showed baseline levels of CRP (standard mean difference [SMD] 0.457 mg/L, p < 0.001), baseline levels of IL-6 (SMD 0.398 pg/mL, p < 0.001), postoperative levels of CRP (SMD 0.576 mg/L, p < 0.001), postoperative levels of IL-6 (SMD 1.66 pg/mL, p < 0.001), postoperative levels of IL-8 (SMD 0.839 pg/mL, p < 0.001), and postoperative levels of IL-10 (SMD 0.590 pg/mL, p < 0.001) to be relevant inflammatory parameters significantly associated with POAF. Perioperative inflammation is proposed to be involved in the pathogenesis of POAF. Therefore, perioperative assessment of CRP, IL-6, IL-8, and IL-10 can help clinicians in terms of predicting and monitoring for POAF.

  8. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  9. Baseline energy forecasts and analysis of alternative strategies for airline fuel conservation

    Energy Technology Data Exchange (ETDEWEB)

    1976-01-01

    The objectives of this study were to identify measures to reduce airline fuel consumption and to evaluate the impact of these alternatives on fuel consumption through 1990. To evaluate the impact of fuel conservation strategies, baseline forecasts of airline activity and energy consumption to 1990 were developed. Alternative policy options to reduce fuel consumption were identified and analyzed for three baseline levels of aviation activity within the framework of an aviation activity/energy consumption model. By combining the identified policy options, a strategy was developed to provide incentives for airline fuel conservation. Strategies and policy options were evaluated in terms of their impact on airline fuel conservation and the functioning of the airline industry as well as the associated social, environmental, and economic costs. The need for strategies to conserve airline fuel is based on air transportation's dependence upon petroleum; the current lack of alternative energy sources; the potential for disruption of air service due to crises in fuel availability such as experienced during the OPEC oil embargo; and the overall national goal of energy independence through energy conservation in all consuming sectors. The transition from the current situation to that described by strategies and policy options may require difficult adjustments by the airline industry in the short term. In the long term, however, conservation strategies can enhance the health of the airline industry as well as its fuel efficiency.

  10. Mediation Analysis with Multiple Mediators.

    Science.gov (United States)

    VanderWeele, T J; Vansteelandt, S

    2014-01-01

    Recent advances in the causal inference literature on mediation have extended traditional approaches to direct and indirect effects to settings that allow for interactions and non-linearities. In this paper, these approaches from causal inference are further extended to settings in which multiple mediators may be of interest. Two analytic approaches, one based on regression and one based on weighting are proposed to estimate the effect mediated through multiple mediators and the effects through other pathways. The approaches proposed here accommodate exposure-mediator interactions and, to a certain extent, mediator-mediator interactions as well. The methods handle binary or continuous mediators and binary, continuous or count outcomes. When the mediators affect one another, the strategy of trying to assess direct and indirect effects one mediator at a time will in general fail; the approach given in this paper can still be used. A characterization is moreover given as to when the sum of the mediated effects for multiple mediators considered separately will be equal to the mediated effect of all of the mediators considered jointly. The approach proposed in this paper is robust to unmeasured common causes of two or more mediators.

  11. An improved multiple linear regression and data analysis computer program package

    Science.gov (United States)

    Sidik, S. M.

    1972-01-01

    NEWRAP, an improved version of a previous multiple linear regression program called RAPIER, CREDUC, and CRSPLT, allows for a complete regression analysis including cross plots of the independent and dependent variables, correlation coefficients, regression coefficients, analysis of variance tables, t-statistics and their probability levels, rejection of independent variables, plots of residuals against the independent and dependent variables, and a canonical reduction of quadratic response functions useful in optimum seeking experimentation. A major improvement over RAPIER is that all regression calculations are done in double precision arithmetic.

  12. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    Science.gov (United States)

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on

  13. Detailed Load Analysis of the baseline 5MW DeepWind Concept

    DEFF Research Database (Denmark)

    Verelst, David Robert; Aagaard Madsen, Helge; Kragh, Knud Abildgaard

    This report presents an overview of the design of the DeepWind vertical axis oating wind turbine. One could present this as the "nal design", however, it is hoped that more design iterations will follow in the future, but under the umbrella of new and dierent projects. The state of the design...... that is reported here will be called version 2.2.0. The numbering system has just been introduced at the present design version, but the rst 5MW design called the "baseline design" [1] was developed in 2011 and this will therefore be called version 1.0.0. In this report, the design loads of the DeepWind 5 MW...

  14. Neutron-multiplication measurement instrument

    Energy Technology Data Exchange (ETDEWEB)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results.

  15. Neutron multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1983-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  16. Neutron-multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  17. Using Functional Analysis Methodology to Evaluate Effects of an Atypical Antipsychotic on Severe Problem Behavior

    Science.gov (United States)

    Danov, Stacy E.; Tervo, Raymond; Meyers, Stephanie; Symons, Frank J.

    2012-01-01

    The atypical antipsychotic medication aripiprazole was evaluated using a randomized AB multiple baseline, double-blind, placebo-controlled design for the treatment of severe problem behavior with 4 children with intellectual and developmental disabilities. Functional analysis (FA) was conducted concurrent with the medication evaluation to…

  18. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  19. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. Multiple predictor smoothing methods for sensitivity analysis: Example results

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described in the first part of this presentation: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. In this, the second and concluding part of the presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  1. Analysis of underlying and multiple-cause mortality data.

    Science.gov (United States)

    Moussa, M A; El Sayed, A M; Sugathan, T N; Khogali, M M; Verma, D

    1992-01-01

    "A variety of life table models were used for the analysis of the (1984-86) Kuwaiti cause-specific mortality data. These models comprised total mortality, multiple-decrement, cause-elimination, cause-delay and disease dependency. The models were illustrated by application to a set of four chronic diseases: hypertensive, ischaemic heart, cerebrovascular and diabetes mellitus. The life table methods quantify the relative weights of different diseases as hazards to mortality after adjustment for other causes. They can also evaluate the extent of dependency between underlying cause of death and other causes mentioned on [the] death certificate using an extended underlying-cause model." (SUMMARY IN FRE AND ITA) excerpt

  2. The impact of GPS receiver modifications and ionospheric activity on Swarm baseline determination

    Science.gov (United States)

    Mao, X.; Visser, P. N. A. M.; van den IJssel, J.

    2018-05-01

    The European Space Agency (ESA) Swarm mission is a satellite constellation launched on 22 November 2013 aiming at observing the Earth geomagnetic field and its temporal variations. The three identical satellites are equipped with high-precision dual-frequency Global Positioning System (GPS) receivers, which make the constellation an ideal test bed for baseline determination. From October 2014 to August 2016, a number of GPS receiver modifications and a new GPS Receiver Independent Exchange Format (RINEX) converter were implemented. Moreover, the on-board GPS receiver performance has been influenced by the ionospheric scintillations. The impact of these factors is assessed for baseline determination of the pendulum formation flying Swarm-A and -C satellites. In total 30 months of data - from 15 July 2014 to the end of 2016 - is analyzed. The assessment includes analysis of observation residuals, success rate of GPS carrier phase ambiguity fixing, a consistency check between the so-called kinematic and reduced-dynamic baseline solution, and validations of orbits by comparing with Satellite Laser Ranging (SLR) observations. External baseline solutions from The German Space Operations Center (GSOC) and Astronomisches Institut - Universität Bern (AIUB) are also included in the comparison. Results indicate that the GPS receiver modifications and RINEX converter changes are effective to improve the baseline determination. This research eventually shows a consistency level of 9.3/4.9/3.0 mm between kinematic and reduced-dynamic baselines in the radial/along-track/cross-track directions. On average 98.3% of the epochs have kinematic solutions. Consistency between TU Delft and external reduced-dynamic baseline solutions is at a level of 1 mm level in all directions.

  3. Convergence Analysis for the Multiplicative Schwarz Preconditioned Inexact Newton Algorithm

    KAUST Repository

    Liu, Lulu

    2016-10-26

    The multiplicative Schwarz preconditioned inexact Newton (MSPIN) algorithm, based on decomposition by field type rather than by subdomain, was recently introduced to improve the convergence of systems with unbalanced nonlinearities. This paper provides a convergence analysis of the MSPIN algorithm. Under reasonable assumptions, it is shown that MSPIN is locally convergent, and desired superlinear or even quadratic convergence can be obtained when the forcing terms are picked suitably.

  4. Convergence Analysis for the Multiplicative Schwarz Preconditioned Inexact Newton Algorithm

    KAUST Repository

    Liu, Lulu; Keyes, David E.

    2016-01-01

    The multiplicative Schwarz preconditioned inexact Newton (MSPIN) algorithm, based on decomposition by field type rather than by subdomain, was recently introduced to improve the convergence of systems with unbalanced nonlinearities. This paper provides a convergence analysis of the MSPIN algorithm. Under reasonable assumptions, it is shown that MSPIN is locally convergent, and desired superlinear or even quadratic convergence can be obtained when the forcing terms are picked suitably.

  5. mma: An R Package for Mediation Analysis with Multiple Mediators

    OpenAIRE

    Qingzhao Yu; Bin Li

    2017-01-01

    Mediation refers to the effect transmitted by mediators that intervene in the relationship between an exposure and a response variable. Mediation analysis has been broadly studied in many fields. However, it remains a challenge for researchers to consider complicated associations among variables and to differentiate individual effects from multiple mediators. [1] proposed general definitions of mediation effects that were adaptable to all different types of response (categorical or continuous...

  6. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, B.C.; Menne, T.; Johansen, J.D.

    2008-01-01

    Background: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. Objectives: To examine associations of 21 allergens in the European baseline series to polysensitization....... Patients/Methods: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. Results...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization Udgivelsesdato: 2008...

  7. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, Berit Christina; Menné, Torkil; Johansen, Jeanne Duus

    2008-01-01

    BACKGROUND: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. OBJECTIVES: To examine associations of 21 allergens in the European baseline series to polysensitization....... PATIENTS/METHODS: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. RESULTS...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization....

  8. Using multiple-accumulator CMACs to improve efficiency of the X part of an input-buffered FX correlator

    Science.gov (United States)

    Lapshev, Stepan; Hasan, S. M. Rezaul

    2017-04-01

    This paper presents the approach of using complex multiplier-accumulators (CMACs) with multiple accumulators to reduce the total number of memory operations in an input-buffered architecture for the X part of an FX correlator. A processing unit of this architecture uses an array of CMACs that are reused for different groups of baselines. The disadvantage of processing correlations in this way is that each input data sample has to be read multiple times from the memory because each input signal is used in many of these baseline groups. While a one-accumulator CMAC cannot switch to a different baseline until it is finished integrating the current one, a multiple-accumulator CMAC can. Thus, the array of multiple-accumulator CMACs can switch between processing different baselines that share some input signals at any moment to reuse the current data in the processing buffers. In this way significant reductions in the number of memory read operations are achieved with only a few accumulators per CMAC. For example, for a large number of input signals three-accumulator CMACs reduce the total number of memory operations by more than a third. Simulated energy measurements of four VLSI designs in a high-performance 28 nm CMOS technology are presented in this paper to demonstrate that using multiple accumulators can also lead to reduced power dissipation of the processing array. Using three accumulators as opposed to one has been found to reduce the overall energy of 8-bit CMACs by 1.4% through the reduction of the switching activity within their circuits, which is in addition to a more than 30% reduction in the memory.

  9. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  10. Links between Bloom's Taxonomy and Gardener's Multiple Intelligences: The issue of Textbook Analysis

    Directory of Open Access Journals (Sweden)

    Mahmoud Abdi Tabari

    2015-02-01

    Full Text Available The major thrust of this research was to investigate the cognitive aspect of the high school textbooks and interchange series, due to their extensive use, through content analysis based on Bloom's taxonomy and Gardner's Multiple Intelligences (MI. This study embraced two perspectives in a grid in order to broaden and deepen the analysis by determining the numbers and the types of intelligences with respect to their learning objectives tapped in the textbooks and comparing them. Through codification of Bloom’s learning objectives and Gardner's MI, the results showed that there was a significant difference between the numbers of intelligences with respect to their learning objectives in the textbooks. However, the interchange series enjoyed a large number of the spatial and the interpersonal intelligences across eight levels of learning objectives, whereas they had the least number of the intrapersonal, the musical, and the bodily-kinesthetic intelligences across knowledge understanding and application levels. Keywords: learning objectives, multiple intelligences, textbook analysis

  11. Fat Metaplasia on Sacroiliac Joint Magnetic Resonance Imaging at Baseline Is Associated with Spinal Radiographic Progression in Patients with Axial Spondyloarthritis.

    Directory of Open Access Journals (Sweden)

    Kwi Young Kang

    Full Text Available To study the relationship between inflammatory and structural lesions in the sacroiliac joints (SIJs on MRI and spinal progression observed on conventional radiographs in patients with axial spondyloarthritis (axSpA.One hundred and ten patients who fulfilled the ASAS axSpA criteria were enrolled. All underwent SIJ MRI at baseline and lumbar spine radiographs at baseline and after 2 years. Inflammatory and structural lesions on SIJ MRI were scored using the SPondyloArthritis Research Consortium of Canada (SPARCC method. Spinal radiographs were scored using the Stoke AS Spinal Score (SASSS. Multivariate logistic regression analysis was performed to identify predictors of spinal progression.Among the 110 patients, 25 (23% showed significant radiographic progression (change of SASSS≥2 over 2 years. There was no change in the SASSS over 2 years according to the type of inflammatory lesion. Patients with fat metaplasia or ankyloses on baseline MRI showed a significantly higher SASSS at 2 years than those without (p<0.001. According to univariate logistic regression analysis, age at diagnosis, HLA-B27 positivity, the presence of fat metaplasia, erosion, and ankyloses on SIJ MRI, increased baseline CRP levels, and the presence of syndesmophytes at baseline were associated with spinal progression over 2 years. Multivariate analysis identified syndesmophytes and severe fat metaplasia on baseline SIJ MRI as predictive of spinal radiographic progression (OR, 14.74 and 5.66, respectively.Inflammatory lesions in the SIJs on baseline MRI were not associated with spinal radiographic progression. However, fat metaplasia at baseline was significantly associated with spinal progression after 2 years.

  12. A Girl With Multiple Disabilities Increases Object Manipulation and Reduces Hand Mouthing Through a Microswitch-Based Program

    NARCIS (Netherlands)

    Lancioni, G.E.; Singh, N.N.; O'Reilly, M.F.; Sigafoos, J.; Didden, H.C.M.; Oliva, D.; Cingolani, E.

    2008-01-01

    The study was an effort to help a girl with multiple disabilities increase object manipulation responses and reduce hand mouthing, carried out according to an ABAB sequence (in which A represented baseline phases; B, treatment phases) and including a 3-month follow-up. During the baseline phases, a

  13. Business-as-Unusual: Existing policies in energy model baselines

    International Nuclear Information System (INIS)

    Strachan, Neil

    2011-01-01

    Baselines are generally accepted as a key input assumption in long-term energy modelling, but energy models have traditionally been poor on identifying baselines assumptions. Notably, transparency on the current policy content of model baselines is now especially critical as long-term climate mitigation policies have been underway for a number of years. This paper argues that the range of existing energy and emissions policies are an integral part of any long-term baseline, and hence already represent a 'with-policy' baseline, termed here a Business-as-Unusual (BAuU). Crucially, existing energy policies are not a sunk effort; as impacts of existing policy initiatives are targeted at future years, they may be revised through iterative policy making, and their quantitative effectiveness requires ex-post verification. To assess the long-term role of existing policies in energy modelling, currently identified UK policies are explicitly stripped out of the UK MARKAL Elastic Demand (MED) optimisation energy system model, to generate a BAuU (with-policy) and a REF (without policy) baseline. In terms of long-term mitigation costs, policy-baseline assumptions are comparable to another key exogenous modelling assumption - that of global fossil fuel prices. Therefore, best practice in energy modelling would be to have both a no-policy reference baseline, and a current policy reference baseline (BAuU). At a minimum, energy modelling studies should have a transparent assessment of the current policy contained within the baseline. Clearly identifying and comparing policy-baseline assumptions are required for cost effective and objective policy making, otherwise energy models will underestimate the true cost of long-term emissions reductions.

  14. Energy and emission scenarios for China in the 21st century - exploration of baseline development and mitigation options

    International Nuclear Information System (INIS)

    Vuuren, Detlef van; Zhou Fengqi; Vries, Bert de; Jiang Kejun; Graveland, Cor; Li Yun

    2003-01-01

    In this paper, we have used the simulation model IMAGE/TIMER to develop a set of energy and emission scenarios for China between 1995 and 2100, based on the global baseline scenarios published by IPCC. The purpose of the study was to explore possible baseline developments and available options to mitigate emissions. The two main baseline scenarios of the study differ, among others, in the openness of the Chinese economy and in economic growth, but both indicate a rapid growth in carbon emissions (2.0% and 2.6% per year in the 2000-2050 period). The baseline scenario analysis also shows that an orientation on environmental sustainability can not only reduce other environmental pressures but also lower carbon emissions. In the mitigation analysis, a large number of options has been evaluated in terms of impacts on investments, user costs, fuel imports costs and emissions. It is found that a large potential exists to mitigate carbon emissions in China, among others in the form of energy efficiency improvement (with large co-benefits) and measures in the electricity sector. Combining all options considered, it appears to be possible to reduce emissions compared to the baseline scenarios by 50%

  15. Analysis of Altered Baseline Brain Activity in Drug-Naive Adult Patients with Social Anxiety Disorder Using Resting-State Functional MRI

    OpenAIRE

    Qiu, Changjian; Feng, Yuan; Meng, Yajing; Liao, Wei; Huang, Xiaoqi; Lui, Su; Zhu, Chunyan; Chen, Huafu; Gong, Qiyong; Zhang, Wei

    2015-01-01

    Objective We hypothesize that the amplitude of low-frequency fluctuations (ALFF) is involved in the altered regional baseline brain function in social anxiety disorder (SAD). The aim of the study was to analyze the altered baseline brain activity in drug-naive adult patients with SAD. Methods We investigated spontaneous and baseline brain activities by obtaining the resting-state functional magnetic resonance imaging data of 20 drug-na?ve adult SAD patients and 19 healthy controls. Voxels wer...

  16. Pakistan, Sindh Province - Baseline Indicators System : Baseline Procurement Performance Assessment Report

    OpenAIRE

    World Bank

    2009-01-01

    This document provides an assessment of the public procurement system in Sindh province using the baseline indicators system developed by the Development Assistance Committee of the Organization for Economic Cooperation and Development (OECD-DAC). This assessment, interviews and discussions were held with stakeholders from the public and private sectors as well as civil society. Developing...

  17. Multiple cell CPV nickel-hydrogen battery

    Science.gov (United States)

    Jones, Ken R.; Zagrodnik, Jeffrey P.

    1991-01-01

    Johnson Controls, Inc. has developed a multiple cell CPV nickel hydrogen battery that offers significant weight, volume, and cost advantages for aerospace applications. The baseline design was successfully demonstrated through the testing of a 26-cell prototype, which completed over 7000 44 percent depth-of-discharge low earth orbit cycles. Prototype designs using both nominal 5 and 10 inch diameter vessels are currently being developed for a variety of customers and applications.

  18. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  19. Video Modeling for Children and Adolescents with Autism Spectrum Disorder: A Meta-Analysis

    Science.gov (United States)

    Thompson, Teresa Lynn

    2014-01-01

    The objective of this research was to conduct a meta-analysis to examine existing research studies on video modeling as an effective teaching tool for children and adolescents diagnosed with Autism Spectrum Disorder (ASD). Study eligibility criteria included (a) single case research design using multiple baselines, alternating treatment designs,…

  20. Zinc status in HIV infected Ugandan children aged 1-5 years: a cross sectional baseline survey

    OpenAIRE

    Ndeezi, Grace; Tumwine, James K.; Bolann, Bjørn J.; Ndugwa, Christopher M.; Tylleskär, Thorkild

    2010-01-01

    Abstract Background Low concentrations of serum zinc have been reported in HIV infected adults and are associated with disease progression and an increased risk of death. Few studies have been conducted in HIV infected children in Africa. We determined serum zinc levels and factors associated with zinc deficiency in HIV infected Ugandan children. Methods We measured the baseline zinc status of 247 children aged 1-5 years enrolled in a randomised trial for multiple micronutrient supplementatio...

  1. Superresolution Imaging Using Resonant Multiples and Plane-wave Migration Velocity Analysis

    KAUST Repository

    Guo, Bowen

    2017-08-28

    Seismic imaging is a technique that uses seismic echoes to map and detect underground geological structures. The conventional seismic image has the resolution limit of λ/2, where λ is the wavelength associated with the seismic waves propagating in the subsurface. To exceed this resolution limit, this thesis develops a new imaging method using resonant multiples, which produces superresolution images with twice or even more the spatial resolution compared to the conventional primary reflection image. A resonant multiple is defined as a seismic reflection that revisits the same subsurface location along coincident reflection raypath. This reverberated raypath is the reason for superresolution imaging because it increases the differences in reflection times associated with subtle changes in the spatial location of the reflector. For the practical implementation of superresolution imaging, I develop a post-stack migration technique that first enhances the signal-to-noise ratios (SNRs) of resonant multiples by a moveout-correction stacking method, and then migrates the post-stacked resonant multiples with the associated Kirchhoff or wave-equation migration formula. I show with synthetic and field data examples that the first-order resonant multiple image has about twice the spatial resolution compared to the primary reflection image. Besides resolution, the correct estimate of the subsurface velocity is crucial for determining the correct depth of reflectors. Towards this goal, wave-equation migration velocity analysis (WEMVA) is an image-domain method which inverts for the velocity model that maximizes the similarity of common image gathers (CIGs). Conventional WEMVA based on subsurface-offset, angle domain or time-lag CIGs requires significant computational and memory resources because it computes higher dimensional migration images in the extended image domain. To mitigate this problem, I present a new WEMVA method using plane-wave CIGs. Plane-wave CIGs reduce the

  2. Psychosocial job quality, mental health, and subjective wellbeing: a cross-sectional analysis of the baseline wave of the Australian Longitudinal Study on Male Health.

    Science.gov (United States)

    LaMontagne, Anthony D; Milner, Allison; Krnjacki, Lauren; Schlichthorst, Marisa; Kavanagh, Anne; Page, Kathryn; Pirkis, Jane

    2016-10-31

    Employment status and working conditions are strong determinants of male health, and are therefore an important focus in the Australian Longitudinal Study on Male Health (Ten to Men). In this paper, we describe key work variables included in Ten to Men, and present analyses relating psychosocial job quality to mental health and subjective wellbeing at baseline. A national sample of males aged 10 to 55 years residing in private dwellings was drawn using a stratified multi-stage cluster random sample design. Data were collected between October 2013 and July 2014 for a cohort of 15,988 males, representing a response fraction of 35 %. This analysis was restricted to 18-55 year old working age participants (n = 13,456). Work-related measures included employment status, and, for those who were employed, a number of working conditions including an ordinal scale of psychosocial job quality (presence of low job control, high demand and complexity, high job insecurity, and low fairness of pay), and working time-related stressors such as long working hours and night shift work. Associations between psychosocial job quality and two outcome measures, mental ill-health and subjective wellbeing, were assessed using multiple linear regression. The majority of participants aged 18-55 years were employed at baseline (85.6 %), with 8.4 % unemployed and looking for work, and 6.1 % not in the labour force. Among employed participants, there was a high prevalence of long working hours (49.9 % reported working more than 40 h/week) and night shift work (23.4 %). Psychosocial job quality (exposure to 0/1/2/3+ job stressors) prevalence was 36 %/ 37 %/ 20 %/ and 7 % of the working respondents. There was a dose-response relationship between psychosocial job quality and each of the two outcome measures of mental health and subjective wellbeing after adjusting for potential confounders, with higher magnitude associations between psychosocial job quality and subjective wellbeing

  3. FAQs about Baseline Testing among Young Athletes

    Science.gov (United States)

    ... a similar exam conducted by a health care professional during the season if an athlete has a suspected concussion. Baseline testing generally takes place during the pre-season—ideally prior to the first practice. It is important to note that some baseline ...

  4. 75 FR 74706 - Notice of Baseline Filings

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  5. Flexible Mediation Analysis With Multiple Mediators.

    Science.gov (United States)

    Steen, Johan; Loeys, Tom; Moerkerke, Beatrijs; Vansteelandt, Stijn

    2017-07-15

    The advent of counterfactual-based mediation analysis has triggered enormous progress on how, and under what assumptions, one may disentangle path-specific effects upon combining arbitrary (possibly nonlinear) models for mediator and outcome. However, current developments have largely focused on single mediators because required identification assumptions prohibit simple extensions to settings with multiple mediators that may depend on one another. In this article, we propose a procedure for obtaining fine-grained decompositions that may still be recovered from observed data in such complex settings. We first show that existing analytical approaches target specific instances of a more general set of decompositions and may therefore fail to provide a comprehensive assessment of the processes that underpin cause-effect relationships between exposure and outcome. We then outline conditions for obtaining the remaining set of decompositions. Because the number of targeted decompositions increases rapidly with the number of mediators, we introduce natural effects models along with estimation methods that allow for flexible and parsimonious modeling. Our procedure can easily be implemented using off-the-shelf software and is illustrated using a reanalysis of the World Health Organization's Large Analysis and Review of European Housing and Health Status (WHO-LARES) study on the effect of mold exposure on mental health (2002-2003). © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Baseline prediction of combination therapy outcome in hepatitis C virus 1b infected patients by discriminant analysis using viral and host factors.

    Science.gov (United States)

    Saludes, Verónica; Bracho, Maria Alma; Valero, Oliver; Ardèvol, Mercè; Planas, Ramón; González-Candelas, Fernando; Ausina, Vicente; Martró, Elisa

    2010-11-30

    Current treatment of chronic hepatitis C virus (HCV) infection has limited efficacy -especially among genotype 1 infected patients-, is costly, and involves severe side effects. Thus, predicting non-response is of major interest for both patient wellbeing and health care expense. At present, treatment cannot be individualized on the basis of any baseline predictor of response. We aimed to identify pre-treatment clinical and virological parameters associated with treatment failure, as well as to assess whether therapy outcome could be predicted at baseline. Forty-three HCV subtype 1b (HCV-1b) chronically infected patients treated with pegylated-interferon alpha plus ribavirin were retrospectively studied (21 responders and 22 non-responders). Host (gender, age, weight, transaminase levels, fibrosis stage, and source of infection) and viral-related factors (viral load, and genetic variability in the E1-E2 and Core regions) were assessed. Logistic regression and discriminant analyses were used to develop predictive models. A "leave-one-out" cross-validation method was used to assess the reliability of the discriminant models. Lower alanine transaminase levels (ALT, p=0.009), a higher number of quasispecies variants in the E1-E2 region (number of haplotypes, nHap_E1-E2) (p=0.003), and the absence of both amino acid arginine at position 70 and leucine at position 91 in the Core region (p=0.039) were significantly associated with treatment failure. Therapy outcome was most accurately predicted by discriminant analysis (90.5% sensitivity and 95.5% specificity, 85.7% sensitivity and 81.8% specificity after cross-validation); the most significant variables included in the predictive model were the Core amino acid pattern, the nHap_E1-E2, and gamma-glutamyl transferase and ALT levels. Discriminant analysis has been shown as a useful tool to predict treatment outcome using baseline HCV genetic variability and host characteristics. The discriminant models obtained in this

  7. The Multiplicative Zak Transform, Dimension Reduction, and Wavelet Analysis of LIDAR Data

    Science.gov (United States)

    2010-01-01

    systems is likely to fail. Auslander, Eichmann , Gertner, and Tolimieri defined a multiplicative Zak transform [1], mimicking the construction of the Gabor...L. Auslander, G. Eichmann , I. Gertner and R. Tolimieri, “Time-Frequency Analysis and Synthesis of Non-Stationary Signals,” Proc. Soc. Photo-Opt. In

  8. Smoking and worsening disability in multiple sclerosis: A meta-analysis.

    Science.gov (United States)

    Heydarpour, P; Manouchehrinia, A; Beiki, O; Mousavi, S E; Abdolalizadeh, A; -Lakeh, M Moradi; Sahraian, M A

    2018-03-15

    Multiple sclerosis (MS) is a chronic demyelinating disorder affecting young adults. Environmental factors and lifestyle behaviors are pivotal in MS pathophysiology. Smoking has been considered as an important risk factor in MS. Various recent studies have been conducted to measure the role of smoking on worsening disability in patients with MS, thus we intended to systematically assess effect of smoking on evolution of disability in this study. We queried MEDLINE, EMBASE and Cochrane Library with following keywords "Multiple Sclerosis, Smoking, Tobacco Use, Disability" on December 1st 2016. Original articles were included when smoking history was mentioned, disability was measured via expanded disability status scale (EDSS) or multiple sclerosis severity score (MSSS). Studies with insufficient outcome data, non-human, or in other languages than English were excluded. Through literature review after duplicate removals, 268 articles were retrieved. A total of 56 articles were screened and 15 articles were assessed for eligibility, finally, eleven articles were included in this systematic review and meta-analysis. Ever smoking was significantly associated with increased EDSS (standardized mean difference (SMD) = 0.15, 95% CI = 0.01-0.28), but had no significant association with risk of reaching EDSS 4 (HR = 1.24, 95% CI = 0.89-1.72) or EDSS 6 (HR = 1.17, 95% CI = 0.88-1.57). Smoking had no effect on MSSS (SMD = 0.14, 95% CI = -0.04-0.32) or T2 lesion volume (SMD = 0.07, 95% CI = -0.08-0.22). This meta-analysis showed smoking increased EDSS, insignificant findings were possibly due to the small number of studies, significant differences in methodologies, and variations in reporting of disability outcomes. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  10. Effect of Baseline Nutritional Status on Long-term Multivitamin Use and Cardiovascular Disease Risk

    Science.gov (United States)

    Rautiainen, Susanne; Gaziano, J. Michael; Christen, William G.; Bubes, Vadim; Kotler, Gregory; Glynn, Robert J.; Manson, JoAnn E.; Buring, Julie E.

    2017-01-01

    , and between multivitamin use and vitamin B12 intake on CVD mortality and total mortality. However, there were inconsistent patterns in hazard ratios across tertiles of each dietary factor that are likely explained by multiple testing. Conclusions and Relevance The results suggest that baseline nutritional status does not influence the effect of randomized long-term multivitamin use on major CVD events. Future studies are needed to investigate the role of baseline nutritional biomarkers on the effect of multivitamin use on CVD and other outcomes. Trial Registration clinicaltrials.gov Identifier: NCT00270647 PMID:28384735

  11. Improving the accuracy of energy baseline models for commercial buildings with occupancy data

    International Nuclear Information System (INIS)

    Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping

    2016-01-01

    Highlights: • We evaluated several baseline models predicting energy use in buildings. • Including occupancy data improved accuracy of baseline model prediction. • Occupancy is highly correlated with energy use in buildings. • This simple approach can be used in decision makings of energy retrofit projects. - Abstract: More than 80% of energy is consumed during operation phase of a building’s life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essential for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. The results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.

  12. Analysis of the quantitative dermatoglyphics of the digito-palmar complex in patients with multiple sclerosis.

    Science.gov (United States)

    Supe, S; Milicić, J; Pavićević, R

    1997-06-01

    Recent studies on the etiopathogenesis of multiple sclerosis (MS) all point out that there is a polygenetical predisposition for this illness. The so called "MS Trait" determines the reactivity of the immunological system upon ecological factors. The development of the glyphological science and the study of the characteristics of the digito-palmar dermatoglyphic complex (for which it was established that they are polygenetically determined characteristics) all enable a better insight into the genetic development during early embriogenesis. The aim of this study was to estimate certain differences in the dermatoglyphics of digito-palmar complexes between the group with multiple sclerosis and the comparable, phenotypically healthy groups of both sexes. This study is based on the analysis of 18 quantitative characteristics of the digito-palmar complex in 125 patients with multiple sclerosis (41 males and 84 females) in comparison to a group of 400 phenotypically healthy patients (200 males and 200 females). The conducted analysis pointed towards a statistically significant decrease of the number of digital and palmar ridges, as well as with lower values of atd angles in a group of MS patients of both sexes. The main discriminators were the characteristic palmar dermatoglyphics with the possibility that the discriminate analysis classifies over 80% of the examinees which exceeds the statistical significance. The results of this study suggest a possible discrimination of patients with MS and the phenotypically health population through the analysis of the dermatoglyphic status, and therefore the possibility that multiple sclerosis is genetically predisposed disease.

  13. Precision Search for Muon Antineutrino Disappearance Oscillations Using a Dual Baseline Technique

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Gary Li [Columbia Univ., New York, NY (United States)

    2013-01-01

    A search for short baseline muon antineutrino disappearance with the SciBooNE and MiniBooNE experiments at Fermi National Accelerator Laboratory in Batavia, Illinois is presented. Short baseline muon antineutrino disappearance measurements help constrain sterile neutrino models. The two detectors observe muon antineutrinos from the same beam, therefore the combined analysis of their data sets serves to partially constrain some of the flux and cross section uncertainties. A likelihood ratio method was used to set a 90% confidence level upper limit on muon antineutrino disappearance that dramatically improves upon prior sterile neutrino oscillation limits in the Δm2=0.1-100 eV2 region.

  14. Logistics Operations Management Center: Maintenance Support Baseline (LOMC-MSB)

    Science.gov (United States)

    Kurrus, R.; Stump, F.

    1995-01-01

    The Logistics Operations Management Center Maintenance Support Baseline is defined. A historical record of systems, applied to and deleted from, designs in support of future management and/or technical analysis is provided. All Flight elements, Ground Support Equipment, Facility Systems and Equipment and Test Support Equipment for which LOMC has responsibilities at Kennedy Space Center and other locations are listed. International Space Station Alpha Program documentation is supplemented. The responsibility of the Space Station Launch Site Support Office is established.

  15. Retardo estatural em menores de cinco anos: um estudo "baseline" Linear growth retardation in children under five years of age: a baseline study

    Directory of Open Access Journals (Sweden)

    Anete Rissin

    2011-10-01

    Full Text Available OBJETIVO: Descrever a prevalência e analisar fatores associados ao retardo estatural em menores de cinco anos. MÉTODOS: Estudo "baseline", que analisou 2.040 crianças, verificando possíveis associações entre o retardo estatural (índice altura/idade The scope of this study was to describe the prevalence of, and analyze factors associated with, linear growth retardation in children. The baseline study analyzed 2040 children under the age of five, establishing a possible association between growth delay (height/age index < 2 scores Z and variables in six hierarchical blocks: socio-economic, residence, sanitary, maternal, biological and healthcare access. Multivariate analysis was performed using Poisson regression with the robust standard error option, obtaining adjusted prevalence ratios with a CI of 95% and the respective significant probability values. Among non-binary variables, there was a positive association with roof type and number of inhabitants per room and a negative association with income per capita, mother's schooling and birth weight. The adjusted analysis also indicated water supply, visit from the community health agent, birth delivery location, internment for diarrhea, or for pneumonia and birth weight as significant variables. Several risk factors were identified for linear growth retardation pointing to the multi-causal aspects of the problem and highlighting the need for control measures by the various hierarchical government agents.

  16. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    Science.gov (United States)

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  17. Analysis of γ spectra in airborne radioactivity measurements using multiple linear regressions

    International Nuclear Information System (INIS)

    Bao Min; Shi Quanlin; Zhang Jiamei

    2004-01-01

    This paper describes the net peak counts calculating of nuclide 137 Cs at 662 keV of γ spectra in airborne radioactivity measurements using multiple linear regressions. Mathematic model is founded by analyzing every factor that has contribution to Cs peak counts in spectra, and multiple linear regression function is established. Calculating process adopts stepwise regression, and the indistinctive factors are eliminated by F check. The regression results and its uncertainty are calculated using Least Square Estimation, then the Cs peak net counts and its uncertainty can be gotten. The analysis results for experimental spectrum are displayed. The influence of energy shift and energy resolution on the analyzing result is discussed. In comparison with the stripping spectra method, multiple linear regression method needn't stripping radios, and the calculating result has relation with the counts in Cs peak only, and the calculating uncertainty is reduced. (authors)

  18. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehoon [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Lee, Dongeun [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Choi, Jaesik [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Spurlock, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-05

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB on data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.

  19. Assessment of local GNSS baselines at co-location sites

    Science.gov (United States)

    Herrera Pinzón, Iván; Rothacher, Markus

    2018-01-01

    As one of the major contributors to the realisation of the International Terrestrial Reference System (ITRS), the Global Navigation Satellite Systems (GNSS) are prone to suffer from irregularities and discontinuities in time series. While often associated with hardware/software changes and the influence of the local environment, these discrepancies constitute a major threat for ITRS realisations. Co-located GNSS at fundamental sites, with two or more available instruments, provide the opportunity to mitigate their influence while improving the accuracy of estimated positions by examining data breaks, local biases, deformations, time-dependent variations and the comparison of GNSS baselines with existing local tie measurements. With the use of co-located GNSS data from a subset sites of the International GNSS Service network, this paper discusses a global multi-year analysis with the aim of delivering homogeneous time series of coordinates to analyse system-specific error sources in the local baselines. Results based on the comparison of different GNSS-based solutions with the local survey ties show discrepancies of up to 10 mm despite GNSS coordinate repeatabilities at the sub-mm level. The discrepancies are especially large for the solutions using the ionosphere-free linear combination and estimating tropospheric zenith delays, thus corresponding to the processing strategy used for global solutions. Snow on the antennas causes further problems and seasonal variations of the station coordinates. These demonstrate the need for a permanent high-quality monitoring of the effects present in the short GNSS baselines at fundamental sites.

  20. Baseline characteristics predict risk of progression and response to combined medical therapy for benign prostatic hyperplasia (BPH).

    Science.gov (United States)

    Kozminski, Michael A; Wei, John T; Nelson, Jason; Kent, David M

    2015-02-01

    To better risk stratify patients, using baseline characteristics, to help optimise decision-making for men with moderate-to-severe lower urinary tract symptoms (LUTS) secondary to benign prostatic hyperplasia (BPH) through a secondary analysis of the Medical Therapy of Prostatic Symptoms (MTOPS) trial. After review of the literature, we identified potential baseline risk factors for BPH progression. Using bivariate tests in a secondary analysis of MTOPS data, we determined which variables retained prognostic significance. We then used these factors in Cox proportional hazard modelling to: i) more comprehensively risk stratify the study population based on pre-treatment parameters and ii) to determine which risk strata stood to benefit most from medical intervention. In all, 3047 men were followed in MTOPS for a mean of 4.5 years. We found varying risks of progression across quartiles. Baseline BPH Impact Index score, post-void residual urine volume, serum prostate-specific antigen (PSA) level, age, American Urological Association Symptom Index score, and maximum urinary flow rate were found to significantly correlate with overall BPH progression in multivariable analysis. Using baseline factors permits estimation of individual patient risk for clinical progression and the benefits of medical therapy. A novel clinical decision tool based on these analyses will allow clinicians to weigh patient-specific benefits against possible risks of adverse effects for a given patient. © 2014 The Authors. BJU International © 2014 BJU International.

  1. Statin-related aminotransferase elevation according to baseline aminotransferases level in real practice in Korea.

    Science.gov (United States)

    Kim, H-S; Lee, S H; Kim, H; Lee, S-H; Cho, J H; Lee, H; Yim, H W; Kim, S-H; Choi, I-Y; Yoon, K-H; Kim, J H

    2016-06-01

    Higher rate of statin-related hepatotoxicity has been reported for Koreans than for Westerners. Moreover, statin-related aminotransferase elevation for those who show borderline levels of aspartate transaminase (AST) and alanine transaminase (ALT) (≤×3 of UNL) at baseline has not been fully investigated. Post-statin changes AST/ALT levels during the first year for 21 233 Korean outpatients at two large academic teaching hospitals from January 2009 to December 2013 were analysed using electronic health record data. The date of the first statin prescription was set as baseline. We also performed a comparative analysis of statin-related AST/ALT elevations according to the type of statin, followed by an analysis of clinical risk factors. The progression rate to abnormal AST/ALT values [>×3 the upper normal limit (UNL)] was significantly higher (2·4-16% vs. 0·3-1·7%, P ×1, but ≤×3 of UNL) compared with normal AST/ALT values at baseline. Those with normal baseline AST/ALT did not show significantly different progression rate between different statin medications (P = 0·801). However, patients taking pitavastatin (HR = 0·76, P = 0·657) were least likely to develop abnormal AST/ALT, whereas those taking fluvastatin (HR = 2·96, P = 0·029) were the most likely to develop abnormal AST/ALT compared with atorvastatin for patients who were with baseline borderline AST/ALT. However, given the small sample sizes and the observational nature of our study, these need further study. It is advisable to regularly monitor AST/ALT levels even in patients with AST/ALT increases >×1. Future studies should aim to determine the possible risk factors for each specific statin type by analysing various confounding variables. © 2016 The Authors. Journal of Clinical Pharmacy and Therapeutics Published by John Wiley & Sons Ltd.

  2. 324 Building Baseline Radiological Characterization

    International Nuclear Information System (INIS)

    Reeder, R.J.; Cooper, J.C.

    2010-01-01

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building. A total of 85 technical (100 square centimeter (cm 2 )) smears were collected from the Room 147 hoods, the Shielded Materials Facility (SMF), and the Radiochemical Engineering Cells (REC). Exposure rate readings (window open and window closed) were taken at a distance of 2.5 centimeters (cm) and 30 cm from the surface of each smear. Gross beta-gamma and alpha counts of each smear were also performed. The smear samples were analyzed by gamma energy analysis (GEA). Alpha energy analysis (AEA) and strontium-90 analysis were also performed on selected smears. GEA results for one or more samples reported the presence of manganese-54, cobalt-60, silver-108m antimony-125, cesium-134, cesium-137, europium-154, europium-155, and americium-241. AEA results reported the presence of plutonium-239/240, plutonium-238/ 241 Am, curium-243/244, curium-242, and americium-243. Tables 5 through 9 present a summary by location of the estimated maximum removable and total contamination levels in the Room 147 hoods, the SMF, and the REC. The smear sample survey data and laboratory analytical results are presented in tabular form by sample in Appendix A. The Appendix A tables combine survey data documented in radiological survey reports found in Appendix B and laboratory analytical results reported in the 324 Building Physical and Radiological Characterization Study (Berk, Hill, and Landsman 1998), supplemented by the laboratory analytical results found in Appendix C.

  3. 75 FR 57268 - Notice of Baseline Filings

    Science.gov (United States)

    2010-09-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-103-000; Docket No. PR10-104-000; Docket No. PR10-105- 000 (Not Consolidated)] Notice of Baseline Filings September 13..., 2010, and September 10, 2010, respectively the applicants listed above submitted their baseline filing...

  4. Analysis of antenna position measurements and weather station network data during the ALMA long baseline campaign of 2015

    Science.gov (United States)

    Hunter, Todd R.; Lucas, Robert; Broguière, Dominique; Fomalont, Ed B.; Dent, William R. F.; Phillips, Neil; Rabanus, David; Vlahakis, Catherine

    2016-07-01

    In a radio interferometer, the geometrical antenna positions are determined from measurements of the observed delay to each antenna from observations across the sky of many point sources whose positions are known to high accuracy. The determination of accurate antenna positions relies on accurate calibration of the dry and wet delay of the atmosphere above each antenna. For the Atacama Large Millimeter/Submillimeter Array (ALMA), with baseline lengths up to 15 kilometers, the geography of the site forces the height above mean sea level of the more distant antenna pads to be significantly lower than the central array. Thus, both the ground level meteorological values and the total water column can be quite different between antennas in the extended configurations. During 2015, a network of six additional weather stations was installed to monitor pressure, temperature, relative humidity and wind velocity, in order to test whether inclusion of these parameters could improve the repeatability of antenna position determinations in these configurations. We present an analysis of the data obtained during the ALMA Long Baseline Campaign of October through November 2015. The repeatability of antenna position measurements typically degrades as a function of antenna distance. Also, the scatter is more than three times worse in the vertical direction than in the local tangent plane, suggesting that a systematic effect is limiting the measurements. So far we have explored correcting the delay model for deviations from hydrostatic equilibrium in the measured air pressure and separating the partial pressure of water from the total pressure using water vapor radiometer (WVR) data. Correcting for these combined effects still does not provide a good match to the residual position errors in the vertical direction. One hypothesis is that the current model of water vapor may be too simple to fully remove the day-to-day variations in the wet delay. We describe possible new avenues of

  5. 75 FR 65010 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-1-000; Docket No. PR11-2-000; Docket No. PR11-3-000] Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket..., 2010, respectively the applicants listed above submitted their baseline filing of its Statement of...

  6. Combining MCDA and risk analysis: dealing with scaling issues in the multiplicative AHP

    DEFF Research Database (Denmark)

    Barfod, Michael Bruhn; van den Honert, Rob; Salling, Kim Bang

    the progression factor 2 is used for calculating scores of alternatives and √2 for calculation of criteria weights when transforming the verbal judgments stemming from pair wise comparisons. However, depending on the decision context, the decision-makers aversion towards risk, etc., it is most likely......This paper proposes a new decision support system (DSS) for applying risk analysis and stochastic simulation to the multiplicative AHP in order to deal with issues concerning the progression factors. The multiplicative AHP makes use of direct rating on a logarithmic scale, and for this purpose...

  7. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    Science.gov (United States)

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  8. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  9. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    International Nuclear Information System (INIS)

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy

  10. 76 FR 8725 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings Enstor Grama Ridge Storage and Docket No. PR10-97-002. Transportation, L.L.C.. EasTrans, LLC Docket No. PR10-30-001... revised baseline filing of their Statement of Operating Conditions for services provided under section 311...

  11. 76 FR 5797 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  12. Detection of multiple AE signal by triaxial hodogram analysis; Sanjiku hodogram ho ni yoru taju acoustic emission no kenshutsu

    Energy Technology Data Exchange (ETDEWEB)

    Nagano, K; Yamashita, T [Muroran Institute of Technology, Hokkaido (Japan)

    1997-05-27

    In order to evaluate dynamic behavior of underground cracks, analysis and detection were attempted on multiple acoustic emission (AE) events. The multiple AE is a phenomenon in which multiple AE signals generated by underground cracks developed in an extremely short time interval are superimposed, and observed as one AE event. The multiple AE signal consists of two AE signals, whereas the second P-wave is supposed to have been inputted before the first S-wave is inputted. The first P-wave is inputted first, where linear three-dimensional particle movements are observed, but the movements are made random due to scattering and sensor characteristics. When the second P-wave is inputted, the linear particle movements are observed again, but are superimposed with the existing input signals and become multiple AE, which creates poor S/N ratio. The multiple AE detection determines it a multiple AE event when three conditions are met, i. e. a condition of equivalent time interval of a maximum value in a scalogram analysis, a condition of P-wave vibrating direction, and a condition of the linear particle movement. Seventy AE signals observed in the Kakkonda geothermal field were analyzed and AE signals that satisfy the multiple AE were detected. However, further development is required on an analysis method with high resolution for the time. 4 refs., 4 figs.

  13. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  14. Baseline predictors of sputum culture conversion in pulmonary tuberculosis: importance of cavities, smoking, time to detection and W-Beijing genotype.

    Directory of Open Access Journals (Sweden)

    Marianne E Visser

    Full Text Available Time to detection (TTD on automated liquid mycobacterial cultures is an emerging biomarker of tuberculosis outcomes. The M. tuberculosis W-Beijing genotype is spreading globally, indicating a selective advantage. There is a paucity of data on the association between baseline TTD and W-Beijing genotype and tuberculosis outcomes.To assess baseline predictors of failure of sputum culture conversion, within the first 2 months of antitubercular therapy, in participants with pulmonary tuberculosis.Between May 2005 and August 2008 we conducted a prospective cohort study of time to sputum culture conversion in ambulatory participants with first episodes of smear and culture positive pulmonary tuberculosis attending two primary care clinics in Cape Town, South Africa. Rifampicin resistance (diagnosed on phenotypic susceptibility testing was an exclusion criterion. Sputum was collected weekly for 8 weeks for mycobacterial culture on liquid media (BACTEC MGIT 960. Due to missing data, multiple imputation was performed. Time to sputum culture conversion was analysed using a Cox-proportional hazards model. Bayesian model averaging determined the posterior effect probability for each variable.113 participants were enrolled (30.1% female, 10.5% HIV-infected, 44.2% W-Beijing genotype, and 89% cavities. On Kaplan Meier analysis 50.4% of participants underwent sputum culture conversion by 8 weeks. The following baseline factors were associated with slower sputum culture conversion: TTD (adjusted hazard ratio (aHR = 1.11, 95% CI 1.02; 1.2, lung cavities (aHR = 0.13, 95% CI 0.02; 0.95, ever smoking (aHR = 0.32, 95% CI 0.1; 1.02 and the W-Beijing genotype (aHR = 0.51, 95% CI 0.25; 1.07. On Bayesian model averaging, posterior probability effects were strong for TTD, lung cavitation and smoking and moderate for W-Beijing genotype.We found that baseline TTD, smoking, cavities and W-Beijing genotype were associated with delayed 2 month sputum culture

  15. Baseline predictors of sputum culture conversion in pulmonary tuberculosis: importance of cavities, smoking, time to detection and W-Beijing genotype.

    Science.gov (United States)

    Visser, Marianne E; Stead, Michael C; Walzl, Gerhard; Warren, Rob; Schomaker, Michael; Grewal, Harleen M S; Swart, Elizabeth C; Maartens, Gary

    2012-01-01

    Time to detection (TTD) on automated liquid mycobacterial cultures is an emerging biomarker of tuberculosis outcomes. The M. tuberculosis W-Beijing genotype is spreading globally, indicating a selective advantage. There is a paucity of data on the association between baseline TTD and W-Beijing genotype and tuberculosis outcomes. To assess baseline predictors of failure of sputum culture conversion, within the first 2 months of antitubercular therapy, in participants with pulmonary tuberculosis. Between May 2005 and August 2008 we conducted a prospective cohort study of time to sputum culture conversion in ambulatory participants with first episodes of smear and culture positive pulmonary tuberculosis attending two primary care clinics in Cape Town, South Africa. Rifampicin resistance (diagnosed on phenotypic susceptibility testing) was an exclusion criterion. Sputum was collected weekly for 8 weeks for mycobacterial culture on liquid media (BACTEC MGIT 960). Due to missing data, multiple imputation was performed. Time to sputum culture conversion was analysed using a Cox-proportional hazards model. Bayesian model averaging determined the posterior effect probability for each variable. 113 participants were enrolled (30.1% female, 10.5% HIV-infected, 44.2% W-Beijing genotype, and 89% cavities). On Kaplan Meier analysis 50.4% of participants underwent sputum culture conversion by 8 weeks. The following baseline factors were associated with slower sputum culture conversion: TTD (adjusted hazard ratio (aHR) = 1.11, 95% CI 1.02; 1.2), lung cavities (aHR = 0.13, 95% CI 0.02; 0.95), ever smoking (aHR = 0.32, 95% CI 0.1; 1.02) and the W-Beijing genotype (aHR = 0.51, 95% CI 0.25; 1.07). On Bayesian model averaging, posterior probability effects were strong for TTD, lung cavitation and smoking and moderate for W-Beijing genotype. We found that baseline TTD, smoking, cavities and W-Beijing genotype were associated with delayed 2 month sputum culture. Larger

  16. Accelerated Best Basis Inventory Baselining Task

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2001-01-01

    The baselining effort was recently proposed to bring the Best-Basis Inventory (BBI) and Question No.8 of the Tank Interpretive Report (TIR) for all 177 tanks to the current standards and protocols and to prepare a TIR Question No.8 if one is not already available. This plan outlines the objectives and methodology of the accelerated BBI baselining task. BBI baselining meetings held during December 2000 resulted in a revised BBI methodology and an initial set of BBI creation rules to be used in the baselining effort. The objectives of the BBI baselining effort are to: (1) Provide inventories that are consistent with the revised BBI methodology and new BBI creation rules. (2) Split the total tank waste in each tank into six waste phases, as appropriate (Supernatant, saltcake solids, saltcake liquid, sludge solids, sludge liquid, and retained gas). In some tanks, the solids and liquid portions of the sludge and/or saltcake may be combined into a single sludge or saltcake phase. (3) Identify sampling events that are to be used for calculating the BBIs. (4) Update waste volumes for subsequent reconciliation with the Hanlon (2001) waste tank summary. (5) Implement new waste type templates. (6) Include any sample data that might have been unintentionally omitted in the previous BBI and remove any sample data that should not have been included. Sample data to be used in the BBI must be available on TWINS. (7) Ensure that an inventory value for each standard BBI analyte is provided for each waste component. Sample based inventories for supplemental BBI analytes will be included when available. (8) Provide new means and confidence interval reports if one is not already available and include uncertainties in reporting inventory values

  17. AN INTERFEROMETRIC AND SPECTROSCOPIC ANALYSIS OF THE MULTIPLE STAR SYSTEM HD 193322

    International Nuclear Information System (INIS)

    Ten Brummelaar, Theo A.; Farrington, Christopher D.; Schaefer, Gail H.

    2011-01-01

    The star HD 193322 is a remarkable multiple system of massive stars that lies at the heart of the cluster Collinder 419. Here we report on new spectroscopic observations and radial velocities of the narrow-lined component Ab1 which we use to determine its orbital motion around a close companion Ab2 (P = 312 days) and around a distant third star Aa (P = 35 years). We have also obtained long baseline interferometry of the target in the K' band with the CHARA Array which we use in two ways. First, we combine published speckle interferometric measurements with CHARA separated fringe packet measurements to improve the visual orbit for the wide Aa,Ab binary. Second, we use measurements of the fringe packet from Aa to calibrate the visibility of the fringes of the Ab1,Ab2 binary, and we analyze these fringe visibilities to determine the visual orbit of the close system. The two most massive stars, Aa and Ab1, have masses of approximately 21 and 23 M sun , respectively, and their spectral line broadening indicates that they represent extremes of fast and slow projected rotational velocity, respectively.

  18. A mathematical analysis of multiple-target SELEX.

    Science.gov (United States)

    Seo, Yeon-Jung; Chen, Shiliang; Nilsen-Hamilton, Marit; Levine, Howard A

    2010-10-01

    SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is a procedure by which a mixture of nucleic acids can be fractionated with the goal of identifying those with specific biochemical activities. One combines the mixture with a specific target molecule and then separates the target-NA complex from the resulting reactions. The target-NA complex is separated from the unbound NA by mechanical means (such as by filtration), the NA is eluted from the complex, amplified by PCR (polymerase chain reaction), and the process repeated. After several rounds, one should be left with the nucleic acids that best bind to the target. The problem was first formulated mathematically in Irvine et al. (J. Mol. Biol. 222:739-761, 1991). In Levine and Nilsen-Hamilton (Comput. Biol. Chem. 31:11-25, 2007), a mathematical analysis of the process was given. In Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998), multiple target SELEX was considered. It was assumed that each target has a single nucleic acid binding site that permits occupation by no more than one nucleic acid. Here, we revisit Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998) using the same assumptions. The iteration scheme is shown to be convergent and a simplified algorithm is given. Our interest here is in the behavior of the multiple target SELEX process as a discrete "time" dynamical system. Our goal is to characterize the limiting states and their dependence on the initial distribution of nucleic acid and target fraction components. (In multiple target SELEX, we vary the target component fractions, but not their concentrations, as fixed and the initial pool of nucleic acids as a variable starting condition). Given N nucleic acids and a target consisting of M subtarget component species, there is an M × N matrix of affinities, the (i,j) entry corresponding to the affinity of the jth nucleic acid for the ith subtarget. We give a structure condition on this matrix that is equivalent to the following

  19. Integrated Baseline System (IBS) Version 1.03: Utilities guide

    Energy Technology Data Exchange (ETDEWEB)

    Burford, M.J.; Downing, T.R.; Pottier, M.C.; Schrank, E.E.; Williams, J.R.

    1993-01-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool that was developed under the direction of the Federal Emergency Management Agency (FEMA). This Utilities Guide explains how to operate utility programs that are supplied as a part of the IBS. These utility programs are chiefly for managing and manipulating various kinds of IBS data and system administration files. Many of the utilities are for creating, editing, converting, or displaying map data and other data that are related to geographic location.

  20. Integrated Baseline System (IBS) Version 2.0: Utilities Guide

    Energy Technology Data Exchange (ETDEWEB)

    Burford, M.J.; Downing, T.R.; Williams, J.R. [Pacific Northwest Lab., Richland, WA (United States); Bower, J.C. [Bower Software Services, Kennewick, WA (United States)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Utilities Guide explains how you can use the IBS utility programs to manage and manipulate various kinds of IBS data. These programs include utilities for creating, editing, and displaying maps and other data that are referenced to geographic location. The intended audience for this document are chiefly data managers but also system managers and some emergency management planners and analysts.

  1. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  2. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Charles eGreen

    2012-10-01

    Full Text Available Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1 examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2 apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect.Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided samples for the historical dataset (Study 1: N = 64 complete observations and current dataset (Study 2: N = 113 complete observations. Negative binomial regression evaluated Treatment Effectiveness Scores (TES as a function of medication condition (levodopa/carbidopa, placebo, baseline marijuana use (days in past 30, and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES.Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  3. 75 FR 70732 - Notice of Baseline Filings

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  4. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  5. Active neutron multiplicity analysis and Monte Carlo calculations

    International Nuclear Information System (INIS)

    Krick, M.S.; Ensslin, N.; Langner, D.G.; Miller, M.C.; Siebelist, R.; Stewart, J.E.; Ceo, R.N.; May, P.K.; Collins, L.L. Jr

    1994-01-01

    Active neutron multiplicity measurements of high-enrichment uranium metal and oxide samples have been made at Los Alamos and Y-12. The data from the measurements of standards at Los Alamos were analyzed to obtain values for neutron multiplication and source-sample coupling. These results are compared to equivalent results obtained from Monte Carlo calculations. An approximate relationship between coupling and multiplication is derived and used to correct doubles rates for multiplication and coupling. The utility of singles counting for uranium samples is also examined

  6. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  7. Advanced three-dimensional thermal modeling of a baseline spent fuel repository

    International Nuclear Information System (INIS)

    Altenbach, T.J.; Lowry, W.E.

    1980-01-01

    A three-dimensional thermal analysis using finite difference techniques was performed to determine the near-field response of a baseline spent fuel repository in a deep geologic salt medium. A baseline design incorporates previous thermal modeling experience and OWI recommendations for areal thermal loading in specifying the waste form properties, package details, and emplacement configuration. The base case in this thermal analysis considers one 10-year old PWR spent fuel assembly emplaced to yield a 36 kW/acre (8.9 W/m 2 ) loading. A unit cell model in an infinite array is used to simplify the problem and provide upper-bound temperatures. Boundary conditions are imposed which allow simulations to 1000 years. Variations studied include a comparison of ventilated and unventilated storage room conditions, emplacement packages with and without air gaps surrounding the canister, and room cool-down scenarios with ventilation following an unventilated state for retrieval purposes. It was found that at this low-power level, ventilating the emplacement room has an immediate cooling influence on the canister and effectively maintains the emplacement room floor near the temperature of the ventilating air

  8. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  9. Revealing Pathway Dynamics in Heart Diseases by Analyzing Multiple Differential Networks.

    Directory of Open Access Journals (Sweden)

    Xiaoke Ma

    2015-06-01

    Full Text Available Development of heart diseases is driven by dynamic changes in both the activity and connectivity of gene pathways. Understanding these dynamic events is critical for understanding pathogenic mechanisms and development of effective treatment. Currently, there is a lack of computational methods that enable analysis of multiple gene networks, each of which exhibits differential activity compared to the network of the baseline/healthy condition. We describe the iMDM algorithm to identify both unique and shared gene modules across multiple differential co-expression networks, termed M-DMs (multiple differential modules. We applied iMDM to a time-course RNA-Seq dataset generated using a murine heart failure model generated on two genotypes. We showed that iMDM achieves higher accuracy in inferring gene modules compared to using single or multiple co-expression networks. We found that condition-specific M-DMs exhibit differential activities, mediate different biological processes, and are enriched for genes with known cardiovascular phenotypes. By analyzing M-DMs that are present in multiple conditions, we revealed dynamic changes in pathway activity and connectivity across heart failure conditions. We further showed that module dynamics were correlated with the dynamics of disease phenotypes during the development of heart failure. Thus, pathway dynamics is a powerful measure for understanding pathogenesis. iMDM provides a principled way to dissect the dynamics of gene pathways and its relationship to the dynamics of disease phenotype. With the exponential growth of omics data, our method can aid in generating systems-level insights into disease progression.

  10. IPCC Socio-Economic Baseline Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...

  11. Analysis of multiple spurions and associated circuits in Cofrentes; Analisis de espurios multiples y circuitos asociados en C.N. Cofrentes

    Energy Technology Data Exchange (ETDEWEB)

    Molina, J. J.; Celaya, M. A.

    2015-07-01

    The article describes the process followed by the Cofrentes Nuclear Power Plant (CNC) to conduct the analysis of multiple spurious in compliance with regulatory standards IS-30 rev 1 and CSN Safety Guide 1.19 based on the recommendations of the NEI-00-01 Guidance for Post-fire Safe Shutdown Circuit and NUREG/CR-6850. Fire PRA Methodology for Nuclear Power Facilities. (Author)

  12. Causal mediation analysis with multiple mediators in the presence of treatment noncompliance.

    Science.gov (United States)

    Park, Soojin; Kürüm, Esra

    2018-05-20

    Randomized experiments are often complicated because of treatment noncompliance. This challenge prevents researchers from identifying the mediated portion of the intention-to-treated (ITT) effect, which is the effect of the assigned treatment that is attributed to a mediator. One solution suggests identifying the mediated ITT effect on the basis of the average causal mediation effect among compliers when there is a single mediator. However, considering the complex nature of the mediating mechanisms, it is natural to assume that there are multiple variables that mediate through the causal path. Motivated by an empirical analysis of a data set collected in a randomized interventional study, we develop a method to estimate the mediated portion of the ITT effect when both multiple dependent mediators and treatment noncompliance exist. This enables researchers to make an informed decision on how to strengthen the intervention effect by identifying relevant mediators despite treatment noncompliance. We propose a nonparametric estimation procedure and provide a sensitivity analysis for key assumptions. We conduct a Monte Carlo simulation study to assess the finite sample performance of the proposed approach. The proposed method is illustrated by an empirical analysis of JOBS II data, in which a job training intervention was used to prevent mental health deterioration among unemployed individuals. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  14. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  15. Baseline budgeting for continuous improvement.

    Science.gov (United States)

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  16. Possible Lingering Effects of Multiple Past Concussions

    Directory of Open Access Journals (Sweden)

    Grant L. Iverson

    2012-01-01

    Full Text Available Background. The literature on lingering or “cumulative” effects of multiple concussions is mixed. The purpose of this study was to examine whether athletes with a history of three or more concussions perform more poorly on neuropsychological testing or report more subjective symptoms during a baseline, preseason evaluation. Hypothesis. Athletes reporting three or more past concussions would perform more poorly on preseason neurocognitive testing. Study Design. Case-control study. Methods. An archival database including 786 male athletes who underwent preseason testing with a computerized battery (ImPACT was used to select the participants. Twenty-six athletes, between the ages of 17 and 22 with a history of three or more concussions, were identified. Athletes with no history of concussion were matched, in a case-control fashion, on age, education, self-reported ADHD, school, sport, and, when possible, playing position and self-reported academic problems. Results. The two groups were compared on the four neuropsychological composite scores from ImPACT using multivariate analysis of variance followed by univariate ANOVAs. MANOVA revealed no overall significant effect. Exploratory ANOVAs were conducted using Verbal Memory, Visual Memory, Reaction Time, Processing Speed, and Postconcussion Scale composite scores as dependent variables. There was a significant effect for only the Verbal Memory composite. Conclusions. Although inconclusive, the results suggest that some athletes with multiple concussions could have lingering memory deficits.

  17. Prediction of hearing outcomes by multiple regression analysis in patients with idiopathic sudden sensorineural hearing loss.

    Science.gov (United States)

    Suzuki, Hideaki; Tabata, Takahisa; Koizumi, Hiroki; Hohchi, Nobusuke; Takeuchi, Shoko; Kitamura, Takuro; Fujino, Yoshihisa; Ohbuchi, Toyoaki

    2014-12-01

    This study aimed to create a multiple regression model for predicting hearing outcomes of idiopathic sudden sensorineural hearing loss (ISSNHL). The participants were 205 consecutive patients (205 ears) with ISSNHL (hearing level ≥ 40 dB, interval between onset and treatment ≤ 30 days). They received systemic steroid administration combined with intratympanic steroid injection. Data were examined by simple and multiple regression analyses. Three hearing indices (percentage hearing improvement, hearing gain, and posttreatment hearing level [HLpost]) and 7 prognostic factors (age, days from onset to treatment, initial hearing level, initial hearing level at low frequencies, initial hearing level at high frequencies, presence of vertigo, and contralateral hearing level) were included in the multiple regression analysis as dependent and explanatory variables, respectively. In the simple regression analysis, the percentage hearing improvement, hearing gain, and HLpost showed significant correlation with 2, 5, and 6 of the 7 prognostic factors, respectively. The multiple correlation coefficients were 0.396, 0.503, and 0.714 for the percentage hearing improvement, hearing gain, and HLpost, respectively. Predicted values of HLpost calculated by the multiple regression equation were reliable with 70% probability with a 40-dB-width prediction interval. Prediction of HLpost by the multiple regression model may be useful to estimate the hearing prognosis of ISSNHL. © The Author(s) 2014.

  18. Poor Baseline Pulmonary Function May Not Increase the Risk of Radiation-Induced Lung Toxicity

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jingbo [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Cao, Jianzhong [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Yuan, Shuanghu [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Ji, Wei [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Arenberg, Douglas [Department of Internal Medicine, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Dai, Jianrong [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Stanton, Paul; Tatro, Daniel; Ten Haken, Randall K. [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States); Wang, Luhua, E-mail: wlhwq@yahoo.com [Department of Radiation Oncology, Cancer Hospital, Chinese Academic Medical Sciences and Peking Union Medical College, Beijing (China); Kong, Feng-Ming, E-mail: fengkong@med.umich.edu [Department of Radiation Oncology, University of Michigan/Ann Arbor Veterans Health System, Ann Arbor, Michigan (United States)

    2013-03-01

    Purpose: Poor pulmonary function (PF) is often considered a contraindication to definitive radiation therapy for lung cancer. This study investigated whether baseline PF was associated with radiation-induced lung toxicity (RILT) in patients with non-small cell lung cancer (NSCLC) receiving conformal radiation therapy (CRT). Methods and Materials: NSCLC patients treated with CRT and tested for PF at baseline were eligible. Baseline predicted values of forced expiratory volume in 1 sec (FEV1), forced vital capacity (FVC), and diffusion capacity of lung for carbon monoxide (DLCO) were analyzed. Additional factors included age, gender, smoking status, Karnofsky performance status, coexisting chronic obstructive pulmonary disease (COPD), tumor location, histology, concurrent chemotherapy, radiation dose, and mean lung dose (MLD) were evaluated for RILT. The primary endpoint was symptomatic RILT (SRILT), including grade ≥2 radiation pneumonitis and fibrosis. Results: There was a total of 260 patients, and SRILT occurred in 58 (22.3%) of them. Mean FEV1 values for SRILT and non-SRILT patients were 71.7% and 65.9% (P=.077). Under univariate analysis, risk of SRILT increased with MLD (P=.008), the absence of COPD (P=.047), and FEV1 (P=.077). Age (65 split) and MLD were significantly associated with SRILT in multivariate analysis. The addition of FEV1 and age with the MLD-based model slightly improved the predictability of SRILT (area under curve from 0.63-0.70, P=.088). Conclusions: Poor baseline PF does not increase the risk of SRILT, and combining FEV1, age, and MLD may improve the predictive ability.

  19. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  20. Safe and sensible preprocessing and baseline correction of pupil-size data.

    Science.gov (United States)

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  1. Performance Measurement Baseline Change Request

    Data.gov (United States)

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  2. It's Deja Vu All over Again: Using Multiple-Spell Discrete-Time Survival Analysis.

    Science.gov (United States)

    Willett, John B.; Singer, Judith D.

    1995-01-01

    The multiple-spell discrete-time survival analysis method is introduced and illustrated using longitudinal data on exit from and reentry into the teaching profession. The method is applicable to many educational problems involving the sequential occurrence of disparate events or episodes. (SLD)

  3. The effect of antacid on salivary pH in patients with and without dental erosion after multiple acid challenges.

    Science.gov (United States)

    Dhuhair, Sarah; Dennison, Joseph B; Yaman, Peter; Neiva, Gisele F

    2015-04-01

    To evaluate the effect of antacid swish in the salivary pH values and to monitor the pH changes in subjects with and without dental erosion after multiple acid challenge tests. 20 subjects with tooth erosion were matched in age and gender with 20 healthy controls according to specific inclusion/exclusion criteria. Baseline measures were taken of salivary pH, buffering capacity and salivary flow rate using the Saliva Check System. Subjects swished with Diet Pepsi three times at 10-minute intervals. Changes in pH were monitored using a digital pH meter at 0-, 5-, and 10- minute intervals and at every 5 minutes after the third swish until pH resumed baseline value or 45 minutes relapse. Swishing regimen was repeated on a second visit, followed by swishing with sugar-free liquid antacid (Mylanta Supreme). Recovery times were also recorded. Data was analyzed using independent t-tests, repeated measures ANOVA, and Fisher's exact test (α= 0.05). Baseline buffering capacity and flow rate were not significantly different between groups (P= 0.542; P= 0.2831, respectively). Baseline salivary pH values were similar between groups (P= 0.721). No significant differences in salivary pH values were found between erosion and non-erosion groups in response to multiple acid challenges (P= 0.695) or antacid neutralization (P= 0.861). Analysis of salivary pH recovery time revealed no significant differences between groups after acid challenges (P= 0.091) or after the use of antacid (P= 0.118). There was a highly significant difference in the survival curves of the two groups on Day 2, with the non-erosion group resolving significantly faster than the erosion group (P= 0.0086).

  4. Neophyte experiences of football (soccer) match analysis: a multiple case study approach.

    Science.gov (United States)

    McKenna, Mark; Cowan, Daryl Thomas; Stevenson, David; Baker, Julien Steven

    2018-03-05

    Performance analysis is extensively used in sport, but its pedagogical application is little understood. Given its expanding role across football, this study explored the experiences of neophyte performance analysts. Experiences of six analysis interns, across three professional football clubs, were investigated as multiple cases of new match analysis. Each intern was interviewed after their first season, with archival data providing background information. Four themes emerged from qualitative analysis: (1) "building of relationships" was important, along with trust and role clarity; (2) "establishing an analysis system" was difficult due to tacit coach knowledge, but analysis was established; (3) the quality of the "feedback process" hinged on coaching styles, with balance of feedback and athlete engagement considered essential; (4) "establishing effect" was complex with no statistical effects reported; yet enhanced relationships, role clarity, and improved performances were reported. Other emic accounts are required to further understand occupational culture within performance analysis.

  5. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  6. Association of baseline bleeding pattern on amenorrhea with levonorgestrel intrauterine system use.

    Science.gov (United States)

    Mejia, Manuela; McNicholas, Colleen; Madden, Tessa; Peipert, Jeffrey F

    2016-11-01

    This study aims to evaluate the effect of baseline bleeding patterns on rates of amenorrhea reported at 12 months in levonorgestrel (LNG) 52 mg intrauterine system (IUS) users. We also assessed the effect of baseline bleeding patterns at 3 and 6 months postinsertion. In this secondary analysis of the Contraceptive CHOICE Project, we included participants who had an LNG-IUS inserted within 1 month of enrollment and continued use for 12 months. Using 12-month telephone survey data, we defined amenorrhea at 12 months of use as no bleeding or spotting during the previous 6 months. We used chi-square and multivariable logistic regression to assess the association of baseline bleeding pattern with amenorrhea while controlling for confounding variables. Of 1802 continuous 12-month LNG-IUS users, amenorrhea was reported by 4.9%, 14.8% and 15.4% of participants at 3, 6 and 12 months, receptively. Participants with light baseline bleeding or short duration of flow reported higher rates of amenorrhea at 3 and 6 months postinsertion (pamenorrhea at 3 and 6 months (pamenorrhea at 12 months than those who reported moderate bleeding (OR adj , 0.36; 95% CI, 0.16-0.69). Women with heavier menstrual bleeding are less likely than women with moderate flow to report amenorrhea following 12 months of LNG-IUS use. Baseline heavy menstrual flow reduces the likelihood of amenorrhea with LNG-IUS use, information that could impact contraceptive counseling. Anticipatory counseling can improve method satisfaction and continuation, an important strategy to continue to reduce unintended pregnancy and abortion rates. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. An Updated Meta-Analysis of Risk of Multiple Sclerosis following Infectious Mononucleosis

    Science.gov (United States)

    Handel, Adam E.; Williamson, Alexander J.; Disanto, Giulio; Handunnetthi, Lahiru; Giovannoni, Gavin; Ramagopalan, Sreeram V.

    2010-01-01

    Background Multiple sclerosis (MS) appears to develop in genetically susceptible individuals as a result of environmental exposures. Epstein-Barr virus (EBV) infection is an almost universal finding among individuals with MS. Symptomatic EBV infection as manifested by infectious mononucleosis (IM) has been shown in a previous meta-analysis to be associated with the risk of MS, however a number of much larger studies have since been published. Methods/Principal Findings We performed a Medline search to identify articles published since the original meta-analysis investigating MS risk following IM. A total of 18 articles were included in this study, including 19390 MS patients and 16007 controls. We calculated the relative risk of MS following IM using a generic inverse variance with random effects model. This showed that the risk of MS was strongly associated with IM (relative risk (RR) 2.17; 95% confidence interval 1.97–2.39; pmononucleosis significantly increases the risk of multiple sclerosis. Future work should focus on the mechanism of this association and interaction with other risk factors. PMID:20824132

  8. An updated meta-analysis of risk of multiple sclerosis following infectious mononucleosis.

    Directory of Open Access Journals (Sweden)

    Adam E Handel

    2010-09-01

    Full Text Available Multiple sclerosis (MS appears to develop in genetically susceptible individuals as a result of environmental exposures. Epstein-Barr virus (EBV infection is an almost universal finding among individuals with MS. Symptomatic EBV infection as manifested by infectious mononucleosis (IM has been shown in a previous meta-analysis to be associated with the risk of MS, however a number of much larger studies have since been published.We performed a Medline search to identify articles published since the original meta-analysis investigating MS risk following IM. A total of 18 articles were included in this study, including 19390 MS patients and 16007 controls. We calculated the relative risk of MS following IM using a generic inverse variance with random effects model. This showed that the risk of MS was strongly associated with IM (relative risk (RR 2.17; 95% confidence interval 1.97-2.39; p<10(-54.Our results establish firmly that a history of infectious mononucleosis significantly increases the risk of multiple sclerosis. Future work should focus on the mechanism of this association and interaction with other risk factors.

  9. Safe and tolerable one-hour pamidronate infusion for multiple myeloma patients

    Directory of Open Access Journals (Sweden)

    Dimitrios Chantzichristos

    2008-09-01

    Full Text Available Dimitrios Chantzichristos, Andréasson Björn, Johansson PeterDepartment of Internal Medicine, Uddevalla Hospital, Uddevalla, SwedenBackground: Once a month, patients with multiple myeloma received an infusion of bisphosphonates, principally to reduce osteoclastic bone resorption. Recommended infusion time for pamidronate is 2 hours in the US and 4 hours in Europe because of its potential nephrotoxicity. From 2003, a 90 mg infusion of pamidronate was provided over 1 hour to patients with no pre-existing renal impairment, in the Daily Care Unit at Uddevalla Hospital.Method: Retrospective analysis of the renal deterioration, serum calcium, and adverse effects in patients with multiple myeloma treated with 1-hour pamidronate 90 mg infusion from January 2003 to April 2007.Results: Seventy-nine patients provided valuable data. A total number of 846 infusions were given and the median number of infusion to each patient was 11. Significant creatinine elevation was seen in 7 patients (8.9%, after 19 infusions (2.2%. Renal deterioration occurred in 5 of these 7 patients, which was related to progress of the myeloma or opportunistic infections. Prevalence of infusion-related events was 0.8% and the mean total S-Ca was 0.05 mmol/L lower than the baseline.Conclusion: Few events of renal deterioration, hypocalcemia, or other adverse effects resulted from a 1-hour pamidronate 90 mg infusion for multiple myeloma patients with no pre-existing renal impairment.Keywords: bisphosphonates, pamidronate, multiple myeloma, infusion time

  10. Better informing decision making with multiple outcomes cost-effectiveness analysis under uncertainty in cost-disutility space.

    Science.gov (United States)

    McCaffrey, Nikki; Agar, Meera; Harlum, Janeane; Karnon, Jonathon; Currow, David; Eckermann, Simon

    2015-01-01

    Comparing multiple, diverse outcomes with cost-effectiveness analysis (CEA) is important, yet challenging in areas like palliative care where domains are unamenable to integration with survival. Generic multi-attribute utility values exclude important domains and non-health outcomes, while partial analyses-where outcomes are considered separately, with their joint relationship under uncertainty ignored-lead to incorrect inference regarding preferred strategies. The objective of this paper is to consider whether such decision making can be better informed with alternative presentation and summary measures, extending methods previously shown to have advantages in multiple strategy comparison. Multiple outcomes CEA of a home-based palliative care model (PEACH) relative to usual care is undertaken in cost disutility (CDU) space and compared with analysis on the cost-effectiveness plane. Summary measures developed for comparing strategies across potential threshold values for multiple outcomes include: expected net loss (ENL) planes quantifying differences in expected net benefit; the ENL contour identifying preferred strategies minimising ENL and their expected value of perfect information; and cost-effectiveness acceptability planes showing probability of strategies minimising ENL. Conventional analysis suggests PEACH is cost-effective when the threshold value per additional day at home (1) exceeds $1,068 or dominated by usual care when only the proportion of home deaths is considered. In contrast, neither alternative dominate in CDU space where cost and outcomes are jointly considered, with the optimal strategy depending on threshold values. For example, PEACH minimises ENL when 1=$2,000 and 2=$2,000 (threshold value for dying at home), with a 51.6% chance of PEACH being cost-effective. Comparison in CDU space and associated summary measures have distinct advantages to multiple domain comparisons, aiding transparent and robust joint comparison of costs and multiple

  11. Biosensors and multiple mycotoxin analysis

    NARCIS (Netherlands)

    Gaag, B. van der; Spath, S.; Dietrich, H.; Stigter, E.; Boonzaaijer, G.; Osenbruggen, T. van; Koopal, K.

    2003-01-01

    An immunochemical biosensor assay for the detection of multiple mycotoxins in a sample is described.The inhibition assay is designed to measure four different mycotoxins in a single measurement, following extraction, sample clean-up and incubation with an appropriate cocktail of anti-mycotoxin

  12. Security and reliability analysis of diversity combining techniques in SIMO mixed RF/FSO with multiple users

    KAUST Repository

    Abd El-Malek, Ahmed H.; Salhab, Anas M.; Zummo, Salam A.; Alouini, Mohamed-Slim

    2016-01-01

    In this paper, we investigate the impact of different diversity combining techniques on the security and reliability analysis of a single-input-multiple-output (SIMO) mixed radio frequency (RF)/free space optical (FSO) relay network with opportunistic multiuser scheduling. In this model, the user of the best channel among multiple users communicates with a multiple antennas relay node over an RF link, and then, the relay node employs amplify-and-forward (AF) protocol in retransmitting the user data to the destination over an FSO link. Moreover, the authorized transmission is assumed to be attacked by a single passive RF eavesdropper equipped with multiple antennas. Therefore, the system security reliability trade-off analysis is investigated. Closed-form expressions for the system outage probability and the system intercept probability are derived. Then, the newly derived expressions are simplified to their asymptotic formulas at the high signal-to-noise- ratio (SNR) region. Numerical results are presented to validate the achieved exact and asymptotic results and to illustrate the impact of various system parameters on the system performance. © 2016 IEEE.

  13. Security and reliability analysis of diversity combining techniques in SIMO mixed RF/FSO with multiple users

    KAUST Repository

    Abd El-Malek, Ahmed H.

    2016-07-26

    In this paper, we investigate the impact of different diversity combining techniques on the security and reliability analysis of a single-input-multiple-output (SIMO) mixed radio frequency (RF)/free space optical (FSO) relay network with opportunistic multiuser scheduling. In this model, the user of the best channel among multiple users communicates with a multiple antennas relay node over an RF link, and then, the relay node employs amplify-and-forward (AF) protocol in retransmitting the user data to the destination over an FSO link. Moreover, the authorized transmission is assumed to be attacked by a single passive RF eavesdropper equipped with multiple antennas. Therefore, the system security reliability trade-off analysis is investigated. Closed-form expressions for the system outage probability and the system intercept probability are derived. Then, the newly derived expressions are simplified to their asymptotic formulas at the high signal-to-noise- ratio (SNR) region. Numerical results are presented to validate the achieved exact and asymptotic results and to illustrate the impact of various system parameters on the system performance. © 2016 IEEE.

  14. Progressive decline of decision-making performances during multiple sclerosis.

    Science.gov (United States)

    Simioni, Samanta; Ruffieux, Christiane; Kleeberg, Joerg; Bruggimann, Laure; du Pasquier, Renaud A; Annoni, Jean-Marie; Schluep, Myriam

    2009-03-01

    The purpose of this study was to evaluate longitudinally, using the Iowa Gambling Task (IGT), the dynamics of decision-making capacity at a two-year interval (median: 2.1 years) in a group of patients with multiple sclerosis (MS) (n = 70) and minor neurological disability [Expanded Disability Status Scale (EDSS) attention), behavior, handicap, and perceived health status were also investigated. Standardized change scores [(score at retest-score at baseline)/standard deviation of baseline score] were computed. Results showed that IGT performances decreased from baseline to retest (from 0.3, SD = 0.4 to 0.1, SD = 0.3, p = .005). MS patients who worsened in the IGT were more likely to show a decreased perceived health status and emotional well-being (SEP-59; p = .05 for both). Relapsing rate, disability progression, cognitive, and behavioral changes were not associated with decreased IGT performances. In conclusion, decline in decision making can appear as an isolated deficit in MS.

  15. Multiple murder and criminal careers: a latent class analysis of multiple homicide offenders.

    Science.gov (United States)

    Vaughn, Michael G; DeLisi, Matt; Beaver, Kevin M; Howard, Matthew O

    2009-01-10

    To construct an empirically rigorous typology of multiple homicide offenders (MHOs). The current study conducted latent class analysis of the official records of 160 MHOs sampled from eight states to evaluate their criminal careers. A 3-class solution best fit the data (-2LL=-1123.61, Bayesian Information Criterion (BIC)=2648.15, df=81, L(2)=1179.77). Class 1 (n=64, class assignment probability=.999) was the low-offending group marked by little criminal record and delayed arrest onset. Class 2 (n=51, class assignment probability=.957) was the severe group that represents the most violent and habitual criminals. Class 3 (n=45, class assignment probability=.959) was the moderate group whose offending careers were similar to Class 2. A sustained criminal career with involvement in versatile forms of crime was observed for two of three classes of MHOs. Linkages to extant typologies and recommendations for additional research that incorporates clinical constructs are proffered.

  16. Efficacy of a smoking quit line in the military: Baseline design and analysis

    Science.gov (United States)

    Richey, Phyllis A.; Klesges, Robert C.; Talcott, Gerald W.; DeBon, Margaret; Womack, Catherine; Thomas, Fridtjof; Hryshko-Mullen, Ann

    2013-01-01

    Thirty percent of all military personnel smoke cigarettes. Because of the negative health consequences and their impact on physical fitness, overall health, and military readiness, the Department of Defense has identified the reduction of tobacco use as a priority of US military forces. This study aims to evaluate the one-year efficacy of a proactive versus reactive smoking quit line in the US military with adjunctive nicotine replacement therapy (NRT) in both groups. This paper reports on the baseline variables of the first 1000 participants randomized, the design, and proposed analysis of the randomized two-arm clinical trial “Efficacy of a Tobacco Quit Line in the Military”. Participants are adult smokers who are Armed Forces Active Duty personnel, retirees, Reservist, National Guard and family member healthcare beneficiaries. All participants are randomized to either the Counselor Initiated (proactive) group, receiving 6 counseling sessions in addition to an 8-week supply of NRT, or the Self-Paced (reactive) group, in which they may call the quit line themselves to receive the same counseling sessions, in addition to a 2-week supply of NRT. The primary outcome measure of the study is self-reported smoking abstinence at 1-year follow-up. Results from this study will be the first to provide evidence for the efficacy of an intensive Counselor Initiated quit line with provided NRT in military personnel and could lead to dissemination throughout the US Air Force, the armed forces population as a whole and ultimately to civilian personnel that do not have ready access to preventive health services. PMID:22561390

  17. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    Science.gov (United States)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  18. Optical design of multi-multiple expander structure of laser gas analysis and measurement device

    Science.gov (United States)

    Fu, Xiang; Wei, Biao

    2018-03-01

    The installation and debugging of optical circuit structure in the application of carbon monoxide distributed laser gas analysis and measurement, there are difficult key technical problems. Based on the three-component expansion theory, multi-multiple expander structure with expansion ratio of 4, 5, 6 and 7 is adopted in the absorption chamber to enhance the adaptability of the installation environment of the gas analysis and measurement device. According to the basic theory of aberration, the optimal design of multi-multiple beam expander structure is carried out. By using image quality evaluation method, the difference of image quality under different magnifications is analyzed. The results show that the optical quality of the optical system with the expanded beam structure is the best when the expansion ratio is 5-7.

  19. Weight change by baseline BMI from three-year observational data: findings from the Worldwide Schizophrenia Outpatient Health Outcomes Database.

    Science.gov (United States)

    Bushe, Chris J; Slooff, Cees J; Haddad, Peter M; Karagianis, Jamie L

    2013-04-01

    The aim was to explore weight and body mass index (BMI) changes by baseline BMI in patients completing three years of monotherapy with various first- and second-generation antipsychotics in a large cohort in a post hoc analysis of three-year observational data. Data were analyzed by antipsychotic and three baseline BMI bands: underweight/normal weight (BMI 30 kg/m²). Baseline BMI was associated with subsequent weight change irrespective of the antipsychotic given. Specifically, a smaller proportion of patients gained ≥7% baseline bodyweight, and a greater proportion of patients lost ≥7% baseline bodyweight with increasing baseline BMI. For olanzapine (the antipsychotic associated with highest mean weight gain in the total drug cohort), the percentage of patients gaining ≥7% baseline weight was 45% (95% CI: 43-48) in the underweight/normal weight BMI cohort and 20% (95% CI: 15-27) in the obese BMI cohort; 7% (95% CI: 6-8) of the underweight/normal cohort and 19% (95% CI: 13-27) of the obese cohort lost ≥7% baseline weight. BMI has an association with the likelihood of weight gain or loss and should be considered in analyses of antipsychotic weight change.

  20. Emergency Response Capability Baseline Needs Assessment Compliance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, John A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-09-16

    This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures.

  1. Acute alcohol effects on set-shifting and its moderation by baseline individual differences: a latent variable analysis.

    Science.gov (United States)

    Korucuoglu, Ozlem; Sher, Kenneth J; Wood, Phillip K; Saults, John Scott; Altamirano, Lee; Miyake, Akira; Bartholow, Bruce D

    2017-03-01

    To compare the acute effects of alcohol on set-shifting task performance (relative to sober baseline performance) during ascending and descending limb breath alcohol concentration (BrAC), as well as possible moderation of these effects by baseline individual differences. Shifting performance was tested during an initial baseline and a subsequent drinking session, during which participants were assigned randomly to one of three beverage conditions (alcohol, placebo or control) and one of two BrAC limb conditions [ascending and descending (A/D) or descending-only (D-only)]. A human experimental laboratory on the University of Missouri campus in Columbia, MO, USA. A total of 222 moderate-drinking adults (ages 21-30 years) recruited from Columbia, MO and tested between 2010 and 2013. The outcome measure was performance on set-shifting tasks under the different beverage and limb conditions. Shifting performance assessed at baseline was a key moderator. Although performance improved across sessions, this improvement was reduced in the alcohol compared with no-alcohol groups (post-drink latent mean comparison across groups, all Ps ≤ 0.05), and this effect was more pronounced in individuals with lower pre-drink performance (comparison of pre- to post-drink path coefficients across groups, all Ps ≤ 0.05). In the alcohol group, performance was better on descending compared with ascending limb (P ≤ 0.001), but descending limb performance did not differ across the A/D and D-only groups. Practising tasks before drinking moderates the acute effects of alcohol on the ability to switch between tasks. Greater impairment in shifting ability on descending compared with ascending breath alcohol concentration is not related to task practice. © 2016 Society for the Study of Addiction.

  2. Comparison of the phenolic composition of fruit juices by single step gradient HPLC analysis of multiple components versus multiple chromatographic runs optimised for individual families.

    Science.gov (United States)

    Bremner, P D; Blacklock, C J; Paganga, G; Mullen, W; Rice-Evans, C A; Crozier, A

    2000-06-01

    After minimal sample preparation, two different HPLC methodologies, one based on a single gradient reversed-phase HPLC step, the other on multiple HPLC runs each optimised for specific components, were used to investigate the composition of flavonoids and phenolic acids in apple and tomato juices. The principal components in apple juice were identified as chlorogenic acid, phloridzin, caffeic acid and p-coumaric acid. Tomato juice was found to contain chlorogenic acid, caffeic acid, p-coumaric acid, naringenin and rutin. The quantitative estimates of the levels of these compounds, obtained with the two HPLC procedures, were very similar, demonstrating that either method can be used to analyse accurately the phenolic components of apple and tomato juices. Chlorogenic acid in tomato juice was the only component not fully resolved in the single run study and the multiple run analysis prior to enzyme treatment. The single run system of analysis is recommended for the initial investigation of plant phenolics and the multiple run approach for analyses where chromatographic resolution requires improvement.

  3. Confidence ellipses: A variation based on parametric bootstrapping applicable on Multiple Factor Analysis results for rapid graphical evaluation

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.

    2012-01-01

    A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis r...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....

  4. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis

    DEFF Research Database (Denmark)

    Ringius, L.; Grohnheit, Poul Erik; Nielsen, Lars Henrik

    2002-01-01

    and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana,Egypt is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between......The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment anddevelopment - that is, baseline development......, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, andrecommends methodologies for and approaches to baseline development. To present the application...

  5. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  6. A Monte Carlo evaluation of analytical multiple scattering corrections for unpolarised neutron scattering and polarisation analysis data

    International Nuclear Information System (INIS)

    Mayers, J.; Cywinski, R.

    1985-03-01

    Some of the approximations commonly used for the analytical estimation of multiple scattering corrections to thermal neutron elastic scattering data from cylindrical and plane slab samples have been tested using a Monte Carlo program. It is shown that the approximations are accurate for a wide range of sample geometries and scattering cross-sections. Neutron polarisation analysis provides the most stringent test of multiple scattering calculations as multiply scattered neutrons may be redistributed not only geometrically but also between the spin flip and non spin flip scattering channels. A very simple analytical technique for correcting for multiple scattering in neutron polarisation analysis has been tested using the Monte Carlo program and has been shown to work remarkably well in most circumstances. (author)

  7. Inflatable penile prosthesis implant length with baseline characteristic correlations: preliminary analysis of the PROPPER study.

    Science.gov (United States)

    Bennett, Nelson; Henry, Gerard; Karpman, Edward; Brant, William; Jones, LeRoy; Khera, Mohit; Kohler, Tobias; Christine, Brian; Rhee, Eugene; Kansas, Bryan; Bella, Anthony J

    2017-12-01

    "Prospective Registry of Outcomes with Penile Prosthesis for Erectile Restoration" (PROPPER) is a large, multi-institutional, prospective clinical study to collect, analyze, and report real-world outcomes for men implanted with penile prosthetic devices. We prospectively correlated co-morbid conditions and demographic data with implanted penile prosthesis size to enable clinicians to better predict implanted penis size following penile implantation. We present many new data points for the first time in the literature and postulate that radical prostatectomy (RP) is negatively correlated with penile corporal length. Patient demographics, medical history, baseline characteristics and surgical details were compiled prospectively. Pearson correlation coefficient was generated for the correlation between demographic, etiology of ED, duration of ED, co-morbid conditions, pre-operative penile length (flaccid and stretched) and length of implanted penile prosthesis. Multivariate analysis was performed to define predictors of implanted prosthesis length. From June 2011 to June 2017, 1,135 men underwent primary implantation of penile prosthesis at a total of 11 study sites. Malleable (Spectra), 2-piece Ambicor, and 3-piece AMS 700 CX/LGX were included in the analysis. The most common patient comorbidities were CV disease (26.1%), DM (11.1%), and PD (12.4%). Primary etiology of ED: RP (27.4%), DM (20.3%), CVD (18.0%), PD (10.3%), and Priapism (1.4%), others (22.6%). Mean duration of ED is 6.2¡À4.1 years. Implant length was weakly negatively correlated with White/Caucasian (r=-0.18; Pprosthesis length is negatively correlated with some ethnic groups, prostatectomy, and incontinence. Positive correlates include CV disease, preoperative stretched penile length, and flaccid penile length.

  8. Mercury baseline levels in Flemish soils (Belgium)

    International Nuclear Information System (INIS)

    Tack, Filip M.G.; Vanhaesebroeck, Thomas; Verloo, Marc G.; Van Rompaey, Kurt; Ranst, Eric van

    2005-01-01

    It is important to establish contaminant levels that are normally present in soils to provide baseline data for pollution studies. Mercury is a toxic element of concern. This study was aimed at assessing baseline mercury levels in soils in Flanders. In a previous study, mercury contents in soils in Oost-Vlaanderen were found to be significantly above levels reported elsewhere. For the current study, observations were extended over two more provinces, West-Vlaanderen and Antwerpen. Ranges of soil Hg contents were distinctly higher in the province Oost-Vlaanderen (interquartile range from 0.09 to 0.43 mg/kg) than in the other provinces (interquartile ranges from 0.7 to 0.13 and 0.7 to 0.15 mg/kg for West-Vlaanderen and Antwerpen, respectively). The standard threshold method was applied to separate soils containing baseline levels of Hg from the data. Baseline concentrations for Hg were characterised by a median of 0.10 mg Hg/kg dry soil, an interquartile range from 0.07 to 0.14 mg/kg and a 90% percentile value of 0.30 mg/kg. The influence of soil properties such as clay and organic carbon contents, and pH on baseline Hg concentrations was not important. Maps of the spatial distribution of Hg levels showed that the province Oost-Vlaanderen exhibited zones with systematically higher Hg soil contents. This may be related to the former presence of many small-scale industries employing mercury in that region. - Increased mercury levels may reflect human activity

  9. THE ANALYSIS OF THE RELATION COUNTRY RISK – MULTIPLE VALUE

    Directory of Open Access Journals (Sweden)

    Cristina Nicolescu

    2016-12-01

    Full Text Available Financial theory state that high expected growth, low risk in the company’s sector and low interest rates will push multiples higher. In this respect the goal of the empirical work is to examine country risk-multiple value relation, for the companies from emerging and frontier markets such as Central and East European ones. Specific control variables have been included in the model as proxy for growth opportunities, profitability, capital structure, and asset utilization. Using panel data analysis for period 2010-2015 as well as other financial variables for a sample of Central and East European countries during 2010-2015. The results partially support financial theories, mainly the significance of country risk and debt ratio and reject the growth opportunities hypothesis.

  10. Combining analysis of variance and three‐way factor analysis methods for studying additive and multiplicative effects in sensory panel data

    DEFF Research Database (Denmark)

    Romano, Rosaria; Næs, Tormod; Brockhoff, Per Bruun

    2015-01-01

    Data from descriptive sensory analysis are essentially three‐way data with assessors, samples and attributes as the three ways in the data set. Because of this, there are several ways that the data can be analysed. The paper focuses on the analysis of sensory characteristics of products while...... in the use of the scale with reference to the existing structure of relationships between sensory descriptors. The multivariate assessor model will be tested on a data set from milk. Relations between the proposed model and other multiplicative models like parallel factor analysis and analysis of variance...

  11. Tank waste remediation system technical baseline summary description

    International Nuclear Information System (INIS)

    Raymond, R.E.

    1998-01-01

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations

  12. Baseline Plasma C-Reactive Protein Concentrations and Motor Prognosis in Parkinson Disease.

    Directory of Open Access Journals (Sweden)

    Atsushi Umemura

    Full Text Available C-reactive protein (CRP, a blood inflammatory biomarker, is associated with the development of Alzheimer disease. In animal models of Parkinson disease (PD, systemic inflammatory stimuli can promote neuroinflammation and accelerate dopaminergic neurodegeneration. However, the association between long-term systemic inflammations and neurodegeneration has not been assessed in PD patients.To investigate the longitudinal effects of baseline CRP concentrations on motor prognosis in PD.Retrospective analysis of 375 patients (mean age, 69.3 years; mean PD duration, 6.6 years. Plasma concentrations of high-sensitivity CRP were measured in the absence of infections, and the Unified Parkinson's Disease Rating Scale Part III (UPDRS-III scores were measured at five follow-up intervals (Days 1-90, 91-270, 271-450, 451-630, and 631-900.Change of UPDRS-III scores from baseline to each of the five follow-up periods.Change in UPDRS-III scores was significantly greater in PD patients with CRP concentrations ≥0.7 mg/L than in those with CRP concentrations <0.7 mg/L, as determined by a generalized estimation equation model (P = 0.021 for the entire follow-up period and by a generalized regression model (P = 0.030 for the last follow-up interval (Days 631-900. The regression coefficients of baseline CRP for the two periods were 1.41 (95% confidence interval [CI] 0.21-2.61 and 2.62 (95% CI 0.25-4.98, respectively, after adjusting for sex, age, baseline UPDRS-III score, dementia, and incremental L-dopa equivalent dose.Baseline plasma CRP levels were associated with motor deterioration and predicted motor prognosis in patients with PD. These associations were independent of sex, age, PD severity, dementia, and anti-Parkinsonian agents, suggesting that subclinical systemic inflammations could accelerate neurodegeneration in PD.

  13. Carbon tetrachloride ERA soil-gas baseline monitoring

    International Nuclear Information System (INIS)

    Fancher, J.D.

    1994-01-01

    From December 1991 through December 1993, Westinghouse Hanford Company performed routine baseline monitoring of selected wells ad soil-gas points twice weekly in the 200 West Area of the Hanford Site. This work supported the carbon Tetrachloride Expedited Response Action (ERA) and provided a solid baseline of volatile organic compound (VOC) concentrations in wells and in the subsurface at the ERA site. As site remediation continues, comparisons to this baseline can be one means of measuring the success of carbon tetrachloride vapor extraction. This report contains observations of the patterns and trends associated with data obtained during soil-gas monitoring at the 200 West Area: Monitoring performed since late 1991 includes monitoring soil-gas probes ad wellheads for volatile organic compounds (VOCs). This report reflects monitoring data collected from December 1991 through December 1993

  14. Rheological and droplet size analysis of W/O/W multiple emulsions containing low concentrations of polymeric emulsifiers

    Directory of Open Access Journals (Sweden)

    DRAGANA D. VASILJEVIĆ

    2009-07-01

    Full Text Available Multiple emulsions are complex dispersion systems which have many potential applications in pharmaceutics, cosmetics and the food industry. In practice, however, significant problems may arise because of their thermodynamic instability. In this study, W/O/W multiple emulsion systems containing low concentration levels of lipophilic polymeric primary emulsifiers cetyl dimethicone copolyol and PEG–30 dipolyhydroxystearate were evaluated. The concentrations of the primary emulsifiers were set at 1.6 and 2.4 % w/w in the final emulsions. Rheological and droplet size analysis of the investigated samples showed that the type and concentration of the primary lipophilic polymeric emulsifier markedly affected the characteristics of the multiple emulsions. The multiple emulsion prepared with 2.4 % w/w PEG–30 dipolyhydroxystearate as the primary emulsifier exhibited the highest apparent viscosity, yield stress and elastic modulus values, as well as the smallest droplet size. Furthermore, these parameters remained relatively constant over the study period, confirming the high stability of the investigated sample. The results obtained indicate that the changes observed in the investigated samples over time could be attributed to the swelling/breakdown mechanism of the multiple droplets. Such changes could be adequately monitored by rheological and droplet size analysis.

  15. Analysis of cadmium in food by multiple prompt γ-ray spectroscopy

    International Nuclear Information System (INIS)

    Toh, Y.; Oshima, M.; Koizumi, M.; Osa, A.; Kimura, A.; Goto, J.; Hatsukawa, Y.

    2006-01-01

    The Cd concentration in food is a public concern related to the human health. In order to remove Cd-polluted food, the development and validation of a rapid and sensitive method of Cd analysis is required. By applying the multiple γ-ray detection method to prompt γ-ray analysis (PGA), the influence from nuclei which emit only one prompt γ-ray at a time at every neutron capture reaction can be reduced, therefore the quantification limit of Cd is improved significantly. The limit of Cd contained in rice in the case of MPGA was evaluated, and under our proposed experimental conditions, it may be possible to quantify Cd content in rice to within 0.2 ppm in 10 min

  16. DYNAMIC ANALYSIS OF A CRIMPING DEVICE WITH MULTIPLE CAMS USING MSC ADAMS II

    Directory of Open Access Journals (Sweden)

    Gheorghe Popescu

    2012-05-01

    Full Text Available Through the present paper, the author presents the results of the dynamic analysis with MSC ADAMS of the mechanism with a crimping device with 12 tightening cams, designed and used in the technological process of assembly of the indigenous electrical detonators. In this sense, the mechanism with multiple cams is considered a mechanical system and is treated as an assembly of rigid bodies connected by mechanical connections and elastic elements. For shaping and simulation of the mechanism with multiple cams using ADAMS program, the author got through the following stages: construction of the pattern, its testing and simulation, validation, finishing, parametrization, optimization of the pattern.

  17. VERY LONG BASELINE ARRAY IMAGING OF PARSEC-SCALE RADIO EMISSIONS IN NEARBY RADIO-QUIET NARROW-LINE SEYFERT 1 GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Doi, Akihiro [The Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Chuou-ku, Sagamihara, Kanagawa 252-5210 (Japan); Asada, Keiichi; Inoue, Makoto [Academia Sinica Institute of Astronomy and Astrophysics, P.O. Box 23-141, Taipei 10617, Taiwan (China); Fujisawa, Kenta [The Research Institute of Time Studies, Yamaguchi University, 1677-1 Yoshida, Yamaguchi, Yamaguchi 753-8511 (Japan); Nagai, Hiroshi; Hagiwara, Yoshiaki [National Astronomical Observatory, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Wajima, Kiyoaki, E-mail: akihiro.doi@vsop.isas.jaxa.jp [Shanghai Astronomical Observatory, Chinese Academy of Sciences, Shanghai 200030 (China)

    2013-03-01

    We conducted Very Long Baseline Array (VLBA) observations of seven nearby narrow-line Seyfert 1 (NLS1) galaxies at 1.7 GHz ({lambda}18 cm) with milliarcsecond resolution. This is the first systematic very long baseline interferometry study focusing on the central parsec-scale regions of radio-quiet NLS1s. Five of the seven were detected at a brightness temperature of {approx}> 5 Multiplication-Sign 10{sup 6} K and contain radio cores with high brightness temperatures of >6 Multiplication-Sign 10{sup 7} K, indicating a nonthermal process driven by jet-producing central engines as in radio-loud NLS1s and other active galactic nucleus classes. VLBA images of MRK 1239, MRK 705, and MRK 766 exhibit parsec-scale jets with clear linear structures. A large portion of the radio power comes from diffuse emission components that are distributed within the nuclear regions ({approx}< 300 pc), which is a common characteristic throughout the observed NLS1s. Jet kinetic powers limited by the Eddington limit may be insufficient to allow the jets to escape to kiloparsec scales for these radio-quiet NLS1s with low-mass black holes of {approx}< 10{sup 7} M {sub Sun }.

  18. Psychosocial job quality, mental health, and subjective wellbeing: a cross-sectional analysis of the baseline wave of the Australian Longitudinal Study on Male Health

    Directory of Open Access Journals (Sweden)

    Anthony D. LaMontagne

    2016-10-01

    Full Text Available Abstract Background Employment status and working conditions are strong determinants of male health, and are therefore an important focus in the Australian Longitudinal Study on Male Health (Ten to Men. In this paper, we describe key work variables included in Ten to Men, and present analyses relating psychosocial job quality to mental health and subjective wellbeing at baseline. Methods A national sample of males aged 10 to 55 years residing in private dwellings was drawn using a stratified multi-stage cluster random sample design. Data were collected between October 2013 and July 2014 for a cohort of 15,988 males, representing a response fraction of 35 %. This analysis was restricted to 18–55 year old working age participants (n = 13,456. Work-related measures included employment status, and, for those who were employed, a number of working conditions including an ordinal scale of psychosocial job quality (presence of low job control, high demand and complexity, high job insecurity, and low fairness of pay, and working time-related stressors such as long working hours and night shift work. Associations between psychosocial job quality and two outcome measures, mental ill-health and subjective wellbeing, were assessed using multiple linear regression. Results The majority of participants aged 18–55 years were employed at baseline (85.6 %, with 8.4 % unemployed and looking for work, and 6.1 % not in the labour force. Among employed participants, there was a high prevalence of long working hours (49.9 % reported working more than 40 h/week and night shift work (23.4 %. Psychosocial job quality (exposure to 0/1/2/3+ job stressors prevalence was 36 %/ 37 %/ 20 %/ and 7 % of the working respondents. There was a dose–response relationship between psychosocial job quality and each of the two outcome measures of mental health and subjective wellbeing after adjusting for potential confounders, with higher magnitude associations

  19. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    Directory of Open Access Journals (Sweden)

    Seyed Ali Mousavi

    2015-01-01

    Full Text Available Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (<150 g/L baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L baseline hemoglobin were compared with those with lower (<138 g/L baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations.

  20. Relationship of Baseline Hemoglobin Level with Serum Ferritin, Postphlebotomy Hemoglobin Changes, and Phlebotomy Requirements among HFE C282Y Homozygotes

    Science.gov (United States)

    Mousavi, Seyed Ali; Mahmood, Faiza; Aandahl, Astrid; Knutsen, Teresa Risopatron; Llohn, Abid Hussain

    2015-01-01

    Objectives. We aimed to examine whether baseline hemoglobin levels in C282Y-homozygous patients are related to the degree of serum ferritin (SF) elevation and whether patients with different baseline hemoglobin have different phlebotomy requirements. Methods. A total of 196 patients (124 males and 72 females) who had undergone therapeutic phlebotomy and had SF and both pre- and posttreatment hemoglobin values were included in the study. Results. Bivariate correlation analysis suggested that baseline SF explains approximately 6 to 7% of the variation in baseline hemoglobin. The results also showed that males who had higher (≥150 g/L) baseline hemoglobin levels had a significantly greater reduction in their posttreatment hemoglobin despite requiring fewer phlebotomies to achieve iron depletion than those who had lower (baseline hemoglobin, regardless of whether baseline SF was below or above 1000 µg/L. There were no significant differences between hemoglobin subgroups regarding baseline and treatment characteristics, except for transferrin saturation between male subgroups with SF above 1000 µg/L. Similar differences were observed when females with higher (≥138 g/L) baseline hemoglobin were compared with those with lower (baseline hemoglobin. Conclusion. Dividing C282Y-homozygous patients into just two subgroups according to the degree of baseline SF elevation may obscure important subgroup variations. PMID:26380265

  1. Competing failure analysis in phased-mission systems with multiple functional dependence groups

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Peng, Rui; Pan, Zhusheng

    2017-01-01

    A phased-mission system (PMS) involves multiple, consecutive, non-overlapping phases of operation. The system structure function and component failure behavior in a PMS can change from phase to phase, posing big challenges to the system reliability analysis. Further complicating the problem is the functional dependence (FDEP) behavior where the failure of certain component(s) causes other component(s) to become unusable or inaccessible or isolated. Previous studies have shown that FDEP can cause competitions between failure propagation and failure isolation in the time domain. While such competing failure effects have been well addressed in single-phase systems, only little work has focused on PMSs with a restrictive assumption that a single FDEP group exists in one phase of the mission. Many practical systems (e.g., computer systems and networks), however may involve multiple FDEP groups during the mission. Moreover, different FDEP groups can be dependent due to sharing some common components; they may appear in a single phase or multiple phases. This paper makes new contributions by modeling and analyzing reliability of PMSs subject to multiple FDEP groups through a Markov chain-based methodology. Propagated failures with both global and selective effects are considered. Four case studies are presented to demonstrate application of the proposed method. - Highlights: • Reliability of phased-mission systems subject to competing failure propagation and isolation effects is modeled. • Multiple independent or dependent functional dependence groups are considered. • Propagated failures with global effects and selective effects are studied. • Four case studies demonstrate generality and application of the proposed Markov-based method.

  2. Sensitivity analysis for the effects of multiple unmeasured confounders.

    Science.gov (United States)

    Groenwold, Rolf H H; Sterne, Jonathan A C; Lawlor, Debbie A; Moons, Karel G M; Hoes, Arno W; Tilling, Kate

    2016-09-01

    Observational studies are prone to (unmeasured) confounding. Sensitivity analysis of unmeasured confounding typically focuses on a single unmeasured confounder. The purpose of this study was to assess the impact of multiple (possibly weak) unmeasured confounders. Simulation studies were performed based on parameters estimated from the British Women's Heart and Health Study, including 28 measured confounders and assuming no effect of ascorbic acid intake on mortality. In addition, 25, 50, or 100 unmeasured confounders were simulated, with various mutual correlations and correlations with measured confounders. The correlated unmeasured confounders did not need to be strongly associated with exposure and outcome to substantially bias the exposure-outcome association at interest, provided that there are sufficiently many unmeasured confounders. Correlations between unmeasured confounders, in addition to the strength of their relationship with exposure and outcome, are key drivers of the magnitude of unmeasured confounding and should be considered in sensitivity analyses. However, if the unmeasured confounders are correlated with measured confounders, the bias yielded by unmeasured confounders is partly removed through adjustment for the measured confounders. Discussions of the potential impact of unmeasured confounding in observational studies, and sensitivity analyses to examine this, should focus on the potential for the joint effect of multiple unmeasured confounders to bias results. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Low-dose budesonide treatment reduces severe asthma-related events in patients with infrequent asthma symptoms at baseline

    DEFF Research Database (Denmark)

    Reddel, H. K.; Busse, W. W.; Pedersen, Søren

    2015-01-01

    symptoms, evidence is lacking for the benefit of ICS and safety of bronchodilator-only treatment. We investigated asthma outcomes by baseline symptom frequency in a post-hoc analysis of the multinational inhaled Steroid Treatment As Regular Therapy in early asthma (START) study.2 METHODS: Patients aged 4......-66 years with recent-onset mild asthma (11 years] or 200 mug [patients aged 2 symptom days/week; further divided into 0-1, >1-2 symptom days/week). RESULTS: Overall, 7138 patients were included (budesonide, n=3577; placebo, n=3561). At baseline, symptom frequency was 0-1 symptom days/week for 2184 (30...... even in patients with the lowest baseline asthma symptom frequency (0-1 days/week). (Figure Presented)....

  4. First Grade Baseline Evaluation

    Science.gov (United States)

    Center for Innovation in Assessment (NJ1), 2013

    2013-01-01

    The First Grade Baseline Evaluation is an optional tool that can be used at the beginning of the school year to help teachers get to know the reading and language skills of each student. The evaluation is composed of seven screenings. Teachers may use the entire evaluation or choose to use those individual screenings that they find most beneficial…

  5. Sensitivity of amounts and distribution of tropical forest carbon credits depending on baseline rules

    International Nuclear Information System (INIS)

    Griscom, Bronson; Shoch, David; Stanley, Bill; Cortez, Rane; Virgilio, Nicole

    2009-01-01

    One of the largest sources of global greenhouse gas emissions can be addressed through conservation of tropical forests by channeling funds to developing countries at a cost-savings for developed countries. However, questions remain to be resolved in negotiating a system for including reduced emissions from deforestation and forest degradation (REDD) in a post-Kyoto climate treaty. The approach to determine national baselines, or reference levels, for quantifying REDD has emerged as central to negotiations over a REDD mechanism in a post-Kyoto policy framework. The baseline approach is critical to the success of a REDD mechanism because it affects the quantity, credibility, and equity of credits generated from efforts to reduce forest carbon emissions. We compared outcomes of seven proposed baseline approaches as a function of country circumstances, using a retrospective analysis of FAO-FRA data on forest carbon emissions from deforestation. Depending upon the baseline approach used, the total credited emissions avoided ranged over two orders of magnitude for the same quantity of actual emissions reductions. There was also a wide range in the relative distribution of credits generated among the five country types we identified. Outcomes were especially variable for countries with high remaining forest and low rates of deforestation (HFLD). We suggest that the most credible approaches measure emissions avoided with respect to a business-as-usual baseline scenario linked to historic emissions data, and allow limited adjustments based on forest carbon stocks.

  6. Esophageal acid exposure decreases intraluminal baseline impedance levels

    NARCIS (Netherlands)

    Kessing, Boudewijn F.; Bredenoord, Albert J.; Weijenborg, Pim W.; Hemmink, Gerrit J. M.; Loots, Clara M.; Smout, A. J. P. M.

    2011-01-01

    Intraluminal baseline impedance levels are determined by the conductivity of the esophageal wall and can be decreased in gastroesophageal reflux disease (GERD) patients. The aim of this study was to investigate the baseline impedance in GERD patients, on and off proton pump inhibitor (PPI), and in

  7. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    Science.gov (United States)

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  8. Value of the free light chain analysis in the clinical evaluation of response in multiple myeloma patients receiving anti-myeloma therapy

    DEFF Research Database (Denmark)

    Toftmann Hansen, Charlotte; Pedersen, Per T.; Jensen, Bo Amdi

    Value of the free light chain analysis in the clinical evaluation of response in multiple myeloma patients receiving anti-myeloma therapy.......Value of the free light chain analysis in the clinical evaluation of response in multiple myeloma patients receiving anti-myeloma therapy....

  9. Limits in feature-based attention to multiple colors.

    Science.gov (United States)

    Liu, Taosheng; Jigo, Michael

    2017-11-01

    Attention to a feature enhances the sensory representation of that feature. Although much has been learned about the properties of attentional modulation when attending to a single feature, the effectiveness of attending to multiple features is not well understood. We investigated this question in a series of experiments using a color-detection task while varying the number of attended colors in a cueing paradigm. Observers were shown either a single cue, two cues, or no cue (baseline) before detecting a coherent color target. We measured detection threshold by varying the coherence level of the target. Compared to the baseline condition, we found consistent facilitation of detection performance in the one-cue and two-cue conditions, but performance in the two-cue condition was lower than that in the one-cue condition. In the final experiment, we presented a 50% valid cue to emulate the situation in which observers were only able to attend a single color in the two-cue condition, and found equivalent detection thresholds with the standard two-cue condition. These results indicate a limit in attending to two colors and further imply that observers could effectively attend a single color at a time. Such a limit is likely due to an inability to maintain multiple active attentional templates for colors.

  10. Distributed state machine supervision for long-baseline gravitational-wave detectors

    International Nuclear Information System (INIS)

    Rollins, Jameson Graef

    2016-01-01

    The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitate the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.

  11. Distributed state machine supervision for long-baseline gravitational-wave detectors

    Energy Technology Data Exchange (ETDEWEB)

    Rollins, Jameson Graef, E-mail: jameson.rollins@ligo.org [LIGO Laboratory, California Institute of Technology, Pasadena, California 91125 (United States)

    2016-09-15

    The Laser Interferometer Gravitational-wave Observatory (LIGO) consists of two identical yet independent, widely separated, long-baseline gravitational-wave detectors. Each Advanced LIGO detector consists of complex optical-mechanical systems isolated from the ground by multiple layers of active seismic isolation, all controlled by hundreds of fast, digital, feedback control systems. This article describes a novel state machine-based automation platform developed to handle the automation and supervisory control challenges of these detectors. The platform, called Guardian, consists of distributed, independent, state machine automaton nodes organized hierarchically for full detector control. User code is written in standard Python and the platform is designed to facilitate the fast-paced development process associated with commissioning the complicated Advanced LIGO instruments. While developed specifically for the Advanced LIGO detectors, Guardian is a generic state machine automation platform that is useful for experimental control at all levels, from simple table-top setups to large-scale multi-million dollar facilities.

  12. Measuring cognitive change with ImPACT: the aggregate baseline approach.

    Science.gov (United States)

    Bruce, Jared M; Echemendia, Ruben J; Meeuwisse, Willem; Hutchison, Michael G; Aubry, Mark; Comper, Paul

    2017-11-01

    The Immediate Post-Concussion Assessment and Cognitive Test (ImPACT) is commonly used to assess baseline and post-injury cognition among athletes in North America. Despite this, several studies have questioned the reliability of ImPACT when given at intervals employed in clinical practice. Poor test-retest reliability reduces test sensitivity to cognitive decline, increasing the likelihood that concussed athletes will be returned to play prematurely. We recently showed that the reliability of ImPACT can be increased when using a new composite structure and the aggregate of two baselines to predict subsequent performance. The purpose of the present study was to confirm our previous findings and determine whether the addition of a third baseline would further increase the test-retest reliability of ImPACT. Data from 97 English speaking professional hockey players who had received at least 4 ImPACT baseline evaluations were extracted from a National Hockey League Concussion Program database. Linear regression was used to determine whether each of the first three testing sessions accounted for unique variance in the fourth testing session. Results confirmed that the aggregate baseline approach improves the psychometric properties of ImPACT, with most indices demonstrating adequate or better test-retest reliability for clinical use. The aggregate baseline approach provides a modest clinical benefit when recent baselines are available - and a more substantial benefit when compared to approaches that obtain baseline measures only once during the course of a multi-year playing career. Pending confirmation in diverse samples, neuropsychologists are encouraged to use the aggregate baseline approach to best quantify cognitive change following sports concussion.

  13. Work Domain Analysis of a Predecessor Sodium-cooled Reactor as Baseline for AdvSMR Operational Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Farris; David Gertman; Jacques Hugo

    2014-03-01

    provide sophisticated operational information visualization, coupled with adaptive automation schemes and operator support systems to reduce complexity. These all have to be mapped at some point to human performance requirements. The EBR-II results will be used as a baseline that will be extrapolated in the extended Cognitive Work Analysis phase to the analysis of a selected advanced sodium-cooled SMR design as a way to establish non-conventional operational concepts. The Work Domain Analysis results achieved during this phase have not only established an organizing and analytical framework for describing existing sociotechnical systems, but have also indicated that the method is particularly suited to the analysis of prospective and immature designs. The results of the EBR-II Work Domain Analysis have indicated that the methodology is scientifically sound and generalizable to any operating environment.

  14. The effectiveness of the PRISMA integrated service delivery network: preliminary report on methods and baseline data

    Directory of Open Access Journals (Sweden)

    Réjean Hébert

    2008-02-01

    Full Text Available Purpose: The PRISMA study analyzes an innovative coordination-type integrated service delivery (ISD system developed to improve continuity and increase the effectiveness and efficiency of services, especially for older and disabled populations. The objective of the PRISMA study is to evaluate the effectiveness of this system to improve health, empowerment and satisfaction of frail older people, modify their health and social services utilization, without increasing the burden of informal caregivers. The objective of this paper is to present the methodology and give baseline data on the study participants. Methods: A quasi-experimental study with pre-test, multiple post-tests, and a comparison group was used to evaluate the impact of PRISMA ISD. Elders at risk of functional decline (501 experimental, 419 control participated in the study. Results: At entry, the two groups were comparable for most variables. Over the first year, when the implementation rate was low (32%, participants from the control group used fewer services than those from the experimental group. After the first year, no significant statistical difference was observed for functional decline and changes in the other outcome variables. Conclusion: This first year must be considered a baseline year, showing the situation without significant implementation of PRISMA ISD systems. Results for the following years will have to be examined with consideration of these baseline results.

  15. Integrated Baseline System (IBS) Version 2.0: Models guide

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The Integrated Baseline System (IBS) is an emergency management planning and analysis tool being developed under the direction of the US Army Nuclear and Chemical Agency. This Models Guide summarizes the IBS use of several computer models for predicting the results of emergency situations. These include models for predicting dispersion/doses of airborne contaminants, traffic evacuation, explosion effects, heat radiation from a fire, and siren sound transmission. The guide references additional technical documentation on the models when such documentation is available from other sources. The audience for this manual is chiefly emergency management planners and analysts, but also data managers and system managers.

  16. Common pitfalls in statistical analysis: The perils of multiple testing

    Science.gov (United States)

    Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc

    2016-01-01

    Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478

  17. Study on the calibration and optimization of double theodolites baseline

    Science.gov (United States)

    Ma, Jing-yi; Ni, Jin-ping; Wu, Zhi-chao

    2018-01-01

    For the double theodolites measurement system baseline as the benchmark of the scale of the measurement system and affect the accuracy of the system, this paper puts forward a method for calibration and optimization of the double theodolites baseline. Using double theodolites to measure the known length of the reference ruler, and then reverse the baseline formula. Based on the error propagation law, the analyses show that the baseline error function is an important index to measure the accuracy of the system, and the reference ruler position, posture and so on have an impact on the baseline error. The optimization model is established and the baseline error function is used as the objective function, and optimizes the position and posture of the reference ruler. The simulation results show that the height of the reference ruler has no effect on the baseline error; the posture is not uniform; when the reference ruler is placed at x=500mm and y=1000mm in the measurement space, the baseline error is the smallest. The experimental results show that the experimental results are consistent with the theoretical analyses in the measurement space. In this paper, based on the study of the placement of the reference ruler, for improving the accuracy of the double theodolites measurement system has a reference value.

  18. STATUS OF THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2006-09-21

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory and Fermi National Accelerator Laboratory to investigate the potential for future U.S. based long baseline neutrino oscillation experiments beyond the currently planned program. The Study focused on MW class convention at neutrino beams that can be produced at Fermilab or BNL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing Fermilab NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000 km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from Fermilab or BNL aimed at a massive detector with a baseline of > 1000 km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2.2{sup o}.

  19. A public finance analysis of multiple reserve requirements

    OpenAIRE

    Espinosa-Vega, Marco; Russell, Steven

    1998-01-01

    This paper analyzes multiple reserve requirements of the type that have been imposed by a number of developing countries. We show that previous theoretical work on this topic has not succeeded in providing a social welfare rationale for the existence of multiple reserve requirements: in the basic reserve requirements model, any allocation that can be supported by a multiple-reserves regime can also be supported by a single-bond reserve requirement. We go on to present extended versions of the...

  20. Cladribine tablets for relapsing-remitting multiple sclerosis

    DEFF Research Database (Denmark)

    Rammohan, Kottil; Giovannoni, Gavin; Comi, Giancarlo

    2012-01-01

    BACKGROUND: In the phase III CLARITY study, treatment with cladribine tablets at cumulative doses of 3.5 or 5.25mg/kg over 96 weeks led to significant reductions in annualized relapse rates (ARR) versus placebo in patients with relapsing-remitting multiple sclerosis. Further post hoc analyses...... of CLARITY study data were conducted to determine the efficacy of cladribine tablets across patient subgroups stratified by baseline characteristics. METHODS: Relapse rates over the 96-week CLARITY study were analyzed in cohorts stratified by demographics; disease duration; treatment history and disease...... activity at baseline. RESULTS: In the intent-to-treat population (n=437, 433 and 456 in the placebo, cladribine 3.5 and 5.25mg/kg groups, respectively), treatment with cladribine tablets 3.5 and 5.25mg/kg led to consistent improvements in ARR versus placebo in patients stratified by gender; age (≤40...

  1. Better Informing Decision Making with Multiple Outcomes Cost-Effectiveness Analysis under Uncertainty in Cost-Disutility Space

    Science.gov (United States)

    McCaffrey, Nikki; Agar, Meera; Harlum, Janeane; Karnon, Jonathon; Currow, David; Eckermann, Simon

    2015-01-01

    Introduction Comparing multiple, diverse outcomes with cost-effectiveness analysis (CEA) is important, yet challenging in areas like palliative care where domains are unamenable to integration with survival. Generic multi-attribute utility values exclude important domains and non-health outcomes, while partial analyses—where outcomes are considered separately, with their joint relationship under uncertainty ignored—lead to incorrect inference regarding preferred strategies. Objective The objective of this paper is to consider whether such decision making can be better informed with alternative presentation and summary measures, extending methods previously shown to have advantages in multiple strategy comparison. Methods Multiple outcomes CEA of a home-based palliative care model (PEACH) relative to usual care is undertaken in cost disutility (CDU) space and compared with analysis on the cost-effectiveness plane. Summary measures developed for comparing strategies across potential threshold values for multiple outcomes include: expected net loss (ENL) planes quantifying differences in expected net benefit; the ENL contour identifying preferred strategies minimising ENL and their expected value of perfect information; and cost-effectiveness acceptability planes showing probability of strategies minimising ENL. Results Conventional analysis suggests PEACH is cost-effective when the threshold value per additional day at home ( 1) exceeds $1,068 or dominated by usual care when only the proportion of home deaths is considered. In contrast, neither alternative dominate in CDU space where cost and outcomes are jointly considered, with the optimal strategy depending on threshold values. For example, PEACH minimises ENL when 1=$2,000 and 2=$2,000 (threshold value for dying at home), with a 51.6% chance of PEACH being cost-effective. Conclusion Comparison in CDU space and associated summary measures have distinct advantages to multiple domain comparisons, aiding

  2. High baseline left ventricular and systolic volume may identify patients at risk of chemotherapy-induced cardiotoxicity

    International Nuclear Information System (INIS)

    Atiar Rahman; Alex Gedevanishvili; Seham Ali; Elma G Briscoe; Vani Vijaykumar

    2004-01-01

    Introduction and Methods: Use of chemotherapeutic drugs in the treatment of cancer may lead to serious cardiotoxicity and to post-treatment heart failure. Various strategies have been developed to minimize the risk of cardiotoxicity including avoiding the total dosage given to each patient above a certain 'threshold' value; and monitoring the patient's cardiac function by means of the 'Multiple Gated Acquisition' (MUGA) scan using Technetium 99m . However, even with all these precautions some patients still develop cardiotoxicity and it is not well known which factors predict deterioration of cardiac functions in patients with optimized chemotherapeutic dosages. In this retrospective study we sought to evaluate the predictive value of seven variables (age, sex, baseline LV ejection fraction, LV end diastolic [LDEDV] and end systolic volumes [LVESV], peak diastolic filling rate, preexisting malignancies requiring chemotherapy) in 172 patients (n=Breast Carcinoma 86, lymphoma 62, Leukemias and others 24) undergoing chemotherapy from 1995 until 2000. There was no cut off for left ventricular ejection fraction prior to chemotherapy. However, patients were excluded from analysis if they had significant cardiac arrhythmias or received doses higher than considered safe for cardiotoxicity at the beginning of the study. Significant cardiotoxicity was defined as a drop in post chemotherapy LVEF by >15%. Results: Logistic regression models were used to predict the probability of developing cardiotoxicity as a function of the seven prognostic covariates. The mean age of all patients was 51+13 years. Significant Cardiac toxicity was noted in 10 percent of patients. The overall risk estimate for subsequent heart failure after chemotherapy, however, climbed to 18 percent in patients with a presenting LVESD >50 mL. Using multivariate logistic regression model, older age was noted to be a weak risk factors for cardiac toxicity (confidence interval 0.8-1.2; p 50 mL) appeared to

  3. The effectiveness of music as a mnemonic device on recognition memory for people with multiple sclerosis.

    Science.gov (United States)

    Moore, Kimberly Sena; Peterson, David A; O'Shea, Geoffrey; McIntosh, Gerald C; Thaut, Michael H

    2008-01-01

    Research shows that people with multiple sclerosis exhibit learning and memory difficulties and that music can be used successfully as a mnemonic device to aid in learning and memory. However, there is currently no research investigating the effectiveness of music mnemonics as a compensatory learning strategy for people with multiple sclerosis. Participants with clinically definitive multiple sclerosis (N = 38) were given a verbal learning and memory test. Results from a recognition memory task were analyzed that compared learning through music (n = 20) versus learning through speech (n = 18). Preliminary baseline neuropsychological data were collected that measured executive functioning skills, learning and memory abilities, sustained attention, and level of disability. An independent samples t test showed no significant difference between groups on baseline neuropsychological functioning or on recognition task measures. Correlation analyses suggest that music mnemonics may facilitate learning for people who are less impaired by the disease. Implications for future research are discussed.

  4. Placental baseline conditions modulate the hyperoxic BOLD-MRI response.

    Science.gov (United States)

    Sinding, Marianne; Peters, David A; Poulsen, Sofie S; Frøkjær, Jens B; Christiansen, Ole B; Petersen, Astrid; Uldbjerg, Niels; Sørensen, Anne

    2018-01-01

    Human pregnancies complicated by placental dysfunction may be characterized by a high hyperoxic Blood oxygen level-dependent (BOLD) MRI response. The pathophysiology behind this phenomenon remains to be established. The aim of this study was to evaluate whether it is associated with altered placental baseline conditions, including a lower oxygenation and altered tissue morphology, as estimated by the placental transverse relaxation time (T2*). We included 49 normal pregnancies (controls) and 13 pregnancies complicated by placental dysfunction (cases), defined by a birth weight baseline BOLD)/baseline BOLD) from a dynamic single-echo gradient-recalled echo (GRE) MRI sequence and the absolute ΔT2* (hyperoxic T2*- baseline T2*) from breath-hold multi-echo GRE sequences. In the control group, the relative ΔBOLD response increased during gestation from 5% in gestational week 20 to 20% in week 40. In the case group, the relative ΔBOLD response was significantly higher (mean Z-score 4.94; 95% CI 2.41, 7.47). The absolute ΔT2*, however, did not differ between controls and cases (p = 0.37), whereas the baseline T2* was lower among cases (mean Z-score -3.13; 95% CI -3.94, -2.32). Furthermore, we demonstrated a strong negative linear correlation between the Log 10 ΔBOLD response and the baseline T2* (r = -0.88, p baseline conditions, as the absolute increase in placental oxygenation (ΔT2*) does not differ between groups. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Sensitivity Analysis of Multiple Informant Models When Data Are Not Missing at Random

    Science.gov (United States)

    Blozis, Shelley A.; Ge, Xiaojia; Xu, Shu; Natsuaki, Misaki N.; Shaw, Daniel S.; Neiderhiser, Jenae M.; Scaramella, Laura V.; Leve, Leslie D.; Reiss, David

    2013-01-01

    Missing data are common in studies that rely on multiple informant data to evaluate relationships among variables for distinguishable individuals clustered within groups. Estimation of structural equation models using raw data allows for incomplete data, and so all groups can be retained for analysis even if only 1 member of a group contributes…

  6. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    Science.gov (United States)

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  7. Revised CDM baseline study on fuel use and manure management at household level

    Energy Technology Data Exchange (ETDEWEB)

    Buysman, E.; Bryan, S.; Pino, M.

    2010-05-15

    This report presents the revised study of the original CDM baseline study conducted in 2006. The original study was conducted under the authority of the National Biogas Program (NBP), to study the potential GHG mitigation resulting from the adoption of domestic biodigesters. In the beginning of June 2006, a survey amongst 300 randomly selected households with the technical potential for a biodigester was conducted in the NBP's 6-targeted provinces (Kampong Cham, Svay Rieng, Prey Veng, Kampong Speu, Takeo and Kandal) in southeast Cambodia. The revised baseline study includes two additional provinces, Kampot and Kampong Chhnang. The survey showed that a significant proportion of the households have no access to basic sanitation and often have health problems. They consume mainly wood as cooking fuel and the majority use inefficient cooking stoves. The main lighting fuel is kerosene. The GHG emissions were calculated for each type of Animal Waste Management System (AWMS) and the baseline fuel consumption. The main methodology used is the GS-VER biodigester methodology and the IPCC 2006 guidelines to ex-ante estimate baseline, project and the emission reductions. The GHG emission from wood burning is only considered when it originates from a non-renewable source. The NRB analysis determined a NRB share of 70.7% for both collected and purchased wood. Total GHG emission is calculated by combining AWMS and wood fuels emissions. The annual baseline and project emission was estimated to be respectively 5.38 tCO2eq and 0.46 tCO2eq per average household, the emission reductions (ER) are therefore 4.92 tCO2eq/household/year.

  8. Descriptive Analysis of a Baseline Concussion Battery Among U.S. Service Academy Members: Results from the Concussion Assessment, Research, and Education (CARE) Consortium.

    Science.gov (United States)

    O'Connor, Kathryn L; Dain Allred, C; Cameron, Kenneth L; Campbell, Darren E; D'Lauro, Christopher J; Houston, Megan N; Johnson, Brian R; Kelly, Tim F; McGinty, Gerald; O'Donnell, Patrick G; Peck, Karen Y; Svoboda, Steven J; Pasquina, Paul; McAllister, Thomas; McCrea, Michael; Broglio, Steven P

    2018-03-28

    The prevalence and possible long-term consequences of concussion remain an increasing concern to the U.S. military, particularly as it pertains to maintaining a medically ready force. Baseline testing is being used both in the civilian and military domains to assess concussion injury and recovery. Accurate interpretation of these baseline assessments requires one to consider other influencing factors not related to concussion. To date, there is limited understanding, especially within the military, of what factors influence normative test performance. Given the significant physical and mental demands placed on service academy members (SAM), and their relatively high risk for concussion, it is important to describe demographics and normative profile of SAMs. Furthermore, the absence of available baseline normative data on female and non-varsity SAMs makes interpretation of post-injury assessments challenging. Understanding how individuals perform at baseline, given their unique individual characteristics (e.g., concussion history, sex, competition level), will inform post-concussion assessment and management. Thus, the primary aim of this manuscript is to characterize the SAM population and determine normative values on a concussion baseline testing battery. All data were collected as part of the Concussion Assessment, Research and Education (CARE) Consortium. The baseline test battery included a post-concussion symptom checklist (Sport Concussion Assessment Tool (SCAT), psychological health screening inventory (Brief Symptom Inventory (BSI-18) and neurocognitive evaluation (ImPACT), Balance Error Scoring System (BESS), and Standardized Assessment of Concussion (SAC). Linear regression models were used to examine differences across sexes, competition levels, and varsity contact levels while controlling for academy, freshman status, race, and previous concussion. Zero inflated negative binomial models estimated symptom scores due to the high frequency of zero scores

  9. IsoGeneGUI : Multiple approaches for dose-response analysis of microarray data using R

    NARCIS (Netherlands)

    Otava, Martin; Sengupta, Rudradev; Shkedy, Ziv; Lin, Dan; Pramana, Setia; Verbeke, Tobias; Haldermans, Philippe; Hothorn, Ludwig A.; Gerhard, Daniel; Kuiper, Rebecca M.; Klinglmueller, Florian; Kasim, Adetayo

    2017-01-01

    The analysis of transcriptomic experiments with ordered covariates, such as dose-response data, has become a central topic in bioinformatics, in particular in omics studies. Consequently, multiple R packages on CRAN and Bioconductor are designed to analyse microarray data from various perspectives

  10. Epileptic seizures due to multiple cerebral cavernomatosis

    Directory of Open Access Journals (Sweden)

    Spasić Mirjana

    2007-01-01

    Full Text Available Background. Cavernous angiomas are angiographically occult vascular malformations that are present in 0.4−0.9 % of people, and represent around 5% of all cerebrovascular malformations. They can be single or multiple, and sporadic or familial. The presence of multiple lesions is more frequent in familial cavernomatosis. Ten to 30 % are associated with familial clustering. Case report. We presented the case of a 43-year-old man, admitted to the Emergency Department due to unprovoked seizure during the wide awake and everyday activities. Neurological examination was with no focal signs. A 32-channel standard digital EEG was without any significant changes of normal baseline activity. After sleep deprivation EEG showed multifocal, bilateral and asymmetric polyspikes and sharpwaves activity. Hyperventilation induced generalized epileptiform discharges. MRI scan demonstrated multiple small cavernous angiomas. Neuropsychological testing demonstrated a delayed memory impairment. Neurosurgery treatment was not recommended, and the therapy with valproate 1 250 mg/day had an excellent efficacy with no singnificant adverse effects. Conclusion. This patient considered as a rare case with multiple cavernomatosis highlights the importance of neuroradiological examination in adult patients with the first epileptic seizure but with no focal neurological signs. .

  11. Cerebral atrophy as outcome measure in short-term phase 2 clinical trials in multiple sclerosis

    Energy Technology Data Exchange (ETDEWEB)

    Elskamp, I.J. van den; Boden, B.; Barkhof, F. [VU University Medical Center, Department of Radiology, MS Center Amsterdam, Amsterdam (Netherlands); Dattola, V. [VU University Medical Center, Department of Radiology, MS Center Amsterdam, Amsterdam (Netherlands); University of Messina, Department of Neurosciences, Psychiatric and Anaesthesiological Sciences, Messina (Italy); Knol, D.L. [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); Filippi, M. [Scientific Institute and University Ospedale San Raffaele, Neuroimaging Research Unit, Milan (Italy); Kappos, L. [University Hospital, University of Basel, Department of Neurology, Basel (Switzerland); Fazekas, F. [Medical University of Graz, Department of Neurology, Graz (Austria); Wagner, K. [Bayer-Schering Pharma, Berlin (Germany); Pohl, C. [Bayer-Schering Pharma, Berlin (Germany); University Hospital Bonn, Department of Neurology, Bonn (Germany); Sandbrink, R. [Bayer-Schering Pharma, Berlin (Germany); Heinrich-Heine-University Dusseldorf, Department of Neurology, Dusseldorf (Germany); Polman, C.H. [VU University Medical Center, Department of Neurology, MS Center Amsterdam, Amsterdam (Netherlands); Uitdehaag, B.M.J. [VU University Medical Center, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands); VU University Medical Center, Department of Neurology, MS Center Amsterdam, Amsterdam (Netherlands)

    2010-10-15

    Cerebral atrophy is a compound measure of the neurodegenerative component of multiple sclerosis (MS) and a conceivable outcome measure for clinical trials monitoring the effect of neuroprotective agents. In this study, we evaluate the rate of cerebral atrophy in a 6-month period, investigate the predictive and explanatory value of other magnetic resonance imaging (MRI) measures in relation to cerebral atrophy, and determine sample sizes for future short-term clinical trials using cerebral atrophy as primary outcome measure. One hundred thirty-five relapsing-remitting multiple sclerosis patients underwent six monthly MRI scans from which the percentage brain volume change (PBVC) and the number and volume of gadolinium (Gd)-enhancing lesions, T2 lesions, and persistent black holes (PBH) were determined. By means of multiple linear regression analysis, the relationship between focal MRI variables and PBVC was assessed. Sample size calculations were performed for all patients and subgroups selected for enhancement or a high T2 lesion load at baseline. A significant atrophy occurred over 6 months (PBVC = -0.33%, SE = 0.061, p < 0.0001). The number of baseline T2 lesions (p = 0.024), the on-study Gd-enhancing lesion volume (p = 0.044), and the number of on-study PBHs (p = 0.003) were associated with an increased rate of atrophy. For a 50% decrease in rate of atrophy, the sample size calculations showed that approximately 283 patients per arm are required in an unselected sampled population and 185 patients per arm are required in a selected population. Within a 6-month period, significant atrophy can be detected and on-study associations of PBVC and PBHs emphasizes axonal loss to be a driving mechanism. Application as primary outcome measure in short-term clinical trials with feasible sample size requires a potent drug to obtain sufficient power. (orig.)

  12. Positioning performance improvements with European multiple-frequency satellite navigation - Galileo

    Science.gov (United States)

    Ji, Shengyue

    2008-10-01

    The rapid development of Global Positioning System has demonstrated the advantages of satellite based navigation systems. In near future, there will be a number of Global Navigation Satellite System (GNSS) available, i.e. modernized GPS, Galileo, restored GLONASS, BeiDou and many other regional GNSS augmentation systems. Undoubtedly, the new GNSS systems will significantly improve navigation performance over current GPS, with a better satellite coverage and multiple satellite signal bands. In this dissertation, the positioning performance improvement of new GNSS has been investigated based on both theoretical analysis and numerical study. First of all, the navigation performance of new GNSS systems has been analyzed, particularly for urban applications. The study has demonstrated that Receiver Autonomous Integrity Monitoring (RAIM) performance can be significantly improved with multiple satellite constellations, although the position accuracy improvement is limited. Based on a three-dimensional urban building model in Hong Kong streets, it is found that positioning availability is still very low in high-rising urban areas, even with three GNSS systems. On the other hand, the discontinuity of navigation solutions is significantly reduced with the combined constellations. Therefore, it is possible to use cheap DR systems to bridge the gaps of GNSS positioning, with high accuracy. Secondly, the ambiguity resolution performance has been investigated with Galileo multiple frequency band signals. The ambiguity resolution performance of three different algorithms is compared, including CAR, ILS and improved CAR methods (a new method proposed in this study). For short baselines, with four frequency Galileo data, it is highly possible to achieve reliable single epoch ambiguity resolution, when the carrier phase noise level is reasonably low (i.e. less than 6mm). For long baselines (up to 800 km), the integer ambiguity can be determined within 1 min on average. Ambiguity

  13. Outlier Ranking via Subspace Analysis in Multiple Views of the Data

    DEFF Research Database (Denmark)

    Muller, Emmanuel; Assent, Ira; Iglesias, Patricia

    2012-01-01

    , a novel outlier ranking concept. Outrank exploits subspace analysis to determine the degree of outlierness. It considers different subsets of the attributes as individual outlier properties. It compares clustered regions in arbitrary subspaces and derives an outlierness score for each object. Its...... principled integration of multiple views into an outlierness measure uncovers outliers that are not detectable in the full attribute space. Our experimental evaluation demonstrates that Outrank successfully determines a high quality outlier ranking, and outperforms state-of-the-art outlierness measures....

  14. Transforming Parent-Child Interaction in Family Routines: Longitudinal Analysis with Families of Children with Developmental Disabilities.

    Science.gov (United States)

    Lucyshyn, Joseph M; Fossett, Brenda; Bakeman, Roger; Cheremshynski, Christy; Miller, Lynn; Lohrmann, Sharon; Binnendyk, Lauren; Khan, Sophia; Chinn, Stephen; Kwon, Samantha; Irvin, Larry K

    2015-12-01

    The efficacy and consequential validity of an ecological approach to behavioral intervention with families of children with developmental disabilities was examined. The approach aimed to transform coercive into constructive parent-child interaction in family routines. Ten families participated, including 10 mothers and fathers and 10 children 3-8 years old with developmental disabilities. Thirty-six family routines were selected (2 to 4 per family). Dependent measures included child problem behavior, routine steps completed, and coercive and constructive parent-child interaction. For each family, a single case, multiple baseline design was employed with three phases: baseline, intervention, and follow-up. Visual analysis evaluated the functional relation between intervention and improvements in child behavior and routine participation. Nonparametric tests across families evaluated the statistical significance of these improvements. Sequential analyses within families and univariate analyses across families examined changes from baseline to intervention in the percentage and odds ratio of coercive and constructive parent-child interaction. Multiple baseline results documented functional or basic effects for 8 of 10 families. Nonparametric tests showed these changes to be significant. Follow-up showed durability at 11 to 24 months postintervention. Sequential analyses documented the transformation of coercive into constructive processes for 9 of 10 families. Univariate analyses across families showed significant improvements in 2- and 4-step coercive and constructive processes but not in odds ratio. Results offer evidence of the efficacy of the approach and consequential validity of the ecological unit of analysis, parent-child interaction in family routines. Future studies should improve efficiency, and outcomes for families experiencing family systems challenges.

  15. Multiple Improvements of Multiple Imputation Likelihood Ratio Tests

    OpenAIRE

    Chan, Kin Wai; Meng, Xiao-Li

    2017-01-01

    Multiple imputation (MI) inference handles missing data by first properly imputing the missing values $m$ times, and then combining the $m$ analysis results from applying a complete-data procedure to each of the completed datasets. However, the existing method for combining likelihood ratio tests has multiple defects: (i) the combined test statistic can be negative in practice when the reference null distribution is a standard $F$ distribution; (ii) it is not invariant to re-parametrization; ...

  16. Ca analysis: An Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis☆

    Science.gov (United States)

    Greensmith, David J.

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. PMID:24125908

  17. Baseline restoration technique based on symmetrical zero-area trapezoidal pulse shaper

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Guoqiang, E-mail: 24829500@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Yang, Jian, E-mail: 22105653@qq.com [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Hu, Tianyu; Ge, Liangquan [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China); Ouyang, Xiaoping [Northwest Institute of Nuclear Technology, Xi’an 710024,China (China); Zhang, Qingxian; Gu, Yi [Key Laboratory of Applied Nuclear Techniques in Geosciences Sichuan, Chengdu University of Technology, Chengdu 610059 (China)

    2017-06-21

    Since the baseline of the unipolar pulse shaper have the direct-current (DC) offset and drift, an additional baseline estimator is need to obtain baseline values in real-time. The bipolar zero-area (BZA) pulse shapers can be used for baseline restoration, but they cannot restrain the baseline drift due to their asymmetrical shape. In this study, three trapezoids are synthesized as a symmetrical zero-area (SZA) shape, which can remove the DC offset and restrain the baseline drift. This baseline restoration technique can be easily implemented in digital pulse processing (DPP) systems base on the recursive algorithm. To strengthen our approach, the iron's characteristic x-ray was detected using a Si-PIN diode detector. Compared with traditional trapezoidal pulse shapers, the SZA trapezoidal pulse shaper improved the energy resolution from 237 eV to 216 eV for the 6.403 keV Kα peak.

  18. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    Science.gov (United States)

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  19. Interactive Effects of Dopamine Baseline Levels and Cycle Phase on Executive Functions: The Role of Progesterone

    Directory of Open Access Journals (Sweden)

    Esmeralda Hidalgo-Lopez

    2017-07-01

    Full Text Available Estradiol and progesterone levels vary along the menstrual cycle and have multiple neuroactive effects, including on the dopaminergic system. Dopamine relates to executive functions in an “inverted U-shaped” manner and its levels are increased by estradiol. Accordingly, dopamine dependent changes in executive functions along the menstrual cycle have been previously studied in the pre-ovulatory phase, when estradiol levels peak. Specifically it has been demonstrated that working memory is enhanced during the pre-ovulatory phase in women with low dopamine baseline levels, but impaired in women with high dopamine baseline levels. However, the role of progesterone, which peaks in the luteal cycle phase, has not been taken into account previously. Therefore, the main goals of the present study were to extend these findings (i to the luteal cycle phase and (ii to other executive functions. Furthermore, the usefulness of the eye blink rate (EBR as an indicator of dopamine baseline levels in menstrual cycle research was explored. 36 naturally cycling women were tested during three cycle phases (menses–low sex hormones; pre-ovulatory–high estradiol; luteal–high progesterone and estradiol. During each session, women performed a verbal N-back task, as measure of working memory, and a single trial version of the Stroop task, as measure of response inhibition and cognitive flexibility. Hormone levels were assessed from saliva samples and spontaneous eye blink rate was recorded during menses. In the N-back task, women were faster during the luteal phase the higher their progesterone levels, irrespective of their dopamine baseline levels. In the Stroop task, we found a dopamine-cycle interaction, which was also driven by the luteal phase and progesterone levels. For women with higher EBR performance decreased during the luteal phase, whereas for women with lower EBR performance improved during the luteal phase. These findings suggest an important

  20. High Baseline Postconcussion Symptom Scores and Concussion Outcomes in Athletes.

    Science.gov (United States)

    Custer, Aimee; Sufrinko, Alicia; Elbin, R J; Covassin, Tracey; Collins, Micky; Kontos, Anthony

    2016-02-01

    Some healthy athletes report high levels of baseline concussion symptoms, which may be attributable to several factors (eg, illness, personality, somaticizing). However, the role of baseline symptoms in outcomes after sport-related concussion (SRC) has not been empirically examined. To determine if athletes with high symptom scores at baseline performed worse than athletes without baseline symptoms on neurocognitive testing after SRC. Cohort study. High school and collegiate athletic programs. A total of 670 high school and collegiate athletes participated in the study. Participants were divided into groups with either no baseline symptoms (Postconcussion Symptom Scale [PCSS] score = 0, n = 247) or a high level of baseline symptoms (PCSS score > 18 [top 10% of sample], n = 68). Participants were evaluated at baseline and 2 to 7 days after SRC with the Immediate Post-concussion Assessment and Cognitive Test and PCSS. Outcome measures were Immediate Post-concussion Assessment and Cognitive Test composite scores (verbal memory, visual memory, visual motor processing speed, and reaction time) and total symptom score on the PCSS. The groups were compared using repeated-measures analyses of variance with Bonferroni correction to assess interactions between group and time for symptoms and neurocognitive impairment. The no-symptoms group represented 38% of the original sample, whereas the high-symptoms group represented 11% of the sample. The high-symptoms group experienced a larger decline from preinjury to postinjury than the no-symptoms group in verbal (P = .03) and visual memory (P = .05). However, total concussion-symptom scores increased from preinjury to postinjury for the no-symptoms group (P = .001) but remained stable for the high-symptoms group. Reported baseline symptoms may help identify athletes at risk for worse outcomes after SRC. Clinicians should examine baseline symptom levels to better identify patients for earlier referral and treatment for their