WorldWideScience

Sample records for baseline parameter update

  1. Model parameter updating using Bayesian networks

    International Nuclear Information System (INIS)

    Treml, C.A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  2. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  3. Updated global 3+1 analysis of short-baseline neutrino oscillations

    Science.gov (United States)

    Gariazzo, S.; Giunti, C.; Laveder, M.; Li, Y. F.

    2017-06-01

    We present the results of an updated fit of short-baseline neutrino oscillation data in the framework of 3+1 active-sterile neutrino mixing. We first consider ν e and {\\overline{ν}}_e disappearance in the light of the Gallium and reactor anomalies. We discuss the implications of the recent measurement of the reactor {\\overline{ν}}_e spectrum in the NEOS experiment, which shifts the allowed regions of the parameter space towards smaller values of | U e4|2. The β-decay constraints of the Mainz and Troitsk experiments allow us to limit the oscillation length between about 2 cm and 7 m at 3 σ for neutrinos with an energy of 1 MeV. The corresponding oscillations can be discovered in a model-independent way in ongoing reactor and source experiments by measuring ν e and {\\overline{ν}}_e disappearance as a function of distance. We then consider the global fit of the data on short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} transitions in the light of the LSND anomaly, taking into account the constraints from {}_{ν_e}^{(-)} and {}_{ν_{μ}}^{(-)} disappearance experiments, including the recent data of the MINOS and IceCube experiments. The combination of the NEOS constraints on | U e4|2 and the MINOS and IceCube constraints on | U μ4|2 lead to an unacceptable appearance-disappearance tension which becomes tolerable only in a pragmatic fit which neglects the MiniBooNE low-energy anomaly. The minimization of the global χ 2 in the space of the four mixing parameters Δ m 41 2 , | U e4|2, | U μ4|2, and | U τ4|2 leads to three allowed regions with narrow Δ m 41 2 widths at Δ m 41 2 ≈ 1.7 (best-fit), 1.3 (at 2 σ), 2.4 (at 3 σ) eV2. The effective amplitude of short-baseline {}_{ν_{μ}}^{(-)}{\\to}_{ν_e}^{(-)} oscillations is limited by 0.00048 ≲ sin2 2 ϑ eμ ≲ 0.0020 at 3 σ. The restrictions of the allowed regions of the mixing parameters with respect to our previous global fits are mainly due to the NEOS constraints. We present a comparison of the

  4. [Establishing IAQ Metrics and Baseline Measures.] "Indoor Air Quality Tools for Schools" Update #20

    Science.gov (United States)

    US Environmental Protection Agency, 2009

    2009-01-01

    This issue of "Indoor Air Quality Tools for Schools" Update ("IAQ TfS" Update) contains the following items: (1) News and Events; (2) IAQ Profile: Establishing Your Baseline for Long-Term Success (Feature Article); (3) Insight into Excellence: Belleville Township High School District #201, 2009 Leadership Award Winner; and (4) Have Your Questions…

  5. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

    International Nuclear Information System (INIS)

    1996-07-01

    This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics

  6. Sandia National Laboratories/New Mexico Environmental Baseline update--Revision 1.0

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This report provides a baseline update to provide the background information necessary for personnel to prepare clear and consise NEPA documentation. The environment of the Sandia National Laboratories is described in this document, including the ecology, meteorology, climatology, seismology, emissions, cultural resources and land use, visual resources, noise pollution, transportation, and socioeconomics.

  7. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

    International Nuclear Information System (INIS)

    Swita, W.R.

    1998-01-01

    This document provides a summary of the proposed Tank Waste Remediation System Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost) developed to demonstrate the Tank Waste Remediation System contractor's Readiness-to-Proceed in support of the Phase 1B mission

  8. Vegetation Parameter Extraction Using Dual Baseline Polarimetric SAR Interferometry Data

    Science.gov (United States)

    Zhang, H.; Wang, C.; Chen, X.; Tang, Y.

    2009-04-01

    For vegetation parameter inversion, the single baseline polarimetric SAR interferometry (POLinSAR) technique, such as the three-stage method and the ESPRIT algorithm, is limited by the observed data with the minimum ground to volume amplitude ration, which effects the estimation of the effective phase center for the vegetation canopy or the surface, and thus results in the underestimated vegetation height. In order to remove this effect of the single baseline inversion techniques in some extend, another baseline POLinSAR data is added on vegetation parameter estimation in this paper, and a dual baseline POLinSAR technique for the extraction of the vegetation parameter is investigated and improved to reduce the dynamic bias for the vegetation parameter estimation. Finally, the simulated data and real data are used to validate this dual baseline technique.

  9. Tank waste remediation system retrieval and disposal mission initial updated baseline summary

    International Nuclear Information System (INIS)

    Swita, W.R.

    1998-01-01

    This document provides a summary of the Tank Waste Remediation System (TWRS) Retrieval and Disposal Mission Initial Updated Baseline (scope, schedule, and cost), developed to demonstrate Readiness-to-Proceed (RTP) in support of the TWRS Phase 1B mission. This Updated Baseline is the proposed TWRS plan to execute and measure the mission work scope. This document and other supporting data demonstrate that the TWRS Project Hanford Management Contract (PHMC) team is prepared to fully support Phase 1B by executing the following scope, schedule, and cost baseline activities: Deliver the specified initial low-activity waste (LAW) and high-level waste (HLW) feed batches in a consistent, safe, and reliable manner to support private contractors' operations starting in June 2002; Deliver specified subsequent LAW and HLW feed batches during Phase 1B in a consistent, safe, and reliable manner; Provide for the interim storage of immobilized HLW (IHLW) products and the disposal of immobilized LAW (ILAW) products generated by the private contractors; Provide for disposal of byproduct wastes generated by the private contractors; and Provide the infrastructure to support construction and operations of the private contractors' facilities

  10. Updated Abraham solvation parameters for polychlorinated biphenyls

    NARCIS (Netherlands)

    van Noort, P.C.M.; Haftka, J.J.H.; Parsons, J.R.

    2010-01-01

    This study shows that the recently published polychlorinated biphenyl (PCB) Abraham solvation parameters predict PCB air−n-hexadecane and n-octanol−water partition coefficients very poorly, especially for highly ortho-chlorinated congeners. Therefore, an updated set of PCB solvation parameters was

  11. Updated Abraham solvation parameters for polychlorinated biphenyls

    NARCIS (Netherlands)

    Noort, van P.C.M.; Haftka, J.J.H.; Parsons, J.R.

    2010-01-01

    This study shows that the recently published polychlorinated biphenyl (PCB) Abraham solvation parameters predict PCB air-n-hexadecane and n-octanol-water partition coefficients very poorly, especially for highly ortho-chlorinated congeners. Therefore, an updated set of PCB solvation parameters was

  12. Updated climatological model predictions of ionospheric and HF propagation parameters

    International Nuclear Information System (INIS)

    Reilly, M.H.; Rhoads, F.J.; Goodman, J.M.; Singh, M.

    1991-01-01

    The prediction performances of several climatological models, including the ionospheric conductivity and electron density model, RADAR C, and Ionospheric Communications Analysis and Predictions Program, are evaluated for different regions and sunspot number inputs. Particular attention is given to the near-real-time (NRT) predictions associated with single-station updates. It is shown that a dramatic improvement can be obtained by using single-station ionospheric data to update the driving parameters for an ionospheric model for NRT predictions of f(0)F2 and other ionospheric and HF circuit parameters. For middle latitudes, the improvement extends out thousands of kilometers from the update point to points of comparable corrected geomagnetic latitude. 10 refs

  13. Updated baseline for a staged Compact Linear Collider

    CERN Document Server

    Boland, M J; Giansiracusa, P J; Lucas, T G; Rassool, R P; Balazs, C; Charles, T K; Afanaciev, K; Emeliantchik, I; Ignatenko, A; Makarenko, V; Shumeiko, N; Patapenka, A; Zhuk, I; Abusleme Hoffman, A C; Diaz Gutierrez, M A; Gonzalez, M Vogel; Chi, Y; He, X; Pei, G; Pei, S; Shu, G; Wang, X; Zhang, J; Zhao, F; Zhou, Z; Chen, H; Gao, Y; Huang, W; Kuang, Y P; Li, B; Li, Y; Shao, J; Shi, J; Tang, C; Wu, X; Ma, L; Han, Y; Fang, W; Gu, Q; Huang, D; Huang, X; Tan, J; Wang, Z; Zhao, Z; Laštovička, T; Uggerhoj, U; Wistisen, T N; Aabloo, A; Eimre, K; Kuppart, K; Vigonski, S; Zadin, V; Aicheler, M; Baibuz, E; Brücken, E; Djurabekova, F; Eerola, P; Garcia, F; Haeggström, E; Huitu, K; Jansson, V; Karimaki, V; Kassamakov, I; Kyritsakis, A; Lehti, S; Meriläinen, A; Montonen, R; Niinikoski, T; Nordlund, K; Österberg, K; Parekh, M; Törnqvist, N A; Väinölä, J; Veske, M; Farabolini, W; Mollard, A; Napoly, O; Peauger, F; Plouin, J; Bambade, P; Chaikovska, I; Chehab, R; Davier, M; Kaabi, W; Kou, E; LeDiberder, F; Pöschl, R; Zerwas, D; Aimard, B; Balik, G; Baud, J-P; Blaising, J-J; Brunetti, L; Chefdeville, M; Drancourt, C; Geoffroy, N; Jacquemier, J; Jeremie, A; Karyotakis, Y; Nappa, J M; Vilalte, S; Vouters, G; Bernard, A; Peric, I; Gabriel, M; Simon, F; Szalay, M; van der Kolk, N; Alexopoulos, T; Gazis, E N; Gazis, N; Ikarios, E; Kostopoulos, V; Kourkoulis, S; Gupta, P D; Shrivastava, P; Arfaei, H; Dayyani, M K; Ghasem, H; Hajari, S S; Shaker, H; Ashkenazy, Y; Abramowicz, H; Benhammou, Y; Borysov, O; Kananov, S; Levy, A; Levy, I; Rosenblat, O; D'Auria, G; Di Mitri, S; Abe, T; Aryshev, A; Higo, T; Makida, Y; Matsumoto, S; Shidara, T; Takatomi, T; Takubo, Y; Tauchi, T; Toge, N; Ueno, K; Urakawa, J; Yamamoto, A; Yamanaka, M; Raboanary, R; Hart, R; van der Graaf, H; Eigen, G; Zalieckas, J; Adli, E; Lillestøl, R; Malina, L; Pfingstner, J; Sjobak, K N; Ahmed, W; Asghar, M I; Hoorani, H; Bugiel, S; Dasgupta, R; Firlej, M; Fiutowski, T A; Idzik, M; Kopec, M; Kuczynska, M; Moron, J; Swientek, K P; Daniluk, W; Krupa, B; Kucharczyk, M; Lesiak, T; Moszczynski, A; Pawlik, B; Sopicki, P; Wojtoń, T; Zawiejski, L; Kalinowski, J; Krawczyk, M; Żarnecki, A F; Firu, E; Ghenescu, V; Neagu, A T; Preda, T; Zgura, I-S; Aloev, A; Azaryan, N; Budagov, J; Chizhov, M; Filippova, M; Glagolev, V; Gongadze, A; Grigoryan, S; Gudkov, D; Karjavine, V; Lyablin, M; Olyunin, A; Samochkine, A; Sapronov, A; Shirkov, G; Soldatov, V; Solodko, A; Solodko, E; Trubnikov, G; Tyapkin, I; Uzhinsky, V; Vorozhtov, A; Levichev, E; Mezentsev, N; Piminov, P; Shatilov, D; Vobly, P; Zolotarev, K; Bozovic-Jelisavcic, I; Kacarevic, G; Lukic, S; Milutinovic-Dumbelovic, G; Pandurovic, M; Iriso, U; Perez, F; Pont, M; Trenado, J; Aguilar-Benitez, M; Calero, J; Garcia-Tabares, L; Gavela, D; Gutierrez, J L; Lopez, D; Toral, F; Moya, D; Ruiz-Jimeno, A; Vila, I; Argyropoulos, T; Blanch Gutierrez, C; Boronat, M; Esperante, D; Faus-Golfe, A; Fuster, J; Fuster Martinez, N; Galindo Muñoz, N; García, I; Giner Navarro, J; Ros, E; Vos, M; Brenner, R; Ekelöf, T; Jacewicz, M; Ögren, J; Olvegård, M; Ruber, R; Ziemann, V; Aguglia, D; Alipour Tehrani, N; Aloev, A; Andersson, A; Andrianala, F; Antoniou, F; Artoos, K; Atieh, S; Ballabriga Sune, R; Barnes, M J; Barranco Garcia, J; Bartosik, H; Belver-Aguilar, C; Benot Morell, A; Bett, D R; Bettoni, S; Blanchot, G; Blanco Garcia, O; Bonnin, X A; Brunner, O; Burkhardt, H; Calatroni, S; Campbell, M; Catalan Lasheras, N; Cerqueira Bastos, M; Cherif, A; Chevallay, E; Constance, B; Corsini, R; Cure, B; Curt, S; Dalena, B; Dannheim, D; De Michele, G; De Oliveira, L; Deelen, N; Delahaye, J P; Dobers, T; Doebert, S; Draper, M; Duarte Ramos, F; Dubrovskiy, A; Elsener, K; Esberg, J; Esposito, M; Fedosseev, V; Ferracin, P; Fiergolski, A; Foraz, K; Fowler, A; Friebel, F; Fuchs, J-F; Fuentes Rojas, C A; Gaddi, A; Garcia Fajardo, L; Garcia Morales, H; Garion, C; Gatignon, L; Gayde, J-C; Gerwig, H; Goldblatt, A N; Grefe, C; Grudiev, A; Guillot-Vignot, F G; Gutt-Mostowy, M L; Hauschild, M; Hessler, C; Holma, J K; Holzer, E; Hourican, M; Hynds, D; Inntjore Levinsen, Y; Jeanneret, B; Jensen, E; Jonker, M; Kastriotou, M; Kemppinen, J M K; Kieffer, R B; Klempt, W; Kononenko, O; Korsback, A; Koukovini Platia, E; Kovermann, J W; Kozsar, C-I; Kremastiotis, I; Kulis, S; Latina, A; Leaux, F; Lebrun, P; Lefevre, T; Linssen, L; Llopart Cudie, X; Maier, A A; Mainaud Durand, H; Manosperti, E; Marelli, C; Marin Lacoma, E; Martin, R; Mazzoni, S; Mcmonagle, G; Mete, O; Mether, L M; Modena, M; Münker, R M; Muranaka, T; Nebot Del Busto, E; Nikiforou, N; Nisbet, D; Nonglaton, J-M; Nuiry, F X; Nürnberg, A; Olvegard, M; Osborne, J; Papadopoulou, S; Papaphilippou, Y; Passarelli, A; Patecki, M; Pazdera, L; Pellegrini, D; Pepitone, K; Perez, F; Perez Codina, E; Perez Fontenla, A; Persson, T H B; Petrič, M; Pitters, F; Pittet, S; Plassard, F; Rajamak, R; Redford, S; Renier, Y; Rey, S F; Riddone, G; Rinolfi, L; Rodriguez Castro, E; Roloff, P; Rossi, C; Rude, V; Rumolo, G; Sailer, A; Santin, E; Schlatter, D; Schmickler, H; Schulte, D; Shipman, N; Sicking, E; Simoniello, R; Skowronski, P K; Sobrino Mompean, P; Soby, L; Sosin, M P; Sroka, S; Stapnes, S; Sterbini, G; Ström, R; Syratchev, I; Tecker, F; Thonet, P A; Timeo, L; Timko, H; Tomas Garcia, R; Valerio, P; Vamvakas, A L; Vivoli, A; Weber, M A; Wegner, R; Wendt, M; Woolley, B; Wuensch, W; Uythoven, J; Zha, H; Zisopoulos, P; Benoit, M; Vicente Barreto Pinto, M; Bopp, M; Braun, H H; Csatari Divall, M; Dehler, M; Garvey, T; Raguin, J Y; Rivkin, L; Zennaro, R; Aksoy, A; Nergiz, Z; Pilicer, E; Tapan, I; Yavas, O; Baturin, V; Kholodov, R; Lebedynskyi, S; Miroshnichenko, V; Mordyk, S; Profatilova, I; Storizhko, V; Watson, N; Winter, A; Goldstein, J; Green, S; Marshall, J S; Thomson, M A; Xu, B; Gillespie, W A; Pan, R; Tyrk, M A; Protopopescu, D; Robson, A; Apsimon, R; Bailey, I; Burt, G; Constable, D; Dexter, A; Karimian, S; Lingwood, C; Buckland, M D; Casse, G; Vossebeld, J; Bosco, A; Karataev, P; Kruchinin, K; Lekomtsev, K; Nevay, L; Snuverink, J; Yamakawa, E; Boisvert, V; Boogert, S; Boorman, G; Gibson, S; Lyapin, A; Shields, W; Teixeira-Dias, P; West, S; Jones, R; Joshi, N; Bodenstein, R; Burrows, P N; Christian, G B; Gamba, D; Perry, C; Roberts, J; Clarke, J A; Collomb, N A; Jamison, S P; Shepherd, B J A; Walsh, D; Demarteau, M; Repond, J; Weerts, H; Xia, L; Wells, J D; Adolphsen, C; Barklow, T; Breidenbach, M; Graf, N; Hewett, J; Markiewicz, T; McCormick, D; Moffeit, K; Nosochkov, Y; Oriunno, M; Phinney, N; Rizzo, T; Tantawi, S; Wang, F; Wang, J; White, G; Woodley, M

    2016-01-01

    The Compact Linear Collider (CLIC) is a multi-TeV high-luminosity linear e+e- collider under development. For an optimal exploitation of its physics potential, CLIC is foreseen to be built and operated in a staged approach with three centre-of-mass energy stages ranging from a few hundred GeV up to 3 TeV. The first stage will focus on precision Standard Model physics, in particular Higgs and top-quark measurements. Subsequent stages will focus on measurements of rare Higgs processes, as well as searches for new physics processes and precision measurements of new states, e.g. states previously discovered at LHC or at CLIC itself. In the 2012 CLIC Conceptual Design Report, a fully optimised 3 TeV collider was presented, while the proposed lower energy stages were not studied to the same level of detail. This report presents an updated baseline staging scenario for CLIC. The scenario is the result of a comprehensive study addressing the performance, cost and power of the CLIC accelerator complex as a function of...

  14. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  15. The dynamics of integrate-and-fire: mean versus variance modulations and dependence on baseline parameters.

    Science.gov (United States)

    Pressley, Joanna; Troyer, Todd W

    2011-05-01

    The leaky integrate-and-fire (LIF) is the simplest neuron model that captures the essential properties of neuronal signaling. Yet common intuitions are inadequate to explain basic properties of LIF responses to sinusoidal modulations of the input. Here we examine responses to low and moderate frequency modulations of both the mean and variance of the input current and quantify how these responses depend on baseline parameters. Across parameters, responses to modulations in the mean current are low pass, approaching zero in the limit of high frequencies. For very low baseline firing rates, the response cutoff frequency matches that expected from membrane integration. However, the cutoff shows a rapid, supralinear increase with firing rate, with a steeper increase in the case of lower noise. For modulations of the input variance, the gain at high frequency remains finite. Here, we show that the low-frequency responses depend strongly on baseline parameters and derive an analytic condition specifying the parameters at which responses switch from being dominated by low versus high frequencies. Additionally, we show that the resonant responses for variance modulations have properties not expected for common oscillatory resonances: they peak at frequencies higher than the baseline firing rate and persist when oscillatory spiking is disrupted by high noise. Finally, the responses to mean and variance modulations are shown to have a complementary dependence on baseline parameters at higher frequencies, resulting in responses to modulations of Poisson input rates that are independent of baseline input statistics.

  16. Studying the physics potential of long-baseline experiments in terms of new sensitivity parameters

    International Nuclear Information System (INIS)

    Singh, Mandip

    2016-01-01

    We investigate physics opportunities to constraint the leptonic CP-violation phase δ_C_P through numerical analysis of working neutrino oscillation probability parameters, in the context of long-baseline experiments. Numerical analysis of two parameters, the “transition probability δ_C_P phase sensitivity parameter (A"M)” and the “CP-violation probability δ_C_P phase sensitivity parameter (A"C"P),” as functions of beam energy and/or baseline have been carried out. It is an elegant technique to broadly analyze different experiments to constrain the δ_C_P phase and also to investigate the mass hierarchy in the leptonic sector. Positive and negative values of the parameter A"C"P, corresponding to either hierarchy in the specific beam energy ranges, could be a very promising way to explore the mass hierarchy and δ_C_P phase. The keys to more robust bounds on the δ_C_P phase are improvements of the involved detection techniques to explore lower energies and relatively long baseline regions with better experimental accuracy.

  17. Industrial Hazardous Waste Management In Egypt-the baseline study: An Updated review

    International Nuclear Information System (INIS)

    Farida M, S.

    1999-01-01

    Increased industrialization over the past decades in Egypt has resulted in an increased and uncontrolled generation of industrial hazardous waste. This was not accompanied by any concerted efforts to control these wastes. Consequently, no system for handling or disposing of industrial wastes, in general, and industrial hazardous wastes, in specific, exists. In 1993, a baseline report was formulated to assess the overall problem of industrial hazardous waste management in Egypt. Consequently, recommendations for priority actions were identified and the main components of a national hazardous waste system under the provision of Law 4/ 1994 were presented. This paper provides an updated review of this report in light of the proposed technical, legal and institutional guidelines to help in the realization of such a needed waste management system in Egypt

  18. Aircraft engine sensor fault diagnostics using an on-line OBEM update method.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.

  19. A PSO Driven Intelligent Model Updating and Parameter Identification Scheme for Cable-Damper System

    Directory of Open Access Journals (Sweden)

    Danhui Dan

    2015-01-01

    Full Text Available The precise measurement of the cable force is very important for monitoring and evaluating the operation status of cable structures such as cable-stayed bridges. The cable system should be installed with lateral dampers to reduce the vibration, which affects the precise measurement of the cable force and other cable parameters. This paper suggests a cable model updating calculation scheme driven by the particle swarm optimization (PSO algorithm. By establishing a finite element model considering the static geometric nonlinearity and stress-stiffening effect firstly, an automatically finite element method model updating powered by PSO algorithm is proposed, with the aims to identify the cable force and relevant parameters of cable-damper system precisely. Both numerical case studies and full-scale cable tests indicated that, after two rounds of updating process, the algorithm can accurately identify the cable force, moment of inertia, and damping coefficient of the cable-damper system.

  20. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    Science.gov (United States)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  1. Likelihood updating of random process load and resistance parameters by monitoring

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2003-01-01

    that maximum likelihood estimation is a rational alternative to an arbitrary weighting for least square fitting. The derived likelihood function gets singularities if the spectrum is prescribed with zero values at some frequencies. This is often the case for models of technically relevant processes......, even though it is of complicated mathematical form, allows an approximate Bayesian updating and control of the time development of the parameters. Some of these parameters can be structural parameters that by too much change reveal progressing damage or other malfunctioning. Thus current process......Spectral parameters for a stationary Gaussian process are most often estimated by Fourier transformation of a realization followed by some smoothing procedure. This smoothing is often a weighted least square fitting of some prespecified parametric form of the spectrum. In this paper it is shown...

  2. Reliability updating based on monitoring of structural response parameters

    International Nuclear Information System (INIS)

    Leira, B.J.

    2016-01-01

    Short- and long-term aspects of measuring structural response parameters are addressed. Two specific examples of such measurements are considered for the purpose of illustration and in order to focus the discussion. These examples are taken from the petroleum industry (monitoring of riser response) and from the shipping industry (monitoring of ice-induced strains in a ship hull). Similarities and differences between the two cases are elaborated with respect to which are the most relevant mechanical limit states. Furthermore, main concerns related to reliability levels within a short-term versus long-term time horizon are highlighted. Quantifying the economic benefits of applying monitoring systems is also addressed. - Highlights: • Two examples of structural response monitoring are described. • Application of measurements is discussed in relation to updating of load and structural parameters. • Quantification of the value of response monitoring is made for both of the examples.

  3. The use of Bayesian networks for nanoparticle risk forecasting: model formulation and baseline evaluation.

    Science.gov (United States)

    Money, Eric S; Reckhow, Kenneth H; Wiesner, Mark R

    2012-06-01

    We describe the use of Bayesian networks as a tool for nanomaterial risk forecasting and develop a baseline probabilistic model that incorporates nanoparticle specific characteristics and environmental parameters, along with elements of exposure potential, hazard, and risk related to nanomaterials. The baseline model, FINE (Forecasting the Impacts of Nanomaterials in the Environment), was developed using expert elicitation techniques. The Bayesian nature of FINE allows for updating as new data become available, a critical feature for forecasting risk in the context of nanomaterials. The specific case of silver nanoparticles (AgNPs) in aquatic environments is presented here (FINE(AgNP)). The results of this study show that Bayesian networks provide a robust method for formally incorporating expert judgments into a probabilistic measure of exposure and risk to nanoparticles, particularly when other knowledge bases may be lacking. The model is easily adapted and updated as additional experimental data and other information on nanoparticle behavior in the environment become available. The baseline model suggests that, within the bounds of uncertainty as currently quantified, nanosilver may pose the greatest potential risk as these particles accumulate in aquatic sediments. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tencate, Alister J. [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); Kalivas, John H., E-mail: kalijohn@isu.edu [Department of Chemistry, Idaho State University, Pocatello, ID 83209 (United States); White, Alexander J. [Department of Physics and Optical Engineering, Rose-Hulman Institute of Technology, Terre Huate, IN 47803 (United States)

    2016-05-19

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  5. Fusion strategies for selecting multiple tuning parameters for multivariate calibration and other penalty based processes: A model updating application for pharmaceutical analysis

    International Nuclear Information System (INIS)

    Tencate, Alister J.; Kalivas, John H.; White, Alexander J.

    2016-01-01

    New multivariate calibration methods and other processes are being developed that require selection of multiple tuning parameter (penalty) values to form the final model. With one or more tuning parameters, using only one measure of model quality to select final tuning parameter values is not sufficient. Optimization of several model quality measures is challenging. Thus, three fusion ranking methods are investigated for simultaneous assessment of multiple measures of model quality for selecting tuning parameter values. One is a supervised learning fusion rule named sum of ranking differences (SRD). The other two are non-supervised learning processes based on the sum and median operations. The effect of the number of models evaluated on the three fusion rules are also evaluated using three procedures. One procedure uses all models from all possible combinations of the tuning parameters. To reduce the number of models evaluated, an iterative process (only applicable to SRD) is applied and thresholding a model quality measure before applying the fusion rules is also used. A near infrared pharmaceutical data set requiring model updating is used to evaluate the three fusion rules. In this case, calibration of the primary conditions is for the active pharmaceutical ingredient (API) of tablets produced in a laboratory. The secondary conditions for calibration updating is for tablets produced in the full batch setting. Two model updating processes requiring selection of two unique tuning parameter values are studied. One is based on Tikhonov regularization (TR) and the other is a variation of partial least squares (PLS). The three fusion methods are shown to provide equivalent and acceptable results allowing automatic selection of the tuning parameter values. Best tuning parameter values are selected when model quality measures used with the fusion rules are for the small secondary sample set used to form the updated models. In this model updating situation, evaluation of

  6. Updating the U.S. Life Cycle GHG Petroleum Baseline to 2014 with Projections to 2040 Using Open-Source Engineering-Based Models.

    Science.gov (United States)

    Cooney, Gregory; Jamieson, Matthew; Marriott, Joe; Bergerson, Joule; Brandt, Adam; Skone, Timothy J

    2017-01-17

    The National Energy Technology Laboratory produced a well-to-wheels (WTW) life cycle greenhouse gas analysis of petroleum-based fuels consumed in the U.S. in 2005, known as the NETL 2005 Petroleum Baseline. This study uses a set of engineering-based, open-source models combined with publicly available data to calculate baseline results for 2014. An increase between the 2005 baseline and the 2014 results presented here (e.g., 92.4 vs 96.2 g CO 2 e/MJ gasoline, + 4.1%) are due to changes both in modeling platform and in the U.S. petroleum sector. An updated result for 2005 was calculated to minimize the effect of the change in modeling platform, and emissions for gasoline in 2014 were about 2% lower than in 2005 (98.1 vs 96.2 g CO 2 e/MJ gasoline). The same methods were utilized to forecast emissions from fuels out to 2040, indicating maximum changes from the 2014 gasoline result between +2.1% and -1.4%. The changing baseline values lead to potential compliance challenges with frameworks such as the Energy Independence and Security Act (EISA) Section 526, which states that Federal agencies should not purchase alternative fuels unless their life cycle GHG emissions are less than those of conventionally produced, petroleum-derived fuels.

  7. Homogeneous Gaussian Profile P+-Type Emitters: Updated Parameters and Metal-Grid Optimization

    Directory of Open Access Journals (Sweden)

    M. Cid

    2002-10-01

    Full Text Available P+-type emitters were optimized keeping the base parameters constant. Updated internal parameters were considered. The surface recombination velocity was considered variable with the surface doping level. Passivated homogeneous emitters were found to have low emitter recombination density and high collection efficiency. A complete structure p+nn+ was analyzed, taking into account optimized shadowing and metal-contacted factors for laboratory cells as function of the surface doping level and the emitter thickness. The base parameters were kept constant to make the emitter characteristics evident. The most efficient P+-type passivated homogeneous emitters, provide efficiencies around 21% for a wide range of emitter sheet resistivity (50 -- 500 omega/ with the surface doping levels Ns=1×10(19 cm-3 and 5×10(19 cm-3. The output electrical parameters were evaluated considering the recently proposed value n i=9.65×10(9 (cm-3. A non-significant increase of 0.1% in the efficiency was obtained, validating all the conclusions obtained in this work, considering n i=1×10(10 cm-3.

  8. Developing RESRAD-BASELINE for environmental baseline risk assessment

    International Nuclear Information System (INIS)

    Cheng, Jing-Jy.

    1995-01-01

    RESRAD-BASELINE is a computer code developed at Argonne developed at Argonne National Laboratory for the US Department of Energy (DOE) to perform both radiological and chemical risk assessments. The code implements the baseline risk assessment guidance of the US Environmental Protection Agency (EPA 1989). The computer code calculates (1) radiation doses and cancer risks from exposure to radioactive materials, and (2) hazard indexes and cancer risks from exposure to noncarcinogenic and carcinogenic chemicals, respectively. The user can enter measured or predicted environmental media concentrations from the graphic interface and can simulate different exposure scenarios by selecting the appropriate pathways and modifying the exposure parameters. The database used by PESRAD-BASELINE includes dose conversion factors and slope factors for radionuclides and toxicity information and properties for chemicals. The user can modify the database for use in the calculation. Sensitivity analysis can be performed while running the computer code to examine the influence of the input parameters. Use of RESRAD-BASELINE for risk analysis is easy, fast, and cost-saving. Furthermore, it ensures in consistency in methodology for both radiological and chemical risk analyses

  9. Investigation of key parameters for the development of reliable ITER baseline operation scenarios using CORSICA

    Science.gov (United States)

    Kim, S. H.; Casper, T. A.; Snipes, J. A.

    2018-05-01

    ITER will demonstrate the feasibility of burning plasma operation by operating DT plasmas in the ELMy H-mode regime with a high ratio of fusion power gain Q ~ 10. 15 MA ITER baseline operation scenario has been studied using CORSICA, focusing on the entry to burn, flat-top burning plasma operation and exit from burn. The burning plasma operation for about 400 s of the current flat-top was achieved in H-mode within the various engineering constraints imposed by the poloidal field coil and power supply systems. The target fusion gain (Q ~ 10) was achievable in the 15 MA ITER baseline operation with a moderate amount of the total auxiliary heating power (~50 MW). It has been observed that the tungsten (W) concentration needs to be maintained low level (n w/n e up to the order of 1.0  ×  10-5) to avoid the radiative collapse and uncontrolled early termination of the discharge. The dynamic evolution of the density can modify the H-mode access unless the applied auxiliary heating power is significantly higher than the H-mode threshold power. Several qualitative sensitivity studies have been performed to provide guidance for further optimizing the plasma operation and performance. Increasing the density profile peaking factor was quite effective in increasing the alpha particle self-heating power and fusion power multiplication factor. Varying the combination of auxiliary heating power has shown that the fusion power multiplication factor can be reduced along with the increase in the total auxiliary heating power. As the 15 MA ITER baseline operation scenario requires full capacity of the coil and power supply systems, the operation window for H-mode access and shape modification was narrow. The updated ITER baseline operation scenarios developed in this work will become a basis for further optimization studies necessary along with the improvement in understanding the burning plasma physics.

  10. Glass Dissolution Parameters: Update for Entsorgungsnachweis

    International Nuclear Information System (INIS)

    Curti, E.

    2003-11-01

    This document provides updated long-term corrosion rates for borosilicate glasses used in Switzerland as a matrix for high-level radioactive waste. The new rates are based on long-term leaching experiments conducted at PSI and are corroborated by recent investigations. The asymptotic rates have been determined through weighted linear regressions of the normalised mass losses, directly calculated from B and Li concentrations in the leaching solutions. Special attention was given to the determination of the analytical uncertainty of the mass losses. The sensitivity of the corrosion rates to analytical uncertainties and to other criteria (e.g. the choice of data points for the regressions) was also studied. A major finding was that the uncertainty of the corrosion rate mainly depends on the uncertainty of the specific glass surface area. The reference rates proposed for safety assessment calculations are 1.5 mg m -2 d -1 for BNFL glasses and 0.2 mg m -2 d -1 for Cogema glasses. The relevance of the proposed corrosion rates for repository conditions is shown based on the analysis of processes and parameters currently known to affect the long-term kinetics of silicate glasses. Specifically, recent studies indicate that potentially detrimental effects, notably the removal of silica from solution through adsorption on clay minerals, are transitory and will not affect the long-term corrosion rate of the Swiss reference glasses. Iron corrosion products are also known to bind silica, but present data are not sufficient to quantify their influence on the long-term rate. (author)

  11. Finite element model updating of multi-span steel-arch-steel-girder bridges based on ambient vibrations

    Science.gov (United States)

    Hou, Tsung-Chin; Gao, Wei-Yuan; Chang, Chia-Sheng; Zhu, Guan-Rong; Su, Yu-Min

    2017-04-01

    The three-span steel-arch-steel-girder Jiaxian Bridge was newly constructed in 2010 to replace the former one that has been destroyed by Typhoon Sinlaku (2008, Taiwan). It was designed and built to continue the domestic service requirement, as well as to improve the tourism business of the Kaohsiung city government, Taiwan. This study aimed at establishing the baseline model of Jiaxian Bridge for hazardous scenario simulation such as typhoons, floods and earthquakes. Necessities of these precaution works were attributed to the inherent vulnerability of the sites: near fault and river cross. The uncalibrated baseline bridge model was built with structural finite element in accordance with the blueprints. Ambient vibration measurements were performed repeatedly to acquire the elastic dynamic characteristics of the bridge structure. Two frequency domain system identification algorithms were employed to extract the measured operational modal parameters. Modal shapes, frequencies, and modal assurance criteria (MAC) were configured as the fitting targets so as to calibrate/update the structural parameters of the baseline model. It has been recognized that different types of structural parameters contribute distinguishably to the fitting targets, as this study has similarly explored. For steel-arch-steel-girder bridges in particular this case, joint rigidity of the steel components was found to be dominant while material properties and section geometries relatively minor. The updated model was capable of providing more rational elastic responses of the bridge superstructure under normal service conditions as well as hazardous scenarios, and can be used for manage the health conditions of the bridge structure.

  12. Association of baseline vitamin D levels with clinical parameters and treatment outcomes in chronic hepatitis B.

    Science.gov (United States)

    Chan, Henry Lik-Yuen; Elkhashab, Magdy; Trinh, Huy; Tak, Won Young; Ma, Xiaoli; Chuang, Wan-Long; Kim, Yoon Jun; Martins, Eduardo B; Lin, Lanjia; Dinh, Phillip; Charuworn, Prista; Foster, Graham R; Marcellin, Patrick

    2015-11-01

    The relationship between vitamin D levels and chronic hepatitis B (CHB) infection and treatment outcomes are poorly elucidated. We measured pre-treatment serum vitamin D (25-hydroxyvitamin D3; 25[OH]D3) levels and determined their association with clinical parameters and treatment outcomes in active CHB patients without advanced liver disease enrolled in a global clinical trial. Patients were randomly assigned to either 48 weeks of tenofovir disoproxil fumarate (TDF) plus peginterferon alfa-2a (PegIFN), TDF plus PegIFN for 16 weeks followed by TDF for 32 weeks, PegIFN for 48 weeks, or TDF for 120 weeks. Univariate and multivariate analyses were conducted to determine associations between vitamin D, baseline factors, and week 48 clinical outcome. Of 737 patients, 35% had insufficient (⩾20 but vitamin D levels. In univariate analysis, lower vitamin D levels were significantly associated with the following baseline parameters: younger age, lower uric acid levels, HBeAg-positive status, lower calcium levels, blood draw in winter or autumn, and HBV genotype D. On multivariate analysis, only HBV genotype, season of blood draw, calcium level, and age retained their association. High baseline level of vitamin D was associated with low HBV DNA, normal ALT and HBsAg at week 48 independent of treatment groups, but the association, with the exception of ALT, became statistically insignificant after adjusting for age, gender, HBeAg and HBV genotype. Abnormally low vitamin D levels are highly prevalent among untreated, active CHB patients. Baseline vitamin D levels are not associated with treatment outcomes, but were associated with normal ALT. Copyright © 2015 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.

  13. Baseline values of immunologic parameters in the lizard Salvator merianae (Teiidae, Squamata)

    Science.gov (United States)

    Mestre, Ana Paula; Amavet, Patricia Susana; Siroski, Pablo Ariel

    2017-01-01

    The genus Salvator is widely distributed throughout South America. In Argentina, the species most abundant widely distributed is Salvator merianae. Particularly in Santa Fe province, the area occupied by populations of these lizards overlaps with areas where agriculture was extended. With the aim of established baseline values for four immunologic biomarkers widely used, 36 tegu lizards were evaluated tacking into account different age classes and both sexes. Total leukocyte counts were not different between age classes. Of the leucocytes count, eosinophils levels were higher in neonates compared with juvenile and adults; nevertheless, the heterophils group was the most prevalent leukocyte in the peripheral blood in all age classes. Lymphocytes, monocytes, heterophils, azurophils and basophils levels did not differ with age. Natural antibodies titres were higher in the adults compared with neonates and juveniles lizards. Lastly, complement system activity was low in neonates compared with juveniles and adults. Statistical analysis within each age group showed that gender was not a factor in the outcomes. Based on the results, we concluded that S. merianae demonstrated age (but not gender) related differences in the immune parameters analyzed. Having established baseline values for these four widely-used immunologic biomarkers, ongoing studies will seek to optimize the use of the S. merianae model in future research. PMID:28652981

  14. Baseline values of immunologic parameters in the lizard Salvator merianae (Teiidae, Squamata).

    Science.gov (United States)

    Mestre, Ana Paula; Amavet, Patricia Susana; Siroski, Pablo Ariel

    2017-01-01

    The genus Salvator is widely distributed throughout South America. In Argentina, the species most abundant widely distributed is Salvator merianae . Particularly in Santa Fe province, the area occupied by populations of these lizards overlaps with areas where agriculture was extended. With the aim of established baseline values for four immunologic biomarkers widely used, 36 tegu lizards were evaluated tacking into account different age classes and both sexes. Total leukocyte counts were not different between age classes. Of the leucocytes count, eosinophils levels were higher in neonates compared with juvenile and adults; nevertheless, the heterophils group was the most prevalent leukocyte in the peripheral blood in all age classes. Lymphocytes, monocytes, heterophils, azurophils and basophils levels did not differ with age. Natural antibodies titres were higher in the adults compared with neonates and juveniles lizards. Lastly, complement system activity was low in neonates compared with juveniles and adults. Statistical analysis within each age group showed that gender was not a factor in the outcomes. Based on the results, we concluded that S. merianae demonstrated age (but not gender) related differences in the immune parameters analyzed. Having established baseline values for these four widely-used immunologic biomarkers, ongoing studies will seek to optimize the use of the S. merianae model in future research.

  15. Baseline values of immunologic parameters in the lizard Salvator merianae (Teiidae, Squamata

    Directory of Open Access Journals (Sweden)

    Ana Paula Mestre

    2017-05-01

    Full Text Available The genus Salvator is widely distributed throughout South America. In Argentina, the species most abundant widely distributed is Salvator merianae. Particularly in Santa Fe province, the area occupied by populations of these lizards overlaps with areas where agriculture was extended. With the aim of established baseline values for four immunologic biomarkers widely used, 36 tegu lizards were evaluated tacking into account different age classes and both sexes. Total leukocyte counts were not different between age classes. Of the leucocytes count, eosinophils levels were higher in neonates compared with juvenile and adults; nevertheless, the heterophils group was the most prevalent leukocyte in the peripheral blood in all age classes. Lymphocytes, monocytes, heterophils, azurophils and basophils levels did not differ with age. Natural antibodies titres were higher in the adults compared with neonates and juveniles lizards. Lastly, complement system activity was low in neonates compared with juveniles and adults. Statistical analysis within each age group showed that gender was not a factor in the outcomes. Based on the results, we concluded that S. merianae demonstrated age (but not gender related differences in the immune parameters analyzed. Having established baseline values for these four widely-used immunologic biomarkers, ongoing studies will seek to optimize the use of the S. merianae model in future research.

  16. Neoadjuvant chemoradiotherapy of rectal carcinoma. Baseline hematologic parameters influencing outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Hodek, Miroslav; Sirak, Igor; Paluska, Petr; Kopecky, Jindrich; Petera, Jiri; Vosmik, Milan [University Hospital in Hradec Kralove, Department of Oncology and Radiotherapy, Hradec Kralove (Czech Republic); Ferko, Alexander; Oerhalmi, Julius [University Hospital in Hradec Kralove, Department of Surgery, Hradec Kralove (Czech Republic); Hovorkova, Eva; Hadzi Nikolov, Dimitar [University Hospital in Hradec Kralove, Fingerland Department of Pathology, Hradec Kralove (Czech Republic)

    2016-09-15

    The link between the blood count and a systemic inflammatory response (SIR) is indisputable and well described. Pretreatment hematological parameters may predict the overall clinical outcomes in many types of cancer. Thus, this study aims to systematically evaluate the relationship between baseline blood count levels and treatment response in rectal cancer patients treated with neoadjuvant chemoradiotherapy. From 2009-2015, 173 patients with locally advanced rectal cancer were retrospectively enrolled in the study and analyzed. The baseline blood count was recorded in all patients 1 week before chemoradiation. Tumor response was evaluated through pathologic findings. Blood count levels which included RBC (red blood cells), Hb (hemoglobin), PLT (platelet count), neutrophil count, WBC (white blood cells), NLR (neutrophil-to-lymphocyte ratio), and PLR (platelet-to-lymphocyte ratio) were analyzed in relation to tumor downstaging, pCR (pathologic complete response), OS (overall survival), and DFS (disease-free survival). Hb levels were associated with a response in logistic regression analysis: pCR (p = 0.05; OR 1.04, 95 % CI 1.00-1.07); T downstaging (p = 0.006; OR 1.03, 95 % CI 1.01-1.05); N downstaging (p = 0.09; OR 1.02, 95 % CI 1.00-1.04); T or N downstaging (p = 0.007; OR 1.04, 95 % CI 1.01-1.07); T and N downstaging (p = 0.02; OR 1.02, 95 % CI 1.00-1.04); Hb and RBC were the most significant parameters influencing OS; PLT was a negative prognostic factor for OS and DFS (p = 0.008 for OS); an NLR value of 2.8 was associated with the greatest significance for OS (p = 0.03) and primary tumor downstaging (p = 0.02). Knowledge of pretreatment hematological parameters appears to be an important prognostic factor in patients with rectal carcinoma. (orig.) [German] Die Verbindung zwischen dem Blutbild und der systemischen Entzuendungsreaktion (''systemic inflammatory response'', SIR) ist unbestreitbar und gut beschrieben. Aufgrund der

  17. Neutrino Oscillation Parameter Sensitivity in Future Long-Baseline Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bass, Matthew [Colorado State Univ., Fort Collins, CO (United States)

    2014-01-01

    The study of neutrino interactions and propagation has produced evidence for physics beyond the standard model and promises to continue to shed light on rare phenomena. Since the discovery of neutrino oscillations in the late 1990s there have been rapid advances in establishing the three flavor paradigm of neutrino oscillations. The 2012 discovery of a large value for the last unmeasured missing angle has opened the way for future experiments to search for charge-parity symmetry violation in the lepton sector. This thesis presents an analysis of the future sensitivity to neutrino oscillations in the three flavor paradigm for the T2K, NO A, LBNE, and T2HK experiments. The theory of the three flavor paradigm is explained and the methods to use these theoretical predictions to design long baseline neutrino experiments are described. The sensitivity to the oscillation parameters for each experiment is presented with a particular focus on the search for CP violation and the measurement of the neutrino mass hierarchy. The variations of these sensitivities with statistical considerations and experimental design optimizations taken into account are explored. The effects of systematic uncertainties in the neutrino flux, interaction, and detection predictions are also considered by incorporating more advanced simulations inputs from the LBNE experiment.

  18. Damage Identification of Bridge Based on Chebyshev Polynomial Fitting and Fuzzy Logic without Considering Baseline Model Parameters

    Directory of Open Access Journals (Sweden)

    Yu-Bo Jiao

    2015-01-01

    Full Text Available The paper presents an effective approach for damage identification of bridge based on Chebyshev polynomial fitting and fuzzy logic systems without considering baseline model data. The modal curvature of damaged bridge can be obtained through central difference approximation based on displacement modal shape. Depending on the modal curvature of damaged structure, Chebyshev polynomial fitting is applied to acquire the curvature of undamaged one without considering baseline parameters. Therefore, modal curvature difference can be derived and used for damage localizing. Subsequently, the normalized modal curvature difference is treated as input variable of fuzzy logic systems for damage condition assessment. Numerical simulation on a simply supported bridge was carried out to demonstrate the feasibility of the proposed method.

  19. Atmospheric pressure loading parameters from very long baseline interferometry observations

    Science.gov (United States)

    Macmillan, D. S.; Gipson, John M.

    1994-01-01

    Atmospheric mass loading produces a primarily vertical displacement of the Earth's crust. This displacement is correlated with surface pressure and is large enough to be detected by very long baseline interferometry (VLBI) measurements. Using the measured surface pressure at VLBI stations, we have estimated the atmospheric loading term for each station location directly from VLBI data acquired from 1979 to 1992. Our estimates of the vertical sensitivity to change in pressure range from 0 to -0.6 mm/mbar depending on the station. These estimates agree with inverted barometer model calculations (Manabe et al., 1991; vanDam and Herring, 1994) of the vertical displacement sensitivity computed by convolving actual pressure distributions with loading Green's functions. The pressure sensitivity tends to be smaller for stations near the coast, which is consistent with the inverted barometer hypothesis. Applying this estimated pressure loading correction in standard VLBI geodetic analysis improves the repeatability of estimated lengths of 25 out of 37 baselines that were measured at least 50 times. In a root-sum-square (rss) sense, the improvement generally increases with baseline length at a rate of about 0.3 to 0.6 ppb depending on whether the baseline stations are close to the coast. For the 5998-km baseline from Westford, Massachusetts, to Wettzell, Germany, the rss improvement is about 3.6 mm out of 11.0 mm. The average rss reduction of the vertical scatter for inland stations ranges from 2.7 to 5.4 mm.

  20. Baseline Anthropometric and Metabolic Parameters Correlate with Weight Loss in Women 1-Year After Laparoscopic Roux-En-Y Gastric Bypass.

    Science.gov (United States)

    Sans, Arnaud; Bailly, Laurent; Anty, Rodolphe; Sielezenef, Igor; Gugenheim, Jean; Tran, Albert; Gual, Philippe; Iannelli, Antonio

    2017-11-01

    In this study, we explored in a prospective cohort of morbidly obese women undergoing laparoscopic Roux-en-Y gastric bypass (LRYGP) correlations between baseline anthropometrics, metabolic parameters, resting energy expenditure (REE), body composition, and 1-year % excess body mass index loss (%EBMIL). We also investigated risk factors for insufficient %EBMIL. One hundred three consecutive women were prospectively evaluated at baseline (age 40.6 ± 11.2, weight 113.9 kg ± 15.3, BMI 43.3 ± 4.9 kg/m 2 ) and 1 year after LRYGP. Weight, excess weight, brachial circumference, waist circumference, fat mass (FM) and fat-free mass (FFM) (measured with bioelectrical impedance analysis), REE, inflammation, insulin resistance, and lipid disturbances were determined before and 1 year after LRYGP. At 1 year, mean weight loss was 39.8 kg ± 11.7 and mean EBMIL was 15.2 kg/m 2  ± 4.2. Mean %EBMIL was 86% ± 21% (range 30-146%). Baseline brachial circumference, waist circumference and triceps skinfold thickness decreased significantly at 1 year (P baseline body composition parameters, only preoperative FM was negatively correlated with %EBMIL (r = -0.23; p = 0.02). One year after surgery FM change was negatively correlated with EBMIL% (r = -0.49; P baseline blood glucose level (OR = 1.77; CI 95%: [1.3-2.4]) was the only predictive factor of EBMIL Baseline glucose level may be helpful in identifying poor responders to LRYGBP. NCT02820285y ( https://clinicaltrials.gov/ct2/show/NCT02820285?term=Characterization+of+Immune+Semaphorin+in+Non-Alcoholic+Fatty+Liver+Disease+and+NASH&rank=1 ).

  1. Challenges while Updating Planning Parameters of an ERP System and How a Simulation-Based Support System Can Support Material Planners

    Directory of Open Access Journals (Sweden)

    Ulrike Stumvoll

    2016-01-01

    Full Text Available In an Enterprise Resource Planning (ERP system, production planning is influenced by a variety of parameters. Previous investigations show that setting parameter values is highly relevant to a company’s target system. Parameter settings should be checked and adjusted, e.g., after a change in environmental factors, by material planners. In practice, updating the parameters is difficult due to several reasons. This paper presents a simulation-based decision support system, which helps material planners in all stages of decision-making processes. It will present the system prototype’s user interface and the results of applying the system to a case study.

  2. 40 CFR 80.92 - Baseline auditor requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Baseline auditor requirements. 80.92... (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Anti-Dumping § 80.92 Baseline auditor requirements. (a... determination methodology, resulting baseline fuel parameter, volume and emissions values verified by an auditor...

  3. EML Chester - 1982. Annual report of the Regional Baseline Station at Chester, New Jersey

    International Nuclear Information System (INIS)

    Volchok, H.L.

    1982-11-01

    The Environmental Measurements Laboratory (EML) has maintained a regional baseline station at Chester, New Jersey since 1976. The site provides EML with a remote, rural facility for carrying out regional baseline research and for testing field equipment. This report updates the various programs underway at the Chester site. Separate abstracts have been prepared for the included papers

  4. Retrieving CO concentrations from FT-IR spectra with nonmodeled interferences and fluctuating baselines using PCR model parameters

    DEFF Research Database (Denmark)

    Bak, J.

    2001-01-01

    It is demonstrated that good predictions of gas concentrations based on measured spectra can be made even if these spectra contain totally overlapping spectral features from nonidentified and non-modeled interfering compounds and fluctuating baselines. The prediction program (CONTOUR) is based...... solely on principal component regression (PCR) model parameters, CONTOUR consists of two smaller algorithms. The first of these is used to calculate pure component spectra based on the PCR model parameters at different concentrations. In the second algorithm, the calculated pure component spectra...... remains. The assumptions are that the background and analytical signals must be additive and that no accidental match between these signals takes place. The best results are obtained with the use of spectra with a high selectivity. The use of the program is demonstrated hg applying simple single...

  5. U-10Mo Baseline Fuel Fabrication Process Description

    Energy Technology Data Exchange (ETDEWEB)

    Hubbard, Lance R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Arendt, Christina L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dye, Daniel F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Clayton, Christopher K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lerchen, Megan E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lombardo, Nicholas J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lavender, Curt A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zacher, Alan H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2017-09-27

    This document provides a description of the U.S. High Power Research Reactor (USHPRR) low-enriched uranium (LEU) fuel fabrication process. This document is intended to be used in conjunction with the baseline process flow diagram (PFD) presented in Appendix A. The baseline PFD is used to document the fabrication process, communicate gaps in technology or manufacturing capabilities, convey alternatives under consideration, and as the basis for a dynamic simulation model of the fabrication process. The simulation model allows for the assessment of production rates, costs, and manufacturing requirements (manpower, fabrication space, numbers and types of equipment, etc.) throughout the lifecycle of the USHPRR program. This document, along with the accompanying PFD, is updated regularly

  6. Identification of material parameters for plasticity models: A comparative study on the finite element model updating and the virtual fields method

    Science.gov (United States)

    Martins, J. M. P.; Thuillier, S.; Andrade-Campos, A.

    2018-05-01

    The identification of material parameters, for a given constitutive model, can be seen as the first step before any practical application. In the last years, the field of material parameters identification received an important boost with the development of full-field measurement techniques, such as Digital Image Correlation. These techniques enable the use of heterogeneous displacement/strain fields, which contain more information than the classical homogeneous tests. Consequently, different techniques have been developed to extract material parameters from full-field measurements. In this study, two of these techniques are addressed, the Finite Element Model Updating (FEMU) and the Virtual Fields Method (VFM). The main idea behind FEMU is to update the parameters of a constitutive model implemented in a finite element model until both numerical and experimental results match, whereas VFM makes use of the Principle of Virtual Work and does not require any finite element simulation. Though both techniques proved their feasibility in linear and non-linear constitutive models, it is rather difficult to rank their robustness in plasticity. The purpose of this work is to perform a comparative study in the case of elasto-plastic models. Details concerning the implementation of each strategy are presented. Moreover, a dedicated code for VFM within a large strain framework is developed. The reconstruction of the stress field is performed through a user subroutine. A heterogeneous tensile test is considered to compare FEMU and VFM strategies.

  7. Updated fit to three neutrino mixing: exploring the accelerator-reactor complementarity

    International Nuclear Information System (INIS)

    Esteban, Ivan; Gonzalez-Garcia, M.C.; Maltoni, Michele; Martinez-Soler, Ivan; Schwetz, Thomas

    2017-01-01

    We perform a combined fit to global neutrino oscillation data available as of fall 2016 in the scenario of three-neutrino oscillations and present updated allowed ranges of the six oscillation parameters. We discuss the differences arising between the consistent combination of the data samples from accelerator and reactor experiments compared to partial combinations. We quantify the confidence in the determination of the less precisely known parameters θ 23 , δ CP , and the neutrino mass ordering by performing a Monte Carlo study of the long baseline accelerator and reactor data. We find that the sensitivity to the mass ordering and the θ 23 octant is below 1σ. Maximal θ 23 mixing is allowed at slightly more than 90% CL. The best fit for the CP violating phase is around 270 ∘ , CP conservation is allowed at slightly above 1σ, and values of δ CP ≃90 ∘ are disfavored at around 99% CL for normal ordering and higher CL for inverted ordering.

  8. Estimating Propensity Parameters Using Google PageRank and Genetic Algorithms.

    Science.gov (United States)

    Murrugarra, David; Miller, Jacob; Mueller, Alex N

    2016-01-01

    Stochastic Boolean networks, or more generally, stochastic discrete networks, are an important class of computational models for molecular interaction networks. The stochasticity stems from the updating schedule. Standard updating schedules include the synchronous update, where all the nodes are updated at the same time, and the asynchronous update where a random node is updated at each time step. The former produces a deterministic dynamics while the latter a stochastic dynamics. A more general stochastic setting considers propensity parameters for updating each node. Stochastic Discrete Dynamical Systems (SDDS) are a modeling framework that considers two propensity parameters for updating each node and uses one when the update has a positive impact on the variable, that is, when the update causes the variable to increase its value, and uses the other when the update has a negative impact, that is, when the update causes it to decrease its value. This framework offers additional features for simulations but also adds a complexity in parameter estimation of the propensities. This paper presents a method for estimating the propensity parameters for SDDS. The method is based on adding noise to the system using the Google PageRank approach to make the system ergodic and thus guaranteeing the existence of a stationary distribution. Then with the use of a genetic algorithm, the propensity parameters are estimated. Approximation techniques that make the search algorithms efficient are also presented and Matlab/Octave code to test the algorithms are available at http://www.ms.uky.edu/~dmu228/GeneticAlg/Code.html.

  9. Effects of Baseline Selection on Magnetocardiography: P-Q and T-P Intervals

    International Nuclear Information System (INIS)

    Lim, Hyun Kyoon; Kwon, Hyuk Chan; Kim, Tae En; Lee, Yong Ho; Kim, Jin Mok; Kim, In Seon; Kim, Ki Woong; Park, Yong Ki

    2007-01-01

    The baseline selection is the first and important step to analyze magnetocardiography (MCG) parameters. There are no difficulties to select the baseline between P- and Q-wave peak (P-Q interval) of MCG wave recorded from healthy subjects because the P-Q intervals of the healthy subjects do not much vary. However, patients with ischemic heart disease often show an unstable P-Q interval which does not seem to be appropriate for the baseline. In this case, T-P interval is alternatively recommended for the baseline. However, there has been no study on the difference made by the baseline selection. In this study, we studied the effect of the different baseline selection. MCG data were analyzed from twenty healthy subjects and twenty one patients whose baselines were alternatively selected in the T-P interval for their inappropriate P-Q interval. Paired T-test was used to compare two set of data. Fifteen parameters derived from the R-wave peak, the T-wave peak, and the period, T max/3 ∼ T max were compared for the different baseline selection. As a result, most parameters did not show significant differences (p>0.05) except few parameters. Therefore, there will be no significant differences if anyone of two intervals were selected for the MCG baseline. However, for the consistent analysis, P-Q interval is strongly recommended for the baseline correction.

  10. SRP Baseline Hydrogeologic Investigation, Phase 3

    Energy Technology Data Exchange (ETDEWEB)

    Bledsoe, H.W.

    1988-08-01

    The SRP Baseline Hydrogeologic Investigation was implemented for the purpose of updating and improving the knowledge and understanding of the hydrogeologic systems underlying the SRP site. Phase III, which is discussed in this report, includes the drilling of 7 deep coreholes (sites P-24 through P-30) and the installation of 53 observation wells ranging in depth from approximately 50 ft to more than 970 ft below the ground surface. In addition to the collection of geologic cores for lithologic and stratigraphic study, samples were also collected for the determination of physical characteristics of the sediments and for the identification of microorganisms.

  11. Updating systematic reviews: an international survey.

    Directory of Open Access Journals (Sweden)

    Chantelle Garritty

    a formal written policy for updating SRs. This research marks the first baseline data available on updating from an organizational perspective.

  12. Key Update Assistant for Resource-Constrained Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    developed a push-button solution - powered by stochastic model checking - that network designers can easily benefit from, and it paves the way for consumers to set up key update related security parameters. Key Update Assistant, as we named it, runs necessary model checking operations and determines...

  13. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  14. Estimating Propensity Parameters using Google PageRank and Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    David Murrugarra

    2016-11-01

    Full Text Available Stochastic Boolean networks, or more generally, stochastic discrete networks, are an important class of computational models for molecular interaction networks. The stochasticity stems from the updating schedule. Standard updating schedules include the synchronous update, where all the nodes are updated at the same time, and the asynchronous update where a random node is updated at each time step. The former produces a deterministic dynamics while the latter a stochastic dynamics. A more general stochastic setting considers propensity parameters for updating each node. Stochastic Discrete Dynamical Systems (SDDS are a modeling framework that considers two propensity parameters for updating each node and uses one when the update has a positive impact on the variable, that is, when the update causes the variable to increase its value, and uses the other when the update has a negative impact, that is, when the update causes it to decrease its value. This framework offers additional features for simulations but also adds a complexity in parameter estimation of the propensities. This paper presents a method for estimating the propensity parameters for SDDS. The method is based on adding noise to the system using the Google PageRank approach to make the system ergodic and thus guaranteeing the existence of a stationary distribution. Then with the use of a genetic algorithm, the propensity parameters are estimated. Approximation techniques that make the search algorithms efficient are also presented and Matlab/Octave code to test the algorithms are available at~href{http://www.ms.uky.edu/~dmu228/GeneticAlg/Code.html}{http://www.ms.uky.edu/$sim$dmu228GeneticAlgCode.html}.

  15. MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation

    Directory of Open Access Journals (Sweden)

    Howard Daniel

    2006-01-01

    Full Text Available Stochastic Bernstein (SB approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant or tuned with evolutionary computation (for .

  16. FY2016 Update on ILAW Glass Testing for Disposal at IDF

    Energy Technology Data Exchange (ETDEWEB)

    Brown, E. E. [Hanford Site (HNF), Richland, WA (United States); Swanberg, D. J. [Hanford Site (HNF), Richland, WA (United States); Muller, Isabelle S. [The Catholic Univ. of America, Washington, DC (United States); Pegg, Ian L. [The Catholic Univ. of America, Washington, DC (United States)

    2017-04-12

    This status report provides a FY2016 update on work performed to collect information on the corrosion behavior of LAW glasses to support the IDF PA. In addition to the development of the baseline operating envelope for the WTP, since 2003, VSL has developed a wide range of LAW formulations that achieve considerably higher waste loadings than the WTP baseline formulations.

  17. The use of autecological and environmental parameters for establishing the status of lichen vegetation in a baseline study for a long-term monitoring survey

    International Nuclear Information System (INIS)

    Gombert, S.; Asta, J.; Seaward, M.R.D.

    2005-01-01

    In 1997 the ecological characteristics of the epiphytic species (83 lichens and two algae) of an urban area (Grenoble, France) were determined. Seven autecological indices were used to characterize the lichen ecology: illumination index, humidity index, pH of bark, nutrient status of substratum, ecological index of IAP and frequency. Six clusters (A1-A6) were defined using cluster analysis and principal component analysis. Seven environmental parameters characterizing the stations and the lichen releves were also used: elevation, parameters of artificiality (urbanization, traffic and local land use), IAP, and the percentage of nitrophytic and acidophytic species. Six clusters (B1-B6) were defined using cluster analysis and canonical correspondence analysis. Four clusters (C1-C4) were finally defined using an empirical integrated method combining the autecological and environmental parameters. This final clustering which established the status of the lichen vegetation in 1997 can be reliably used as a baseline study to effectively monitor environmental changes in this urban area. - Ecological clustering which establishes the status of lichen vegetation can be reliably used as a baseline study to monitor environmental changes

  18. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  19. Baseline projections of transportation energy consumption by mode: 1981 update

    Energy Technology Data Exchange (ETDEWEB)

    Millar, M; Bunch, J; Vyas, A; Kaplan, M; Knorr, R; Mendiratta, V; Saricks, C

    1982-04-01

    A comprehensive set of activity and energy-demand projections for each of the major transportation modes and submodes is presented. Projections are developed for a business-as-usual scenario, which provides a benchmark for assessing the effects of potential conservation strategies. This baseline scenario assumes a continuation of present trends, including fuel-efficiency improvements likely to result from current efforts of vehicle manufacturers. Because of anticipated changes in fuel efficiency, fuel price, modal shifts, and a lower-than-historic rate of economic growth, projected growth rates in transportation activity and energy consumption depart from historic patterns. The text discusses the factors responsible for this departure, documents the assumptions and methodologies used to develop the modal projections, and compares the projections with other efforts.

  20. Updated fit to three neutrino mixing: exploring the accelerator-reactor complementarity

    Energy Technology Data Exchange (ETDEWEB)

    Esteban, Ivan [Departament de Fisíca Quàntica i Astrofísica and Institut de Ciencies del Cosmos,Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Gonzalez-Garcia, M.C. [Departament de Fisíca Quàntica i Astrofísica and Institut de Ciencies del Cosmos,Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Institució Catalana de Recerca i Estudis Avançats (ICREA),Pg. Lluis Companys 23, 08010 Barcelona (Spain); C.N. Yang Institute for Theoretical Physics, State University of New York at Stony Brook,Stony Brook, NY 11794-3840 (United States); Maltoni, Michele; Martinez-Soler, Ivan [Instituto de Física Teórica UAM/CSIC, Universidad Autónoma de Madrid,Calle de Nicolás Cabrera 13-15, Cantoblanco, E-28049 Madrid (Spain); Schwetz, Thomas [Institut für Kernphysik, Karlsruher Institut für Technologie (KIT),D-76021 Karlsruhe (Germany)

    2017-01-20

    We perform a combined fit to global neutrino oscillation data available as of fall 2016 in the scenario of three-neutrino oscillations and present updated allowed ranges of the six oscillation parameters. We discuss the differences arising between the consistent combination of the data samples from accelerator and reactor experiments compared to partial combinations. We quantify the confidence in the determination of the less precisely known parameters θ{sub 23}, δ{sub CP}, and the neutrino mass ordering by performing a Monte Carlo study of the long baseline accelerator and reactor data. We find that the sensitivity to the mass ordering and the θ{sub 23} octant is below 1σ. Maximal θ{sub 23} mixing is allowed at slightly more than 90% CL. The best fit for the CP violating phase is around 270{sup ∘}, CP conservation is allowed at slightly above 1σ, and values of δ{sub CP}≃90{sup ∘} are disfavored at around 99% CL for normal ordering and higher CL for inverted ordering.

  1. FEM Updating of the Heritage Court Building Structure

    DEFF Research Database (Denmark)

    Ventura, C. E.; Brincker, Rune; Dascotte, E.

    2001-01-01

    . The starting model of the structure was developed from the information provided in the design documentation of the building. Different parameters of the model were then modified using an automated procedure to improve the correlation between measured and calculated modal parameters. Careful attention......This paper describes results of a model updating study conducted on a 15-storey reinforced concrete shear core building. The output-only modal identification results obtained from ambient vibration measurements of the building were used to update a finite element model of the structure...

  2. Accelerated Best Basis Inventory Baselining Task

    International Nuclear Information System (INIS)

    SASAKI, L.M.

    2001-01-01

    The baselining effort was recently proposed to bring the Best-Basis Inventory (BBI) and Question No.8 of the Tank Interpretive Report (TIR) for all 177 tanks to the current standards and protocols and to prepare a TIR Question No.8 if one is not already available. This plan outlines the objectives and methodology of the accelerated BBI baselining task. BBI baselining meetings held during December 2000 resulted in a revised BBI methodology and an initial set of BBI creation rules to be used in the baselining effort. The objectives of the BBI baselining effort are to: (1) Provide inventories that are consistent with the revised BBI methodology and new BBI creation rules. (2) Split the total tank waste in each tank into six waste phases, as appropriate (Supernatant, saltcake solids, saltcake liquid, sludge solids, sludge liquid, and retained gas). In some tanks, the solids and liquid portions of the sludge and/or saltcake may be combined into a single sludge or saltcake phase. (3) Identify sampling events that are to be used for calculating the BBIs. (4) Update waste volumes for subsequent reconciliation with the Hanlon (2001) waste tank summary. (5) Implement new waste type templates. (6) Include any sample data that might have been unintentionally omitted in the previous BBI and remove any sample data that should not have been included. Sample data to be used in the BBI must be available on TWINS. (7) Ensure that an inventory value for each standard BBI analyte is provided for each waste component. Sample based inventories for supplemental BBI analytes will be included when available. (8) Provide new means and confidence interval reports if one is not already available and include uncertainties in reporting inventory values

  3. A review on model updating of joint structure for dynamic analysis purpose

    Directory of Open Access Journals (Sweden)

    Zahari S.N.

    2016-01-01

    Full Text Available Structural joints provide connection between structural element (beam, plate etc. in order to construct a whole assembled structure. There are many types of structural joints such as bolted joint, riveted joints and welded joints. The joints structures significantly contribute to structural stiffness and dynamic behaviour of structures hence the main objectives of this paper are to review on method of model updating on joints structure and to discuss the guidelines to perform model updating for dynamic analysis purpose. This review paper firstly will outline some of the existing finite element modelling works of joints structure. Experimental modal analysis is the next step to obtain modal parameters (natural frequency & mode shape to validate and improve the discrepancy between results obtained from experimental and the simulation counterparts. Hence model updating will be carried out to minimize the differences between the two results. There are two methods of model updating; direct method and iterative method. Sensitivity analysis employed using SOL200 in NASTRAN by selecting the suitable updating parameters to avoid ill-conditioning problem. It is best to consider both geometrical and material properties in the updating procedure rather than choosing only a number of geometrical properties alone. Iterative method was chosen as the best model updating procedure because the physical meaning of updated parameters are guaranteed although this method required computational effort compare to direct method.

  4. Nonsynchronous updating in the multiverse of cellular automata.

    Science.gov (United States)

    Reia, Sandro M; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  5. Nonsynchronous updating in the multiverse of cellular automata

    Science.gov (United States)

    Reia, Sandro M.; Kinouchi, Osame

    2015-04-01

    In this paper we study updating effects on cellular automata rule space. We consider a subset of 6144 order-3 automata from the space of 262144 bidimensional outer-totalistic rules. We compare synchronous to asynchronous and sequential updatings. Focusing on two automata, we discuss how update changes destroy typical structures of these rules. Besides, we show that the first-order phase transition in the multiverse of synchronous cellular automata, revealed with the use of a recently introduced control parameter, seems to be robust not only to changes in update schema but also to different initial densities.

  6. The EURISOL Beta-beam Facility: Parameter and Intensity Values, Version 2

    CERN Document Server

    Benedikt, M; Lindroos, M; Fabich, A

    An initial “bottom-up” analysis of ion intensities along the accelerator chain is revised to take into account more recent simulations of the stacking of 18Ne ions in the decay ring and beneficial trends in output flux as functions of certain machine parameters. In addition, space charge detuning at injection in the PS has led to a rethink of the top energy of the RCS, while that at injection in the SPS has had an impact on the number of bunches per batch delivered by the PS. We present transverse emittance values (which enter the space charge tune shift calculations) together with an updated list of intensities for both ion species under consideration in the baseline scenario.

  7. A proposal to create an extension to the European baseline series.

    Science.gov (United States)

    Wilkinson, Mark; Gallo, Rosella; Goossens, An; Johansen, Jeanne D; Rustemeyer, Thomas; Sánchez-Pérez, Javier; Schuttelaar, Marie L; Uter, Wolfgang

    2018-02-01

    The current European baseline series consists of 30 allergens, and was last updated in 2015. To use data from the European Surveillance System on Contact Allergies (ESSCA) to propose an extension to the European baseline series in response to changes in environmental exposures. Data from departmental and national extensions to the baseline series, together with some temporary additions from departments contributing to the ESSCA, were collated during 2013-2014. In total, 31689 patients were patch tested in 46 European departments. Many departments and national groups already consider the current European baseline series to be a suboptimal screen, and use their own extensions to it. The haptens tested are heterogeneous, although there are some consistent themes. Potential haptens to include in an extension to the European baseline series comprise sodium metabisulfite, formaldehyde-releasing preservatives, additional markers of fragrance allergy, propolis, Compositae mix, and 2-hydroxyethyl methacrylate. In combination with other published work from the ESSCA, changes to the current European baseline series are proposed for discussion. As well as addition of the allergens listed above, it is suggested that primin and clioquinol should be deleted from the series, owing to reduced environmental exposure. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  9. The full potential of the baseline SASE undulators of the European XFEL

    International Nuclear Information System (INIS)

    Agapov, Ilya; Geloni, Gianluca; Feng, Guangyao; Kocharyan, Vitali; Saldin, Evgeni; Serkez, Svitozar; Zagorodnov, Igor

    2014-04-01

    The output SASE characteristics of the baseline European XFEL, recently used in the TDRs of scientific instruments and X-ray optics, have been previously optimized assuming uniform undulators without considering the potential of undulator tapering in the SASE regime. Here we demonstrate that the performance of European XFEL sources can be significantly improved without additional hardware. The procedure simply consists in the optimization of the undulator gap configuration for each X-ray beamline. Here we provide a comprehensive description of the soft X-ray photon beam properties as a function of wavelength and bunch charge. Based on nominal parameters for the electron beam, we demonstrate that undulator tapering allows one to achieve up to a tenfold increase in peak power and photon spectral density in the conventional SASE regime. We illustrate this fact for the SASE3 beamline. The FEL code Genesis has been extensively used for these studies. Based on these findings we suggest that the requirements for the SASE3 instrument (SCS, SQS) and for the SASE3 beam transport system be updated.

  10. Prediction-error variance in Bayesian model updating: a comparative study

    Science.gov (United States)

    Asadollahi, Parisa; Li, Jian; Huang, Yong

    2017-04-01

    In Bayesian model updating, the likelihood function is commonly formulated by stochastic embedding in which the maximum information entropy probability model of prediction error variances plays an important role and it is Gaussian distribution subject to the first two moments as constraints. The selection of prediction error variances can be formulated as a model class selection problem, which automatically involves a trade-off between the average data-fit of the model class and the information it extracts from the data. Therefore, it is critical for the robustness in the updating of the structural model especially in the presence of modeling errors. To date, three ways of considering prediction error variances have been seem in the literature: 1) setting constant values empirically, 2) estimating them based on the goodness-of-fit of the measured data, and 3) updating them as uncertain parameters by applying Bayes' Theorem at the model class level. In this paper, the effect of different strategies to deal with the prediction error variances on the model updating performance is investigated explicitly. A six-story shear building model with six uncertain stiffness parameters is employed as an illustrative example. Transitional Markov Chain Monte Carlo is used to draw samples of the posterior probability density function of the structure model parameters as well as the uncertain prediction variances. The different levels of modeling uncertainty and complexity are modeled through three FE models, including a true model, a model with more complexity, and a model with modeling error. Bayesian updating is performed for the three FE models considering the three aforementioned treatments of the prediction error variances. The effect of number of measurements on the model updating performance is also examined in the study. The results are compared based on model class assessment and indicate that updating the prediction error variances as uncertain parameters at the model

  11. Optimal updating magnitude in adaptive flat-distribution sampling.

    Science.gov (United States)

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  12. Item response theory analysis of the mechanics baseline test

    Science.gov (United States)

    Cardamone, Caroline N.; Abbott, Jonathan E.; Rayyan, Saif; Seaton, Daniel T.; Pawl, Andrew; Pritchard, David E.

    2012-02-01

    Item response theory is useful in both the development and evaluation of assessments and in computing standardized measures of student performance. In item response theory, individual parameters (difficulty, discrimination) for each item or question are fit by item response models. These parameters provide a means for evaluating a test and offer a better measure of student skill than a raw test score, because each skill calculation considers not only the number of questions answered correctly, but the individual properties of all questions answered. Here, we present the results from an analysis of the Mechanics Baseline Test given at MIT during 2005-2010. Using the item parameters, we identify questions on the Mechanics Baseline Test that are not effective in discriminating between MIT students of different abilities. We show that a limited subset of the highest quality questions on the Mechanics Baseline Test returns accurate measures of student skill. We compare student skills as determined by item response theory to the more traditional measurement of the raw score and show that a comparable measure of learning gain can be computed.

  13. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  14. Non-linear Bayesian update of PCE coefficients

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.; Pojonk, Oliver; Rosic, Bojana V.; Zander, Elmar

    2014-01-01

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(?), a measurement operator Y (u(q), q), where u(q, ?) uncertain solution. Aim: to identify q(?). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(!) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a unctional approximation, e.g. polynomial chaos expansion (PCE). New: We apply Bayesian update to the PCE coefficients of the random coefficient q(?) (not to the probability density function of q).

  15. Geodesy by radio interferometry - Determinations of baseline vector, earth rotation, and solid earth tide parameters with the Mark I very long baseline radio interferometery system

    Science.gov (United States)

    Ryan, J. W.; Clark, T. A.; Coates, R. J.; Ma, C.; Wildes, W. T.

    1986-01-01

    Thirty-seven very long baseline radio interferometry experiments performed between 1972 and 1978 are analyzed and estimates of baseline vectors between six sites, five in the continental United States and one in Europe are derived. No evidence of significant changes in baseline length is found. For example, with a statistical level of confidence of approximately 85 percent, upper bounds on such changes within the United States ranged from a low of 10 mm/yr for the 850 km baseline between Westford, Massachusetts, and Green Bank, West Virginia, to a high of 90 mm/yr for the nearly 4000 km baseline between Westford and Goldstone, California. Estimates for universal time and for the x component of the position of the earth's pole are obtained. For the last 15 experiments, the only ones employing wideband receivers, the root-mean-square differences between the derived values and the corresponding ones published by the Bureau International de l'Heure are 0.0012 s and 0.018 arc sec respectively. The average value obtained for the radial Love number for the solid earth is 0.62 + or - 0.02 (estimated standard error).

  16. DGEMP-OE (2008) Energy Baseline Scenario. Synthesis report

    International Nuclear Information System (INIS)

    2008-01-01

    A 'Business as usual' or 'Baseline' scenario of energy trends to 2020-2030 is produced by France every four years, as requested by the International Energy Agency in order to update the global scenarios published in its World Energy Outlook. Since the most recent scenario of this type was drawn up in 2003-2004, the time has come to renew the effort for the IEA's next in-depth review of French energy policy. Specifically, the DGEMP seeks to predict the future of France's energy situation assuming that no policies or new measures are taken affecting (i.e. improving or deteriorating) the situation other than those already in place or adopted as of 1 January 2008 (in other words, before measures such as those stemming from the Grenelle Environment Forum). On the other hand, it is assumed that change in the energy system is guided by 'conventional wisdom' according to which political options and behaviours by economic units are expected to be 'reasonable'. As a result, even should its projections prove inappropriate, this cannot be considered a 'worst-case' scenario. Indeed, beyond the IEA, this scenario can be used to establish an MEA (Multilateral Environment Agreement) scenario (based on existing measures) for national communications submitted under the U.N. Climate Convention. The scenarios by the 'Energy' Commission, part of the Centre d'Analyse Strategique (CAS), could have been used, particularly since the consultant who worked with the CAS to develop its scenarios was also commissioned by the DGEMP. However, several considerations argued in favour of proceeding separately: - The CAS scenarios drew on the DGEMP's 2004 baseline scenario, even though certain parameters were updated (in particular energy prices). - Moreover, the concept underpinning the DGEMP baseline scenario is that it should to every extent possible remain constant over time to secure continued consensus on this 'reference' at national level. - Finally, the MEDEE energy demand model applied in

  17. Estimation of beam material random field properties via sensitivity-based model updating using experimental frequency response functions

    Science.gov (United States)

    Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.

    2018-03-01

    Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.

  18. Pseudodynamic systems approach based on a quadratic approximation of update equations for diffuse optical tomography.

    Science.gov (United States)

    Biswas, Samir Kumar; Kanhirodan, Rajan; Vasu, Ram Mohan; Roy, Debasish

    2011-08-01

    We explore a pseudodynamic form of the quadratic parameter update equation for diffuse optical tomographic reconstruction from noisy data. A few explicit and implicit strategies for obtaining the parameter updates via a semianalytical integration of the pseudodynamic equations are proposed. Despite the ill-posedness of the inverse problem associated with diffuse optical tomography, adoption of the quadratic update scheme combined with the pseudotime integration appears not only to yield higher convergence, but also a muted sensitivity to the regularization parameters, which include the pseudotime step size for integration. These observations are validated through reconstructions with both numerically generated and experimentally acquired data.

  19. 2016 Annual Technology Baseline (ATB)

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; O' Connor, Patrick; Waldoch, Connor

    2016-09-01

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  20. The TDAQ Baseline Architecture

    CERN Multimedia

    Wickens, F J

    The Trigger-DAQ community is currently busy preparing material for the DAQ, HLT and DCS TDR. Over the last few weeks a very important step has been a series of meetings to complete agreement on the baseline architecture. An overview of the architecture indicating some of the main parameters is shown in figure 1. As reported at the ATLAS Plenary during the February ATLAS week, the main area where the baseline had not yet been agreed was around the Read-Out System (ROS) and details in the DataFlow. The agreed architecture has: Read-Out Links (ROLs) from the RODs using S-Link; Read-Out Buffers (ROB) sited near the RODs, mounted in a chassis - today assumed to be a PC, using PCI bus at least for configuration, control and monitoring. The baseline assumes data aggregation, in the ROB and/or at the output (which could either be over a bus or in the network). Optimization of the data aggregation will be made in the coming months, but the current model has each ROB card receiving input from 4 ROLs, and 3 such c...

  1. Digital baseline estimation method for multi-channel pulse height analyzing

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun

    2005-01-01

    The basic features of digital baseline estimation for multi-channel pulse height analysis are introduced. The weight-function of minimum-noise baseline filter is deduced with functional variational calculus. The frequency response of this filter is also deduced with Fourier transformation, and the influence of parameters on amplitude frequency response characteristics is discussed. With MATLAB software, the noise voltage signal from the charge sensitive preamplifier is simulated, and the processing effect of minimum-noise digital baseline estimation is verified. According to the results of this research, digital baseline estimation method can estimate baseline optimally, and it is very suitable to be used in digital multi-channel pulse height analysis. (authors)

  2. Damages detection in cylindrical metallic specimens by means of statistical baseline models and updated daily temperature profiles

    Science.gov (United States)

    Villamizar-Mejia, Rodolfo; Mujica-Delgado, Luis-Eduardo; Ruiz-Ordóñez, Magda-Liliana; Camacho-Navarro, Jhonatan; Moreno-Beltrán, Gustavo

    2017-05-01

    In previous works, damage detection of metallic specimens exposed to temperature changes has been achieved by using a statistical baseline model based on Principal Component Analysis (PCA), piezodiagnostics principle and taking into account temperature effect by augmenting the baseline model or by using several baseline models according to the current temperature. In this paper a new approach is presented, where damage detection is based in a new index that combine Q and T2 statistical indices with current temperature measurements. Experimental tests were achieved in a carbon-steel pipe of 1m length and 1.5 inches diameter, instrumented with piezodevices acting as actuators or sensors. A PCA baseline model was obtained to a temperature of 21º and then T2 and Q statistical indices were obtained for a 24h temperature profile. Also, mass adding at different points of pipe between sensor and actuator was used as damage. By using the combined index the temperature contribution can be separated and a better differentiation of damages respect to undamaged cases can be graphically obtained.

  3. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  4. Updated fracture incidence rates for the US version of FRAX (registered trademark)

    Science.gov (United States)

    Evaluation of results produced by the US version of FRAX (trademarked) indicates that this tool overestimates the likelihood of major osteoporotic fracture. In an attempt to correct this, we updated underlying baseline fracture rates for the model. We used US hospital discharge data from 2006 to ca...

  5. Collider baseline parameters: Milestone M1.5

    CERN Document Server

    Schulte, Daniel

    2016-01-01

    The deliverable D1.1 provided a preliminary specification of the layout and target operation parameters for the FCC-hh hadron collider concept. It serves as the basis for the studies in all work packages. Tis milestone summarises the outcome of the first studies of this design. The goal of the FCC hadron collider is to provide proton-proton collisions at a centre-of-mass energy of 100 TeV. The machine is compatible with ion beam operation. Assuming a nominal dipole field of 16 T, such a machine is based on a perimeter of 100 km. The machine is designed to accommodate two main proton experiments that are operated simultaneously. The machine delivers a peak luminosity of 5-30 x 1034 cm-2s-1. The layout allows for two additional special-purpose experiments.

  6. Indoor air quality in the Karns research houses: baseline measurements and impact of indoor environmental parameters on formaldehyde concentrations

    International Nuclear Information System (INIS)

    Matthews, T.G.; Fung, K.W.; Tromberg, B.J.; Hawthorne, A.R.

    1985-12-01

    Baseline indoor air quality measurements, a nine-month radon study, and an environmental parameters study examining the impact of indoor temperature (T) and relative humidity (RH) levels on formaldehyde (CH 2 O) concentrations have been performed in three unoccupied research homes located in Karns, Tennessee. Inter-house comparison measurements of (1) CH 2 O concentration, (2) CH 2 O emission rates from primary CH 2 O emission sources, (3) radon and radon daughter concentrations, and (4) air exchange rates indicate that the three homes are similar. The results of the nine-month radon study indicate indoor concentrations consistently below the EPA recommended level of 4 pCi/L. Evidence was found that crawl-space concentrations may be reduced using heat pump systems whose outdoor units circulate fresh air through the crawl-space. The modeled results of the environmental parameters study indicate approximate fourfold increases in CH 2 O concentrations from 0.07 to 0.27 ppM for seasonal T and RH conditions of 20 0 C, 30% RH and 29 0 C, 80% RH, respectively. Evaluation of these environmental parameters study data with steady-state CH 2 O concentration models developed from laboratory studies of the environmental dependence of CH 2 O emissions from particleboard underlayment indicate good correlations between the laboratory and field studies

  7. A New Approach to Estimate Forest Parameters Using Dual-Baseline Pol-InSAR Data

    Science.gov (United States)

    Bai, L.; Hong, W.; Cao, F.; Zhou, Y.

    2009-04-01

    In POL-InSAR applications using ESPRIT technique, it is assumed that there exist stable scattering centres in the forest. However, the observations in forest severely suffer from volume and temporal decorrelation. The forest scatters are not stable as assumed. The obtained interferometric information is not accurate as expected. Besides, ESPRIT techniques could not identify the interferometric phases corresponding to the ground and the canopy. It provides multiple estimations for the height between two scattering centers due to phase unwrapping. Therefore, estimation errors are introduced to the forest height results. To suppress the two types of errors, we use the dual-baseline POL-InSAR data to estimate forest height. Dual-baseline coherence optimization is applied to obtain interferometric information of stable scattering centers in the forest. From the interferometric phases for different baselines, estimation errors caused by phase unwrapping is solved. Other estimation errors can be suppressed, too. Experiments are done to the ESAR L band POL-InSAR data. Experimental results show the proposed methods provide more accurate forest height than ESPRIT technique.

  8. Lawrence Livermore National Laboratory Emergency Response Capability Baseline Needs Assessment Requirement Document

    Energy Technology Data Exchange (ETDEWEB)

    Sharry, J A

    2009-12-30

    This revision of the LLNL Fire Protection Baseline Needs Assessment (BNA) was prepared by John A. Sharry, LLNL Fire Marshal and LLNL Division Leader for Fire Protection and reviewed by Martin Gresho, Sandia/CA Fire Marshal. The document follows and expands upon the format and contents of the DOE Model Fire Protection Baseline Capabilities Assessment document contained on the DOE Fire Protection Web Site, but only address emergency response. The original LLNL BNA was created on April 23, 1997 as a means of collecting all requirements concerning emergency response capabilities at LLNL (including response to emergencies at Sandia/CA) into one BNA document. The original BNA documented the basis for emergency response, emergency personnel staffing, and emergency response equipment over the years. The BNA has been updated and reissued five times since in 1998, 1999, 2000, 2002, and 2004. A significant format change was performed in the 2004 update of the BNA in that it was 'zero based.' Starting with the requirement documents, the 2004 BNA evaluated the requirements, and determined minimum needs without regard to previous evaluations. This 2010 update maintains the same basic format and requirements as the 2004 BNA. In this 2010 BNA, as in the previous BNA, the document has been intentionally divided into two separate documents - the needs assessment (1) and the compliance assessment (2). The needs assessment will be referred to as the BNA and the compliance assessment will be referred to as the BNA Compliance Assessment. The primary driver for separation is that the needs assessment identifies the detailed applicable regulations (primarily NFPA Standards) for emergency response capabilities based on the hazards present at LLNL and Sandia/CA and the geographical location of the facilities. The needs assessment also identifies areas where the modification of the requirements in the applicable NFPA standards is appropriate, due to the improved fire protection

  9. Network meta-analysis of disconnected networks: How dangerous are random baseline treatment effects?

    Science.gov (United States)

    Béliveau, Audrey; Goring, Sarah; Platt, Robert W; Gustafson, Paul

    2017-12-01

    In network meta-analysis, the use of fixed baseline treatment effects (a priori independent) in a contrast-based approach is regularly preferred to the use of random baseline treatment effects (a priori dependent). That is because, often, there is not a need to model baseline treatment effects, which carry the risk of model misspecification. However, in disconnected networks, fixed baseline treatment effects do not work (unless extra assumptions are made), as there is not enough information in the data to update the prior distribution on the contrasts between disconnected treatments. In this paper, we investigate to what extent the use of random baseline treatment effects is dangerous in disconnected networks. We take 2 publicly available datasets of connected networks and disconnect them in multiple ways. We then compare the results of treatment comparisons obtained from a Bayesian contrast-based analysis of each disconnected network using random normally distributed and exchangeable baseline treatment effects to those obtained from a Bayesian contrast-based analysis of their initial connected network using fixed baseline treatment effects. For the 2 datasets considered, we found that the use of random baseline treatment effects in disconnected networks was appropriate. Because those datasets were not cherry-picked, there should be other disconnected networks that would benefit from being analyzed using random baseline treatment effects. However, there is also a risk for the normality and exchangeability assumption to be inappropriate in other datasets even though we have not observed this situation in our case study. We provide code, so other datasets can be investigated. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Single-Shell Tank (SST) Retrieval Sequence Fiscal Year 2000 Update

    International Nuclear Information System (INIS)

    GARFIELD, J.S.

    2000-01-01

    This document describes the baseline single-shell tank (SST) waste retrieval sequence for the River Protection Project (RPP) updated for Fiscal Year 2000. The SST retrieval sequence identifies the proposed retrieval order (sequence), the tank selection and prioritization rationale, and planned retrieval dates for Hanford SSTs. In addition, the tank selection criteria and reference retrieval method for this sequence are discussed

  11. Co-operation and Phase Behavior under the Mixed Updating Rules

    International Nuclear Information System (INIS)

    Zhang Wen; Li Yao-Sheng; Xu Chen

    2015-01-01

    We present a model by considering two updating rules when the agents play prisoner's dilemma on a square lattice. Agents can update their strategies by referencing one of his neighbors of higher payoffs under the imitation updating rule or directly replaced by one of his neighbors according to the death-birth updating rule. The frequency of co-operation is related to the probability q of occurrence of the imitation updating or the death-birth updating and the game parameter b. The death-birth updating rule favors the co-operation while the imitation updating rule favors the defection on the lattice, although both rules suppress the co-operation in the well-mixed population. Therefore a totally co-operative state may emerge when the death-birth updating is involved in the evolution when b is relatively small. We also obtain a phase diagram on the q-b plane. There are three phases on the plane with two pure phases of a totally co-operative state and a totally defective state and a mixing phase of mixed strategies. Based on the pair approximation, we theoretically analyze the phase behavior and obtain a quantitative agreement with the simulation results. (paper)

  12. Task completion report for update DTPCON

    Energy Technology Data Exchange (ETDEWEB)

    Steinke, R.G.

    1997-09-10

    Update DTPCON programs TRAC-P Version 5.4.28 with a Namelist variable IDTPC option to input 21 timestep-control parameter constants and 3 switch control variables. The 21 timestep-control parameter constants are a defined part of the internal timestep-size control criteria. The 3 switch control variables prevent a timestep reduction when the outer iteration fails to converge, when the timestep size needs to be reduced below DTMIN, and when a backup evaluation is to be performed.

  13. Numerical model updating technique for structures using firefly algorithm

    Science.gov (United States)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  14. Fast model updating coupling Bayesian inference and PGD model reduction

    Science.gov (United States)

    Rubio, Paul-Baptiste; Louf, François; Chamoin, Ludovic

    2018-04-01

    The paper focuses on a coupled Bayesian-Proper Generalized Decomposition (PGD) approach for the real-time identification and updating of numerical models. The purpose is to use the most general case of Bayesian inference theory in order to address inverse problems and to deal with different sources of uncertainties (measurement and model errors, stochastic parameters). In order to do so with a reasonable CPU cost, the idea is to replace the direct model called for Monte-Carlo sampling by a PGD reduced model, and in some cases directly compute the probability density functions from the obtained analytical formulation. This procedure is first applied to a welding control example with the updating of a deterministic parameter. In the second application, the identification of a stochastic parameter is studied through a glued assembly example.

  15. LMFBR plant parameters 1991

    International Nuclear Information System (INIS)

    1991-03-01

    The document has been prepared on the basis of information provided by the members of the IAEA International Working Group on Fast Reactors (IWGFR). It contains updated parameters of 27 experimental, prototype and commercial size liquid metal fast breeder reactors (LMFBRs). Most of the reactors are currently in operation, under construction or in an advanced planning stage. Parameters of the Clinch River Breeder Reactor (USA), PEC (Italy), RAPSODIE (France), DFR (UK) and EFFBR (USA) are included in the report because of their important role in the development of LMFBR technology from first LMFBRs to the prototype size fast reactors. Two more reactors appeared in the list: European Fast Reactor (EFR) and PRISM (USA). Parameters of these reactors included in this publication are based on the data from the papers presented at the 23rd Annual Meeting of the IWGFR. All in all more than four hundred corrections and additions have been made to update the document. The report is intended for specialists and institutions in industrialized and developing countries who are responsible for the design and operation of liquid metal fast breeder reactors

  16. The Inertia Weight Updating Strategies in Particle Swarm Optimisation Based on the Beta Distribution

    Directory of Open Access Journals (Sweden)

    Petr Maca

    2015-01-01

    Full Text Available The presented paper deals with the comparison of selected random updating strategies of inertia weight in particle swarm optimisation. Six versions of particle swarm optimization were analysed on 28 benchmark functions, prepared for the Special Session on Real-Parameter Single Objective Optimisation at CEC2013. The random components of tested inertia weight were generated from Beta distribution with different values of shape parameters. The best analysed PSO version is the multiswarm PSO, which combines two strategies of updating the inertia weight. The first is driven by the temporally varying shape parameters, while the second is based on random control of shape parameters of Beta distribution.

  17. Development of a new risk score for incident type 2 diabetes using updated diagnostic criteria in middle-aged and older chinese.

    Directory of Open Access Journals (Sweden)

    Xingwang Ye

    Full Text Available Type 2 diabetes mellitus (T2DM reaches an epidemic proportion among adults in China. However, no simple score has been created for the prediction of T2DM incidence diagnosed by updated criteria with hemoglobin A1c (HbA1c ≥ 6.5% included in Chinese. In a 6-year follow-up cohort in Beijing and Shanghai, China, we recruited a total of 2529 adults aged 50-70 years in 2005 and followed them up in 2011. Fasting plasma glucose (FPG, HbA1c, and C-reactive protein (CRP were measured and incident diabetes was identified by the recently updated criteria. Of the 1912 participants without T2DM at baseline, 924 were identified as having T2DM at follow-up, and most of them (72.4% were diagnosed using the HbA1c criterion. Baseline body mass index, FPG, HbA1c, CRP, hypertension, and female gender were all significantly associated with incident T2DM. Based upon these risk factors, a simple score was developed with an estimated area under the receiver operating characteristic curve of 0.714 (95% confidence interval: 0.691, 0.737, which performed better than most of existing risk score models developed for eastern Asian populations. This simple, newly constructed score of six parameters may be useful in predicting T2DM in middle-aged and older Chinese.

  18. Updating the 2001 National Land Cover Database land cover classification to 2006 by using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.; Fry, Joyce

    2009-01-01

    The recent release of the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001, which represents the nation's land cover status based on a nominal date of 2001, is widely used as a baseline for national land cover conditions. To enable the updating of this land cover information in a consistent and continuous manner, a prototype method was developed to update land cover by an individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season in 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, land cover classifications at the full NLCD resolution for 2006 areas of change were completed by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain several metropolitan areas including Seattle, Washington; San Diego, California; Sioux Falls, South Dakota; Jackson, Mississippi; and Manchester, New Hampshire. Results from the five study areas show that the vast majority of land cover change was captured and updated with overall land cover classification accuracies of 78.32%, 87.5%, 88.57%, 78.36%, and 83.33% for these areas. The method optimizes mapping efficiency and has the potential to provide users a flexible method to generate updated land cover at national and regional scales by using NLCD 2001 as the baseline.

  19. Efficient Multiplicative Updates for Support Vector Machines

    DEFF Research Database (Denmark)

    Potluru, Vamsi K.; Plis, Sergie N; Mørup, Morten

    2009-01-01

    (NMF) problem. This allows us to derive a novel multiplicative algorithm for solving hard and soft margin SVM. The algorithm follows as a natural extension of the updates for NMF and semi-NMF. No additional parameter setting, such as choosing learning rate, is required. Exploiting the connection......The dual formulation of the support vector machine (SVM) objective function is an instance of a nonnegative quadratic programming problem. We reformulate the SVM objective function as a matrix factorization problem which establishes a connection with the regularized nonnegative matrix factorization...... between SVM and NMF formulation, we show how NMF algorithms can be applied to the SVM problem. Multiplicative updates that we derive for SVM problem also represent novel updates for semi-NMF. Further this unified view yields algorithmic insights in both directions: we demonstrate that the Kernel Adatron...

  20. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Partnership, ALMA [Astrophysics Research Institute, Liverpool John Moores University, IC2, Liverpool Science Park, 146 Brownlow Hill, Liverpool L3 5RF (United Kingdom); Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S. [Joint ALMA Observatory, Alonso de Córdova 3107, Vitacura, Santiago (Chile); Lucas, R. [Institut de Planétologie et d’Astrophysique de Grenoble (UMR 5274), BP 53, F-38041 Grenoble Cedex 9 (France); Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); Asaki, Y. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Matsushita, S. [Institute of Astronomy and Astrophysics, Academia Sinica, P.O. Box 23-141, Taipei 106, Taiwan (China); Hills, R. E. [Astrophysics Group, Cavendish Laboratory, JJ Thomson Avenue, Cambridge CB3 0HE (United Kingdom); Richards, A. M. S. [Jodrell Bank Centre for Astrophysics, School of Physics and Astronomy, University of Manchester, Oxford Road, Manchester M13 9PL (United Kingdom); Broguiere, D., E-mail: efomalon@nrao.edu [Institut de Radioastronomie Millime´trique (IRAM), 300 rue de la Piscine, Domaine Universitaire, F-38406 Saint Martin d’Hères (France); and others

    2015-07-20

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy.

  1. THE 2014 ALMA LONG BASELINE CAMPAIGN: AN OVERVIEW

    International Nuclear Information System (INIS)

    Partnership, ALMA; Fomalont, E. B.; Vlahakis, C.; Corder, S.; Remijan, A.; Barkats, D.; Dent, W. R. F.; Phillips, N.; Cox, P.; Hales, A. S.; Lucas, R.; Hunter, T. R.; Brogan, C. L.; Amestica, R.; Cotton, W.; Asaki, Y.; Matsushita, S.; Hills, R. E.; Richards, A. M. S.; Broguiere, D.

    2015-01-01

    A major goal of the Atacama Large Millimeter/submillimeter Array (ALMA) is to make accurate images with resolutions of tens of milliarcseconds, which at submillimeter (submm) wavelengths requires baselines up to ∼15 km. To develop and test this capability, a Long Baseline Campaign (LBC) was carried out from 2014 September to late November, culminating in end-to-end observations, calibrations, and imaging of selected Science Verification (SV) targets. This paper presents an overview of the campaign and its main results, including an investigation of the short-term coherence properties and systematic phase errors over the long baselines at the ALMA site, a summary of the SV targets and observations, and recommendations for science observing strategies at long baselines. Deep ALMA images of the quasar 3C 138 at 97 and 241 GHz are also compared to VLA 43 GHz results, demonstrating an agreement at a level of a few percent. As a result of the extensive program of LBC testing, the highly successful SV imaging at long baselines achieved angular resolutions as fine as 19 mas at ∼350 GHz. Observing with ALMA on baselines of up to 15 km is now possible, and opens up new parameter space for submm astronomy

  2. Super-NOvA a long-baseline neutrino experiment with two off-axis detectors

    CERN Document Server

    Requejo, O M; Pascoli, S; Requejo, Olga Mena; Palomares-Ruiz, Sergio; Pascoli, Silvia

    2005-01-01

    Establishing the neutrino mass hierarchy is one of the fundamental questions that will have to be addressed in the next future. Its determination could be obtained with long-baseline experiments but typically suffers from degeneracies with other neutrino parameters. We consider here the NOvA experiment configuration and propose to place a second off-axis detector, with a shorter baseline, such that, by exploiting matter effects, the type of neutrino mass hierarchy could be determined with only the neutrino run. We show that the determination of this parameter is free of degeneracies, provided the ratio L/E, where L the baseline and E is the neutrino energy, is the same for both detectors.

  3. On Using Exponential Parameter Estimators with an Adaptive Controller

    Science.gov (United States)

    Patre, Parag; Joshi, Suresh M.

    2011-01-01

    Typical adaptive controllers are restricted to using a specific update law to generate parameter estimates. This paper investigates the possibility of using any exponential parameter estimator with an adaptive controller such that the system tracks a desired trajectory. The goal is to provide flexibility in choosing any update law suitable for a given application. The development relies on a previously developed concept of controller/update law modularity in the adaptive control literature, and the use of a converse Lyapunov-like theorem. Stability analysis is presented to derive gain conditions under which this is possible, and inferences are made about the tracking error performance. The development is based on a class of Euler-Lagrange systems that are used to model various engineering systems including space robots and manipulators.

  4. LLL DBASE glossary and parameter definitions, Part 1

    International Nuclear Information System (INIS)

    Rohrer, R.F.

    1975-01-01

    This report lists, defines, and updates parameters in DBASE, an LLL test effects data bank in which data is stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for classified parameters

  5. Updated neutron spectrum characterization of SNL baseline reactor environments

    International Nuclear Information System (INIS)

    Griffin, P.J.; Kelly, J.G.; Vehar, D.W.

    1994-04-01

    This document provides SAND-II and MANIPULATE output listings from calculations used to derive the new spectrum-averaged integral parameters that were reported in volume 1. When used in conjunction with volume 1, this document provides an audit trail for the neutron radiation field characterization and supports current quality assurance initiatives. This document provides detailed information on the neutron spectrum characteristics of the primary Sandia National Laboratories' (SNL) reactor environments. The information in this volume is not intended for the casual user of the SNL reactor facilities. This detailed characterization of the neutron and gamma environments at the Sandia Pulsed Reactor (SPR) and the Annular Core Research Reactor (ACRR) is provided to aid the users who wish to convert the information given in the Radiation Metrology Laboratory (RML) dosimetry reports into other (non-silicon) measures of neutron damage. The spectra provided in these appendices can be used as a source term for Monte Carlo radiation transport calculations to study the impact of experimenter's test package on the neutron environment

  6. Comparison of two optimization algorithms for fuzzy finite element model updating for damage detection in a wind turbine blade

    Science.gov (United States)

    Turnbull, Heather; Omenzetter, Piotr

    2018-03-01

    vDifficulties associated with current health monitoring and inspection practices combined with harsh, often remote, operational environments of wind turbines highlight the requirement for a non-destructive evaluation system capable of remotely monitoring the current structural state of turbine blades. This research adopted a physics based structural health monitoring methodology through calibration of a finite element model using inverse techniques. A 2.36m blade from a 5kW turbine was used as an experimental specimen, with operational modal analysis techniques utilised to realize the modal properties of the system. Modelling the experimental responses as fuzzy numbers using the sub-level technique, uncertainty in the response parameters was propagated back through the model and into the updating parameters. Initially, experimental responses of the blade were obtained, with a numerical model of the blade created and updated. Deterministic updating was carried out through formulation and minimisation of a deterministic objective function using both firefly algorithm and virus optimisation algorithm. Uncertainty in experimental responses were modelled using triangular membership functions, allowing membership functions of updating parameters (Young's modulus and shear modulus) to be obtained. Firefly algorithm and virus optimisation algorithm were again utilised, however, this time in the solution of fuzzy objective functions. This enabled uncertainty associated with updating parameters to be quantified. Varying damage location and severity was simulated experimentally through addition of small masses to the structure intended to cause a structural alteration. A damaged model was created, modelling four variable magnitude nonstructural masses at predefined points and updated to provide a deterministic damage prediction and information in relation to the parameters uncertainty via fuzzy updating.

  7. Physics Potential of Long-Baseline Experiments

    Directory of Open Access Journals (Sweden)

    Sanjib Kumar Agarwalla

    2014-01-01

    Full Text Available The discovery of neutrino mixing and oscillations over the past decade provides firm evidence for new physics beyond the Standard Model. Recently, θ13 has been determined to be moderately large, quite close to its previous upper bound. This represents a significant milestone in establishing the three-flavor oscillation picture of neutrinos. It has opened up exciting prospects for current and future long-baseline neutrino oscillation experiments towards addressing the remaining fundamental questions, in particular the type of the neutrino mass hierarchy and the possible presence of a CP-violating phase. Another recent and crucial development is the indication of non-maximal 2-3 mixing angle, causing the octant ambiguity of θ23. In this paper, I will review the phenomenology of long-baseline neutrino oscillations with a special emphasis on sub-leading three-flavor effects, which will play a crucial role in resolving these unknowns. First, I will give a brief description of neutrino oscillation phenomenon. Then, I will discuss our present global understanding of the neutrino mass-mixing parameters and will identify the major unknowns in this sector. After that, I will present the physics reach of current generation long-baseline experiments. Finally, I will conclude with a discussion on the physics capabilities of accelerator-driven possible future long-baseline precision oscillation facilities.

  8. The structure of executive functions in children: a closer examination of inhibition, shifting, and updating.

    Science.gov (United States)

    van der Ven, Sanne H G; Kroesbergen, Evelyn H; Boom, Jan; Leseman, Paul P M

    2013-03-01

    An increasing number of studies has investigated the latent factor structure of executive functions. Some studies found a three-factor structure of inhibition, shifting, and updating, but others could not replicate this finding. We assumed that the task choices and scoring methods might be responsible for these contradictory findings. Therefore, we selected tasks in which input modality was varied, controlled for baseline speed, and used both speed and accuracy scores, in order to investigate whether a three factor model with inhibition, shifting, and updating could still be replicated. In a group of 211 children, who were tested at the beginning of grade 1, at approximately 6 years of age, and again after 18 months, the best fitting model was not the three-factor model, but instead consisted of an updating factor and a combined inhibition and shifting factor, besides two baseline speed factors (verbal and motor). We argue that these results might indicate that the structural organization of executive functions might be different in children than in adults, but that there might also be an alternative explanation: the distinction in executive functions might not accurately represent cognitive structures but instead be a methodological artefact. © 2012 The British Psychological Society.

  9. 40 CFR 74.20 - Data for baseline and alternative baseline.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Data for baseline and alternative baseline. 74.20 Section 74.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... baseline and alternative baseline. (a) Acceptable data. (1) The designated representative of a combustion...

  10. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.; Zander, Elmar

    2015-01-01

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  11. Minimum mean square error estimation and approximation of the Bayesian update

    KAUST Repository

    Litvinenko, Alexander

    2015-01-07

    Given: a physical system modeled by a PDE or ODE with uncertain coefficient q(w), a measurement operator Y (u(q); q), where u(q; w) uncertain solution. Aim: to identify q(w). The mapping from parameters to observations is usually not invertible, hence this inverse identification problem is generally ill-posed. To identify q(w) we derived non-linear Bayesian update from the variational problem associated with conditional expectation. To reduce cost of the Bayesian update we offer a functional approximation, e.g. polynomial chaos expansion (PCE). New: We derive linear, quadratic etc approximation of full Bayesian update.

  12. A baseline-free procedure for transformation models under interval censorship.

    Science.gov (United States)

    Gu, Ming Gao; Sun, Liuquan; Zuo, Guoxin

    2005-12-01

    An important property of Cox regression model is that the estimation of regression parameters using the partial likelihood procedure does not depend on its baseline survival function. We call such a procedure baseline-free. Using marginal likelihood, we show that an baseline-free procedure can be derived for a class of general transformation models under interval censoring framework. The baseline-free procedure results a simplified and stable computation algorithm for some complicated and important semiparametric models, such as frailty models and heteroscedastic hazard/rank regression models, where the estimation procedures so far available involve estimation of the infinite dimensional baseline function. A detailed computational algorithm using Markov Chain Monte Carlo stochastic approximation is presented. The proposed procedure is demonstrated through extensive simulation studies, showing the validity of asymptotic consistency and normality. We also illustrate the procedure with a real data set from a study of breast cancer. A heuristic argument showing that the score function is a mean zero martingale is provided.

  13. Combining Multi-Source Remotely Sensed Data and a Process-Based Model for Forest Aboveground Biomass Updating.

    Science.gov (United States)

    Lu, Xiaoman; Zheng, Guang; Miller, Colton; Alvarado, Ernesto

    2017-09-08

    Monitoring and understanding the spatio-temporal variations of forest aboveground biomass (AGB) is a key basis to quantitatively assess the carbon sequestration capacity of a forest ecosystem. To map and update forest AGB in the Greater Khingan Mountains (GKM) of China, this work proposes a physical-based approach. Based on the baseline forest AGB from Landsat Enhanced Thematic Mapper Plus (ETM+) images in 2008, we dynamically updated the annual forest AGB from 2009 to 2012 by adding the annual AGB increment (ABI) obtained from the simulated daily and annual net primary productivity (NPP) using the Boreal Ecosystem Productivity Simulator (BEPS) model. The 2012 result was validated by both field- and aerial laser scanning (ALS)-based AGBs. The predicted forest AGB for 2012 estimated from the process-based model can explain 31% ( n = 35, p forest AGBs, respectively. However, due to the saturation of optical remote sensing-based spectral signals and contribution of understory vegetation, the BEPS-based AGB tended to underestimate/overestimate the AGB for dense/sparse forests. Generally, our results showed that the remotely sensed forest AGB estimates could serve as the initial carbon pool to parameterize the process-based model for NPP simulation, and the combination of the baseline forest AGB and BEPS model could effectively update the spatiotemporal distribution of forest AGB.

  14. Documentation of the dynamic parameter, water-use, stream and lake flow routing, and two summary output modules and updates to surface-depression storage simulation and initial conditions specification options with the Precipitation-Runoff Modeling System (PRMS)

    Science.gov (United States)

    Regan, R. Steve; LaFontaine, Jacob H.

    2017-10-05

    This report documents seven enhancements to the U.S. Geological Survey (USGS) Precipitation-Runoff Modeling System (PRMS) hydrologic simulation code: two time-series input options, two new output options, and three updates of existing capabilities. The enhancements are (1) new dynamic parameter module, (2) new water-use module, (3) new Hydrologic Response Unit (HRU) summary output module, (4) new basin variables summary output module, (5) new stream and lake flow routing module, (6) update to surface-depression storage and flow simulation, and (7) update to the initial-conditions specification. This report relies heavily upon U.S. Geological Survey Techniques and Methods, book 6, chapter B7, which documents PRMS version 4 (PRMS-IV). A brief description of PRMS is included in this report.

  15. Nonlinear undulator tapering in conventional SASE regime at baseline electron beam parameters as a way to optimize the radiation characteristics of the European XFEL

    Energy Technology Data Exchange (ETDEWEB)

    Serkez, Svitozar; Kocharyan, Vitali; Saldin, Evgeni; Zagorodnov, Igor [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Geloni, Gianluca [European XFEL GmbH, Hamburg (Germany)

    2013-09-15

    We demonstrate that the output radiation characteristics of the European XFEL sources at nominal operation point can be easily made significantly better than what is currently reported in the TDRs of scientific instruments and X-ray optics. In fact, the output SASE characteristics of the baseline European XFEL have been previously optimized assuming uniform undulators at a nominal operating point of 5 kA peak current, without considering the potential of undulator tapering in the SASE regime. In order to illustrate this point, we analyze the case of an electron bunch with nominal parameters. Based on start-to-end simulations, we demonstrate that nonlinear undulator tapering allows one to achieve up to a tenfold increase in peak power and photon spectral density in the conventional SASE regime, without modification to the baseline design. The FEL code Genesis has been extensively used for these studies. In order to increase our confidence in simulation results, we cross-checked outcomes by reproducing simulations in the deep nonlinear SASE regime with tapered undulator using the code ALICE.

  16. Updated measurement of the tau lifetime at SLD

    International Nuclear Information System (INIS)

    1996-01-01

    We present an updated measurement of the tau lifetime at SLD. 4316 τ-pair events, selected from a 150k Z 0 data sample, are analyzed using three techniques: decay length, impact parameter, and impact parameter difference methods. The measurement benefits from the small and stable interaction region at the SLC and the precision CCD pixel vertex detector of the SLD. The combined result is: τ τ = 288.1 ± 6.1(stat) ± 3.3(syst) fs

  17. Clinical practice parameters for hemodynamic support of pediatric and neonatal septic shock: 2007 update from the American College of Critical Care Medicine

    Science.gov (United States)

    Brierley, Joe; Carcillo, Joseph A.; Choong, Karen; Cornell, Tim; DeCaen, Allan; Deymann, Andreas; Doctor, Allan; Davis, Alan; Duff, John; Dugas, Marc-Andre; Duncan, Alan; Evans, Barry; Feldman, Jonathan; Felmet, Kathryn; Fisher, Gene; Frankel, Lorry; Jeffries, Howard; Greenwald, Bruce; Gutierrez, Juan; Hall, Mark; Han, Yong Y.; Hanson, James; Hazelzet, Jan; Hernan, Lynn; Kiff, Jane; Kissoon, Niranjan; Kon, Alexander; Irazusta, Jose; Lin, John; Lorts, Angie; Mariscalco, Michelle; Mehta, Renuka; Nadel, Simon; Nguyen, Trung; Nicholson, Carol; Peters, Mark; Okhuysen-Cawley, Regina; Poulton, Tom; Relves, Monica; Rodriguez, Agustin; Rozenfeld, Ranna; Schnitzler, Eduardo; Shanley, Tom; Skache, Sara; Skippen, Peter; Torres, Adalberto; von Dessauer, Bettina; Weingarten, Jacki; Yeh, Timothy; Zaritsky, Arno; Stojadinovic, Bonnie; Zimmerman, Jerry; Zuckerberg, Aaron

    2013-01-01

    Background The Institute of Medicine calls for the use of clinical guidelines and practice parameters to promote “best practices” and to improve patient outcomes. Objective 2007 update of the 2002 American College of Critical Care Medicine Clinical Guidelines for Hemodynamic Support of Neonates and Children with Septic Shock. Participants Society of Critical Care Medicine members with special interest in neonatal and pediatric septic shock were identified from general solicitation at the Society of Critical Care Medicine Educational and Scientific Symposia (2001–2006). Methods The Pubmed/MEDLINE literature database (1966–2006) was searched using the keywords and phrases: sepsis, septicemia, septic shock, endotoxemia, persistent pulmonary hypertension, nitric oxide, extracorporeal membrane oxygenation (ECMO), and American College of Critical Care Medicine guidelines. Best practice centers that reported best outcomes were identified and their practices examined as models of care. Using a modified Delphi method, 30 experts graded new literature. Over 30 additional experts then reviewed the updated recommendations. The document was subsequently modified until there was greater than 90% expert consensus. Results The 2002 guidelines were widely disseminated, translated into Spanish and Portuguese, and incorporated into Society of Critical Care Medicine and AHA sanctioned recommendations. Centers that implemented the 2002 guidelines reported best practice outcomes (hospital mortality 1%–3% in previously healthy, and 7%– 10% in chronically ill children). Early use of 2002 guidelines was associated with improved outcome in the community hospital emergency department (number needed to treat = 3.3) and tertiary pediatric intensive care setting (number needed to treat = 3.6); every hour that went by without guideline adherence was associated with a 1.4-fold increased mortality risk. The updated 2007 guidelines continue to recognize an increased likelihood that

  18. Future Circular Collider Study FCC-he Baseline Parameters

    CERN Document Server

    Bruning, Oliver; Klein, Max; Pellegrini, Dario; Schulte, Daniel; Zimmermann, Frank

    2017-01-01

    Initial considerations are presented on the FCC-he, the electron-hadron collider con guration within the Future Circular Collider study. This note considers arguments for the choice of the electron beam energy based on physics, ep scattering kinematics and cost. The default con guration for the electron accelerator, as for the LHeC, is chosen to be a multi-turn energy recovery linac external to the proton beam tunnel. The main accelerator parameters of the FCC-he are discussed, assuming the concurrent operation of ep with the 100TeV cms energy pp collider. These are compared with the LHeC design concept, for increased performance as for a Higgs facility using the HL-LHC, and also the high energy HE-LHC ep collider configuration. Initial estimates are also provided for the luminosity performance of electron-ion colliders for the 60 GeV electron ERL when combined with the LHC, the HE-LHC and the FCC ion beams.

  19. Baseline hematology and serum biochemistry results for Indian leopards (Panthera pardus fusca

    Directory of Open Access Journals (Sweden)

    Arun Attur Shanmugam

    2017-07-01

    Full Text Available Aim: The aim of the study was to establish the baseline hematology and serum biochemistry values for Indian leopards (Panthera pardus fusca, and to assess the possible variations in these parameters based on age and gender. Materials and Methods: Hemato-biochemical test reports from a total of 83 healthy leopards, carried out as part of routine health evaluation in Bannerghatta Biological Park and Manikdoh Leopard Rescue Center, were used to establish baseline hematology and serum biochemistry parameters for the subspecies. The hematological parameters considered for the analysis included hemoglobin (Hb, packed cell volume, total erythrocyte count (TEC, total leukocyte count (TLC, mean corpuscular volume (MCV, mean corpuscular Hb (MCH, and MCH concentration. The serum biochemistry parameters considered included total protein (TP, albumin, globulin, aspartate aminotransferase, alanine aminotransferase (ALT, blood urea nitrogen, creatinine, triglycerides, calcium, and phosphorus. Results: Even though few differences were observed in hematologic and biochemistry values between male and female Indian leopards, the differences were statistically not significant. Effects of age, however, were evident in relation to many hematologic and biochemical parameters. Sub-adults had significantly greater values for Hb, TEC, and TLC compared to adults and geriatric group, whereas they had significantly lower MCV and MCH compared to adults and geriatric group. Among, serum biochemistry parameters the sub-adult age group was observed to have significantly lower values for TP and ALT than adult and geriatric leopards. Conclusion: The study provides a comprehensive analysis of hematologic and biochemical parameters for Indian leopards. Baselines established here will permit better captive management of the subspecies, serve as a guide to assess the health and physiological status of the free ranging leopards, and may contribute valuable information for making

  20. Baseline hematology and serum biochemistry results for Indian leopards (Panthera pardus fusca)

    Science.gov (United States)

    Shanmugam, Arun Attur; Muliya, Sanath Krishna; Deshmukh, Ajay; Suresh, Sujay; Nath, Anukul; Kalaignan, Pa; Venkataravanappa, Manjunath; Jose, Lyju

    2017-01-01

    Aim: The aim of the study was to establish the baseline hematology and serum biochemistry values for Indian leopards (Panthera pardus fusca), and to assess the possible variations in these parameters based on age and gender. Materials and Methods: Hemato-biochemical test reports from a total of 83 healthy leopards, carried out as part of routine health evaluation in Bannerghatta Biological Park and Manikdoh Leopard Rescue Center, were used to establish baseline hematology and serum biochemistry parameters for the subspecies. The hematological parameters considered for the analysis included hemoglobin (Hb), packed cell volume, total erythrocyte count (TEC), total leukocyte count (TLC), mean corpuscular volume (MCV), mean corpuscular Hb (MCH), and MCH concentration. The serum biochemistry parameters considered included total protein (TP), albumin, globulin, aspartate aminotransferase, alanine aminotransferase (ALT), blood urea nitrogen, creatinine, triglycerides, calcium, and phosphorus. Results: Even though few differences were observed in hematologic and biochemistry values between male and female Indian leopards, the differences were statistically not significant. Effects of age, however, were evident in relation to many hematologic and biochemical parameters. Sub-adults had significantly greater values for Hb, TEC, and TLC compared to adults and geriatric group, whereas they had significantly lower MCV and MCH compared to adults and geriatric group. Among, serum biochemistry parameters the sub-adult age group was observed to have significantly lower values for TP and ALT than adult and geriatric leopards. Conclusion: The study provides a comprehensive analysis of hematologic and biochemical parameters for Indian leopards. Baselines established here will permit better captive management of the subspecies, serve as a guide to assess the health and physiological status of the free ranging leopards, and may contribute valuable information for making effective

  1. An Improved Cognitive Model of the Iowa and Soochow Gambling Tasks With Regard to Model Fitting Performance and Tests of Parameter Consistency

    Directory of Open Access Journals (Sweden)

    Junyi eDai

    2015-03-01

    Full Text Available The Iowa Gambling Task (IGT and the Soochow Gambling Task (SGT are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning model (EVL and the prospect valence learning model (PVL, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79 and 27 control participants (mean age 35; SD 10.44 completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models.

  2. IAHR List of Sea Parameters

    DEFF Research Database (Denmark)

    Frigaard, Peter; Helm-Petersen, J; Klopman, G.

    1997-01-01

    A Working Group on multidirectional waves formed by the International Association for Hydraulic Research has proposed an update of the IAHR List of Sea State Parameters from 1986 in the part concerning directional. Especially wave structure interaction with reflection of the waves have been treated....

  3. Land and Water Use Characteristics and Human Health Input Parameters for use in Environmental Dosimetry and Risk Assessments at the Savannah River Site. 2016 Update

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, G. Tim [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Hartman, Larry [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Stagich, Brooke [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2016-09-26

    Operations at the Savannah River Site (SRS) result in releases of small amounts of radioactive materials to the atmosphere and to the Savannah River. For regulatory compliance purposes, potential offsite radiological doses are estimated annually using computer models that follow U.S. Nuclear Regulatory Commission (NRC) regulatory guides. Within the regulatory guides, default values are provided for many of the dose model parameters, but the use of applicant site-specific values is encouraged. Detailed surveys of land-use and water-use parameters were conducted in 1991 and 2010. They are being updated in this report. These parameters include local characteristics of meat, milk and vegetable production; river recreational activities; and meat, milk and vegetable consumption rates, as well as other human usage parameters required in the SRS dosimetry models. In addition, the preferred elemental bioaccumulation factors and transfer factors (to be used in human health exposure calculations at SRS) are documented. The intent of this report is to establish a standardized source for these parameters that is up to date with existing data, and that is maintained via review of future-issued national references (to evaluate the need for changes as new information is released). These reviews will continue to be added to this document by revision.

  4. Land and Water Use Characteristics and Human Health Input Parameters for use in Environmental Dosimetry and Risk Assessments at the Savannah River Site 2017 Update

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Stagich, B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-05-25

    Operations at the Savannah River Site (SRS) result in releases of relatively small amounts of radioactive materials to the atmosphere and to the Savannah River. For regulatory compliance purposes, potential offsite radiological doses are estimated annually using computer models that follow U.S. Nuclear Regulatory Commission (NRC) regulatory guides. Within the regulatory guides, default values are provided for many of the dose model parameters, but the use of site-specific values is encouraged. Detailed surveys of land-use and water-use parameters were conducted in 1991, 2008, 2010, and 2016 and are being concurred with or updated in this report. These parameters include local characteristics of meat, milk, and vegetable production; river recreational activities; and meat, milk, and vegetable consumption rates, as well as other human usage parameters required in the SRS dosimetry models. In addition, the preferred elemental bioaccumulation factors and transfer factors (to be used in human health exposure calculations at SRS) are documented. The intent of this report is to establish a standardized source for these parameters that is up to date with existing data, and that is maintained via review of future-issued national references (to evaluate the need for changes as new information is released). These reviews will continue to be added to this document by revision.

  5. Frequency Domain Multi-parameter Full Waveform Inversion for Acoustic VTI Media

    KAUST Repository

    Djebbi, Ramzi; Alkhalifah, Tariq Ali

    2017-01-01

    Multi-parameter full waveform inversion (FWI) for transversely isotropic (TI) media with vertical axis of symmetry (VTI) suffers from the trade-off between the parameters. The trade-off results in the leakage of one parameter's update into the other

  6. JPSS CGS Tools For Rapid Algorithm Updates

    Science.gov (United States)

    Smith, D. C.; Grant, K. D.

    2011-12-01

    Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity. [1] K. Grant and G. Route, "JPSS CGS Tools for Rapid Algorithm Updates," NOAA 2011 Satellite Direct Readout Conference, Miami FL, Poster, Apr 2011. [2] K. Grant, G. Route and B. Reed, "JPSS CGS Tools for Rapid Algorithm Updates," AMS 91st Annual Meeting, Seattle WA, Poster, Jan 2011. [3] K. Grant, G. Route and B. Reed, "JPSS CGS Tools for Rapid Algorithm Updates," AGU 2010 Fall Meeting, San Francisco CA, Oral Presentation, Dec 2010.

  7. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites

  8. Demographic and transportation parameters in RADTRAN

    International Nuclear Information System (INIS)

    Brogan, J.D.; Cashwell, J.W.; Neuhauser, K.S.

    1989-01-01

    Recent efforts at Sandia National Laboratories have focused not only on modification of the RADTRAN transportation risk analysis code but also on updating the default parameters for population, land use, and roadway characteristics used by the code. Changes to the code have been discussed earlier in this Conference. This paper summarizes the results of a review of transportation and demographic parameters, performed to complement recent model modifications

  9. A long baseline global stereo matching based upon short baseline estimation

    Science.gov (United States)

    Li, Jing; Zhao, Hong; Li, Zigang; Gu, Feifei; Zhao, Zixin; Ma, Yueyang; Fang, Meiqi

    2018-05-01

    In global stereo vision, balancing the matching efficiency and computing accuracy seems to be impossible because they contradict each other. In the case of a long baseline, this contradiction becomes more prominent. In order to solve this difficult problem, this paper proposes a novel idea to improve both the efficiency and accuracy in global stereo matching for a long baseline. In this way, the reference images located between the long baseline image pairs are firstly chosen to form the new image pairs with short baselines. The relationship between the disparities of pixels in the image pairs with different baselines is revealed by considering the quantized error so that the disparity search range under the long baseline can be reduced by guidance of the short baseline to gain matching efficiency. Then, the novel idea is integrated into the graph cuts (GCs) to form a multi-step GC algorithm based on the short baseline estimation, by which the disparity map under the long baseline can be calculated iteratively on the basis of the previous matching. Furthermore, the image information from the pixels that are non-occluded under the short baseline but are occluded for the long baseline can be employed to improve the matching accuracy. Although the time complexity of the proposed method depends on the locations of the chosen reference images, it is usually much lower for a long baseline stereo matching than when using the traditional GC algorithm. Finally, the validity of the proposed method is examined by experiments based on benchmark datasets. The results show that the proposed method is superior to the traditional GC method in terms of efficiency and accuracy, and thus it is suitable for long baseline stereo matching.

  10. A Kriging Model Based Finite Element Model Updating Method for Damage Detection

    Directory of Open Access Journals (Sweden)

    Xiuming Yang

    2017-10-01

    Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.

  11. Automated finite element updating using strain data for the lifetime reliability assessment of bridges

    International Nuclear Information System (INIS)

    Okasha, Nader M.; Frangopol, Dan M.; Orcesi, André D.

    2012-01-01

    The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.

  12. Lower baseline performance but greater plasticity of working memory for carriers of the val allele of the COMT Val¹⁵⁸Met polymorphism.

    Science.gov (United States)

    Bellander, Martin; Bäckman, Lars; Liu, Tian; Schjeide, Brit-Maren M; Bertram, Lars; Schmiedek, Florian; Lindenberger, Ulman; Lövdén, Martin

    2015-03-01

    Little is known about genetic contributions to individual differences in cognitive plasticity. Given that the neurotransmitter dopamine is critical for cognition and associated with cognitive plasticity, we investigated the effects of 3 polymorphisms of dopamine-related genes (LMX1A, DRD2, COMT) on baseline performance and plasticity of working memory (WM), perceptual speed, and reasoning. One hundred one younger and 103 older adults underwent approximately 100 days of cognitive training, and extensive testing before and after training. We analyzed the baseline and posttest data using latent change score models. For working memory, carriers of the val allele of the COMT polymorphism had lower baseline performance and larger performance gains from training than carriers of the met allele. There was no significant effect of the other genes or on other cognitive domains. We relate this result to available evidence indicating that met carriers perform better than val carriers in WM tasks taxing maintenance, whereas val carriers perform better at updating tasks. We suggest that val carriers may show larger training gains because updating operations carry greater potential for plasticity than maintenance operations. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  13. A Fully Customized Baseline Removal Framework for Spectroscopic Applications.

    Science.gov (United States)

    Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby

    2017-07-01

    The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.

  14. Finite element model updating of natural fibre reinforced composite structure in structural dynamics

    Directory of Open Access Journals (Sweden)

    Sani M.S.M.

    2016-01-01

    Full Text Available Model updating is a process of making adjustment of certain parameters of finite element model in order to reduce discrepancy between analytical predictions of finite element (FE and experimental results. Finite element model updating is considered as an important field of study as practical application of finite element method often shows discrepancy to the test result. The aim of this research is to perform model updating procedure on a composite structure as well as trying improving the presumed geometrical and material properties of tested composite structure in finite element prediction. The composite structure concerned in this study is a plate of reinforced kenaf fiber with epoxy. Modal properties (natural frequency, mode shapes, and damping ratio of the kenaf fiber structure will be determined using both experimental modal analysis (EMA and finite element analysis (FEA. In EMA, modal testing will be carried out using impact hammer test while normal mode analysis using FEA will be carried out using MSC. Nastran/Patran software. Correlation of the data will be carried out before optimizing the data from FEA. Several parameters will be considered and selected for the model updating procedure.

  15. Estimating the Cold War mortgage: The 1995 baseline environmental management report. Volume 1

    International Nuclear Information System (INIS)

    1995-03-01

    This is the first annual report on the activities and potentials costs required to address the waste, contamination, and surplus nuclear facilities that are the responsibility of the Department of Energy's Environmental Management program. The Department's Office of Environmental Management, established in 1989, manages one of the largest environmental programs in the world--with more than 130 sites and facilities in over 30 States and territories. The primary focus of the program is to reduce health and safety risks from radioactive waste and contamination resulting from the production, development, and testing of nuclear weapons. The program also is responsible for the environmental legacy from, and ongoing waste management for, nuclear energy research and development, and basic science research. In an attempt to better oversee this effort, Congress required the Secretary of Energy to submit a Baseline Environmental Management Report with annual updates. The 1995 Baseline Environmental Management Report provides life-cycle cost estimates, tentative schedules, and projected activities necessary to complete the Environmental Management program

  16. Updated Covariance Processing Capabilities in the AMPX Code System

    International Nuclear Information System (INIS)

    Wiarda, Dorothea; Dunn, Michael E.

    2007-01-01

    A concerted effort is in progress within the nuclear data community to provide new cross-section covariance data evaluations to support sensitivity/uncertainty analyses of fissionable systems. The objective of this work is to update processing capabilities of the AMPX library to process the latest Evaluated Nuclear Data File (ENDF)/B formats to generate covariance data libraries for radiation transport software such as SCALE. The module PUFF-IV was updated to allow processing of new ENDF covariance formats in the resolved resonance region. In the resolved resonance region, covariance matrices are given in terms of resonance parameters, which need to be processed into covariance matrices with respect to the group-averaged cross-section data. The parameter covariance matrix can be quite large if the evaluation has many resonances. The PUFF-IV code has recently been used to process an evaluation of 235U, which was prepared in collaboration between Oak Ridge National Laboratory and Los Alamos National Laboratory.

  17. Recursive Matrix Inverse Update On An Optical Processor

    Science.gov (United States)

    Casasent, David P.; Baranoski, Edward J.

    1988-02-01

    A high accuracy optical linear algebraic processor (OLAP) using the digital multiplication by analog convolution (DMAC) algorithm is described for use in an efficient matrix inverse update algorithm with speed and accuracy advantages. The solution of the parameters in the algorithm are addressed and the advantages of optical over digital linear algebraic processors are advanced.

  18. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  19. Finite element modelling and updating of friction stir welding (FSW joint for vibration analysis

    Directory of Open Access Journals (Sweden)

    Zahari Siti Norazila

    2017-01-01

    Full Text Available Friction stir welding of aluminium alloys widely used in automotive and aerospace application due to its advanced and lightweight properties. The behaviour of FSW joints plays a significant role in the dynamic characteristic of the structure due to its complexities and uncertainties therefore the representation of an accurate finite element model of these joints become a research issue. In this paper, various finite elements (FE modelling technique for prediction of dynamic properties of sheet metal jointed by friction stir welding will be presented. Firstly, nine set of flat plate with different series of aluminium alloy; AA7075 and AA6061 joined by FSW are used. Nine set of specimen was fabricated using various types of welding parameters. In order to find the most optimum set of FSW plate, the finite element model using equivalence technique was developed and the model validated using experimental modal analysis (EMA on nine set of specimen and finite element analysis (FEA. Three types of modelling were engaged in this study; rigid body element Type 2 (RBE2, bar element (CBAR and spot weld element connector (CWELD. CBAR element was chosen to represent weld model for FSW joints due to its accurate prediction of mode shapes and contains an updating parameter for weld modelling compare to other weld modelling. Model updating was performed to improve correlation between EMA and FEA and before proceeds to updating, sensitivity analysis was done to select the most sensitive updating parameter. After perform model updating, total error of the natural frequencies for CBAR model is improved significantly. Therefore, CBAR element was selected as the most reliable element in FE to represent FSW weld joint.

  20. Dynamic model updating based on strain mode shape and natural frequency using hybrid pattern search technique

    Science.gov (United States)

    Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping

    2018-05-01

    Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.

  1. Analysis of factors affecting baseline SF-36 Mental Component Summary in Adult Spinal Deformity and its impact on surgical outcomes.

    Science.gov (United States)

    Mmopelwa, Tiro; Ayhan, Selim; Yuksel, Selcen; Nabiyev, Vugar; Niyazi, Asli; Pellise, Ferran; Alanay, Ahmet; Sanchez Perez Grueso, Francisco Javier; Kleinstuck, Frank; Obeid, Ibrahim; Acaroglu, Emre

    2018-03-01

    To identify the factors that affect SF-36 mental component summary (MCS) in patients with adult spinal deformity (ASD) at the time of presentation, and to analyse the effect of SF-36 MCS on clinical outcomes in surgically treated patients. Prospectively collected data from a multicentric ASD database was analysed for baseline parameters. Then, the same database for surgically treated patients with a minimum of 1-year follow-up was analysed to see the effect of baseline SF-36 MCS on treatment results. A clinically useful SF-36 MCS was determined by ROC Curve analysis. A total of 229 patients with the baseline parameters were analysed. A strong correlation between SF-36 MCS and SRS-22, ODI, gender, and diagnosis were found (p baseline SF-36 MCS (p baseline SF-36 MCS in an ASD population are other HRQOL parameters such as SRS-22 and ODI as well as the baseline thoracic kyphosis and gender. This study has also demonstrated that baseline SF-36 MCS does not necessarily have any effect on the treatment results by surgery as assessed by SRS-22 or ODI. Level III, prognostic study. Copyright © 2018 Turkish Association of Orthopaedics and Traumatology. Production and hosting by Elsevier B.V. All rights reserved.

  2. Update of the Dutch manual for costing studies in health care.

    Directory of Open Access Journals (Sweden)

    Tim A Kanters

    Full Text Available Dutch health economic guidelines include a costing manual, which describes preferred research methodology for costing studies and reference prices to ensure high quality studies and comparability between study outcomes. This paper describes the most important revisions of the costing manual compared to the previous version.An online survey was sent out to potential users of the costing manual to identify topics for improvement. The costing manual was aligned with contemporary health economic guidelines. All methodology sections and parameter values needed for costing studies, particularly reference prices, were updated. An expert panel of health economists was consulted several times during the review process. The revised manual was reviewed by two members of the expert panel and by reviewers of the Dutch Health Care Institute.The majority of survey respondents was satisfied with content and usability of the existing costing manual. Respondents recommended updating reference prices and adding some particular commonly needed reference prices. Costs categories were adjusted to the international standard: 1 costs within the health care sector; 2 patient and family costs; and 3 costs in other sectors. Reference prices were updated to reflect 2014 values. The methodology chapter was rewritten to match the requirements of the costing manual and preferences of the users. Reference prices for nursing days of specific wards, for diagnostic procedures and nurse practitioners were added.The usability of the costing manual was increased and parameter values were updated. The costing manual became integrated in the new health economic guidelines.

  3. Finite element model updating of a small steel frame using neural networks

    International Nuclear Information System (INIS)

    Zapico, J L; González, M P; Alonso, R; González-Buelga, A

    2008-01-01

    This paper presents an experimental and analytical dynamic study of a small-scale steel frame. The experimental model was physically built and dynamically tested on a shaking table in a series of different configurations obtained from the original one by changing the mass and by causing structural damage. Finite element modelling and parameterization with physical meaning is iteratively tried for the original undamaged configuration. The finite element model is updated through a neural network, the natural frequencies of the model being the net input. The updating process is made more accurate and robust by using a regressive procedure, which constitutes an original contribution of this work. A novel simplified analytical model has been developed to evaluate the reduction of bending stiffness of the elements due to damage. The experimental results of the rest of the configurations have been used to validate both the updated finite element model and the analytical one. The statistical properties of the identified modal data are evaluated. From these, the statistical properties and a confidence interval for the estimated model parameters are obtained by using the Latin Hypercube sampling technique. The results obtained are successful: the updated model accurately reproduces the low modes identified experimentally for all configurations, and the statistical study of the transmission of errors yields a narrow confidence interval for all the identified parameters

  4. Stack Characterization in CryoSat Level1b SAR/SARin Baseline C

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. CryoSat is the first altimetry mission operating in SAR mode and it carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. The current CryoSat IPF (Instrument Processing Facility), Baseline B, was released in operation in February 2012. After more than 2 years of development, the release in operations of the Baseline C is expected in the first half of 2015. It is worth recalling here that the CryoSat SAR/SARin IPF1 generates 20Hz waveforms in correspondence of an approximately equally spaced set of ground locations on the Earth surface, i.e. surface samples, and that a surface sample gathers a collection of single-look echoes coming from the processed bursts during the time of visibility. Thus, for a given surface sample, the stack can be defined as the collection of all the single-look echoes pointing to the current surface sample, after applying all the necessary range corrections. The L1B product contains the power average of all the single-look echoes in the stack: the multi-looked L1B waveform. This reduces the data volume, while removing some information contained in the single looks, useful for characterizing the surface and modelling the L1B waveform. To recover such information, a set of parameters has been added to the L1B product: the stack characterization or beam behaviour parameters. The stack characterization, already included in previous Baselines, has been reviewed and expanded in Baseline C. This poster describes all the stack characterization parameters, detailing what they represent and how they have been computed. In details, such parameters can be summarized in: - Stack

  5. Earth matter effects at very long baselines and the neutrino mass hierarchy

    International Nuclear Information System (INIS)

    Gandhi, Raj; Ghoshal, Pomita; Goswami, Srubabati; Mehta, Poonam; Sankar, S. Uma

    2006-01-01

    We study matter effects which arise in the muon neutrino oscillation and survival probabilities relevant to atmospheric neutrino and very long baseline (>4000 Km) beam experiments. The interrelations between the three probabilities P μe , P μτ , and P μμ are examined. It is shown that large and observable sensitivity to the neutrino mass hierarchy can be present in P μμ and P μτ . We emphasize that at baselines >7000 Km, matter effects in P μτ are important under certain conditions and can be large. The muon survival rates in experiments with very long baselines thus depend on matter effects in both P μτ and P μe . We also indicate where these effects provide sensitivity to θ 13 and identify ranges of energies and baselines where this sensitivity is maximum. The effect of parameter degeneracies in the three probabilities at these baselines and energies is studied in detail and large parts of the parameter space are identified which are free from these degeneracies. In the second part of the paper, we focus on using the matter effects studied in the first part as a means of determining the mass hierarchy via atmospheric neutrinos. Realistic event rate calculations are performed for a charge discriminating 100 kT iron calorimeter which demonstrate the possibility of realizing this very important goal in neutrino physics. It is shown that for atmospheric neutrinos, a careful selection of energy and baseline ranges is necessary in order to obtain a statistically significant signal, and that the effects are largest in bins where matter effects in both P μe and P μτ combine constructively. Under these conditions, up to a 4σ signal for matter effects is possible (for Δ 31 >0) within a time scale appreciably shorter than the one anticipated for neutrino factories

  6. Baseline values from the electrocardiograms of children and adolescents with ADHD

    Directory of Open Access Journals (Sweden)

    Zhang Shuyu

    2007-09-01

    Full Text Available Abstract Background An important issue in pediatric pharmacology is the determination of whether medications affect cardiac rhythm parameters, in particular the QT interval, which is a surrogate marker for the risk of adverse cardiac events and sudden death. To evaluate changes while on medication, it is useful to have a comparison of age appropriate values while off medication. The present meta-analysis provides baseline ECG values (i.e., off medication from approximately 6000 children and adolescents with attention-deficit/hyperactivity disorder (ADHD. Methods Subjects were aged 6–18 years and participated in global trials within the atomoxetine registration program. Patients were administered a 12-lead ECG at study screening and cardiac rhythm parameters were recorded. Baseline QT intervals were corrected for heart rate using 3 different methods: Bazett's, Fridericia's, and a population data-derived formula. Results ECG data were obtained from 5289 North American and 641 non-North American children and adolescents. Means and percentiles are presented for each ECG measure and QTc interval based on pubertal status as defined by age and sex. Prior treatment history with stimulants and racial origin (Caucasian were each associated with significantly longer mean QTc values. Conclusion Baseline ECG and QTc data from almost 6000 children and adolescents presenting with ADHD are provided to contribute to the knowledge base regarding mean values for pediatric cardiac parameters. Consistent with other studies of QT interval in children and adolescents, Bazett correction formula appears to overestimate the prevalence of prolonged QTc in the pediatric population.

  7. Frequency Domain Multi-parameter Full Waveform Inversion for Acoustic VTI Media

    KAUST Repository

    Djebbi, Ramzi

    2017-05-26

    Multi-parameter full waveform inversion (FWI) for transversely isotropic (TI) media with vertical axis of symmetry (VTI) suffers from the trade-off between the parameters. The trade-off results in the leakage of one parameter\\'s update into the other during the inversion. It affects the accuracy and convergence of the inversion. The sensitivity analyses suggested a parameterisation using the horizontal velocity vh, epsilon and eta to reduce the trade-off for surface recorded seismic data.We test the (vh, epsilon, eta) parameterisation for acoustic VTI media using a scattering integral (SI) based inversion. The data is modeled in frequency domain and the model is updated using a preconditioned conjugate gradient method. We applied the method to the VTI Marmousi II model and in the inversion, we keep eta parameter fixed as the background initial model and we invert simultaneously for both vh and epsilon. The results show the suitability of the parameterisation for multi-parameter VTI acoustic inversion as well as the accuracy of the inversion approach.

  8. GRI baseline projection of U.S. energy supply and demand to 2010. 1992 edition

    International Nuclear Information System (INIS)

    Holtberg, P.D.; Woods, T.J.; Lihn, M.L.; Koklauner, A.B.

    1992-04-01

    The annual GRI baseline projection is the result of a complex modeling effort that seeks to achieve an internally consistent energy supply and demand outlook across all energy sources and end-use demand sectors. The year's projection includes the adoption of a new petroleum refinery methodology, the incorporation of a new approach to determining electric utility generating capacity heat rates, the extensive update of both the residential and commercial databases and methodologies, and the continued update of the GRI Hydrocarbon Model. The report presents a series of summary tables, sectoral breakdowns of energy demand, and the natural gas supply and price trends. The appendices include a discussion of the methodology and assumptions used to prepare the 1992 edition of the projection, an analysis of the potential for higher levels of gas demand, a description of industrial and commercial cogeneration, a description of the independent power producer projection, a comparison of the 1992 edition of the projection with previous GRI projections, and a discussion of additional data used in developing the projection

  9. An Updated Point Design for Heavy Ion Fusion

    International Nuclear Information System (INIS)

    Yu, S.S.; Meier, W.R.; Abbott, R.B.; Barnard, J.J.; Brown, T.; Callahan, D.A.; Heitzenroeder, P.; Latkowski, J.F.; Logan, B.G.; Pemberton, S.J.; Peterson, P.F.; Rose, D.V.; Sabbi, G.L.; Sharp, W.M.; Welch, D.R.

    2002-01-01

    An updated, self-consistent point design for a heavy ion fusion (HIF) power plant based on an induction linac driver, indirect-drive targets, and a thick liquid wall chamber has been completed. Conservative parameters were selected to allow each design area to meet its functional requirements in a robust manner, and thus this design is referred to as the Robust Point Design (RPD-2002). This paper provides a top-level summary of the major characteristics and design parameters for the target, driver, final focus magnet layout and shielding, chamber, beam propagation to the target, and overall power plant

  10. WIMS library up-date project: first stage results

    International Nuclear Information System (INIS)

    Prati, A.; Claro, L.H.

    1990-01-01

    The following benchmarks: TRX1, TRX2, BAPL-UO sub(2)-1, BAPL-UO sub (2)-2, BAPL-UO sub(2)-3 have been calculated with the WIMSD/4 code, as a contribution of CTA/IEAv, to the first stage of the WIMS Library Update Project, coordinated by the International Atomic Energy Agency. The card image input for each benchmark has been attached and the major input options/parameters are commented. The version of the WIMSD/4 code and its multigroup cross section library used to run the benchmarks are specified. Results from the major integral parameters are presented and discussed. (author)

  11. Monitoring of Physiological Parameters to Predict Exacerbations of Chronic Obstructive Pulmonary Disease (COPD: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Ahmed M. Al Rajeh

    2016-11-01

    Full Text Available Introduction: The value of monitoring physiological parameters to predict chronic obstructive pulmonary disease (COPD exacerbations is controversial. A few studies have suggested benefit from domiciliary monitoring of vital signs, and/or lung function but there is no existing systematic review. Objectives: To conduct a systematic review of the effectiveness of monitoring physiological parameters to predict COPD exacerbation. Methods: An electronic systematic search compliant with Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA guidelines was conducted. The search was updated to April 6, 2016. Five databases were examined: Medical Literature Analysis and Retrieval System Online, or MEDLARS Online (Medline, Excerpta Medica dataBASE (Embase, Allied and Complementary Medicine Database (AMED, Cumulative Index of Nursing and Allied Health Literature (CINAHL and the Cochrane clinical trials database. Results: Sixteen articles met the pre-specified inclusion criteria. Fifteen of these articules reported positive results in predicting COPD exacerbation via monitoring of physiological parameters. Nine studies showed a reduction in peripheral oxygen saturation (SpO2% prior to exacerbation onset. Three studies for peak flow, and two studies for respiratory rate reported a significant variation prior to or at exacerbation onset. A particular challenge is accounting for baseline heterogeneity in parameters between patients. Conclusion: There is currently insufficient information on how physiological parameters vary prior to exacerbation to support routine domiciliary monitoring for the prediction of exacerbations in COPD. However, the method remains promising.

  12. Identification of limiting climatic and geographical variables for the distribution of the tortoise Chelonoidis chilensis (Testudinidae: a baseline for conservation actions

    Directory of Open Access Journals (Sweden)

    Alejandro Ruete

    2015-10-01

    Full Text Available Background. Just as for most other tortoise species, the once common Chaco tortoise, Chelonoidis chilensis (Testudinidae, is under constant threat across it distribution in Argentina, Bolivia and Paraguay. Despite initial qualitative description of the species distribution and further individual reports of new locations for the species, there is no description of the species distribution in probabilistic terms. With this work we aim to produce an updated predictive distribution map for C. chilensis to serve as a baseline management tool for directed strategic conservation planning.Methods. We fitted a spatially expanded logistic regression model within the Bayesian framework that accounts for uncertainty on presence-only and generated pseudo-absence data into the parameter estimates. We contrast the results with reported data for the national networks of protected areas to assess the inclusion of the species in area-based conservation strategies.Results. We obtained maps with predictions of the occurrence of the species and reported the model’s uncertainty spatially. The model suggests that potential suitable habitats for the species are continuous across Argentina, West Paraguay and South Bolivia, considering the variables, the scale and the resolution used. The main limiting variables were temperature-related variables, and precipitation in the reproductive period.Discussion. Given the alarming low density and coverage of protected areas over the distribution area of C. chilensis, the map produced provides a baseline to identify areas where directed strategic conservation management actions would be more efficient for this and other associated species.

  13. A Bayesian consistent dual ensemble Kalman filter for state-parameter estimation in subsurface hydrology

    KAUST Repository

    Ait-El-Fquih, Boujemaa; El Gharamti, Mohamad; Hoteit, Ibrahim

    2016-01-01

    Ensemble Kalman filtering (EnKF) is an efficient approach to addressing uncertainties in subsurface ground-water models. The EnKF sequentially integrates field data into simulation models to obtain a better characterization of the model's state and parameters. These are generally estimated following joint and dual filtering strategies, in which, at each assimilation cycle, a forecast step by the model is followed by an update step with incoming observations. The joint EnKF directly updates the augmented state-parameter vector, whereas the dual EnKF empirically employs two separate filters, first estimating the parameters and then estimating the state based on the updated parameters. To develop a Bayesian consistent dual approach and improve the state-parameter estimates and their consistency, we propose in this paper a one-step-ahead (OSA) smoothing formulation of the state-parameter Bayesian filtering problem from which we derive a new dual-type EnKF, the dual EnKF(OSA). Compared with the standard dual EnKF, it imposes a new update step to the state, which is shown to enhance the performance of the dual approach with almost no increase in the computational cost. Numerical experiments are conducted with a two-dimensional (2-D) synthetic groundwater aquifer model to investigate the performance and robustness of the proposed dual EnKFOSA, and to evaluate its results against those of the joint and dual EnKFs. The proposed scheme is able to successfully recover both the hydraulic head and the aquifer conductivity, providing further reliable estimates of their uncertainties. Furthermore, it is found to be more robust to different assimilation settings, such as the spatial and temporal distribution of the observations, and the level of noise in the data. Based on our experimental setups, it yields up to 25% more accurate state and parameter estimations than the joint and dual approaches.

  14. A Bayesian consistent dual ensemble Kalman filter for state-parameter estimation in subsurface hydrology

    KAUST Repository

    Ait-El-Fquih, Boujemaa

    2016-08-12

    Ensemble Kalman filtering (EnKF) is an efficient approach to addressing uncertainties in subsurface ground-water models. The EnKF sequentially integrates field data into simulation models to obtain a better characterization of the model\\'s state and parameters. These are generally estimated following joint and dual filtering strategies, in which, at each assimilation cycle, a forecast step by the model is followed by an update step with incoming observations. The joint EnKF directly updates the augmented state-parameter vector, whereas the dual EnKF empirically employs two separate filters, first estimating the parameters and then estimating the state based on the updated parameters. To develop a Bayesian consistent dual approach and improve the state-parameter estimates and their consistency, we propose in this paper a one-step-ahead (OSA) smoothing formulation of the state-parameter Bayesian filtering problem from which we derive a new dual-type EnKF, the dual EnKF(OSA). Compared with the standard dual EnKF, it imposes a new update step to the state, which is shown to enhance the performance of the dual approach with almost no increase in the computational cost. Numerical experiments are conducted with a two-dimensional (2-D) synthetic groundwater aquifer model to investigate the performance and robustness of the proposed dual EnKFOSA, and to evaluate its results against those of the joint and dual EnKFs. The proposed scheme is able to successfully recover both the hydraulic head and the aquifer conductivity, providing further reliable estimates of their uncertainties. Furthermore, it is found to be more robust to different assimilation settings, such as the spatial and temporal distribution of the observations, and the level of noise in the data. Based on our experimental setups, it yields up to 25% more accurate state and parameter estimations than the joint and dual approaches.

  15. Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.

    Science.gov (United States)

    Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y

    2007-01-01

    Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.

  16. Meta-Analysis of the Relation of Baseline Right Ventricular Function to Response to Cardiac Resynchronization Therapy.

    Science.gov (United States)

    Sharma, Abhishek; Bax, Jerome J; Vallakati, Ajay; Goel, Sunny; Lavie, Carl J; Kassotis, John; Mukherjee, Debabrata; Einstein, Andrew; Warrier, Nikhil; Lazar, Jason M

    2016-04-15

    Right ventricular (RV) dysfunction has been associated with adverse clinical outcomes in patients with heart failure (HF). Cardiac resynchronization therapy (CRT) improves left ventricular (LV) size and function in patients with markedly abnormal electrocardiogram QRS duration. However, relation of baseline RV function with response to CRT has not been well described. In this study, we aim to investigate the relation of baseline RV function with response to CRT as assessed by change in LV ejection fraction (EF). A systematic search of studies published from 1966 to May 31, 2015 was conducted using PubMed, CINAHL, Cochrane CENTRAL, and the Web of Science databases. Studies were included if they have reported (1) parameters of baseline RV function (tricuspid annular plane systolic excursion [TAPSE] or RVEF or RV basal strain or RV fractional area change [FAC]) and (2) LVEF before and after CRT. Random-effects metaregression was used to evaluate the effect of baseline RV function parameters and change in LVEF. Sixteen studies (n = 1,764) were selected for final analysis. Random-effects metaregression analysis showed no significant association between the magnitude of the difference in EF before and after CRT with baseline TAPSE (β = 0.005, p = 0.989); baseline RVEF (β = 0.270, p = 0.493); baseline RVFAC (β = -0.367, p = 0.06); baseline basal strain (β = -0.342, p = 0.462) after a mean follow-up period of 10.5 months. In conclusion, baseline RV function as assessed by TAPSE, FAC, basal strain, or RVEF does not determine response to CRT as assessed by change in LVEF. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Design parameters and source terms: Volume 1, Design parameters: Revision 0

    International Nuclear Information System (INIS)

    1987-09-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report to the level of the Site Characterization Plan - Conceptual Design Report, SCP-CDR. The previous unpublished SCC Study identified the data needs for the Environmental Assessment effort for seven possible salt repository sites

  18. Updated Status of the Global Electroweak Fit and Constraints on New Physics

    CERN Document Server

    Baak, M; Haller, J; Hoecker, A; Kennedy, D; Moenig, K; Schott, M; Stelzer, J

    2012-01-01

    We present an update of the Standard Model fit to electroweak precision data. We include newest experimental results on the top quark mass, the W mass and width, and the Higgs boson mass bounds from LEP, Tevatron and the LHC. We also include a new determination of the electromagnetic coupling strength at the Z pole. We find for the Higgs boson mass (96 +31 -24) GeV and (120 +12 -5) GeV when not including and including the direct Higgs searches, respectively. From the latter fit we indirectly determine the W mass to be (80.362 +- 0.013) GeV. We exploit the data to determine experimental constraints on the oblique vacuum polarisation parameters, and confront these with predictions from the Standard Model (SM) and selected SM extensions. By fitting the oblique parameters to the electroweak data we derive allowed regions in the BSM parameter spaces. We revisit and consistently update these constraints for a fourth fourth fermion generation, two Higgs doublet, inert Higgs and littlest Higgs models, models with lar...

  19. Updated Status of the Global Electroweak Fit and Constraints on New Physics

    CERN Document Server

    Baak, Max; Haller, Johannes; Hoecker, Andreas; Ludwig, Doerthe; Moenig, Klaus; Schott, Matthias; Stelzer, Joerg

    2011-01-01

    We present an update of the Standard Model fit to electroweak precision data. We include newest experimental results on the top quark mass, the W mass and width, and the Higgs boson mass bounds from LEP, Tevatron and the cLHC. We also include a new determination of the electromagnetic coupling strength at the Z pole. We find for the Higgs boson mass (96 +31 -24) GeV and (120 +12 -5) GeV when not including and including the direct Higgs searches, respectively. From the latter fit we indirectly determine the W mass to be (80.359 +0.017 -0.010) GeV. We exploit the data to determine experimental constraints on the oblique vacuum polarisation parameters, and confront these with predictions from the Standard Model (SM) and selected SM extensions. By fitting the oblique parameters to the electroweak data we derive allowed regions in the BSM parameter spaces. We revisit and consistently update these constraints for a fourth family, two Higgs doublet, inert Higgs and littlest Higgs models, models with large,...

  20. analysis and correlation of stability parameters in malting barley

    African Journals Online (AJOL)

    Administrator

    food, feed, medicinal purposes and malt of alcoholic beverages. Stability parameters are ... and animal food, health benefits and malting and brewing in many ..... targeted for environmental conditions which are ... Market Update. Available at ...

  1. Efficient Ensemble State-Parameters Estimation Techniques in Ocean Ecosystem Models: Application to the North Atlantic

    Science.gov (United States)

    El Gharamti, M.; Bethke, I.; Tjiputra, J.; Bertino, L.

    2016-02-01

    Given the recent strong international focus on developing new data assimilation systems for biological models, we present in this comparative study the application of newly developed state-parameters estimation tools to an ocean ecosystem model. It is quite known that the available physical models are still too simple compared to the complexity of the ocean biology. Furthermore, various biological parameters remain poorly unknown and hence wrong specifications of such parameters can lead to large model errors. Standard joint state-parameters augmentation technique using the ensemble Kalman filter (Stochastic EnKF) has been extensively tested in many geophysical applications. Some of these assimilation studies reported that jointly updating the state and the parameters might introduce significant inconsistency especially for strongly nonlinear models. This is usually the case for ecosystem models particularly during the period of the spring bloom. A better handling of the estimation problem is often carried out by separating the update of the state and the parameters using the so-called Dual EnKF. The dual filter is computationally more expensive than the Joint EnKF but is expected to perform more accurately. Using a similar separation strategy, we propose a new EnKF estimation algorithm in which we apply a one-step-ahead smoothing to the state. The new state-parameters estimation scheme is derived in a consistent Bayesian filtering framework and results in separate update steps for the state and the parameters. Unlike the classical filtering path, the new scheme starts with an update step and later a model propagation step is performed. We test the performance of the new smoothing-based schemes against the standard EnKF in a one-dimensional configuration of the Norwegian Earth System Model (NorESM) in the North Atlantic. We use nutrients profile (up to 2000 m deep) data and surface partial CO2 measurements from Mike weather station (66o N, 2o E) to estimate

  2. A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data

    Directory of Open Access Journals (Sweden)

    Jingjing He

    2017-09-01

    Full Text Available This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions.

  3. Precise baseline determination for the TanDEM-X mission

    Science.gov (United States)

    Koenig, Rolf; Moon, Yongjin; Neumayer, Hans; Wermuth, Martin; Montenbruck, Oliver; Jäggi, Adrian

    The TanDEM-X mission will strive for generating a global precise Digital Elevation Model (DEM) by way of bi-static SAR in a close formation of the TerraSAR-X satellite, already launched on June 15, 2007, and the TanDEM-X satellite to be launched in May 2010. Both satellites carry the Tracking, Occultation and Ranging (TOR) payload supplied by the GFZ German Research Centre for Geosciences. The TOR consists of a high-precision dual-frequency GPS receiver, called Integrated GPS Occultation Receiver (IGOR), and a Laser retro-reflector (LRR) for precise orbit determination (POD) and atmospheric sounding. The IGOR is of vital importance for the TanDEM-X mission objectives as the millimeter level determination of the baseline or distance between the two spacecrafts is needed to derive meter level accurate DEMs. Within the TanDEM-X ground segment GFZ is responsible for the operational provision of precise baselines. For this GFZ uses two software chains, first its Earth Parameter and Orbit System (EPOS) software and second the BERNESE software, for backup purposes and quality control. In a concerted effort also the German Aerospace Center (DLR) generates precise baselines independently with a dedicated Kalman filter approach realized in its FRNS software. By the example of GRACE the generation of baselines with millimeter accuracy from on-board GPS data can be validated directly by way of comparing them to the intersatellite K-band range measurements. The K-band ranges are accurate down to the micrometer-level and therefore may be considered as truth. Both TanDEM-X baseline providers are able to generate GRACE baselines with sub-millimeter accuracy. By merging the independent baselines by GFZ and DLR, the accuracy can even be increased. The K-band validation however covers solely the along-track component as the K-band data measure just the distance between the two GRACE satellites. In addition they inhibit an un-known bias which must be modelled in the comparison, so the

  4. Retrospective forecast of ETAS model with daily parameters estimate

    Science.gov (United States)

    Falcone, Giuseppe; Murru, Maura; Console, Rodolfo; Marzocchi, Warner; Zhuang, Jiancang

    2016-04-01

    We present a retrospective ETAS (Epidemic Type of Aftershock Sequence) model based on the daily updating of free parameters during the background, the learning and the test phase of a seismic sequence. The idea was born after the 2011 Tohoku-Oki earthquake. The CSEP (Collaboratory for the Study of Earthquake Predictability) Center in Japan provided an appropriate testing benchmark for the five 1-day submitted models. Of all the models, only one was able to successfully predict the number of events that really happened. This result was verified using both the real time and the revised catalogs. The main cause of the failure was in the underestimation of the forecasted events, due to model parameters maintained fixed during the test. Moreover, the absence in the learning catalog of an event similar to the magnitude of the mainshock (M9.0), which drastically changed the seismicity in the area, made the learning parameters not suitable to describe the real seismicity. As an example of this methodological development we show the evolution of the model parameters during the last two strong seismic sequences in Italy: the 2009 L'Aquila and the 2012 Reggio Emilia episodes. The achievement of the model with daily updated parameters is compared with that of same model where the parameters remain fixed during the test time.

  5. PROPOSAL OF A TABLE TO CLASSIFY THE RELIABILITY OF BASELINES OBTAINED BY GNSS TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    Paulo Cesar Lima Segantine

    Full Text Available The correct data processing of GNSS measurements, as well as a correct interpretation of the results are fundamental factors for analysis of quality of land surveying works. In that sense, it is important to keep in mind that, although, the statistical data provided by the majority of commercials software used for GNSS data processing, describes the credibility of the work, they do not have consistent information about the reliability of the processed coordinates. Based on that assumption, this paper proposes a classification table to classify the reliability of baselines obtained through GNSS data processing. As data input, the GNSS measurements were performed during the years 2006 and 2008, considering different seasons of the year, geometric configurations of RBMC stations and baseline lengths. As demonstrated in this paper, parameters as baseline length, ambiguity solution, PDOP value and the precision of horizontal and vertical values of coordinates can be used as reliability parameters. The proposed classification guarantees the requirements of the Brazilian Law N( 10.267/2001 of the National Institute of Colonization and Agrarian Reform (INCRA

  6. Update of the Nuclear Criticality Slide Rule for the Emergency Response to a Nuclear Criticality Accident

    Science.gov (United States)

    Duluc, Matthieu; Bardelay, Aurélie; Celik, Cihangir; Heinrichs, Dave; Hopper, Calvin; Jones, Richard; Kim, Soon; Miller, Thomas; Troisne, Marc; Wilson, Chris

    2017-09-01

    AWE (UK), IRSN (France), LLNL (USA) and ORNL (USA) began a long term collaboration effort in 2015 to update the nuclear criticality Slide Rule for the emergency response to a nuclear criticality accident. This document, published almost 20 years ago, gives order of magnitude estimates of key parameters, such as number of fissions and doses (neutron and gamma), useful for emergency response teams and public authorities. This paper will present, firstly the motivation and the long term objectives for this update, then the overview of the initial configurations for updated calculations and preliminary results obtained with modern 3D codes.

  7. Update of the Nuclear Criticality Slide Rule for the Emergency Response to a Nuclear Criticality Accident

    Directory of Open Access Journals (Sweden)

    Duluc Matthieu

    2017-01-01

    Full Text Available AWE (UK, IRSN (France, LLNL (USA and ORNL (USA began a long term collaboration effort in 2015 to update the nuclear criticality Slide Rule for the emergency response to a nuclear criticality accident. This document, published almost 20 years ago, gives order of magnitude estimates of key parameters, such as number of fissions and doses (neutron and gamma, useful for emergency response teams and public authorities. This paper will present, firstly the motivation and the long term objectives for this update, then the overview of the initial configurations for updated calculations and preliminary results obtained with modern 3D codes.

  8. Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system

    Science.gov (United States)

    Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-05-01

    We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.

  9. Updated determination of αs from hadronic τ decays

    International Nuclear Information System (INIS)

    Boito, Diogo; Golterman, Maarten; Jamin, Matthias; Mahdavi, Andisheh; Maltman, Kim; Osborne, James; Peris, Santiago

    2013-01-01

    Recently we introduced a new framework to extract α s from the non-strange hadronic τ-decay spectral functions taking into account duality violations and treating nonperturbative effects self-consistently. Using an updated version of the OPAL 1998 data, we obtain α s FO (m τ 2 )=0.325±0.018 and α s Cl (m τ 2 )=0.347±0.025 in fixed-order and contour-improved perturbation theory, respectively. The results from our multi-parameter nonlinear fits were corroborated by a Markov-Chain Monte Carlo investigation of the posterior probability distributions of the model parameters

  10. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  11. Baseline recommendations for greenhouse gas mitigation projects in the electric power sector

    Energy Technology Data Exchange (ETDEWEB)

    Kartha, Sivan; Lazarus, Michael [Stockholm Environment Institute/Tellus Institute, Boston, MA (United States); Bosi, Martina [International Energy Agency, Paris, 75 (France)

    2004-03-01

    The success of the Clean Development Mechanism (CDM) and other credit-based emission trading regimes depends on effective methodologies for quantifying a project's emissions reductions. The key methodological challenge lies in estimating project's counterfactual emission baseline, through balancing the need for accuracy, transparency, and practicality. Baseline standardisation (e.g. methodology, parameters and/or emission rate) can be a means to achieve these goals. This paper compares specific options for developing standardised baselines for the electricity sector - a natural starting point for baseline standardisation given the magnitude of the emissions reductions opportunities. The authors review fundamental assumptions that baseline studies have made with respect to estimating the generation sources avoided by CDM or other emission-reducing projects. Typically, studies have assumed that such projects affect either the operation of existing power plants (the operating margin) or the construction of new generation facilities (the build margin). The authors show that both effects are important to consider and thus recommend a combined margin approach for most projects, based on grid-specific data. They propose a three-category framework, according to projects' relative scale and environmental risk. (Author)

  12. Working Memory Updating Latency Reflects the Cost of Switching between Maintenance and Updating Modes of Operation

    Science.gov (United States)

    Kessler, Yoav; Oberauer, Klaus

    2014-01-01

    Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…

  13. FRMAC Updates

    International Nuclear Information System (INIS)

    Mueller, P.

    1995-01-01

    This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given

  14. MOVES sensitivity analysis update : Transportation Research Board Summer Meeting 2012 : ADC-20 Air Quality Committee

    Science.gov (United States)

    2012-01-01

    OVERVIEW OF PRESENTATION : Evaluation Parameters : EPAs Sensitivity Analysis : Comparison to Baseline Case : MOVES Sensitivity Run Specification : MOVES Sensitivity Input Parameters : Results : Uses of Study

  15. Circuit realization, chaos synchronization and estimation of parameters of a hyperchaotic system with unknown parameters

    Directory of Open Access Journals (Sweden)

    A. Elsonbaty

    2014-10-01

    Full Text Available In this article, the adaptive chaos synchronization technique is implemented by an electronic circuit and applied to the hyperchaotic system proposed by Chen et al. We consider the more realistic and practical case where all the parameters of the master system are unknowns. We propose and implement an electronic circuit that performs the estimation of the unknown parameters and the updating of the parameters of the slave system automatically, and hence it achieves the synchronization. To the best of our knowledge, this is the first attempt to implement a circuit that estimates the values of the unknown parameters of chaotic system and achieves synchronization. The proposed circuit has a variety of suitable real applications related to chaos encryption and cryptography. The outputs of the implemented circuits and numerical simulation results are shown to view the performance of the synchronized system and the proposed circuit.

  16. Forest Resources of the United States, 2012: a technical document supporting the Forest Service 2010 update of the RPA Assessment

    Science.gov (United States)

    Sonja N. Oswalt; W. Brad Smith; Patrick D. Miles; Scott A. Pugh

    2014-01-01

    Forest resource statistics from the 2010 Resources Planning Act (RPA) Assessment were updated to provide current information on the Nation's forests as a baseline for the 2015 national assessment. Resource tables present estimates of forest area, volume, mortality, growth, removals, and timber products output in various ways, such as by ownership, region, or State...

  17. Sensitivity Analysis of the Influence of Structural Parameters on Dynamic Behaviour of Highly Redundant Cable-Stayed Bridges

    Directory of Open Access Journals (Sweden)

    B. Asgari

    2013-01-01

    Full Text Available The model tuning through sensitivity analysis is a prominent procedure to assess the structural behavior and dynamic characteristics of cable-stayed bridges. Most of the previous sensitivity-based model tuning methods are automatic iterative processes; however, the results of recent studies show that the most reasonable results are achievable by applying the manual methods to update the analytical model of cable-stayed bridges. This paper presents a model updating algorithm for highly redundant cable-stayed bridges that can be used as an iterative manual procedure. The updating parameters are selected through the sensitivity analysis which helps to better understand the structural behavior of the bridge. The finite element model of Tatara Bridge is considered for the numerical studies. The results of the simulations indicate the efficiency and applicability of the presented manual tuning method for updating the finite element model of cable-stayed bridges. The new aspects regarding effective material and structural parameters and model tuning procedure presented in this paper will be useful for analyzing and model updating of cable-stayed bridges.

  18. Updated Simulation Studies of Damage Limit of LHC Tertiary Collimators

    CERN Document Server

    AUTHOR|(CDS)2085459; Bertarelli, Alessandro; Bruce, Roderik; Carra, Federico; Cerutti, Francesco; Gradassi, Paolo; Lechner, Anton; Redaelli, Stefano; Skordis, Eleftherios

    2015-01-01

    The tertiary collimators (TCTs) in the LHC, installed in front of the experiments, in standard operation intercept fractions of 10−3 halo particles. However, they risk to be hit by high-intensity primary beams in case of asynchronous beam dump. TCT damage thresholds were initially inferred from results of destructive tests on a TCT jaw, supported by numerical simulations, assuming simplified impact scenarios with one single bunch hitting the jaw with a given impact parameter. In this paper, more realistic failure conditions, including a train of bunches and taking into account the full collimation hierarchy, are used to derive updated damage limits. The results are used to update the margins in the collimation hierarchy and could thus potentially have an influence on the LHC performance.

  19. Memory updating and mental arithmetic

    Directory of Open Access Journals (Sweden)

    Cheng-Ching eHan

    2016-02-01

    Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.

  20. Base-line studies for DAE establishments

    International Nuclear Information System (INIS)

    Puranik, V.D.

    2012-01-01

    to ensure that the seasonal variations in parameters are considered. The data is generated for an area covering at least 30 km radium around the proposed location of the facility, in a manner, such that very dense data is generated close to the location and it becomes gradually sparce for the distant areas. Base-line data is generated with the help of local universities and institutions under constant interaction and supervision of the departmental personnel. The work is carried out as per the set protocols laid down by the department for such purposes. The protocols conform to the international practices for carrying out such work. The studies include measurement of concentrations of naturally occurring and man-made radionuclides and also heavy toxic metals and other pollutants in various environmental matrices such as air sub soil water, surface water, soil, sediment, biota and locally consumed food items including meat, fish, milk, eggs, vegetables and cereals. Studies on density and variety of flora and fauna in the region are carried out. Health status and demographic status is recorded in detail. Hydrogeological studies are carried out to establish ground water movement at the location. Based on the data so generated, a Remote Sensing and Geographic Information System is prepared to collate the data. For coastal locations, studies of the nearby marine environment are also carried out. The baseline data is a valuable set of information of the environmental status of a location prevailing before the start of the departmental activity. Its importance is two fold - firstly because, it can not be generated after the start of the activity at the given location and secondly because it is the most authentic data set which can be referred later to assess the environmental impact of the facility by way of evaluating the changes in the environmental parameters, if any. (author)

  1. Finite element model updating of a prestressed concrete box girder bridge using subproblem approximation

    Science.gov (United States)

    Chen, G. W.; Omenzetter, P.

    2016-04-01

    This paper presents the implementation of an updating procedure for the finite element model (FEM) of a prestressed concrete continuous box-girder highway off-ramp bridge. Ambient vibration testing was conducted to excite the bridge, assisted by linear chirp sweepings induced by two small electrodynamic shakes deployed to enhance the excitation levels, since the bridge was closed to traffic. The data-driven stochastic subspace identification method was executed to recover the modal properties from measurement data. An initial FEM was developed and correlation between the experimental modal results and their analytical counterparts was studied. Modelling of the pier and abutment bearings was carefully adjusted to reflect the real operational conditions of the bridge. The subproblem approximation method was subsequently utilized to automatically update the FEM. For this purpose, the influences of bearing stiffness, and mass density and Young's modulus of materials were examined as uncertain parameters using sensitivity analysis. The updating objective function was defined based on a summation of squared values of relative errors of natural frequencies between the FEM and experimentation. All the identified modes were used as the target responses with the purpose of putting more constrains for the optimization process and decreasing the number of potentially feasible combinations for parameter changes. The updated FEM of the bridge was able to produce sufficient improvements in natural frequencies in most modes of interest, and can serve for a more precise dynamic response prediction or future investigation of the bridge health.

  2. Annual Technology Baseline (Including Supporting Data); NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Blair, Nate; Cory, Karlynn; Hand, Maureen; Parkhill, Linda; Speer, Bethany; Stehly, Tyler; Feldman, David; Lantz, Eric; Augusting, Chad; Turchi, Craig; O' Connor, Patrick

    2015-07-08

    Consistent cost and performance data for various electricity generation technologies can be difficult to find and may change frequently for certain technologies. With the Annual Technology Baseline (ATB), National Renewable Energy Laboratory provides an organized and centralized dataset that was reviewed by internal and external experts. It uses the best information from the Department of Energy laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information. The ATB includes both a presentation with notes (PDF) and an associated Excel Workbook. The ATB includes the following electricity generation technologies: land-based wind; offshore wind; utility-scale solar PV; concentrating solar power; geothermal power; hydropower plants (upgrades to existing facilities, powering non-powered dams, and new stream-reach development); conventional coal; coal with carbon capture and sequestration; integrated gasification combined cycle coal; natural gas combustion turbines; natural gas combined cycle; conventional biopower. Nuclear laboratory's renewable energy analysts and Energy Information Administration information for conventional technologies. The ATB will be updated annually in order to provide an up-to-date repository of current and future cost and performance data. Going forward, we plan to revise and refine the values using best available information.

  3. An Iterative Ensemble Kalman Filter with One-Step-Ahead Smoothing for State-Parameters Estimation of Contaminant Transport Models

    KAUST Repository

    Gharamti, M. E.

    2015-05-11

    The ensemble Kalman filter (EnKF) is a popular method for state-parameters estimation of subsurface flow and transport models based on field measurements. The common filtering procedure is to directly update the state and parameters as one single vector, which is known as the Joint-EnKF. In this study, we follow the one-step-ahead smoothing formulation of the filtering problem, to derive a new joint-based EnKF which involves a smoothing step of the state between two successive analysis steps. The new state-parameters estimation scheme is derived in a consistent Bayesian filtering framework and results in separate update steps for the state and the parameters. This new algorithm bears strong resemblance with the Dual-EnKF, but unlike the latter which first propagates the state with the model then updates it with the new observation, the proposed scheme starts by an update step, followed by a model integration step. We exploit this new formulation of the joint filtering problem and propose an efficient model-integration-free iterative procedure on the update step of the parameters only for further improved performances. Numerical experiments are conducted with a two-dimensional synthetic subsurface transport model simulating the migration of a contaminant plume in a heterogenous aquifer domain. Contaminant concentration data are assimilated to estimate both the contaminant state and the hydraulic conductivity field. Assimilation runs are performed under imperfect modeling conditions and various observational scenarios. Simulation results suggest that the proposed scheme efficiently recovers both the contaminant state and the aquifer conductivity, providing more accurate estimates than the standard Joint and Dual EnKFs in all tested scenarios. Iterating on the update step of the new scheme further enhances the proposed filter’s behavior. In term of computational cost, the new Joint-EnKF is almost equivalent to that of the Dual-EnKF, but requires twice more model

  4. FEM Updating of Tall Buildings using Ambient Vibration Data

    DEFF Research Database (Denmark)

    Ventura, C. E.; Lord, J. F.; Turek, M.

    2005-01-01

    Ambient vibration testing is the most economical non-destructive testing method to acquire vibration data from large civil engineering structures. The purpose of this paper is to demonstrate how ambient vibration Modal Identification techniques can be effectively used with Model Updating tools...... to develop reliable finite element models of large civil engineering structures. A fifteen story and a forty-eight story reinforced concrete buildings are used as case studies for this purpose. The dynamic characteristics of interest for this study were the first few lateral and torsional natural frequencies...... the information provided in the design documentation of the building. Different parameters of the model were then modified using an automated procedure to improve the correlation between measured and calculated modal parameters. Careful attention was placed to the selection of the parameters to be modified...

  5. Stellar Parameters for Trappist-1

    Science.gov (United States)

    Van Grootel, Valérie; Fernandes, Catarina S.; Gillon, Michael; Jehin, Emmanuel; Manfroid, Jean; Scuflaire, Richard; Burgasser, Adam J.; Barkaoui, Khalid; Benkhaldoun, Zouhair; Burdanov, Artem; Delrez, Laetitia; Demory, Brice-Olivier; de Wit, Julien; Queloz, Didier; Triaud, Amaury H. M. J.

    2018-01-01

    TRAPPIST-1 is an ultracool dwarf star transited by seven Earth-sized planets, for which thorough characterization of atmospheric properties, surface conditions encompassing habitability, and internal compositions is possible with current and next-generation telescopes. Accurate modeling of the star is essential to achieve this goal. We aim to obtain updated stellar parameters for TRAPPIST-1 based on new measurements and evolutionary models, compared to those used in discovery studies. We present a new measurement for the parallax of TRAPPIST-1, 82.4 ± 0.8 mas, based on 188 epochs of observations with the TRAPPIST and Liverpool Telescopes from 2013 to 2016. This revised parallax yields an updated luminosity of {L}* =(5.22+/- 0.19)× {10}-4 {L}ȯ , which is very close to the previous estimate but almost two times more precise. We next present an updated estimate for TRAPPIST-1 stellar mass, based on two approaches: mass from stellar evolution modeling, and empirical mass derived from dynamical masses of equivalently classified ultracool dwarfs in astrometric binaries. We combine them using a Monte-Carlo approach to derive a semi-empirical estimate for the mass of TRAPPIST-1. We also derive estimate for the radius by combining this mass with stellar density inferred from transits, as well as an estimate for the effective temperature from our revised luminosity and radius. Our final results are {M}* =0.089+/- 0.006 {M}ȯ , {R}* =0.121+/- 0.003 {R}ȯ , and {T}{eff} = 2516 ± 41 K. Considering the degree to which the TRAPPIST-1 system will be scrutinized in coming years, these revised and more precise stellar parameters should be considered when assessing the properties of TRAPPIST-1 planets.

  6. NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, B.; Reed, B.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. As the Calibration and Validation activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  7. Updating of states in operational hydrological models

    Science.gov (United States)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  8. Baseline Year for the Baseline Year Test (in 40 CFR 93.119)

    Science.gov (United States)

    EPA OTAQ's State and Local Transportation Resources are for air quality and transportation government and community leaders. Updates on the adequacy of state implementation plans (SIPs), and transportation conformity regulations and guidance

  9. Program Baseline Change Control Procedure

    International Nuclear Information System (INIS)

    1993-02-01

    This procedure establishes the responsibilities and process for approving initial issues of and changes to the technical, cost, and schedule baselines, and selected management documents developed by the Office of Civilian Radioactive Waste Management (OCRWM) for the Civilian Radioactive Waste Management System. This procedure implements the OCRWM Baseline Management Plan and DOE Order 4700.1, Chg 1. It streamlines the change control process to enhance integration, accountability, and traceability of Level 0 and Level I decisions through standardized Baseline Change Proposal (BCP) forms to be used by the Level 0, 1, 2, and 3 Baseline Change Control Boards (BCCBs) and to be tracked in the OCRWM-wide Configuration Information System (CIS) Database.This procedure applies to all technical, cost, and schedule baselines controlled by the Energy System Acquisition Advisory Board (ESAAB) BCCB (Level 0) and, OCRWM Program Baseline Control Board (PBCCB) (Level 1). All baseline BCPs initiated by Level 2 or lower BCCBs, which require approval from ESAAB or PBCCB, shall be processed in accordance with this procedure. This procedure also applies to all Program-level management documents controlled by the OCRWM PBCCB

  10. A sampling strategy to establish existing plant configuration baselines

    International Nuclear Information System (INIS)

    Buchanan, L.P.

    1995-01-01

    The Department of Energy's Gaseous Diffusion Plants (DOEGDP) are undergoing a Safety Analysis Update Program. As part of this program, critical existing structures are being reevaluated for Natural Phenomena Hazards (NPH) based on the recommendations of UCRL-15910. The Department of Energy has specified that current plant configurations be used in the performance of these reevaluations. This paper presents the process and results of a walkdown program implemented at DOEGDP to establish the current configuration baseline for these existing critical structures for use in subsequent NPH evaluations. These structures are classified as moderate hazard facilities and were constructed in the early 1950's. The process involved a statistical sampling strategy to determine the validity of critical design information as represented on the original design drawings such as member sizes, orientation, connection details and anchorage. A floor load inventory of the dead load of the equipment, both permanently attached and spare, was also performed as well as a walkthrough inspection of the overall structure to identify any other significant anomalies

  11. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  12. The planning of decommissioning activities within nuclear facilities - Generating a Baseline Decommissioning Plan

    International Nuclear Information System (INIS)

    Meek, N.C.; Ingram, S.; Page, J.

    2003-01-01

    BNFL Environmental Services has developed planning tools to meet the emerging need for nuclear liabilities management and decommissioning engineering both in the UK and globally. It can provide a comprehensive baseline planning service primarily aimed at nuclear power stations and nuclear plant. The paper develops the following issues: Decommissioning planning; The baseline decommissioning plan;The process; Work package; Compiling the information; Deliverables summary; Customer Benefits; - Planning tool for nuclear liability life-cycle management; - Robust and reliable plans based upon 'real' experience; - Advanced financial planning; - Ascertaining risk; - Strategy and business planning. The following Deliverables are mentioned:1. Site Work Breakdown Structure; 2. Development of site implementation strategy from the high level decommissioning strategy; 3. An end point definition for the site; 4. Buildings, operational systems and plant surveys; 5. A schedule of condition for the site; 6. Development of technical approach for decommissioning for each work package; 7. Cost estimate to WBS level 5 for each work package; 8. Estimate of decommissioning waste arisings for each work package; 9. Preparation of complete decommissioning programme in planning software to suit client; 10. Risk modelling of work package and overall project levels; 11. Roll up of costs into an overall cost model; 12. Cash flow, waste profiling and resource profiling against the decommissioning programme; 13. Preparation and issue of Final Report. Finally The BDP process is represented by a flowchart listing the following stages: [Power Station project assigned] → [Review project and conduct Characterisation review of power station] → [Identify work packages] → [Set up WBS to level 3] → [Assign work packages] → [Update WBS to level 4] →[Develop cost model] → [Develop logic network] → [Develop risk management procedure] ] → [Develop project strategy document]→ [Work package

  13. Updated users' guide for SAMMY: Multilevel R-matrix fits to neutron data using Bayes' equation

    International Nuclear Information System (INIS)

    Larson, N.M.

    1989-06-01

    In 1980 the multilevel multichannel R-matrix code SAMMY was released for use in analysis of neutron data at the Oak Ridge Electron Linear Accelerator. Since that time, SAMMY has undergone significant modifications: user-friendly options have been incorporated to streamline common operations and to protect a run from common user errors; the Reich-Moore formalism has been extended to include an optional logarithmic parameterization of the external R-matrix, for which any or all parameters may be varied; the ability to vary sample thickness, effective temperature, matching radius, and/or resolution-broadening parameters has been incorporated; to avoid loss of information (i.e., computer round-off errors) between runs, the ''covariance file'' now includes precise values for all variables; and unused but correlated variables may be included in the analysis. Because of these and earlier changes, the 1980 SAMMY manual is now hopelessly obsolete. This report is intended to be complete documentation for the current version of SAMMY. Its publication in looseleaf form will permit updates to the manual to be made concurrently with updates to the code itself, thus eliminating most of the time lag between update and documentation. 28 refs., 54 tabs

  14. Updated user's guide for SAMMY: multilevel R-matrix fits to neutron data using Bayes' equation

    International Nuclear Information System (INIS)

    Larson, N.M.

    1996-01-01

    In 1980 the multilevel multichannel R-matrix code SAMMY was released for use in analysis of neutron data at the Oak Ridge Electron Linear Accelerator. Since that time, SAMMY has undergone significant modifications: (1) User-friendly options have been incorporated to streamline common operations and to protect a run from common user errors, (2) The Reich-Moore formalism has been extended to include an optional logarithmic parameterization of the external R-matrix, for which any or all parameters may be varied, (3) the ability to vary sample thickness, effective temperature, matching radius, and/or resolution-broadening parameters has been incorporated, (4) to avoid loss of information (i.e. computer round-off errors) between runs, the ''covariance file'' now includes precise values for al variables, (5) Unused but correlated variables may be included in the analysis. Because of these and earlier changes, the 1980 SAMMY manual is now hopelessly obsolete. This report is intended to be complete documentation for the current version of SAMMY. Its publication in looseleaf form will permit updates to the manual to be made concurrently with updates to the code itself, thus eliminating most of the time lag between update and documentation

  15. Imitate or innovate: Competition of strategy updating attitudes in spatial social dilemma games

    Science.gov (United States)

    Danku, Zsuzsa; Wang, Zhen; Szolnoki, Attila

    2018-01-01

    Evolution is based on the assumption that competing players update their strategies to increase their individual payoffs. However, while the applied updating method can be different, most of previous works proposed uniform models where players use identical way to revise their strategies. In this work we explore how imitation-based or learning attitude and innovation-based or myopic best-response attitude compete for space in a complex model where both attitudes are available. In the absence of additional cost the best response trait practically dominates the whole snow-drift game parameter space which is in agreement with the average payoff difference of basic models. When additional cost is involved then the imitation attitude can gradually invade the whole parameter space but this transition happens in a highly nontrivial way. However, the role of competing attitudes is reversed in the stag-hunt parameter space where imitation is more successful in general. Interestingly, a four-state solution can be observed for the latter game which is a consequence of an emerging cyclic dominance between possible states. These phenomena can be understood by analyzing the microscopic invasion processes, which reveals the unequal propagation velocities of strategies and attitudes.

  16. Program reference schedule baseline

    International Nuclear Information System (INIS)

    1986-07-01

    This Program Reference Schedule Baseline (PRSB) provides the baseline Program-level milestones and associated schedules for the Civilian Radioactive Waste Management Program. It integrates all Program-level schedule-related activities. This schedule baseline will be used by the Director, Office of Civilian Radioactive Waste Management (OCRWM), and his staff to monitor compliance with Program objectives. Chapter 1 includes brief discussions concerning the relationship of the PRSB to the Program Reference Cost Baseline (PRCB), the Mission Plan, the Project Decision Schedule, the Total System Life Cycle Cost report, the Program Management Information System report, the Program Milestone Review, annual budget preparation, and system element plans. Chapter 2 includes the identification of all Level 0, or Program-level, milestones, while Chapter 3 presents and discusses the critical path schedules that correspond to those Level 0 milestones

  17. Bayesian Parameter Estimation via Filtering and Functional Approximations

    KAUST Repository

    Matthies, Hermann G.

    2016-11-25

    The inverse problem of determining parameters in a model by comparing some output of the model with observations is addressed. This is a description for what hat to be done to use the Gauss-Markov-Kalman filter for the Bayesian estimation and updating of parameters in a computational model. This is a filter acting on random variables, and while its Monte Carlo variant --- the Ensemble Kalman Filter (EnKF) --- is fairly straightforward, we subsequently only sketch its implementation with the help of functional representations.

  18. Bayesian Parameter Estimation via Filtering and Functional Approximations

    KAUST Repository

    Matthies, Hermann G.; Litvinenko, Alexander; Rosic, Bojana V.; Zander, Elmar

    2016-01-01

    The inverse problem of determining parameters in a model by comparing some output of the model with observations is addressed. This is a description for what hat to be done to use the Gauss-Markov-Kalman filter for the Bayesian estimation and updating of parameters in a computational model. This is a filter acting on random variables, and while its Monte Carlo variant --- the Ensemble Kalman Filter (EnKF) --- is fairly straightforward, we subsequently only sketch its implementation with the help of functional representations.

  19. The Cardassian expansion revisited: constraints from updated Hubble parameter measurements and type Ia supernova data

    Science.gov (United States)

    Magaña, Juan; Amante, Mario H.; Garcia-Aspeitia, Miguel A.; Motta, V.

    2018-05-01

    Motivated by an updated compilation of observational Hubble data (OHD) that consist of 51 points in the redshift range of 0.07 Ia supernova (SN Ia) using the compressed and full joint-light-analysis (JLA) samples (Betoule et al.). We also perform a joint analysis using the combination OHD plus compressed JLA. Our results show that the OC and MPC models are in agreement with the standard cosmology and naturally introduce a cosmological-constant-like extra term in the canonical Friedmann equation with the capability of accelerating the Universe without dark energy.

  20. SCoPE: an efficient method of Cosmological Parameter Estimation

    International Nuclear Information System (INIS)

    Das, Santanu; Souradeep, Tarun

    2014-01-01

    Markov Chain Monte Carlo (MCMC) sampler is widely used for cosmological parameter estimation from CMB and other data. However, due to the intrinsic serial nature of the MCMC sampler, convergence is often very slow. Here we present a fast and independently written Monte Carlo method for cosmological parameter estimation named as Slick Cosmological Parameter Estimator (SCoPE), that employs delayed rejection to increase the acceptance rate of a chain, and pre-fetching that helps an individual chain to run on parallel CPUs. An inter-chain covariance update is also incorporated to prevent clustering of the chains allowing faster and better mixing of the chains. We use an adaptive method for covariance calculation to calculate and update the covariance automatically as the chains progress. Our analysis shows that the acceptance probability of each step in SCoPE is more than 95% and the convergence of the chains are faster. Using SCoPE, we carry out some cosmological parameter estimations with different cosmological models using WMAP-9 and Planck results. One of the current research interests in cosmology is quantifying the nature of dark energy. We analyze the cosmological parameters from two illustrative commonly used parameterisations of dark energy models. We also asses primordial helium fraction in the universe can be constrained by the present CMB data from WMAP-9 and Planck. The results from our MCMC analysis on the one hand helps us to understand the workability of the SCoPE better, on the other hand it provides a completely independent estimation of cosmological parameters from WMAP-9 and Planck data

  1. Updated status of the global electroweak fit and constraints on new physics

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M.; Hoecker, A.; Schott, M. [CERN, Geneva (Switzerland); Goebel, M.; Ludwig, D. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Hamburg Univ. (Germany). Inst. fuer Experimentalphysik; Haller, J. [Hamburg Univ. (Germany). Inst. fuer Experimentalphysik; Goettingen Univ. (Germany). II. Physikalisches Inst.; Moenig, K. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Stelzer, J. [Michigan State Univ., East Lansing, MI (United States). Dept. of Physics and Astronomy

    2011-07-15

    We present an update of the Standard Model fit to electroweak precision data. We include newest experimental results on the top quark mass, the W mass and width, and the Higgs boson mass bounds from LEP, Tevatron and the LHC. We also include a new determination of the electromagnetic coupling strength at the Z pole. We find for the Higgs boson mass 96{sub -24}{sup +31} GeV and 120{sub -5}{sup +12} GeV when not including and including the direct Higgs searches, respectively. From the latter fit we indirectly determine the W mass to be (80.362{+-} 0.013)GeV. We exploit the data to determine experimental constraints on the oblique vacuum polarisation parameters, and confront these with predictions from the Standard Model (SM) and selected SM extensions. By fitting the oblique parameters to the electroweak data we derive allowed regions in the BSM parameter spaces. We revisit and consistently update these constraints for a fourth fermion generation, two Higgs doublet, inert Higgs and littlest Higgs models, models with large, universal or warped extra dimensions and technicolour. In most of the models studied a heavy Higgs boson can be made compatible with the electroweak precision data. (orig.)

  2. CryoSat SAR/SARin Level1b products: assessment of BaselineC and improvements towards BaselineD

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Bouffard, Jerome; Parrinello, Tommaso

    2017-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline C, was released in operation in April 2015 and the second CryoSat reprocessing campaign was jointly initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also of some specific configurations for the calibration corrections. In particular, the CryoSat Level1b BaselineC products generated in the framework of the second reprocessing campaign include refined information for what concerns the mispointing angles and the calibration corrections. This poster will thus detail thus the evolutions that are currently planned for the CryoSat BaselineD SAR/SARin Level1b products and the corresponding quality improvements that are expected.

  3. Baseline 18F-FDG PET image-derived parameters for therapy response prediction in oesophageal cancer

    International Nuclear Information System (INIS)

    Hatt, Mathieu; Visvikis, Dimitris; Cheze-le Rest, Catherine; Pradier, Olivier

    2011-01-01

    The objectives of this study were to investigate the predictive value of tumour measurements on 2-deoxy-2-[ 18 F]fluoro-D-glucose ( 18 F-FDG) positron emission tomography (PET) pretreatment scan regarding therapy response in oesophageal cancer and to evaluate the impact of tumour delineation strategies. Fifty patients with oesophageal cancer treated with concomitant radiochemotherapy between 2004 and 2008 were retrospectively considered and classified as complete, partial or non-responders (including stable and progressive disease) according to Response Evaluation Criteria in Solid Tumors (RECIST). The classification of partial and complete responders was confirmed by biopsy. Tumours were delineated on the 18 F-FDG pretreatment scan using an adaptive threshold and the automatic fuzzy locally adaptive Bayesian (FLAB) methodologies. Several parameters were then extracted: maximum and peak standardized uptake value (SUV), tumour longitudinal length (TL) and volume (TV), SUV mean , and total lesion glycolysis (TLG = TV x SUV mean ). The correlation between each parameter and response was investigated using Kruskal-Wallis tests, and receiver-operating characteristic methodology was used to assess performance of the parameters to differentiate patients. Whereas commonly used parameters such as SUV measurements were not significant predictive factors of the response, parameters related to tumour functional spatial extent (TL, TV, TLG) allowed significant differentiation of all three groups of patients, independently of the delineation strategy, and could identify complete and non-responders with sensitivity above 75% and specificity above 85%. A systematic although not statistically significant trend was observed regarding the hierarchy of the delineation methodologies and the parameters considered, with slightly higher predictive value obtained with FLAB over adaptive thresholding, and TLG over TV and TL. TLG is a promising predictive factor of concomitant

  4. Development of JAEA sorption database (JAEA-SDB). Update of data evaluation functions and sorption/QA data

    International Nuclear Information System (INIS)

    Tachi, Yukio; Suyama, Tadahiro; Ochs, Michael; Ganter, Charlotte

    2011-03-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) and rocks are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the sorption and diffusion processes and develop database compiling reliable data and mechanistic/predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, Japan Atomic Energy Agency (JAEA) has developed databases of sorption and diffusion parameters in buffer materials (bentonite) and rocks. These sorption and diffusion databases (SDB/DDB) were firstly developed as an important basis for the H12 PA of high-level radioactive waste disposal, and have been provided through the Web. JAEA has been continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the databases for possible applications to PA-related parameter setting. The present report focuses on developing and updating of the sorption database (JAEA-SDB) as basis of integrated approach for PA-related K d setting. This includes an overview of database structure, contents and functions including additional data evaluation function focusing on multi-parameter dependence, operating method, PA-related applications of the web-based JAEA-SDB. K d data and their QA results are updated by focusing our recent activities on the K d setting and mechanistic model development. As a result, 4,250 K d data from 32 references are added, total K d values in the JAEA-SDB are about 28,540. The QA/classified K d data are about 39% for all K d data in JAEA-SDB. The updated JAEA-SDB is expected to make it possible to obtain quick overview of the available data, and to have suitable access to the respective data

  5. CryoSat Level1b SAR/SARin BaselineC: Product Format and Algorithm Improvements

    Science.gov (United States)

    Scagliola, Michele; Fornari, Marco; Di Giacinto, Andrea; Bouffard, Jerome; Féménias, Pierre; Parrinello, Tommaso

    2015-04-01

    CryoSat was launched on the 8th April 2010 and is the first European ice mission dedicated to the monitoring of precise changes in the thickness of polar ice sheets and floating sea ice. Cryosat carries an innovative radar altimeter called the Synthetic Aperture Interferometric Altimeter (SIRAL), that transmits pulses at a high pulse repetition frequency thus making the received echoes phase coherent and suitable for azimuth processing. This allows to reach a significantly improved along track resolution with respect to traditional pulse-width limited altimeters. CryoSat is the first altimetry mission operating in SAR mode and continuous improvements in the Level1 Instrument Processing Facility (IPF1) are being identified, tested and validated in order to improve the quality of the Level1b products. The current IPF, Baseline B, was released in operation in February 2012. A reprocessing campaign followed, in order to reprocess the data since July 2010. After more than 2 years of development, the release in operations of Baseline C is expected in the first half of 2015. BaselineC Level1b products will be distributed in an updated format, including for example the attitude information (roll, pitch and yaw) and, for SAR/SARIN, the waveform length doubled with respect to Baseline B. Moreveor, various algorithm improvements have been identified: • a datation bias of about -0.5195 ms will be corrected (SAR/SARIn) • a range bias of about 0.6730 m will be corrected (SAR/SARIn) • a roll bias of 0.1062 deg and a pitch bias of 0.0520 deg • Surface sample stack weighting to filter out the single look echoes acquired at highest look angle, that results in a sharpening of the 20Hz waveforms With the operational release of BaselineC, the second CryoSat reprocessing campaign will be initiated, taking benefit of the upgrade implemented in the IPF1 processing chain but also at IPF2 level. The reprocessing campaign will cover the full Cryosat mission starting on 16th July 2010

  6. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  7. Surrogate based approaches to parameter inference in ocean models

    KAUST Repository

    Knio, Omar

    2016-01-01

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  8. The Abiotic Depletion Potential: Background, Updates, and Future

    Directory of Open Access Journals (Sweden)

    Lauran van Oers

    2016-03-01

    Full Text Available Depletion of abiotic resources is a much disputed impact category in life cycle assessment (LCA. The reason is that the problem can be defined in different ways. Furthermore, within a specified problem definition, many choices can still be made regarding which parameters to include in the characterization model and which data to use. This article gives an overview of the problem definition and the choices that have been made when defining the abiotic depletion potentials (ADPs for a characterization model for abiotic resource depletion in LCA. Updates of the ADPs since 2002 are also briefly discussed. Finally, some possible new developments of the impact category of abiotic resource depletion are suggested, such as redefining the depletion problem as a dilution problem. This means taking the reserves in the environment and the economy into account in the reserve parameter and using leakage from the economy, instead of extraction rate, as a dilution parameter.

  9. Baseline drift effect on the performance of neutron and γ ray discrimination using frequency gradient analysis

    International Nuclear Information System (INIS)

    Liu Guofu; Luo Xiaoliang; Yang Jun; Lin Cunbao; Hu Qingqing; Peng Jinxian

    2013-01-01

    Frequency gradient analysis (FGA) effectively discriminates neutrons and γ rays by examining the frequency-domain features of the photomultiplier tube anode signal. This approach is insensitive to noise but is inevitably affected by the baseline drift similar to other pulse shape discrimination methods. The baseline drift effect is attributed to factors such as power line fluctuation, dark current, noise disturbances, hum, and pulse tail in front-end electronics. This effect needs to be elucidated and quantified before the baseline shift can be estimated and removed from the captured signal. Therefore, the effect of baseline shift on the discrimination performance of neutrons and γ rays with organic scintillation detectors using FGA is investigated in this paper. The relationship between the baseline shift and discrimination parameters of FGA is derived and verified by an experimental system consisting of an americium—beryllium source, a BC501A liquid scintillator detector, and a 5 GSample/s 8-bit oscilloscope. The theoretical and experimental results both show that the estimation of the baseline shift is necessary, and the removal of baseline drift from the pulse shapes can improve the discrimination performance of FGA. (authors)

  10. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch  N6 Prévessin Site Ap...

  11. BMI in relation to sperm count: an updated systematic review and collaborative meta-analysis

    NARCIS (Netherlands)

    Sermondade, N.; Faure, C.; Fezeu, L.; Shayeb, A. G.; Bonde, J. P.; Jensen, T. K.; van Wely, M.; Cao, J.; Martini, A. C.; Eskandar, M.; Chavarro, J. E.; Koloszar, S.; Twigt, J. M.; Ramlau-Hansen, C. H.; Borges, E.; Lotti, F.; Steegers-Theunissen, R. P. M.; Zorn, B.; Polotsky, A. J.; La Vignera, S.; Eskenazi, B.; Tremellen, K.; Magnusdottir, E. V.; Fejes, I.; Hercberg, S.; Lévy, R.; Czernichow, S.

    2013-01-01

    BACKGROUND The global obesity epidemic has paralleled a decrease in semen quality. Yet, the association between obesity and sperm parameters remains controversial. The purpose of this report was to update the evidence on the association between BMI and sperm count through a systematic review with

  12. LLL K Division nuclear test effects and geologic data base: glossary and parameter definitions (U)

    International Nuclear Information System (INIS)

    Howard, N.W.

    1979-01-01

    The report lists, defines, and updates Parameters in DBASE, an LLL test effects data bank in which data is stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for parameters for which some of the data may be classified

  13. Is more lordosis associated with improved outcomes in cervical laminectomy and fusion when baseline alignment is lordotic?

    Science.gov (United States)

    Sielatycki, John A; Armaghani, Sheyan; Silverberg, Arnold; McGirt, Matthew J; Devin, Clinton J; O'Neill, Kevin

    2016-08-01

    In cervical spondylotic myelopathy (CSM), cervical sagittal alignment (CSA) is associated with disease severity. Increased kyphosis and C2-C7 sagittal vertical axis (SVA) correlate with worse myelopathy and poor outcomes. However, when alignment is lordotic, it is unknown whether these associations persist. The study aimed to investigate the associations between CSA parameters and patient-reported outcomes (PROs) following posterior decompression and fusion for CSM when baseline lordosis is maintained. This is an analysis of a prospective surgical cohort at a single academic institution. The sample includes adult patients undergoing primary cervical laminectomy and fusion for CSM over a 3-year period. The PROs included EuroQol-5D, Short-Form-12 (SF-12) physical composite (PCS) and mental composite scales (MCS), Neck Disability Index, and the modified Japanese Orthopaedic Association scores. Radiographic CSA parameters measured included C1-C2 Cobb, C2-C7 Cobb, C1-C7 Cobb, C2-C7 SVA, C1-C7 SVA, and T1 slope. The PROs were recorded at baseline and at 3 and 12 months postoperatively. The CSA parameters were measured on standing radiographs in the neutral position at baseline and 3 months. Wilcoxon rank test was used to test for changes in PROs and CSA parameters, and Pearson correlation coefficients were calculated for CSA parameters and PROs preoperatively and at 12 months. No external sources of funding were used for this work. There were 45 patients included with an average age of 63 years who underwent posterior decompression and fusion of 3.7±1.3 levels. Significant improvements were found in all PROs except SF-12 MCS (p=.06). Small but statistically significant changes were found in C2-C7 Cobb (mean change: +3.6°; p=.03) and C2-C7 SVA (mean change: +3 mm; p=.01). At baseline, only C2-C7 SVA associated with worse SF-12 PCS scores (r=-0.34, p=.02). Postoperatively, there were no associations found between PROs and any CSA parameters. Similarly, no CSA

  14. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  15. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  16. Parameterized Analysis of Paging and List Update Algorithms

    DEFF Research Database (Denmark)

    Dorrigiv, Reza; Ehmsen, Martin R.; López-Ortiz, Alejandro

    2015-01-01

    that a larger cache leads to a better performance. We also apply the parameterized analysis framework to list update and show that certain randomized algorithms which are superior to MTF in the classical model are not so in the parameterized case, which matches experimental results....... set model and express the performance of well known algorithms in terms of this parameter. This explicitly introduces parameterized-style analysis to online algorithms. The idea is that rather than normalizing the performance of an online algorithm by an (optimal) offline algorithm, we explicitly...... express the behavior of the algorithm in terms of two more natural parameters: the size of the cache and Denning’s working set measure. This technique creates a performance hierarchy of paging algorithms which better reflects their experimentally observed relative strengths. It also reflects the intuition...

  17. DairyBISS Baseline report

    NARCIS (Netherlands)

    Buizer, N.N.; Berhanu, Tinsae; Murutse, Girmay; Vugt, van S.M.

    2015-01-01

    This baseline report of the Dairy Business Information Service and Support (DairyBISS) project presents the findings of a baseline survey among 103 commercial farms and 31 firms and advisors working in the dairy value chain. Additional results from the survey among commercial dairy farms are

  18. Baseline rationing

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    The standard problem of adjudicating conflicting claims describes a situation in which a given amount of a divisible good has to be allocated among agents who hold claims against it exceeding the available amount. This paper considers more general rationing problems in which, in addition to claims...... to international protocols for the reduction of greenhouse emissions, or water distribution in drought periods. We define a family of allocation methods for such general rationing problems - called baseline rationing rules - and provide an axiomatic characterization for it. Any baseline rationing rule within...... the family is associated with a standard rule and we show that if the latter obeys some properties reflecting principles of impartiality, priority and solidarity, the former obeys them too....

  19. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  20. Quickly updatable hologram images with high performance photorefractive polymer composites

    Science.gov (United States)

    Tsutsumi, Naoto; Kinashi, Kenji; Nonomura, Asato; Sakai, Wataru

    2012-02-01

    We present here quickly updatable hologram images using high performance photorefractive (PR) polymer composite based on poly(N-vinyl carbazole) (PVCz). PVCz is one of the pioneer materials for photoconductive polymer. PVCz/7- DCST/CzEPA/TNF (44/35/20/1 by wt) gives high diffraction efficiency of 68 % at E = 45 V/μm with fast response speed. Response speed of optical diffraction is the key parameter for real-time 3D holographic display. Key parameter for obtaining quickly updatable hologram images is to control the glass transition temperature lower enough to enhance chromophore orientation. Object image of the reflected coin surface recorded with reference beam at 532 nm (green beam) in the PR polymer composite is simultaneously reconstructed using a red probe beam at 642 nm. Instead of using coin object, object image produced by a computer was displayed on a spatial light modulator (SLM) is used as an object for hologram. Reflected object beam from a SLM interfered with reference beam on PR polymer composite to record a hologram and simultaneously reconstructed by a red probe beam. Movie produced in a computer was recorded as a realtime hologram in the PR polymer composite and simultaneously clearly reconstructed with a video rate.

  1. Updating Recursive XML Views of Relations

    DEFF Research Database (Denmark)

    Choi, Byron; Cong, Gao; Fan, Wenfei

    2009-01-01

    This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...

  2. Dissociating Working Memory Updating and Automatic Updating: The Reference-Back Paradigm

    Science.gov (United States)

    Rac-Lubashevsky, Rachel; Kessler, Yoav

    2016-01-01

    Working memory (WM) updating is a controlled process through which relevant information in the environment is selected to enter the gate to WM and substitute its contents. We suggest that there is also an automatic form of updating, which influences performance in many tasks and is primarily manifested in reaction time sequential effects. The goal…

  3. Baseline Physiologic and Psychosocial Characteristics of Transgender Youth Seeking Care for Gender Dysphoria.

    Science.gov (United States)

    Olson, Johanna; Schrager, Sheree M; Belzer, Marvin; Simons, Lisa K; Clark, Leslie F

    2015-10-01

    The purpose of this study was to describe baseline characteristics of participants in a prospective observational study of transgender youth (aged 12-24 years) seeking care for gender dysphoria at a large, urban transgender youth clinic. Eligible participants presented consecutively for care at between February 2011 and June 2013 and completed a computer-assisted survey at their initial study visit. Physiologic data were abstracted from medical charts. Data were analyzed by descriptive statistics, with limited comparisons between transmasculine and transfeminine participants. A total of 101 youth were evaluated for physiologic parameters, 96 completed surveys assessing psychosocial parameters. About half (50.5%) of the youth were assigned a male sex at birth. Baseline physiologic values were within normal ranges for assigned sex at birth. Youth recognized gender incongruence at a mean age of 8.3 years (standard deviation = 4.5), yet disclosed to their family much later (mean = 17.1; standard deviation = 4.2). Gender dysphoria was high among all participants. Thirty-five percent of the participants reported depression symptoms in the clinical range. More than half of the youth reported having thought about suicide at least once in their lifetime, and nearly a third had made at least one attempt. Baseline physiologic parameters were within normal ranges for assigned sex at birth. Transgender youth are aware of the incongruence between their internal gender identity and their assigned sex at early ages. Prevalence of depression and suicidality demonstrates that youth may benefit from timely and appropriate intervention. Evaluation of these youth over time will help determine the impact of medical intervention and mental health therapy. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  4. SOTER-based soil parameter estimates for Jordan (ver. 1.0)

    NARCIS (Netherlands)

    Batjes, N.H.

    2013-01-01

    This harmonized set of soil parameter estimates has been developed using an updated 1:500 000 scale Soil and Terrain (SOTER) Database for Jordan. The associated soil analytical data were derived from soil survey reports. These sources seldom hold all the physical and chemical attributes ideally

  5. Baseline restoration using current conveyors

    International Nuclear Information System (INIS)

    Morgado, A.M.L.S.; Simoes, J.B.; Correia, C.M.

    1996-01-01

    A good performance of high resolution nuclear spectrometry systems, at high pulse rates, demands restoration of baseline between pulses, in order to remove rate dependent baseline shifts. This restoration is performed by circuits named baseline restorers (BLRs) which also remove low frequency noise, such as power supply hum and detector microphonics. This paper presents simple circuits for baseline restoration based on a commercial current conveyor (CCII01). Tests were performed, on two circuits, with periodic trapezoidal shaped pulses in order to measure the baseline restoration for several pulse rates and restorer duty cycles. For the current conveyor based Robinson restorer, the peak shift was less than 10 mV, for duty cycles up to 60%, at high pulse rates. Duty cycles up to 80% were also tested, being the maximum peak shift 21 mV. The peak shift for the current conveyor based Grubic restorer was also measured. The maximum value found was 30 mV at 82% duty cycle. Keeping the duty cycle below 60% improves greatly the restorer performance. The ability of both baseline restorer architectures to reject low frequency modulation is also measured, with good results on both circuits

  6. Prediction of the Pharmacokinetic Parameters of Triptolide in Rats Based on Endogenous Molecules in Pre-Dose Baseline Serum

    Science.gov (United States)

    Aa, Jiye; Zheng, Tian; Shi, Jian; Li, Mengjie; Wang, Xinwen; Zhao, Chunyan; Xiao, Wenjing; Yu, Xiaoyi; Sun, Runbin; Gu, Rongrong; Zhou, Jun; Wu, Liang; Hao, Gang; Zhu, Xuanxuan; Wang, Guangji

    2012-01-01

    Background Individual variances usually affect drug metabolism and disposition, and hence result in either ineffectiveness or toxicity of a drug. In addition to genetic polymorphism, the multiple confounding factors of lifestyles, such as dietary preferences, contribute partially to individual variances. However, the difficulty of quantifying individual diversity greatly challenges the realization of individualized drug therapy. This study aims at quantitative evaluating the association between individual variances and the pharmacokinetics. Methodology/Principal Findings Molecules in pre-dose baseline serum were profiled using gas chromatography mass spectrometry to represent the individual variances of the model rats provided with high fat diets (HFD), routine chows and calorie restricted (CR) chows. Triptolide and its metabolites were determined using high performance liquid chromatography mass spectrometry. Metabonomic and pharmacokinetic data revealed that rats treated with the varied diets had distinctly different metabolic patterns and showed differential Cmax values, AUC and drug metabolism after oral administration of triptolide. Rats with fatty chows had the lowest Cmax and AUC values and the highest percentage of triptolide metabolic transformation, while rats with CR chows had the highest Cmax and AUC values and the least percentage of triptolide transformation. Multivariate linear regression revealed that in baseline serum, the concentrations of creatinine and glutamic acid, which is the precursor of GSH, were linearly negatively correlated to Cmax and AUC values. The glutamic acid and creatinine in baseline serum were suggested as the potential markers to represent individual diversity and as predictors of the disposal and pharmacokinetics of triptolide. Conclusions/Significance These results highlight the robust potential of metabonomics in characterizing individual variances and identifying relevant markers that have the potential to facilitate

  7. Fuzzy cross-model cross-mode method and its application to update the finite element model of structures

    International Nuclear Information System (INIS)

    Liu Yang; Xu Dejian; Li Yan; Duan Zhongdong

    2011-01-01

    As a novel updating technique, cross-model cross-mode (CMCM) method possesses a high efficiency and capability of flexible selecting updating parameters. However, the success of this method depends on the accuracy of measured modal shapes. Usually, the measured modal shapes are inaccurate since many kinds of measured noises are inevitable. Furthermore, the complete testing modal shapes are required by CMCM method so that the calculating errors may be introduced into the measured modal shapes by conducting the modal expansion or model reduction technique. Therefore, this algorithm is faced with the challenge of updating the finite element (FE) model of practical complex structures. In this study, the fuzzy CMCM method is proposed in order to weaken the effect of errors of the measured modal shapes on the updated results. Then two simulated examples are applied to compare the performance of the fuzzy CMCM method with the CMCM method. The test results show that proposed method is more promising to update the FE model of practical structures than CMCM method.

  8. Large short-baseline νμ disappearance

    International Nuclear Information System (INIS)

    Giunti, Carlo; Laveder, Marco

    2011-01-01

    We analyze the LSND, KARMEN, and MiniBooNE data on short-baseline ν μ →ν e oscillations and the data on short-baseline ν e disappearance obtained in the Bugey-3 and CHOOZ reactor experiments in the framework of 3+1 antineutrino mixing, taking into account the MINOS observation of long-baseline ν μ disappearance and the KamLAND observation of very-long-baseline ν e disappearance. We show that the fit of the data implies that the short-baseline disappearance of ν μ is relatively large. We obtain a prediction of an effective amplitude sin 2 2θ μμ > or approx. 0.1 for short-baseline ν μ disappearance generated by 0.2 2 2 , which could be measured in future experiments.

  9. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  10. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. II. Optimization model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    improvements. The biological model of the replacement model is described in a previous paper and in this paper the optimization model is described. The model is developed as a prototype for use under practical conditions. The application of the model is demonstrated using data from two commercial Danish sow......Recent methodological improvements in replacement models comprising multi-level hierarchical Markov processes and Bayesian updating have hardly been implemented in any replacement model and the aim of this study is to present a sow replacement model that really uses these methodological...... herds. It is concluded that the Bayesian updating technique and the hierarchical structure decrease the size of the state space dramatically. Since parameter estimates vary considerably among herds it is concluded that decision support concerning sow replacement only makes sense with parameters...

  11. Ambient modal testing of a double-arch dam: the experimental campaign and model updating

    International Nuclear Information System (INIS)

    García-Palacios, Jaime H.; Soria, José M.; Díaz, Iván M.; Tirado-Andrés, Francisco

    2016-01-01

    A finite element model updating of a double-curvature-arch dam (La Tajera, Spain) is carried out hereof using the modal parameters obtained from an operational modal analysis. That is, the system modal dampings, natural frequencies and mode shapes have been identified using output-only identification techniques under environmental loads (wind, vehicles). A finite element model of the dam-reservoir-foundation system was initially created. Then, a testing campaing was then carried out from the most significant test points using high-sensitivity accelerometers wirelessly synchronized. Afterwards, the model updating of the initial model was done using a Monte Carlo based approach in order to match it to the recorded dynamic behaviour. The updated model may be used within a structural health monitoring system for damage detection or, for instance, for the analysis of the seismic response of the arch dam- reservoir-foundation coupled system. (paper)

  12. Finite element model updating of concrete structures based on imprecise probability

    Science.gov (United States)

    Biswal, S.; Ramaswamy, A.

    2017-09-01

    Imprecise probability based methods are developed in this study for the parameter estimation, in finite element model updating for concrete structures, when the measurements are imprecisely defined. Bayesian analysis using Metropolis Hastings algorithm for parameter estimation is generalized to incorporate the imprecision present in the prior distribution, in the likelihood function, and in the measured responses. Three different cases are considered (i) imprecision is present in the prior distribution and in the measurements only, (ii) imprecision is present in the parameters of the finite element model and in the measurement only, and (iii) imprecision is present in the prior distribution, in the parameters of the finite element model, and in the measurements. Procedures are also developed for integrating the imprecision in the parameters of the finite element model, in the finite element software Abaqus. The proposed methods are then verified against reinforced concrete beams and prestressed concrete beams tested in our laboratory as part of this study.

  13. TAPIR--Finnish national geochemical baseline database.

    Science.gov (United States)

    Jarva, Jaana; Tarvainen, Timo; Reinikainen, Jussi; Eklund, Mikael

    2010-09-15

    In Finland, a Government Decree on the Assessment of Soil Contamination and Remediation Needs has generated a need for reliable and readily accessible data on geochemical baseline concentrations in Finnish soils. According to the Decree, baseline concentrations, referring both to the natural geological background concentrations and the diffuse anthropogenic input of substances, shall be taken into account in the soil contamination assessment process. This baseline information is provided in a national geochemical baseline database, TAPIR, that is publicly available via the Internet. Geochemical provinces with elevated baseline concentrations were delineated to provide regional geochemical baseline values. The nationwide geochemical datasets were used to divide Finland into geochemical provinces. Several metals (Co, Cr, Cu, Ni, V, and Zn) showed anomalous concentrations in seven regions that were defined as metal provinces. Arsenic did not follow a similar distribution to any other elements, and four arsenic provinces were separately determined. Nationwide geochemical datasets were not available for some other important elements such as Cd and Pb. Although these elements are included in the TAPIR system, their distribution does not necessarily follow the ones pre-defined for metal and arsenic provinces. Regional geochemical baseline values, presented as upper limit of geochemical variation within the region, can be used as trigger values to assess potential soil contamination. Baseline values have also been used to determine upper and lower guideline values that must be taken into account as a tool in basic risk assessment. If regional geochemical baseline values are available, the national guideline values prescribed in the Decree based on ecological risks can be modified accordingly. The national geochemical baseline database provides scientifically sound, easily accessible and generally accepted information on the baseline values, and it can be used in various

  14. Allergy-immunology practice parameters and strength of recommendation data: an evolutionary perspective.

    Science.gov (United States)

    Park, Matthew H; Banks, Taylor A; Nelson, Michael R

    2016-03-01

    The practice parameters for allergy and immunology (A/I) are a valuable tool guiding practitioners' clinical practice. The A/I practice parameters have evolved over time in the context of evidence-based medicine milestones. To identify evolutionary trends in the character, scope, and evidence underlying recommendations in the A/I practice parameters. Practice parameters that have guided A/I from 1995 through 2014 were analyzed. Statements and recommendations with strength of recommendation categories A and B were considered to have a basis in evidence from controlled trials. Forty-three publications and updates covering 25 unique topics were identified. There was great variability in the number of recommendations made and the proportion of statements with controlled trial evidence. The mean number of recommendations made per practice parameter has decreased significantly, from 95.8 to a mean of 38.3. There also is a trend toward an increased proportion of recommendations based on controlled trial evidence in practice parameters with fewer recommendations, with a mean of 30.7% in practice parameters with at least 100 recommendations based on controlled trial evidence compared with 48.3% in practice parameters with 30 to 100 recommendations and 51.0% in those with fewer than 30 recommendations. The A/I practice parameters have evolved significantly over time. Encouragingly, greater controlled trial evidence is associated with updated practice parameters and a recent trend of more narrowly focused topics. These findings should only bolster and inspire confidence in the utility of the A/I practice parameters in assisting practitioners to navigate through the uncertainty that is intrinsic to medicine in making informed decisions with patients. Published by Elsevier Inc.

  15. Retrievals of ethane from ground-based high-resolution FTIR solar observations with updated line parameters: determination of the optimum strategy for the Jungfraujoch station.

    Science.gov (United States)

    Bader, W.; Perrin, A.; Jacquemart, D.; Sudo, K.; Yashiro, H.; Gauss, M.; Demoulin, P.; Servais, C.; Mahieu, E.

    2012-04-01

    Ethane (C2H6) is the most abundant Non-Methane HydroCarbon (NMHC) in the Earth's atmosphere, with a lifetime of approximately 2 months. C2H6 has both anthropogenic and natural emission sources such as biomass burning, natural gas loss and biofuel consumption. Oxidation by the hydroxyl radical is by far the major C2H6 sink as the seasonally changing OH concentration controls the strong modulation of the ethane abundance throughout the year. Ethane lowers Cl atom concentrations in the lower stratosphere and is a major source of peroxyacetyl nitrate (PAN) and carbon monoxide (by reaction with OH). Involved in the formation of tropospheric ozone and in the destruction of atmospheric methane through changes in OH, C2H6 is a non-direct greenhouse gas with a net-global warming potential (100-yr horizon) of 5.5. The retrieval of ethane from ground-based infrared (IR) spectra is challenging. Indeed, the fitting of the ethane features is complicated by numerous interferences by strong water vapor, ozone and methane absorptions. Moreover, ethane has a complicated spectrum with many interacting vibrational modes and the current state of ethane parameters in HITRAN (e.g. : Rothman et al., 2009, see http://www.hitran.com) was rather unsatisfactory in the 3 μm region. In fact, PQ branches outside the 2973-3001 cm-1 range are not included in HITRAN, and most P and R structures are missing. New ethane absorption cross sections recorded at the Molecular Spectroscopy Facility of the Rutherford Appleton Laboratory (Harrison et al., 2010) are used in our retrievals. They were calibrated in intensity by using reference low-resolution spectra from the Pacific Northwest National Laboratory (PNNL) IR database. Pseudoline parameters fitted to these ethane spectra have been combined with HITRAN 2004 line parameters (including all the 2006 updates) for all other species encompassed in the selected microwindows. Also, the improvement brought by the update of the line positions and intensities

  16. An ESME update, v. 8.05

    International Nuclear Information System (INIS)

    MacLachlan, J.A.

    1993-01-01

    The program ESME for modeling the longitudinal degree of freedom of beam dynamics in proton synchrotrons is described in the ''User's Guide to ESME v. 8.0'' released 8 March 1993. This note updates the User's Guide to the state of the code at 15 June 1993. To simplify moving the code to the various UNIX machines at Fermilab and sharing it with other laboratories, it has been decided to promote the HIGZ graphics version. A few new graphics parameters are described which have been introduced in consequence. This note also corrects minor errors and omissions in the User's Guide and reports minor program enhancements. No errors producing wrong results have been found

  17. Online Support Vector Regression with Varying Parameters for Time-Dependent Data

    International Nuclear Information System (INIS)

    Omitaomu, Olufemi A.; Jeong, Myong K.; Badiru, Adedeji B.

    2011-01-01

    Support vector regression (SVR) is a machine learning technique that continues to receive interest in several domains including manufacturing, engineering, and medicine. In order to extend its application to problems in which datasets arrive constantly and in which batch processing of the datasets is infeasible or expensive, an accurate online support vector regression (AOSVR) technique was proposed. The AOSVR technique efficiently updates a trained SVR function whenever a sample is added to or removed from the training set without retraining the entire training data. However, the AOSVR technique assumes that the new samples and the training samples are of the same characteristics; hence, the same value of SVR parameters is used for training and prediction. This assumption is not applicable to data samples that are inherently noisy and non-stationary such as sensor data. As a result, we propose Accurate On-line Support Vector Regression with Varying Parameters (AOSVR-VP) that uses varying SVR parameters rather than fixed SVR parameters, and hence accounts for the variability that may exist in the samples. To accomplish this objective, we also propose a generalized weight function to automatically update the weights of SVR parameters in on-line monitoring applications. The proposed function allows for lower and upper bounds for SVR parameters. We tested our proposed approach and compared results with the conventional AOSVR approach using two benchmark time series data and sensor data from nuclear power plant. The results show that using varying SVR parameters is more applicable to time dependent data.

  18. Prediction of the pharmacokinetic parameters of triptolide in rats based on endogenous molecules in pre-dose baseline serum.

    Directory of Open Access Journals (Sweden)

    Linsheng Liu

    Full Text Available BACKGROUND: Individual variances usually affect drug metabolism and disposition, and hence result in either ineffectiveness or toxicity of a drug. In addition to genetic polymorphism, the multiple confounding factors of lifestyles, such as dietary preferences, contribute partially to individual variances. However, the difficulty of quantifying individual diversity greatly challenges the realization of individualized drug therapy. This study aims at quantitative evaluating the association between individual variances and the pharmacokinetics. METHODOLOGY/PRINCIPAL FINDINGS: Molecules in pre-dose baseline serum were profiled using gas chromatography mass spectrometry to represent the individual variances of the model rats provided with high fat diets (HFD, routine chows and calorie restricted (CR chows. Triptolide and its metabolites were determined using high performance liquid chromatography mass spectrometry. Metabonomic and pharmacokinetic data revealed that rats treated with the varied diets had distinctly different metabolic patterns and showed differential C(max values, AUC and drug metabolism after oral administration of triptolide. Rats with fatty chows had the lowest C(max and AUC values and the highest percentage of triptolide metabolic transformation, while rats with CR chows had the highest C(max and AUC values and the least percentage of triptolide transformation. Multivariate linear regression revealed that in baseline serum, the concentrations of creatinine and glutamic acid, which is the precursor of GSH, were linearly negatively correlated to C(max and AUC values. The glutamic acid and creatinine in baseline serum were suggested as the potential markers to represent individual diversity and as predictors of the disposal and pharmacokinetics of triptolide. CONCLUSIONS/SIGNIFICANCE: These results highlight the robust potential of metabonomics in characterizing individual variances and identifying relevant markers that have the

  19. Quantifying Update Effects in Citizen-Oriented Software

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-02-01

    Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.

  20. Determination of the Performance Parameters of a Spectrophotometer: An Advanced Experiment.

    Science.gov (United States)

    Cope, Virgil W.

    1978-01-01

    Describes an advanced analytical chemistry laboratory experiment developed for the determination of the performance parameters of a spectrophotometer. Among the parameters are the baseline linearity with wavelength, wavelength accuracy and respectability, stray light, noise level and pen response time. (HM)

  1. Development of JAEA sorption database (JAEA-SDB). Update of sorption/QA data in FY2015

    International Nuclear Information System (INIS)

    Tachi, Yukio; Suyama, Tadahiro

    2016-03-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) and rocks are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in these barrier materials is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the sorption and diffusion processes and develop databases compiling reliable data and mechanistic/predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, Japan Atomic Energy Agency (JAEA) has developed databases of sorption and diffusion parameters in bentonites and rocks. These sorption and diffusion databases (SDB/DDB) were firstly developed as an important basis for the H12 PA of high-level radioactive waste disposal, and have been provided through the Web. JAEA has been continuing to improve and update the SDB/DDB in view of potential future data needs, focusing on assuring the desired quality level and testing the usefulness of the databases for possible applications to PA-related parameter setting. The present report focuses on improving and updating of the sorption database (JAEA-SDB) as basis of integrated approach for PA-related K d setting and mechanistic sorption model development. This includes an overview of database structure, contents and functions including additional data evaluation function focusing on statistical data evaluation and grouping of data related to potential perturbations. K d data and their QA results are updated by focusing our recent activities on the K d setting and mechanistic model development. As a result, 11,206 K d data from 83 references were added, total number of K d values in the JAEA-SDB reached about 58,000. The QA/classified K d data reached about 60% for all K d data in JAEA-SDB. The updated JAEA-SDB is expected to make it possible to obtain quick overview of the available data, and to

  2. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    International Nuclear Information System (INIS)

    Elsheikh, Ahmed H.; Wheeler, Mary F.; Hoteit, Ibrahim

    2013-01-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss–Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates

  3. An iterative stochastic ensemble method for parameter estimation of subsurface flow models

    KAUST Repository

    Elsheikh, Ahmed H.

    2013-06-01

    Parameter estimation for subsurface flow models is an essential step for maximizing the value of numerical simulations for future prediction and the development of effective control strategies. We propose the iterative stochastic ensemble method (ISEM) as a general method for parameter estimation based on stochastic estimation of gradients using an ensemble of directional derivatives. ISEM eliminates the need for adjoint coding and deals with the numerical simulator as a blackbox. The proposed method employs directional derivatives within a Gauss-Newton iteration. The update equation in ISEM resembles the update step in ensemble Kalman filter, however the inverse of the output covariance matrix in ISEM is regularized using standard truncated singular value decomposition or Tikhonov regularization. We also investigate the performance of a set of shrinkage based covariance estimators within ISEM. The proposed method is successfully applied on several nonlinear parameter estimation problems for subsurface flow models. The efficiency of the proposed algorithm is demonstrated by the small size of utilized ensembles and in terms of error convergence rates. © 2013 Elsevier Inc.

  4. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  5. Update on the Direct Detection of Supersymmetric Dark Matter

    CERN Document Server

    Ellis, Jonathan Richard; Santoso, Y; Spanos, V C; Ellis, John; Olive, Keith A.; Santoso, Yudi; Spanos, Vassilis C.

    2005-01-01

    We compare updated predictions for the elastic scattering of supersymmetric neutralino dark matter with the improved experimental upper limit recently published by CDMS II. We take into account the possibility that the \\pi-nucleon \\Sigma term may be somewhat larger than was previously considered plausible, as may be supported by the masses of exotic baryons reported recently. We also incorporate the new central value of m_t, which affects indirectly constraints on the supersymmetric parameter space, for example via calculations of the relic density. Even if a large value of \\Sigma is assumed, the CDMS II data currently exclude only small parts of the parameter space in the constrained MSSM (CMSSM) with universal soft supersymmetry-breaking Higgs, squark and slepton masses. None of the previously-proposed CMSSM benchmark scenarios is excluded for any value of \\Sigma, and the CDMS II data do not impinge on the domains of the CMSSM parameter space favoured at the 90 % confidence level in a recent likelihood anal...

  6. Baseline {sup 18}F-FDG PET image-derived parameters for therapy response prediction in oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Hatt, Mathieu; Visvikis, Dimitris; Cheze-le Rest, Catherine [CHU Morvan, LaTIM, INSERM U650, Brest (France); Pradier, Olivier [CHU Morvan, LaTIM, INSERM U650, Brest (France); CHU Morvan, Department of Radiotherapy, Brest (France)

    2011-09-15

    The objectives of this study were to investigate the predictive value of tumour measurements on 2-deoxy-2-[{sup 18}F]fluoro-D-glucose ({sup 18}F-FDG) positron emission tomography (PET) pretreatment scan regarding therapy response in oesophageal cancer and to evaluate the impact of tumour delineation strategies. Fifty patients with oesophageal cancer treated with concomitant radiochemotherapy between 2004 and 2008 were retrospectively considered and classified as complete, partial or non-responders (including stable and progressive disease) according to Response Evaluation Criteria in Solid Tumors (RECIST). The classification of partial and complete responders was confirmed by biopsy. Tumours were delineated on the {sup 18}F-FDG pretreatment scan using an adaptive threshold and the automatic fuzzy locally adaptive Bayesian (FLAB) methodologies. Several parameters were then extracted: maximum and peak standardized uptake value (SUV), tumour longitudinal length (TL) and volume (TV), SUV{sub mean}, and total lesion glycolysis (TLG = TV x SUV{sub mean}). The correlation between each parameter and response was investigated using Kruskal-Wallis tests, and receiver-operating characteristic methodology was used to assess performance of the parameters to differentiate patients. Whereas commonly used parameters such as SUV measurements were not significant predictive factors of the response, parameters related to tumour functional spatial extent (TL, TV, TLG) allowed significant differentiation of all three groups of patients, independently of the delineation strategy, and could identify complete and non-responders with sensitivity above 75% and specificity above 85%. A systematic although not statistically significant trend was observed regarding the hierarchy of the delineation methodologies and the parameters considered, with slightly higher predictive value obtained with FLAB over adaptive thresholding, and TLG over TV and TL. TLG is a promising predictive factor of

  7. JLab High Efficiency Klystron Baseline Design for 12 GeV Upgrade

    International Nuclear Information System (INIS)

    Hovater, J.; Delayen, Jean; Harwood, Leigh; Nelson, Richard; Wang, Haipeng

    2003-01-01

    A computer design of a 13.5 kW, 1497 MHz, CW type, 55% efficiency, 0.8 microPv beam perveance, ∼40 dB gain, 5-cavity klystron has been developed for JLab 12 GeV Upgrade project.The design uses TRICOMP codes to simulate the gun, mod-anode section, solenoid focus channel and beam dump. The klystron tube was designed by JPNDISK (1D) code initially and then optimized by MASK (2D) code for the baseline parameters. All of these codes have been bunch marked by JLab 5 kW operational klystrons. The details of design parameters and the simulations by MAFIA (3D) for the cavity couplings tuners, and window are also going to be presented.

  8. Updating of working memory: lingering bindings.

    Science.gov (United States)

    Oberauer, Klaus; Vockenberg, Kerstin

    2009-05-01

    Three experiments investigated proactive interference and proactive facilitation in a memory-updating paradigm. Participants remembered several letters or spatial patterns, distinguished by their spatial positions, and updated them by new stimuli up to 20 times per trial. Self-paced updating times were shorter when an item previously remembered and then replaced reappeared in the same location than when it reappeared in a different location. This effect demonstrates residual memory for no-longer-relevant bindings of items to locations. The effect increased with the number of items to be remembered. With one exception, updating times did not increase, and recall of final values did not decrease, over successive updating steps, thus providing little evidence for proactive interference building up cumulatively.

  9. LLNL Containment Program nuclear test effects and geologic data base: glossary and parameter definitions

    International Nuclear Information System (INIS)

    Howard, N.W.

    1983-01-01

    This report lists, defines, and updates Parameters in DBASE, an LLNL test effects data bank in which data are stored from experiments performed at NTS and other test sites. Parameters are listed by subject and by number. Part 2 of this report presents the same information for parameters for which some of the data may be classified; it was issued in 1979 and is not being reissued at this time as it is essentially unchanged

  10. How do we update faces? Effects of gaze direction and facial expressions on working memory updating

    Directory of Open Access Journals (Sweden)

    Caterina eArtuso

    2012-09-01

    Full Text Available The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM. We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g. joy, while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g. fear. Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g. joy-direct gaze were compared to low binding conditions (e.g. joy-averted gaze. Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  11. How do we update faces? Effects of gaze direction and facial expressions on working memory updating.

    Science.gov (United States)

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enhances the processing of avoidance-oriented facial emotional expressions (e.g., fear). Thus, the way in which these two facial dimensions are combined communicates to the observer important behavioral and social information. Updating of these two facial dimensions and their bindings has not been investigated before, despite the fact that they provide a piece of social information essential for building and maintaining an internal ongoing representation of our social environment. In Experiment 1 we created a task in which the binding between gaze direction and facial expression was manipulated: high binding conditions (e.g., joy-direct gaze) were compared to low binding conditions (e.g., joy-averted gaze). Participants had to study and update continuously a number of faces, displaying different bindings between the two dimensions. In Experiment 2 we tested whether updating was affected by the social and communicative value of the facial dimension binding; to this end, we manipulated bindings between eye and hair color, two less communicative facial dimensions. Two new results emerged. First, faster response times were found in updating combinations of facial dimensions highly bound together. Second, our data showed that the ease of the ongoing updating processing varied depending on the communicative meaning of the binding that had to be updated. The results are discussed with reference to the role of WM updating in social cognition and appraisal processes.

  12. Analyzing of singlet fermionic dark matter via the updated direct detection data

    Energy Technology Data Exchange (ETDEWEB)

    Ettefaghi, M.M.; Moazzemi, R. [University of Qom, Department of Physics, Qom (Iran, Islamic Republic of)

    2017-05-15

    We revisit the parameter space of singlet fermionic cold dark matter model in order to determine the role of the mixing angle between the standard model Higgs and a new singlet one. Furthermore, we restudy the direct detection constraints with the updated and new experimental data. As an important conclusion, this model is completely excluded by recent XENON100, PandaX II and LUX data. (orig.)

  13. 49 CFR 1002.3 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating fees. Each fee shall be updated by updating the cost components comprising the fee. Cost... direct labor costs are direct labor costs determined by the cost study set forth in Revision of Fees For... by total office costs for the Offices directly associated with user fee activity. Actual updating of...

  14. 40 CFR 1042.825 - Baseline determination.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Baseline determination. 1042.825... Provisions for Remanufactured Marine Engines § 1042.825 Baseline determination. (a) For the purpose of this... not valid. (f) Use good engineering judgment for all aspects of the baseline determination. We may...

  15. Improving the accuracy of energy baseline models for commercial buildings with occupancy data

    International Nuclear Information System (INIS)

    Liang, Xin; Hong, Tianzhen; Shen, Geoffrey Qiping

    2016-01-01

    Highlights: • We evaluated several baseline models predicting energy use in buildings. • Including occupancy data improved accuracy of baseline model prediction. • Occupancy is highly correlated with energy use in buildings. • This simple approach can be used in decision makings of energy retrofit projects. - Abstract: More than 80% of energy is consumed during operation phase of a building’s life cycle, so energy efficiency retrofit for existing buildings is considered a promising way to reduce energy use in buildings. The investment strategies of retrofit depend on the ability to quantify energy savings by “measurement and verification” (M&V), which compares actual energy consumption to how much energy would have been used without retrofit (called the “baseline” of energy use). Although numerous models exist for predicting baseline of energy use, a critical limitation is that occupancy has not been included as a variable. However, occupancy rate is essential for energy consumption and was emphasized by previous studies. This study develops a new baseline model which is built upon the Lawrence Berkeley National Laboratory (LBNL) model but includes the use of building occupancy data. The study also proposes metrics to quantify the accuracy of prediction and the impacts of variables. However, the results show that including occupancy data does not significantly improve the accuracy of the baseline model, especially for HVAC load. The reasons are discussed further. In addition, sensitivity analysis is conducted to show the influence of parameters in baseline models. The results from this study can help us understand the influence of occupancy on energy use, improve energy baseline prediction by including the occupancy factor, reduce risks of M&V and facilitate investment strategies of energy efficiency retrofit.

  16. Assessment of Multiple Intrauterine Gestations from Ovarian Stimulation (AMIGOS) Trial: Baseline Characteristics

    Science.gov (United States)

    Diamond, Michael P.; Legro, Richard S.; Coutifaris, Christos; Alvero, Ruben; Robinson, Randal D.; Casson, Peter; Christman, Gregory M.; Ager, Joel; Huang, Hao; Hansen, Karl R.; Baker, Valerie; Usadi, Rebecca; Seungdamrong, Aimee; Bates, G. Wright; Rosen, R. Mitchell; Haisonleder, Daniell; Krawetz, Stephen A.; Barnhart, Kurt; Trussell, J.C.; Jin, Yufeng; Santoro, Nanette; Eisenberg, Esther; Zhang, Heping

    2015-01-01

    Objective To identify baseline characteristics of women with unexplained infertility to determine whether treatment with an aromatase inhibitor will result in a lower rate of multiple gestations than current standard ovulation induction medications. Design Randomized, prospective clinical trial Patients 900 couples with unexplained infertility Interventions: Ovarian stimulation with gonadotropins, clomiphene citrate, or letrozole in conjunction with intrauterine insemination. Setting Multicenter University based clinical practices. Main Outcome Measures Demographic, laboratory, imaging, and survey characteristics. Interventions Collection of baseline demographics, blood samples, and ultrasonographic assessments. Results Demographic characteristics of women receiving clomiphene citrate, letrozole, or gonadotropins for ovarian stimulation were very consistent. Their mean age was 32.2 ± 4.4 years and infertility duration was 34.7± 25.7 months, with 59% primary infertility. More than 1/3 of the women were current or past smokers. The mean BMI was 27 and mean AMH level was 2.6; only 11 women (1.3%) had antral follicle counts of less than 5. Similar observations were identified for hormonal profiles, ultrasound characterization of the ovaries, semen parameters, and quality of life assessments in both male and female partners. Conclusion The cause of infertility in the couples recruited to this treatment trial is elusive, as the women were regularly ovulating and had evidence of good ovarian reserve both by basal FSH, AMH levels, and antral follicle counts; the male partners had normal semen parameters. The three treatment subgroups have common baseline characteristics, thereby providing comparable patient populations for testing the hypothesis that use of letrozole for ovarian stimulation can reduce the rates of multiples from that observed with gonadotropin and clomiphene citrate treatment. PMID:25707331

  17. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  18. Reference Physiological Ranges for Serum Biochemical Parameters ...

    African Journals Online (AJOL)

    drugs includes measurement of changes in physiological parameters of subjects from known established baseline ... Methods: After informed consent, blood and urine samples were collected from a total of 576 ... a major public health problem in Cameroon with a .... sample collection, processing, storage and handling.

  19. The Sensitivity of the Input Impedance Parameters of Track Circuits to Changes in the Parameters of the Track

    Directory of Open Access Journals (Sweden)

    Lubomir Ivanek

    2017-01-01

    Full Text Available This paper deals with the sensitivity of the input impedance of an open track circuit in the event that the parameters of the track are changed. Weather conditions and the state of pollution are the most common reasons for parameter changes. The results were obtained from the measured values of the parameters R (resistance, G (conductance, L (inductance, and C (capacitance of a rail superstructure depending on the frequency. Measurements were performed on a railway siding in Orlova. The results are used to design a predictor of occupancy of a track section. In particular, we were interested in the frequencies of 75 and 275 Hz for this purpose. Many parameter values of track substructures have already been solved in different works in literature. At first, we had planned to use the parameter values from these sources when we designed the predictor. Deviations between them, however, are large and often differ by three orders of magnitude (see Tab.8. From this perspective, this article presents data that have been updated using modern measurement devices and computer technology. And above all, it shows a transmission (cascade matrix used to determine the parameters.

  20. Forecasting Sensorimotor Adaptability from Baseline Inter-Trial Correlations

    Science.gov (United States)

    Beaton, K. H.; Bloomberg, J. J.

    2014-01-01

    measured in the frequency domain. Therefore, we use the power spectrum (PS), which is the Fourier transform of the ACF, to describe our inter-trial correlations. The decay of the PS yields a straight line on a log-log frequency plot, which we quantify by Beta = - (slope of PS on log-log axes). Hence, Beta is a measure of the strength of inter- trial correlations in the baseline data. Larger Beta values are indicative of longer inter-trial correlations. Experimental Approach: We will begin by performing a retrospective analysis of treadmill-gait adaptation data previously collected by Dr. Bloomberg and colleagues. Specifically, we will quantify the strength of inter-trial correlations in the baseline step cadence and heart rate data and compare it to the locomotor adaptability performance results already described by these investigators. Incorporating these datasets will also allow us to explore the applicability of (and potential limitations surrounding) the use of Beta in forecasting physiological performance. We will also perform a new experiment, in which Beta will be derived from baseline data collected during over-ground (non-treadmill) walking, which will enable us to consider locomotor performance, through the parameter Beta, under the most functionallyrelevant, natural gait condition. This experiment will incorporate two baseline and five post-training over-ground locomotion tests to explore the consistency and potential adaptability of the Beta values themselves. HYPOTHESES: We hypothesize that the strength of baseline inter-trial correlations of step cadence and heart rate will relate to locomotor adaptability. Specifically, we anticipate that individuals who show weaker longer-term inter-trial correlations in baseline step cadence data will be the better adaptors, as step cadence can be modified in real-time (i.e., online corrections are an inherent property of the locomotor system; analogous to results observed in the VOR). Conversely, because heart rate is not

  1. Grading of Parameters for Urban Tree Inventories by City Officials, Arborists, and Academics Using the Delphi Method

    Science.gov (United States)

    Östberg, Johan; Delshammar, Tim; Wiström, Björn; Nielsen, Anders Busse

    2013-03-01

    Tree inventories are expensive to conduct and update, so every inventory carried out must be maximized. However, increasing the number of constituent parameters increases the cost of performing and updating the inventory, illustrating the need for careful parameter selection. This article reports the results of a systematic expert rating of tree inventories aiming to quantify the relative importance of each parameter. Using the Delphi method, panels comprising city officials, arborists, and academics rated a total of 148 parameters. The total mean score, the top ranking parameters, which can serve as a guide for decision-making at practical level and for standardization of tree inventories, were: Scientific name of the tree species and genera, Vitality, Coordinates, Hazard class, and Identification number. The study also examined whether the different responsibilities and usage of urban tree databases among organizations and people engaged in urban tree inventories affected their prioritization. The results revealed noticeable dissimilarities in the ranking of parameters between the panels, underlining the need for collaboration between the research community and those commissioning, administrating, and conducting inventories. Only by applying such a transdisciplinary approach to parameter selection can urban tree inventories be strengthened and made more relevant.

  2. Impact of line parameter database, continuum absorption, full grind configuration, and L1B update on GOSAT TIR methane retrieval

    Science.gov (United States)

    Yamada, A.; Saitoh, N.; Nonogaki, R.; Imasu, R.; Shiomi, K.; Kuze, A.

    2016-12-01

    The thermal infrared (TIR) band of Thermal and Near-infrared Sensor for Carbon Observation Fourier Transform Spectrometer (TANSO-FTS) onboard Greenhouse Gases Observing Satellite (GOSAT) observes CH4 profile at wavenumber range from 1210 cm-1 to 1360 cm-1 including CH4 ν4 band. The current retrieval algorithm (V1.0) uses LBLRTM V12.1 with AER V3.1 line database to calculate optical depth. LBLRTM V12.1 include MT_CKD 2.5.2 model to calculate continuum absorption. The continuum absorption has large uncertainty, especially temperature dependent coefficient, between BPS model and MT_CKD model in the wavenumber region of 1210-1250 cm-1(Paynter and Ramaswamy, 2014). The purpose of this study is to assess the impact on CH4 retrieval from the line parameter databases and the uncertainty of continuum absorption. We used AER v1.0 database, HITRAN2004 database, HITRAN2008 database, AER V3.2 database, and HITRAN2012 database (Rothman et al. 2005, 2009, and 2013. Clough et al., 2005). AER V1.0 database is based on HITRAN2000. The CH4 line parameters of AER V3.1 and V3.2 databases are developed from HITRAN2008 including updates until May 2009 with line mixing parameters. We compared the retrieved CH4 with the HIPPO CH4 observation (Wofsy et al., 2012). The difference of AER V3.2 was the smallest and 24.1 ± 45.9 ppbv. The differences of AER V1.0, HITRAN2004, HITRAN2008, and HITRAN2012 were 35.6 ± 46.5 ppbv, 37.6 ± 46.3 ppbv, 32.1 ± 46.1 ppbv, and 35.2 ± 46.0 ppbv, respectively. Compare AER V3.2 case to HITRAN2008 case, the line coupling effect reduced difference by 8.0 ppbv. Median values of Residual difference from HITRAN2008 to AER V1.0, HITRAN2004, AER V3.2, and HITRAN2012 were 0.6 K, 0.1 K, -0.08 K, and 0.08 K, respectively, while median values of transmittance difference were less than 0.0003 and transmittance differences have small wavenumber dependence. We also discuss the retrieval error from the uncertainty of the continuum absorption, the test of full grid

  3. Two updating methods for dissipative models with non symmetric matrices

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Aubry, D.

    1997-01-01

    In this paper the feasibility of the extension of two updating methods to rotating machinery models is considered, the particularity of rotating machinery models is to use non-symmetric stiffness and damping matrices. It is shown that the two methods described here, the inverse Eigen-sensitivity method and the error in constitutive relation method can be adapted to such models given some modification.As far as inverse sensitivity method is concerned, an error function based on the difference between right hand calculated and measured Eigen mode shapes and calculated and measured Eigen values is used. Concerning the error in constitutive relation method, the equation which defines the error has to be modified due to the non definite positiveness of the stiffness matrix. The advantage of this modification is that, in some cases, it is possible to focus the updating process on some specific model parameters. Both methods were validated on a simple test model consisting in a two-bearing and disc rotor system. (author)

  4. Process industry energy retrofits: the importance of emission baselines for greenhouse gas reductions

    International Nuclear Information System (INIS)

    Aadahl, Anders; Harvey, Simon; Berntsson, Thore

    2004-01-01

    Fuel combustion for heat and/or electric power production is often the largest contributor of greenhouse gas (GHG) emissions from an industrial process plant. Economically feasible options to reduce these emissions include fuel switching and retrofitting the plant's energy system. Process integration methods and tools can be used to evaluate potential retrofit measures. For assessing the GHG emissions reduction potential for the measures considered, it is also necessary to define appropriate GHG emission baselines. This paper presents a systematic GHG emission calculation method for retrofit situations including improved heat exchange, integration of combined heat and power (CHP) units, and combinations of both. The proposed method is applied to five different industrial processes in order to compare the impact of process specific parameters and energy market specific parameters. For potential GHG emission reductions the results from the applied study reveal that electricity grid emissions are significantly more important than differences between individual processes. Based on the results of the study, it is suggested that for sustainable investment decision considerations a conservative emission baseline is most appropriate. Even so, new industrial CHP in the Northern European energy market could play a significant role in the common effort to decrease GHG emissions

  5. An updated nuclear criticality slide rule

    International Nuclear Information System (INIS)

    Hopper, C.M.; Broadhead, B.L.

    1998-04-01

    This Volume 2 contains the functional version of the updated nuclear criticality slide rule (more accurately, sliding graphs) that is referenced in An Updated Nuclear Criticality Slide Rule: Technical Basis, NUREG/CR-6504, Vol. 1 (ORNL/TM-13322/V1). This functional slide rule provides a readily usable open-quotes in-handclose quotes method for estimating pertinent nuclear criticality accident information from sliding graphs, thereby permitting (1) the rapid estimation of pertinent criticality accident information without laborious or sophisticated calculations in a nuclear criticality emergency situation, (2) the appraisal of potential fission yields and external personnel radiation exposures for facility safety analyses, and (3) a technical basis for emergency preparedness and training programs at nonreactor nuclear facilities. The slide rule permits the estimation of neutron and gamma dose rates and integrated doses based upon estimated fission yields, distance from the fission source, and time-after criticality accidents for five different critical systems. Another sliding graph permits the estimation of critical solution fission yields based upon fissile material concentration, critical vessel geometry, and solution addition rate. Another graph provides neutron and gamma dose-reduction factors for water, steel, and concrete. Graphs from historic documents are provided as references for estimating critical parameters of various fissile material systems. Conversion factors for various English and metric units are provided for quick reference

  6. NPP/NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, R.

    2010-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS Preparatory Project (NPP) and NPOESS satellites will carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The Northrop Grumman Aerospace Systems (NGAS) Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the Environmental Data Records (EDRs), including their quality aspects. As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  7. 33 CFR 2.20 - Territorial sea baseline.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Territorial sea baseline. 2.20... JURISDICTION Jurisdictional Terms § 2.20 Territorial sea baseline. Territorial sea baseline means the line.... Normally, the territorial sea baseline is the mean low water line along the coast of the United States...

  8. Concepts of incremental updating and versioning

    CSIR Research Space (South Africa)

    Cooper, Antony K

    2004-07-01

    Full Text Available of the work undertaken recently by the Working Group (WG). The WG was voted for a Commission by the General Assembly held at the 21st ICC in Durban, South Africa. The basic problem being addressed by the Commission is that a user compiles their data base... or election). Historically, updates have been provided in bulk, with the new data set replacing the old one. User could: ignore update (if it is not significant enough), manually (and selectively) update their data base, or accept the whole update...

  9. Toxicity-adjusted dose (TAD) administration of chemotherapy: Effect of baseline and nadir neutrophil count in patients with breast, ovarian, and lung cancer?

    DEFF Research Database (Denmark)

    Carus, Andreas; Donskov, Frede; Gebski, Val

    2011-01-01

    Background: In some solid cancers a survival benefit has been observed for patients who had chemotherapy-induced neutropenia. The prognostic impact of baseline and nadir blood neutrophils was assessed in the present study. Methods: Data on patients with breast cancer st.I-IV, ovarian cancer st.......Survival data were updated 2010. Results: A total of 819 patients were identified, comprising 507 patients with breast cancer, 118 patients with ovarian cancer, 115 patients with NSCLC and 79 patients with SCLC. Median survival for ovarian cancer patients obtaining nadir neutropenia below 2.0 x 109/l was 56...... months. In contrast, median survival for ovarian cancer patients who had nadir neutropenia above 2.0 was 27 months. In a multivariate analysis, adjusting for well-known prognostic features, nadir neutropenia below 2.0 was statistically significant (HR 1.73;p=0.03). In patients with NSCLC, baseline...

  10. Program Baseline Change Control Board charter

    International Nuclear Information System (INIS)

    1993-02-01

    The purpose of this Charter is to establish the Program Baseline Change Control Board (PBCCB) for the Office of Civilian Radioactive Waste Management (OCRWM) Program, and to describe its organization, responsibilities, and basic methods of operation. Guidance for implementing this Charter is provided by the OCRWM Baseline Management Plan (BMP) and OCRWM Program Baseline Change Control Procedure

  11. CO2 line-mixing database and software update and its tests in the 2.1 μm and 4.3 μm regions

    International Nuclear Information System (INIS)

    Lamouroux, J.; Régalia, L.; Thomas, X.; Vander Auwera, J.; Gamache, R.R.; Hartmann, J.-M.

    2015-01-01

    An update of the former version of the database and software for the calculation of CO 2 –air absorption coefficients taking line-mixing into account [Lamouroux et al. J Quant Spectrosc Radiat Transf 2010;111:2321] is described. In this new edition, the data sets were constructed using parameters from the 2012 version of the HITRAN database and recent measurements of line-shape parameters. Among other improvements, speed-dependent profiles can now be used if line-mixing is treated within the first order approximation. This new package is tested using laboratory spectra measured in the 2.1 μm and 4.3 μm spectral regions for various pressures, temperatures and CO 2 concentration conditions. Despite improvements at 4.3 μm at room temperature, the conclusions on the quality of this update are more ambiguous at low temperature and in the 2.1 μm region. Further tests using laboratory and atmospheric spectra are thus required for the evaluation of the performances of this updated package. - Highlights: • High resolution infrared spectroscopy. • CO 2 in air. • Updated tools. • Line mixing database and software

  12. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  13. Establishing a store baseline during interim storage of waste packages and a review of potential technologies for base-lining

    Energy Technology Data Exchange (ETDEWEB)

    McTeer, Jennifer; Morris, Jenny; Wickham, Stephen [Galson Sciences Ltd. Oakham, Rutland (United Kingdom); Bolton, Gary [National Nuclear Laboratory Risley, Warrington (United Kingdom); McKinney, James; Morris, Darrell [Nuclear Decommissioning Authority Moor Row, Cumbria (United Kingdom); Angus, Mike [National Nuclear Laboratory Risley, Warrington (United Kingdom); Cann, Gavin; Binks, Tracey [National Nuclear Laboratory Sellafield (United Kingdom)

    2013-07-01

    Interim storage is an essential component of the waste management lifecycle, providing a safe, secure environment for waste packages awaiting final disposal. In order to be able to monitor and detect change or degradation of the waste packages, storage building or equipment, it is necessary to know the original condition of these components (the 'waste storage system'). This paper presents an approach to establishing the baseline for a waste-storage system, and provides guidance on the selection and implementation of potential base-lining technologies. The approach is made up of two sections; assessment of base-lining needs and definition of base-lining approach. During the assessment of base-lining needs a review of available monitoring data and store/package records should be undertaken (if the store is operational). Evolutionary processes (affecting safety functions), and their corresponding indicators, that can be measured to provide a baseline for the waste-storage system should then be identified in order for the most suitable indicators to be selected for base-lining. In defining the approach, identification of opportunities to collect data and constraints is undertaken before selecting the techniques for base-lining and developing a base-lining plan. Base-lining data may be used to establish that the state of the packages is consistent with the waste acceptance criteria for the storage facility and to support the interpretation of monitoring and inspection data collected during store operations. Opportunities and constraints are identified for different store and package types. Technologies that could potentially be used to measure baseline indicators are also reviewed. (authors)

  14. Seismic source characterization for the 2014 update of the U.S. National Seismic Hazard Model

    Science.gov (United States)

    Moschetti, Morgan P.; Powers, Peter; Petersen, Mark D.; Boyd, Oliver; Chen, Rui; Field, Edward H.; Frankel, Arthur; Haller, Kathleen; Harmsen, Stephen; Mueller, Charles S.; Wheeler, Russell; Zeng, Yuehua

    2015-01-01

    We present the updated seismic source characterization (SSC) for the 2014 update of the National Seismic Hazard Model (NSHM) for the conterminous United States. Construction of the seismic source models employs the methodology that was developed for the 1996 NSHM but includes new and updated data, data types, source models, and source parameters that reflect the current state of knowledge of earthquake occurrence and state of practice for seismic hazard analyses. We review the SSC parameterization and describe the methods used to estimate earthquake rates, magnitudes, locations, and geometries for all seismic source models, with an emphasis on new source model components. We highlight the effects that two new model components—incorporation of slip rates from combined geodetic-geologic inversions and the incorporation of adaptively smoothed seismicity models—have on probabilistic ground motions, because these sources span multiple regions of the conterminous United States and provide important additional epistemic uncertainty for the 2014 NSHM.

  15. Resistant Hypertension, Time-Updated Blood Pressure Values and Renal Outcome in Type 2 Diabetes Mellitus.

    Science.gov (United States)

    Viazzi, Francesca; Piscitelli, Pamela; Ceriello, Antonio; Fioretto, Paola; Giorda, Carlo; Guida, Pietro; Russo, Giuseppina; De Cosmo, Salvatore; Pontremoli, Roberto

    2017-09-22

    Apparent treatment resistant hypertension (aTRH) is highly prevalent in patients with type 2 diabetes mellitus (T2D) and entails worse cardiovascular prognosis. The impact of aTRH and long-term achievement of recommended blood pressure (BP) values on renal outcome remains largely unknown. We assessed the role of aTRH and BP on the development of chronic kidney disease in patients with T2D and hypertension in real-life clinical practice. Clinical records from a total of 29 923 patients with T2D and hypertension, with normal baseline estimated glomerular filtration rate and regular visits during a 4-year follow-up, were retrieved and analyzed. The association between time-updated BP control (ie, 75% of visits with BP hypertension. BP control is not associated with a more-favorable renal outcome in aTRH. The relationship between time-updated BP and renal function seems to be J-shaped, with optimal systolic BP values between 120 and 140 mm Hg. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  16. Can baseline ultrasound results help to predict failure to achieve DAS28 remission after 1 year of tight control treatment in early RA patients?

    Science.gov (United States)

    Ten Cate, D F; Jacobs, J W G; Swen, W A A; Hazes, J M W; de Jager, M H; Basoski, N M; Haagsma, C J; Luime, J J; Gerards, A H

    2018-01-30

    At present, there are no prognostic parameters unequivocally predicting treatment failure in early rheumatoid arthritis (RA) patients. We investigated whether baseline ultrasonography (US) findings of joints, when added to baseline clinical, laboratory, and radiographical data, could improve prediction of failure to achieve Disease Activity Score assessing 28 joints (DAS28) remission (baseline. Clinical, laboratory, and radiographical parameters were recorded. Primary analysis was the prediction by logistic regression of the absence of DAS28 remission 12 months after diagnosis and start of therapy. Of 194 patients included, 174 were used for the analysis, with complete data available for 159. In a multivariate model with baseline DAS28 (odds ratio (OR) 1.6, 95% confidence interval (CI) 1.2-2.2), the presence of rheumatoid factor (OR 2.3, 95% CI 1.1-5.1), and type of monitoring strategy (OR 0.2, 95% CI 0.05-0.85), the addition of baseline US results for joints (OR 0.96, 95% CI 0.89-1.04) did not significantly improve the prediction of failure to achieve DAS28 remission (likelihood ratio test, 1.04; p = 0.31). In an early RA population, adding baseline ultrasonography of the hands, wrists, and feet to commonly available baseline characteristics did not improve prediction of failure to achieve DAS28 remission at 12 months. Clinicaltrials.gov, NCT01752309 . Registered on 19 December 2012.

  17. Kalman filter data assimilation: targeting observations and parameter estimation.

    Science.gov (United States)

    Bellsky, Thomas; Kostelich, Eric J; Mahalov, Alex

    2014-06-01

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation.

  18. Kalman filter data assimilation: Targeting observations and parameter estimation

    International Nuclear Information System (INIS)

    Bellsky, Thomas; Kostelich, Eric J.; Mahalov, Alex

    2014-01-01

    This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation

  19. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    Science.gov (United States)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  20. Energy reconstruction in the long-baseline neutrino experiment.

    Science.gov (United States)

    Mosel, U; Lalakulich, O; Gallmeister, K

    2014-04-18

    The Long-Baseline Neutrino Experiment aims at measuring fundamental physical parameters to high precision and exploring physics beyond the standard model. Nuclear targets introduce complications towards that aim. We investigate the uncertainties in the energy reconstruction, based on quasielastic scattering relations, due to nuclear effects. The reconstructed event distributions as a function of energy tend to be smeared out and shifted by several 100 MeV in their oscillatory structure if standard event selection is used. We show that a more restrictive experimental event selection offers the possibility to reach the accuracy needed for a determination of the mass ordering and the CP-violating phase. Quasielastic-based energy reconstruction could thus be a viable alternative to the calorimetric reconstruction also at higher energies.

  1. Implications of the top quark mass measurement for the CKM parameters x$_{s}$ and CP asymmetries

    CERN Document Server

    Ali, A

    1995-01-01

    Motivated by the recent determination of the top quark mass by the CDF collaboration, \\mt =174 \\pm 10 ^{+13}_{-12} GeV, we review and update constraints on the parameters of the quark flavour mixing matrix V_{CKM} in the standard model. In performing these fits, we use inputs from the measurements of \\abseps, the CP-violating parameter in K decays, \\xd = (\\delm)/\\Gamma, the mixing parameter in \\bdbdbar\\ mixing, the present measurements of the matrix elements \\absvcb and \\absvub, and the B-hadron lifetimes. The CDF value for \\mt considerably reduces the CKM-parameter space previously allowed. An interesting result of our analysis is that the present data can be used to restrict the coupling constant product ratio f_{B_d}\\sqrt{B_{B_d}} to the range 110-270 MeV -- in comfortable agreement with existing theoretical estimates of this quantity. We use the updated CKM matrix to predict the \\bsbsbar\\ mixing ratio \\xs, as well as the quantities \\sin 2\\alpha, \\sin 2\\beta and \\sin^2\\gamma, which characterize CP-violatin...

  2. Hazard Baseline Downgrade Effluent Treatment Facility

    International Nuclear Information System (INIS)

    Blanchard, A.

    1998-01-01

    This Hazard Baseline Downgrade reviews the Effluent Treatment Facility, in accordance with Department of Energy Order 5480.23, WSRC11Q Facility Safety Document Manual, DOE-STD-1027-92, and DOE-EM-STD-5502-94. It provides a baseline grouping based on the chemical and radiological hazards associated with the facility. The Determination of the baseline grouping for ETF will aid in establishing the appropriate set of standards for the facility

  3. Minimal-Learning-Parameter Technique Based Adaptive Neural Sliding Mode Control of MEMS Gyroscope

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2017-01-01

    Full Text Available This paper investigates an adaptive neural sliding mode controller for MEMS gyroscopes with minimal-learning-parameter technique. Considering the system uncertainty in dynamics, neural network is employed for approximation. Minimal-learning-parameter technique is constructed to decrease the number of update parameters, and in this way the computation burden is greatly reduced. Sliding mode control is designed to cancel the effect of time-varying disturbance. The closed-loop stability analysis is established via Lyapunov approach. Simulation results are presented to demonstrate the effectiveness of the method.

  4. Updated clinical guidelines experience major reporting limitations

    Directory of Open Access Journals (Sweden)

    Robin W.M. Vernooij

    2017-10-01

    Full Text Available Abstract Background The Checklist for the Reporting of Updated Guidelines (CheckUp was recently developed. However, so far, no systematic assessment of the reporting of updated clinical guidelines (CGs exists. We aimed to examine (1 the completeness of reporting the updating process in CGs and (2 the inter-observer reliability of CheckUp. Methods We conducted a systematic assessment of the reporting of the updating process in a sample of updated CGs using CheckUp. We performed a systematic search to identify updated CGs published in 2015, developed by a professional society, reporting a systematic review of the evidence, and containing at least one recommendation. Three reviewers independently assessed the CGs with CheckUp (16 items. We calculated the median score per item, per domain, and overall, converting scores to a 10-point scale. Multiple linear regression analyses were used to identify differences according to country, type of organisation, scope, and health topic of updated CGs. We calculated the intraclass coefficient (ICC and 95% confidence interval (95% CI for domains and overall score. Results We included in total 60 updated CGs. The median domain score on a 10-point scale for presentation was 5.8 (range 1.7 to 10, for editorial independence 8.3 (range 3.3 to 10, and for methodology 5.7 (range 0 to 10. The median overall score on a 10-point scale was 6.3 (range 3.1 to 10. Presentation and justification items at recommendation level (respectively reported by 27 and 38% of the CGs and the methods used for the external review and implementing changes in practice were particularly poorly reported (both reported by 38% of the CGs. CGs developed by a European or international institution obtained a statistically significant higher overall score compared to North American or Asian institutions (p = 0.014. Finally, the agreement among the reviewers on the overall score was excellent (ICC 0.88, 95% CI 0.75 to 0.95. Conclusions The

  5. Circular Updates

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Circular Updates are periodic sequentially numbered instructions to debriefing staff and observers informing them of changes or additions to scientific and specimen...

  6. Important update of CERN Mail Services

    CERN Multimedia

    IT Department

    2009-01-01

    The CERN Mail Services are evolving. In the course of June and July 2009, all CERN mailboxes will be updated with a new infrastructure for hosting mailboxes, running Exchange 2007. This update is taking place in order to provide the capacity upgrade for the constantly growing volume of CERN mailboxes. It is also the opportunity to provide a number of improvements to CERN mailboxes: new and improved Outlook Web Access (the web interface used to access your mailbox from a web browser, also known as "webmail"), new features in the Out-of-Office auto-reply assistant, easier spam management... The update will preserve the mailbox configuration and no specific action is required by users. During the next weeks, each mailbox will be individually notified of the upcoming update the day before it takes place. We invite all users to carefully read this notification as it will contain the latest information for this update. The mailbox will be unavailable for a short time during the ni...

  7. The Updating of Geospatial Base Data

    Science.gov (United States)

    Alrajhi, Muhamad N.; Konecny, Gottfried

    2018-04-01

    Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.

  8. PHENIX CDR update: An experiment to be performed at the Brookhaven National Laboratory relativistic heavy ion collider. Revision

    International Nuclear Information System (INIS)

    1994-11-01

    The PHENIX Conceptual Design Report Update (CDR Update) is intended for use together with the Conceptual Design Report (CDR). The CDR Update is a companion document to the CDR, and it describes the collaboration's progress since the CDR was submitted in January 1993. Therefore, this document concentrates on changes, refinements, and decisions that have been made over the past year. These documents together define the baseline PHENIX detector that the collaboration intends to build for operation at RHIC startup. In this chapter the current status of the detector and its motivation are briefly described. In Chapters 2 and 3 the detector and the physics performance are more fully developed. In Chapters 4 through 13 the details of the present design status, the technology choices, and the construction costs and schedules are presented. The physics goals of PHENIX collaboration have remained exactly as they were described in the CDR. Primary among these is the detection of a new phase of matter, the quark-gluon plasma (QGP), and the measurement of its properties. The PHENIX experiment will measure many of the best potential QGP signatures to see if any or all of these physics variables show anomalies simultaneously due to the formation of the QGP

  9. Updating optical pseudoinverse associative memories.

    Science.gov (United States)

    Telfer, B; Casasent, D

    1989-07-01

    Selected algorithms for adding to and deleting from optical pseudoinverse associative memories are presented and compared. New realizations of pseudoinverse updating methods using vector inner product matrix bordering and reduced-dimensionality Karhunen-Loeve approximations (which have been used for updating optical filters) are described in the context of associative memories. Greville's theorem is reviewed and compared with the Widrow-Hoff algorithm. Kohonen's gradient projection method is expressed in a different form suitable for optical implementation. The data matrix memory is also discussed for comparison purposes. Memory size, speed and ease of updating, and key vector requirements are the comparison criteria used.

  10. Estimation of trust metrics for MANET using QoS parameter and source routing algorithms

    CSIR Research Space (South Africa)

    Umuhoza, D

    2007-08-01

    Full Text Available parameters. Probabilities of transit time variation, deleted, multiplied and inserted packets, processing delays are used to estimate and update trust. Functions which facilitate this are provided and evaluated. It has been shown that only two end nodes need...

  11. Update on hidden sectors with dark forces and dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Andreas, Sarah

    2012-11-15

    Recently there has been much interest in hidden sectors, especially in the context of dark matter and ''dark forces'', since they are a common feature of beyond standard model scenarios like string theory and SUSY and additionally exhibit interesting phenomenological aspects. Various laboratory experiments place limits on the so-called hidden photon and continuously further probe and constrain the parameter space; an updated overview is presented here. Furthermore, for several hidden sector models with light dark matter we study the viability with respect to the relic abundance and direct detection experiments.

  12. Updated precision measurement of the average lifetime of B hadrons

    CERN Document Server

    Abreu, P; Adye, T; Agasi, E; Ajinenko, I; Aleksan, Roy; Alekseev, G D; Alemany, R; Allport, P P; Almehed, S; Amaldi, Ugo; Amato, S; Andreazza, A; Andrieux, M L; Antilogus, P; Apel, W D; Arnoud, Y; Åsman, B; Augustin, J E; Augustinus, A; Baillon, Paul; Bambade, P; Barate, R; Barbi, M S; Barbiellini, Guido; Bardin, Dimitri Yuri; Baroncelli, A; Bärring, O; Barrio, J A; Bartl, Walter; Bates, M J; Battaglia, Marco; Baubillier, M; Baudot, J; Becks, K H; Begalli, M; Beillière, P; Belokopytov, Yu A; Benvenuti, Alberto C; Berggren, M; Bertrand, D; Bianchi, F; Bigi, M; Bilenky, S M; Billoir, P; Bloch, D; Blume, M; Blyth, S; Bolognese, T; Bonesini, M; Bonivento, W; Booth, P S L; Borisov, G; Bosio, C; Bosworth, S; Botner, O; Boudinov, E; Bouquet, B; Bourdarios, C; Bowcock, T J V; Bozzo, M; Branchini, P; Brand, K D; Brenke, T; Brenner, R A; Bricman, C; Brillault, L; Brown, R C A; Brückman, P; Brunet, J M; Bugge, L; Buran, T; Burgsmüller, T; Buschmann, P; Buys, A; Cabrera, S; Caccia, M; Calvi, M; Camacho-Rozas, A J; Camporesi, T; Canale, V; Canepa, M; Cankocak, K; Cao, F; Carena, F; Carroll, L; Caso, Carlo; Castillo-Gimenez, M V; Cattai, A; Cavallo, F R; Cerrito, L; Chabaud, V; Charpentier, P; Chaussard, L; Chauveau, J; Checchia, P; Chelkov, G A; Chen, M; Chierici, R; Chliapnikov, P V; Chochula, P; Chorowicz, V; Chudoba, J; Cindro, V; Collins, P; Contreras, J L; Contri, R; Cortina, E; Cosme, G; Cossutti, F; Crawley, H B; Crennell, D J; Crosetti, G; Cuevas-Maestro, J; Czellar, S; Dahl-Jensen, Erik; Dahm, J; D'Almagne, B; Dam, M; Damgaard, G; Dauncey, P D; Davenport, Martyn; Da Silva, W; Defoix, C; Deghorain, A; Della Ricca, G; Delpierre, P A; Demaria, N; De Angelis, A; de Boer, Wim; De Brabandere, S; De Clercq, C; La Vaissière, C de; De Lotto, B; De Min, A; De Paula, L S; De Saint-Jean, C; Dijkstra, H; Di Ciaccio, Lucia; Djama, F; Dolbeau, J; Dönszelmann, M; Doroba, K; Dracos, M; Drees, J; Drees, K A; Dris, M; Dufour, Y; Edsall, D M; Ehret, R; Eigen, G; Ekelöf, T J C; Ekspong, Gösta; Elsing, M; Engel, J P; Ershaidat, N; Erzen, B; Espirito-Santo, M C; Falk, E; Fassouliotis, D; Feindt, Michael; Fenyuk, A; Ferrer, A; Filippas-Tassos, A; Firestone, A; Fischer, P A; Föth, H; Fokitis, E; Fontanelli, F; Formenti, F; Franek, B J; Frenkiel, P; Fries, D E C; Frodesen, A G; Frühwirth, R; Fulda-Quenzer, F; Fuster, J A; Galloni, A; Gamba, D; Gandelman, M; García, C; García, J; Gaspar, C; Gasparini, U; Gavillet, P; Gazis, E N; Gelé, D; Gerber, J P; Gibbs, M; Gokieli, R; Golob, B; Gopal, Gian P; Gorn, L; Górski, M; Guz, Yu; Gracco, Valerio; Graziani, E; Grosdidier, G; Grzelak, K; Gumenyuk, S A; Gunnarsson, P; Günther, M; Guy, J; Hahn, F; Hahn, S; Hajduk, Z; Hallgren, A; Hamacher, K; Hao, W; Harris, F J; Hedberg, V; Henriques, R P; Hernández, J J; Herquet, P; Herr, H; Hessing, T L; Higón, E; Hilke, Hans Jürgen; Hill, T S; Holmgren, S O; Holt, P J; Holthuizen, D J; Hoorelbeke, S; Houlden, M A; Hrubec, Josef; Huet, K; Hultqvist, K; Jackson, J N; Jacobsson, R; Jalocha, P; Janik, R; Jarlskog, C; Jarlskog, G; Jarry, P; Jean-Marie, B; Johansson, E K; Jönsson, L B; Jönsson, P E; Joram, Christian; Juillot, P; Kaiser, M; Kapusta, F; Karafasoulis, K; Karlsson, M; Karvelas, E; Katsanevas, S; Katsoufis, E C; Keränen, R; Khokhlov, Yu A; Khomenko, B A; Khovanskii, N N; King, B J; Kjaer, N J; Klein, H; Klovning, A; Kluit, P M; Köne, B; Kokkinias, P; Koratzinos, M; Korcyl, K; Kourkoumelis, C; Kuznetsov, O; Kramer, P H; Krammer, Manfred; Kreuter, C; Kronkvist, I J; Krumshtein, Z; Krupinski, W; Kubinec, P; Kucewicz, W; Kurvinen, K L; Lacasta, C; Laktineh, I; Lamblot, S; Lamsa, J; Lanceri, L; Lane, D W; Langefeld, P; Last, I; Laugier, J P; Lauhakangas, R; Leder, Gerhard; Ledroit, F; Lefébure, V; Legan, C K; Leitner, R; Lemoigne, Y; Lemonne, J; Lenzen, Georg; Lepeltier, V; Lesiak, T; Liko, D; Lindner, R; Lipniacka, A; Lippi, I; Lörstad, B; Loken, J G; López, J M; Loukas, D; Lutz, P; Lyons, L; MacNaughton, J N; Maehlum, G; Maio, A; Malychev, V; Mandl, F; Marco, J; Marco, R P; Maréchal, B; Margoni, M; Marin, J C; Mariotti, C; Markou, A; Maron, T; Martínez-Rivero, C; Martínez-Vidal, F; Martí i García, S; Masik, J; Matorras, F; Matteuzzi, C; Matthiae, Giorgio; Mazzucato, M; McCubbin, M L; McKay, R; McNulty, R; Medbo, J; Merk, M; Meroni, C; Meyer, S; Meyer, W T; Michelotto, M; Migliore, E; Mirabito, L; Mitaroff, Winfried A; Mjörnmark, U; Moa, T; Møller, R; Mönig, K; Monge, M R; Morettini, P; Müller, H; Mundim, L M; Murray, W J; Muryn, B; Myatt, Gerald; Naraghi, F; Navarria, Francesco Luigi; Navas, S; Nawrocki, K; Negri, P; Neumann, W; Nicolaidou, R; Nielsen, B S; Nieuwenhuizen, M; Nikolaenko, V; Niss, P; Nomerotski, A; Normand, Ainsley; Novák, M; Oberschulte-Beckmann, W; Obraztsov, V F; Olshevskii, A G; Onofre, A; Orava, Risto; Österberg, K; Ouraou, A; Paganini, P; Paganoni, M; Pagès, P; Palka, H; Papadopoulou, T D; Papageorgiou, K; Pape, L; Parkes, C; Parodi, F; Passeri, A; Pegoraro, M; Peralta, L; Pernegger, H; Pernicka, Manfred; Perrotta, A; Petridou, C; Petrolini, A; Petrovykh, M; Phillips, H T; Piana, G; Pierre, F; Pimenta, M; Pindo, M; Plaszczynski, S; Podobrin, O; Pol, M E; Polok, G; Poropat, P; Pozdnyakov, V; Prest, M; Privitera, P; Pukhaeva, N; Pullia, Antonio; Radojicic, D; Ragazzi, S; Rahmani, H; Ratoff, P N; Read, A L; Reale, M; Rebecchi, P; Redaelli, N G; Regler, Meinhard; Reid, D; Renton, P B; Resvanis, L K; Richard, F; Richardson, J; Rídky, J; Rinaudo, G; Ripp, I; Romero, A; Roncagliolo, I; Ronchese, P; Ronjin, V M; Roos, L; Rosenberg, E I; Rosso, E; Roudeau, Patrick; Rovelli, T; Rückstuhl, W; Ruhlmann-Kleider, V; Ruiz, A; Rybicki, K; Saarikko, H; Sacquin, Yu; Sadovskii, A; Sajot, G; Salt, J; Sánchez, J; Sannino, M; Schimmelpfennig, M; Schneider, H; Schwickerath, U; Schyns, M A E; Sciolla, G; Scuri, F; Seager, P; Sedykh, Yu; Segar, A M; Seitz, A; Sekulin, R L; Shellard, R C; Siccama, I; Siegrist, P; Simonetti, S; Simonetto, F; Sissakian, A N; Sitár, B; Skaali, T B; Smadja, G; Smirnov, N; Smirnova, O G; Smith, G R; Solovyanov, O; Sosnowski, R; Souza-Santos, D; Spassoff, Tz; Spiriti, E; Sponholz, P; Squarcia, S; Stanescu, C; Stapnes, Steinar; Stavitski, I; Stichelbaut, F; Stocchi, A; Strauss, J; Strub, R; Stugu, B; Szczekowski, M; Szeptycka, M; Tabarelli de Fatis, T; Tavernet, J P; Chikilev, O G; Tilquin, A; Timmermans, J; Tkatchev, L G; Todorov, T; Toet, D Z; Tomaradze, A G; Tomé, B; Tonazzo, A; Tortora, L; Tranströmer, G; Treille, D; Trischuk, W; Tristram, G; Trombini, A; Troncon, C; Tsirou, A L; Turluer, M L; Tyapkin, I A; Tyndel, M; Tzamarias, S; Überschär, B; Ullaland, O; Uvarov, V; Valenti, G; Vallazza, E; Van der Velde, C; van Apeldoorn, G W; van Dam, P; Van Doninck, W K; Van Eldik, J; Vassilopoulos, N; Vegni, G; Ventura, L; Venus, W A; Verbeure, F; Verlato, M; Vertogradov, L S; Vilanova, D; Vincent, P; Vitale, L; Vlasov, E; Vodopyanov, A S; Vrba, V; Wahlen, H; Walck, C; Weierstall, M; Weilhammer, Peter; Weiser, C; Wetherell, Alan M; Wicke, D; Wickens, J H; Wielers, M; Wilkinson, G R; Williams, W S C; Winter, M; Witek, M; Woschnagg, K; Yip, K; Yushchenko, O P; Zach, F; Zaitsev, A; Zalewska-Bak, A; Zalewski, Piotr; Zavrtanik, D; Zevgolatakos, E; Zimin, N I; Zito, M; Zontar, D; Zuberi, R; Zucchelli, G C; Zumerle, G; Belokopytov, Yu; Charpentier, Ph; Gavillet, Ph; Gouz, Yu; Jarlskog, Ch; Khokhlov, Yu; Papadopoulou, Th D

    1996-01-01

    The measurement of the average lifetime of B hadrons using inclusively reconstructed secondary vertices has been updated using both an improved processing of previous data and additional statistics from new data. This has reduced the statistical and systematic uncertainties and gives \\tau_{\\mathrm{B}} = 1.582 \\pm 0.011\\ \\mathrm{(stat.)} \\pm 0.027\\ \\mathrm{(syst.)}\\ \\mathrm{ps.} Combining this result with the previous result based on charged particle impact parameter distributions yields \\tau_{\\mathrm{B}} = 1.575 \\pm 0.010\\ \\mathrm{(stat.)} \\pm 0.026\\ \\mathrm{(syst.)}\\ \\mathrm{ps.}

  13. The prognostic value of baseline 18F-FDG PET/CT in steroid-naive large-vessel vasculitis: introduction of volume-based parameters

    International Nuclear Information System (INIS)

    Dellavedova, L.; Carletto, M.; Maffioli, L.S.; Faggioli, P.; Sciascera, A.; Mazzone, A.; Del Sole, A.

    2016-01-01

    The aim of this study was to analyse if the result of a baseline 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET)/CT scan, in large-vessel vasculitis (LVV) patients, is able to predict the course of the disease, not only in terms of presence/absence of final complications but also in terms of favourable/complicated progress (response to steroid therapy, time to steroid suspension, relapses, etc.). A total of 46 consecutive patients, who underwent 18 F-FDG PET/CT between May 2010 and March 2013 for fever of unknown origin (FUO) or suspected vasculitis (before starting corticosteroid therapy), were enrolled. The diagnosis of LVV was confirmed in 17 patients. Considering follow-up results, positive LVV patients were divided into two groups, one characterized by favourable (nine) and the other by complicated progress (eight), on the basis of presence/absence of vascular complications, presence/absence of at least another positive PET/CT during follow-up and impossibility to comply with the tapering schedule of the steroid due to biochemical/symptomatic relapse. Vessel uptake in subjects of the two groups was compared in terms of intensity and extension. To evaluate the extent of active disease, we introduced two volume-based parameters: ''volume of increased uptake'' (VIU) and ''total lesion glycolysis'' (TLG). The threshold used to calculate VIU on vessel walls was obtained by the ''vessel to liver'' ratio by means of receiver-operating characteristic analysis and was set at 0.92 x liver maximum standardized uptake value in each patient. Measures of tracer uptake intensity were significantly higher in patients with complicated progress compared to those with a favourable one (p < 0.05). Measures of disease extension were even more significant and TLG emerged as the best parameter to separate the two groups of patients (p = 0.01). This pilot study shows that, in LVV patients, the combined

  14. DGEMP-OE (2008) Energy Baseline Scenario. Synthesis report; Scenario energetique de reference DGEMP-OE(2008). Rapport de synthese

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    A 'Business as usual' or 'Baseline' scenario of energy trends to 2020-2030 is produced by France every four years, as requested by the International Energy Agency in order to update the global scenarios published in its World Energy Outlook. Since the most recent scenario of this type was drawn up in 2003-2004, the time has come to renew the effort for the IEA's next in-depth review of French energy policy. Specifically, the DGEMP seeks to predict the future of France's energy situation assuming that no policies or new measures are taken affecting (i.e. improving or deteriorating) the situation other than those already in place or adopted as of 1 January 2008 (in other words, before measures such as those stemming from the Grenelle Environment Forum). On the other hand, it is assumed that change in the energy system is guided by 'conventional wisdom' according to which political options and behaviours by economic units are expected to be 'reasonable'. As a result, even should its projections prove inappropriate, this cannot be considered a 'worst-case' scenario. Indeed, beyond the IEA, this scenario can be used to establish an MEA (Multilateral Environment Agreement) scenario (based on existing measures) for national communications submitted under the U.N. Climate Convention. The scenarios by the 'Energy' Commission, part of the Centre d'Analyse Strategique (CAS), could have been used, particularly since the consultant who worked with the CAS to develop its scenarios was also commissioned by the DGEMP. However, several considerations argued in favour of proceeding separately: - The CAS scenarios drew on the DGEMP's 2004 baseline scenario, even though certain parameters were updated (in particular energy prices). - Moreover, the concept underpinning the DGEMP baseline scenario is that it should to every extent possible remain constant over time to secure continued consensus on this &apos

  15. Design parameters and source terms: Volume 3, Source terms

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository in Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 11 refs., 9 tabs

  16. Scheme for Generation highly monochromatic X-Rays from a baseline XFEL undulator

    International Nuclear Information System (INIS)

    Geloni, Gianluca; Kocharyan, Vitali; Saldin, Evgeni

    2010-03-01

    One goal of XFEL facilities is the production of narrow bandwidth X-ray radiation. The self-seeding scheme was proposed to obtain a bandwidth narrower than that achievable with conventional X-ray SASE FELs. A self-seeded FEL is composed of two undulators separated by a monochromator and an electron beam bypass that must compensate for the path delay of X-rays in the monochromator. This leads to a long bypass, with a length in the order of 40-60 m, which requires modifications of the baseline undulator configuration. As an attempt to get around this obstacle, together with a study of the self-seeding scheme for the European XFEL, here we propose a novel technique based on a pulse doubler concept. Using a crystal monochromator installed within a short magnetic chicane in the baseline undulator, it is possible to decrease the bandwidth of the radiation well beyond the XFEL design down to 10 -5 . The magnetic chicane can be installed without any perturbation of the XFEL focusing structure, and does not interfere with the baseline mode of operation. We present a feasibility study and we make exemplifications with the parameters of the SASE2 line of the European XFEL. (orig.)

  17. An Updated Nuclear Equation of State for Neutron Stars and Supernova Simulations

    Science.gov (United States)

    Meixner, M. A.; Mathews, G. J.; Dalhed, H. E.; Lan, N. Q.

    2011-10-01

    We present an updated and improved Equation of State based upon the framework originally developed by Bowers & Wilson. The details of the EoS and improvements are described along with a description of how to access this EOS for numerical simulations. Among the improvements are an updated compressibility based upon recent measurements, the possibility of the formation of proton excess (Ye> 0.5) material and an improved treatment of the nuclear statistical equilibrium and the transition to pasta nuclei as the density approaches nuclear matter density. The possibility of a QCD chiral phase transition is also included at densities above nuclear matter density. We show comparisons of this EOS with the other two publicly available equations of state used in supernova collapse simulations. The advantages of the present EoS is that it is easily amenable to phenomenological parameterization to fit observed explosion properties and to accommodate new physical parameters.

  18. Gaussian process inference for estimating pharmacokinetic parameters of dynamic contrast-enhanced MR images.

    Science.gov (United States)

    Wang, Shijun; Liu, Peter; Turkbey, Baris; Choyke, Peter; Pinto, Peter; Summers, Ronald M

    2012-01-01

    In this paper, we propose a new pharmacokinetic model for parameter estimation of dynamic contrast-enhanced (DCE) MRI by using Gaussian process inference. Our model is based on the Tofts dual-compartment model for the description of tracer kinetics and the observed time series from DCE-MRI is treated as a Gaussian stochastic process. The parameter estimation is done through a maximum likelihood approach and we propose a variant of the coordinate descent method to solve this likelihood maximization problem. The new model was shown to outperform a baseline method on simulated data. Parametric maps generated on prostate DCE data with the new model also provided better enhancement of tumors, lower intensity on false positives, and better boundary delineation when compared with the baseline method. New statistical parameter maps from the process model were also found to be informative, particularly when paired with the PK parameter maps.

  19. 10 CFR 850.20 - Baseline beryllium inventory.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Baseline beryllium inventory. 850.20 Section 850.20 Energy... Baseline beryllium inventory. (a) The responsible employer must develop a baseline inventory of the... inventory, the responsible employer must: (1) Review current and historical records; (2) Interview workers...

  20. Improving precipitation simulation from updated surface characteristics in South America

    Science.gov (United States)

    Pereira, Gabriel; Silva, Maria Elisa Siqueira; Moraes, Elisabete Caria; Chiquetto, Júlio Barboza; da Silva Cardozo, Francielle

    2017-07-01

    Land use and land cover maps and their physical-chemical and biological properties are important variables in the numerical modeling of Earth systems. In this context, the main objective of this study is to analyze the improvements resulting from the land use and land cover map update in numerical simulations performed using the Regional Climate Model system version 4 (RegCM4), as well as the seasonal variations of physical parameters used by the Biosphere Atmosphere Transfer Scheme (BATS). In general, the update of the South America 2007 land use and land cover map, used by the BATS, improved the simulation of precipitation by 10 %, increasing the mean temporal correlation coefficient, compared to observed data, from 0.84 to 0.92 (significant at p Atlantic convergence zone (SACZ) positioning, presenting a spatial pattern of alternated areas with higher and lower precipitation rates. These important differences occur due to the replacement of tropical rainforest for pasture and agriculture and the replacement of agricultural areas for pasture, scrubland, and deciduous forest.

  1. Integrated planning: A baseline development perspective

    International Nuclear Information System (INIS)

    Clauss, L.; Chang, D.

    1994-01-01

    The FEMP Baseline establishes the basis for integrating environmental activity technical requirements with their cost and schedule elements. The result is a path forward to successfully achieving the FERMCO mission. Specific to cost management, the FEMP Baseline has been incorporate into the FERMCO Project Control System (PCS) to provide a time-phased budget plan against which contractor performance is measured with an earned value management system. The result is the Performance Measurement Baseline (PMB), an important tool for keeping cost under control

  2. Plasma parameters for alternate operating modes of TIBER-II

    International Nuclear Information System (INIS)

    Fenstermacher, M.E.; Devoto, R.S.; Logan, B.G.; Perkins, L.J.

    1987-01-01

    Parameters for operating points of TIBER-II, different from the baseline steady-state operation, are presented. These results have been generated with the MUMAK tokamak power balance code. Pulsed ignited and high performance steady-state operating points are described. 20 refs

  3. A State Space Model for Spatial Updating of Remembered Visual Targets during Eye Movements.

    Science.gov (United States)

    Mohsenzadeh, Yalda; Dash, Suryadeep; Crawford, J Douglas

    2016-01-01

    In the oculomotor system, spatial updating is the ability to aim a saccade toward a remembered visual target position despite intervening eye movements. Although this has been the subject of extensive experimental investigation, there is still no unifying theoretical framework to explain the neural mechanism for this phenomenon, and how it influences visual signals in the brain. Here, we propose a unified state-space model (SSM) to account for the dynamics of spatial updating during two types of eye movement; saccades and smooth pursuit. Our proposed model is a non-linear SSM and implemented through a recurrent radial-basis-function neural network in a dual Extended Kalman filter (EKF) structure. The model parameters and internal states (remembered target position) are estimated sequentially using the EKF method. The proposed model replicates two fundamental experimental observations: continuous gaze-centered updating of visual memory-related activity during smooth pursuit, and predictive remapping of visual memory activity before and during saccades. Moreover, our model makes the new prediction that, when uncertainty of input signals is incorporated in the model, neural population activity and receptive fields expand just before and during saccades. These results suggest that visual remapping and motor updating are part of a common visuomotor mechanism, and that subjective perceptual constancy arises in part from training the visual system on motor tasks.

  4. Physics with a very long neutrino factory baseline

    International Nuclear Information System (INIS)

    Gandhi, Raj; Winter, Walter

    2007-01-01

    We discuss the neutrino oscillation physics of a very long neutrino factory baseline over a broad range of lengths (between 6000 km and 9000 km), centered on the 'magic baseline' (∼7500 km) where correlations with the leptonic CP phase are suppressed by matter effects. Since the magic baseline depends only on the density, we study the impact of matter density profile effects and density uncertainties over this range, and the impact of detector locations off the optimal baseline. We find that the optimal constant density describing the physics over this entire baseline range is about 5% higher than the average matter density. This implies that the magic baseline is significantly shorter than previously inferred. However, while a single detector optimization requires fine-tuning of the (very long) baseline length, its combination with a near detector at a shorter baseline is much less sensitive to the far detector location and to uncertainties in the matter density. In addition, we point out different applications of this baseline which go beyond its excellent correlation and degeneracy resolution potential. We demonstrate that such a long baseline assists in the improvement of the θ 13 precision and in the resolution of the octant degeneracy. Moreover, we show that the neutrino data from such a baseline could be used to extract the matter density along the profile up to 0.24% at 1σ for large sin 2 2θ 13 , providing a useful discriminator between different geophysical models

  5. The NuMAX Long Baseline Neutrino Factory Concept

    Energy Technology Data Exchange (ETDEWEB)

    Delahaye, J-P. [SLAC; Ankenbrandt, C. [MUONS Inc., Batavia; Bogacz, A. [Jefferson Lab; Huber, P. [Virginia Tech.; Kirk, H. [Brookhaven; Neuffer, D. [Fermilab; Palmer, M. A. [Fermilab; Ryne, R. [LBL, Berkeley; Snopok, P. [IIT, Chicago

    2018-03-19

    A Neutrino Factory where neutrinos of all species are produced in equal quantities by muon decay is described as a facility at the intensity frontier for exquisite precision providing ideal conditions for ultimate neutrino studies and the ideal complement to Long Baseline Facilities like LBNF at Fermilab. It is foreseen to be built in stages with progressively increasing complexity and performance, taking advantage of existing or proposed facilities at an existing laboratory like Fermilab. A tentative layout based on a recirculating linac providing opportunities for considerable saving is discussed as well as its possible evolution toward a muon collider if and when requested by Physics. Tentative parameters of the various stages are presented as well as the necessary R&D to address the technological issues and demonstrate their feasibility.

  6. Enhanced 3D PET OSEM reconstruction using inter-update Metz filtering

    International Nuclear Information System (INIS)

    Jacobson, M.; Levkovitz, R.; Ben-Tal, A.; Thielemans, K.; Spinks, T.; Belluzzo, D.; Pagani, E.; Bettinardi, V.; Gilardi, M.C.; Zverovich, A.; Mitra, G.

    2000-01-01

    We present an enhancement of the OSEM (ordered set expectation maximization) algorithm for 3D PET reconstruction, which we call the inter-update Metz filtered OSEM (IMF-OSEM). The IMF-OSEM algorithm incorporates filtering action into the image updating process in order to improve the quality of the reconstruction. With this technique, the multiplicative correction image - ordinarily used to update image estimates in plain OSEM - is applied to a Metz-filtered version of the image estimate at certain intervals. In addition, we present a software implementation that employs several high-speed features to accelerate reconstruction. These features include, firstly, forward and back projection functions which make full use of symmetry as well as a fast incremental computation technique. Secondly, the software has the capability of running in parallel mode on several processors. The parallelization approach employed yields a significant speed-up, which is nearly independent of the amount of data. Together, these features lead to reasonable reconstruction times even when using large image arrays and non-axially compressed projection data. The performance of IMF-OSEM was tested on phantom data acquired on the GE Advance scanner. Our results demonstrate that an appropriate choice of Metz filter parameters can improve the contrast-noise balance of certain regions of interest relative to both plain and post-filtered OSEM, and to the GE commercial reprojection algorithm software. (author)

  7. 7. Mentor update and support: what do mentors need from an update?

    Science.gov (United States)

    Phillips, Mari; Marshall, Joyce

    2015-04-01

    Mentorship is the 14th series of 'Midwifery basics' targeted at practising midwives. The aim of these articles is to provide information to raise awareness of the impact of the work of midwives on women's experience, and encourage midwives to seek further information through a series of activities relating to the topic. In this seventh article Mari Phillips and Joyce Marshall consider some of the key issues related to mentor update and support and consider what mentors need from their annual update.

  8. Email Updates

    Science.gov (United States)

    ... of this page: https://medlineplus.gov/listserv.html Email Updates To use the sharing features on this ... view your email history or unsubscribe. Prevent MedlinePlus emails from being marked as "spam" or "junk" To ...

  9. Valence-Dependent Belief Updating: Computational Validation

    Directory of Open Access Journals (Sweden)

    Bojana Kuzmanovic

    2017-06-01

    Full Text Available People tend to update beliefs about their future outcomes in a valence-dependent way: they are likely to incorporate good news and to neglect bad news. However, belief formation is a complex process which depends not only on motivational factors such as the desire for favorable conclusions, but also on multiple cognitive variables such as prior beliefs, knowledge about personal vulnerabilities and resources, and the size of the probabilities and estimation errors. Thus, we applied computational modeling in order to test for valence-induced biases in updating while formally controlling for relevant cognitive factors. We compared biased and unbiased Bayesian models of belief updating, and specified alternative models based on reinforcement learning. The experiment consisted of 80 trials with 80 different adverse future life events. In each trial, participants estimated the base rate of one of these events and estimated their own risk of experiencing the event before and after being confronted with the actual base rate. Belief updates corresponded to the difference between the two self-risk estimates. Valence-dependent updating was assessed by comparing trials with good news (better-than-expected base rates with trials with bad news (worse-than-expected base rates. After receiving bad relative to good news, participants' updates were smaller and deviated more strongly from rational Bayesian predictions, indicating a valence-induced bias. Model comparison revealed that the biased (i.e., optimistic Bayesian model of belief updating better accounted for data than the unbiased (i.e., rational Bayesian model, confirming that the valence of the new information influenced the amount of updating. Moreover, alternative computational modeling based on reinforcement learning demonstrated higher learning rates for good than for bad news, as well as a moderating role of personal knowledge. Finally, in this specific experimental context, the approach based on

  10. Nondestructive Evaluation of Railway Bridge by System Identification Using Field Vibration Measurement

    International Nuclear Information System (INIS)

    Ho, Duc Duy; Hong, Dong Soo; Kim, Jeong Tae

    2010-01-01

    This paper presents a nondestructive evaluation approach for system identification (SID) of real railway bridges using field vibration test results. First, a multi-phase SID scheme designed on the basis of eigenvalue sensitivity concept is presented. Next, the proposed multi-phase approach is evaluated from field vibration tests on a real railway bridge (Wondongcheon bridge) located in Yangsan, Korea. On the steel girder bridge, a few natural frequencies and mode shapes are experimentally measured under the ambient vibration condition. The corresponding modal parameters are numerically calculated from a three-dimensional finite element (FE) model established for the target bridge. Eigenvalue sensitivities are analyzed for potential model-updating parameters of the FE model. Then, structural subsystems are identified phase-by-phase using the proposed model-updating procedure. Based on model-updating results, a baseline model and a nondestructive evaluation of test bridge are identified

  11. On-line Bayesian model updating for structural health monitoring

    Science.gov (United States)

    Rocchetta, Roberto; Broggi, Matteo; Huchet, Quentin; Patelli, Edoardo

    2018-03-01

    Fatigue induced cracks is a dangerous failure mechanism which affects mechanical components subject to alternating load cycles. System health monitoring should be adopted to identify cracks which can jeopardise the structure. Real-time damage detection may fail in the identification of the cracks due to different sources of uncertainty which have been poorly assessed or even fully neglected. In this paper, a novel efficient and robust procedure is used for the detection of cracks locations and lengths in mechanical components. A Bayesian model updating framework is employed, which allows accounting for relevant sources of uncertainty. The idea underpinning the approach is to identify the most probable crack consistent with the experimental measurements. To tackle the computational cost of the Bayesian approach an emulator is adopted for replacing the computationally costly Finite Element model. To improve the overall robustness of the procedure, different numerical likelihoods, measurement noises and imprecision in the value of model parameters are analysed and their effects quantified. The accuracy of the stochastic updating and the efficiency of the numerical procedure are discussed. An experimental aluminium frame and on a numerical model of a typical car suspension arm are used to demonstrate the applicability of the approach.

  12. 49 CFR 360.5 - Updating user fees.

    Science.gov (United States)

    2010-10-01

    ... updating the cost components comprising the fee. Cost components shall be updated as follows: (1) Direct... determined by the cost study in Regulations Governing Fees For Service, 1 I.C.C. 2d 60 (1984), or subsequent... by total office costs for the office directly associated with user fee activity. Actual updating of...

  13. The choice of leasing companies for automobile fleet updating on the basis of hierarchies analysis method

    OpenAIRE

    Dorohov, А.

    2007-01-01

    The basic criteria of leasing companies choice by the transport enterprises for automobile fleet updating such as terms of financing, size of advance, assortment time of existence at the market, have been determined. The determination of the best leasing company according to these parameters on the basis of hierarchies analysis method has been offered.

  14. Polarized Redundant-Baseline Calibration for 21 cm Cosmology Without Adding Spectral Structure

    Science.gov (United States)

    Dillon, Joshua S.; Kohn, Saul A.; Parsons, Aaron R.; Aguirre, James E.; Ali, Zaki S.; Bernardi, Gianni; Kern, Nicholas S.; Li, Wenyang; Liu, Adrian; Nunhokee, Chuneeta D.; Pober, Jonathan C.

    2018-04-01

    21 cm cosmology is a promising new probe of the evolution of visible matter in our universe, especially during the poorly-constrained Cosmic Dawn and Epoch of Reionization. However, in order to separate the 21 cm signal from bright astrophysical foregrounds, we need an exquisite understanding of our telescopes so as to avoid adding spectral structure to spectrally-smooth foregrounds. One powerful calibration method relies on repeated simultaneous measurements of the same interferometric baseline to solve for the sky signal and for instrumental parameters simultaneously. However, certain degrees of freedom are not constrained by asserting internal consistency between redundant measurements. In this paper, we review the origin of these degeneracies of redundant-baseline calibration and demonstrate how they can source unwanted spectral structure in our measurement and show how to eliminate that additional, artificial structure. We also generalize redundant calibration to dual-polarization instruments, derive the degeneracy structure, and explore the unique challenges to calibration and preserving spectral smoothness presented by a polarized measurement.

  15. Wind Farm Decentralized Dynamic Modeling With Parameters

    DEFF Research Database (Denmark)

    Soltani, Mohsen; Shakeri, Sayyed Mojtaba; Grunnet, Jacob Deleuran

    2010-01-01

    Development of dynamic wind flow models for wind farms is part of the research in European research FP7 project AEOLUS. The objective of this report is to provide decentralized dynamic wind flow models with parameters. The report presents a structure for decentralized flow models with inputs from...... local models. The results of this report are especially useful, but not limited, to design a decentralized wind farm controller, since in centralized controller design one can also use the model and update it in a central computing node.......Development of dynamic wind flow models for wind farms is part of the research in European research FP7 project AEOLUS. The objective of this report is to provide decentralized dynamic wind flow models with parameters. The report presents a structure for decentralized flow models with inputs from...

  16. Second-generation speed limit map updating applications

    DEFF Research Database (Denmark)

    Tradisauskas, Nerius; Agerholm, Niels; Juhl, Jens

    2011-01-01

    Intelligent Speed Adaptation is an Intelligent Transport System developed to significantly improve road safety in helping car drivers maintain appropriate driving behaviour. The system works in connection with the speed limits on the road network. It is thus essential to keep the speed limit map...... used in the Intelligent Speed Adaptation scheme updated. The traditional method of updating speed limit maps on the basis of long time interval observations needed to be replaced by a more efficient speed limit updating tool. In a Danish Intelligent Speed Adaptation trial a web-based tool was therefore...... for map updating should preferably be made on the basis of a commercial map provider, 2 such as Google Maps and that the real challenge is to oblige road authorities to carry out updates....

  17. National baselines for the Sustainable Development Goals assessed in the SDG Index and Dashboards

    Science.gov (United States)

    Schmidt-Traub, Guido; Kroll, Christian; Teksoz, Katerina; Durand-Delacre, David; Sachs, Jeffrey D.

    2017-08-01

    The Sustainable Development Goals (SDGs) -- agreed in 2015 by all 193 member states of the United Nations and complemented by commitments made in the Paris Agreement -- map out a broad spectrum of economic, social and environmental objectives to be achieved by 2030. Reaching these goals will require deep transformations in every country, as well as major efforts in monitoring and measuring progress. Here we introduce the SDG Index and Dashboards as analytical tools for assessing countries' baselines for the SDGs that can be applied by researchers in the cross-disciplinary analyses required for implementation. The Index and Dashboards synthesize available country-level data for all 17 goals, and for each country estimate the size of the gap towards achieving the SDGs. They will be updated annually. All 149 countries for which sufficient data is available face significant challenges in achieving the goals, and many countries' development strategies are imbalanced across the economic, social and environmental priorities. We illustrate the analytical value of the index by examining its relationship with other widely used development indices and by showing how it accounts for cross-national differences in subjective well-being. Given significant data gaps, scope and coverage of the Index and Dashboards are limited, but we suggest that these analyses represent a starting point for a comprehensive assessment of national SDG baselines and can help policymakers determine priorities for early action and monitor progress. The tools also identify data gaps that must be closed for SDG monitoring.

  18. Evaluation of selected parameters on exposure rates in Westinghouse designed nuclear power plants

    International Nuclear Information System (INIS)

    Bergmann, C.A.

    1989-01-01

    During the past ten years, Westinghouse under EPRI contract and independently, has performed research and evaluation of plant data to define the trends of ex-core component exposure rates and the effects of various parameters on the exposure rates. The effects of the parameters were evaluated using comparative analyses or empirical techniques. This paper updates the information presented at the Fourth Bournemouth Conference and the conclusions obtained from the effects of selected parameters namely, coolant chemistry, physical changes, use of enriched boric acid, and cobalt input on plant exposure rates. The trends of exposure rates and relationship to doses is also presented. (author)

  19. Updating Environmental Media Concentration Limits and Uncertainty factors in the ERICA Tool

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.E.; Hosseini, A. [Norwegian Radiation Protection Authority, P.O. Box 55, N-1332 Oesteraas (Norway); Alfonso, B.; Avila, R. [Facilia AB, S-167 51 Bromma (Sweden); Beresford, N.A. [Centre for Ecology and Hydrology, CEH-Lancaster, Lancaster Environment Centre, Library Avenue, Bailrigg, Lancaster LA 1 4AP (United Kingdom); Copplestone, D. [Dept. Biological and Environmental Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom)

    2014-07-01

    Tiered approaches have become a standard means of structuring information in the process of conducting environmental risk assessments. For cases involving the assessment of impacts on wildlife from ionising radiation, the ERICA integrated approach and its supporting software (The ERICA Tool) provides such a structure, splitting the system into two generic screening tiers and a third site-specific tier. The first Tier is very simple, based around Environmental Media Concentration Limits, EMCLs, and requires minimal input from the assessor. The second Tier, although still a screening tier, calculates dose rates and requires more detailed input from the assessor allowing for scrutiny and editing of default parameters in the process. A key element of Tier 2 involves the application of Uncertainty Factors, UFs. Such factors reflect our knowledge concerning probability distribution functions and provide a way of incorporating conservatism into the assessment by considering high percentile values in underlying parameters. Following its launch in 2007, there have been significant developments regarding certain components of the ERICA integrated approach. Most notably, an extended international collation of concentration ratio data has precipitated the need to update parameter values in the Tools databases. In addition, more considered guidance has been developed with regards to filling knowledge gaps in the absence of transfer data. Furthermore, the efficacy of the methods used in assigning probability distribution functions has been questioned leading to an acknowledgement from the developers that the methods were not described in enough detail nor were the justifications for applying the selected approach provided in a convincing way. This has implications for the EMCL values which are derived probabilistically using parameters including concentration ratios. Furthermore, there are implications for UF derivation that relies upon a robust consideration of underlying

  20. Chaos anti-synchronization of two non-identical chaotic systems with known or fully unknown parameters

    International Nuclear Information System (INIS)

    Al-Sawalha, Ayman

    2009-01-01

    This work is devoted to investigating the anti-synchronization between two novel different chaotic systems. Two different anti-synchronization methods are proposed. Active control is applied when system parameters are known and adaptive control is employed when system parameters are uncertain or unknown. Controllers and update laws of parameters are designed based on Lyapunov stability theory. In both cases, sufficient conditions for the anti-synchronization are obtained analytically. Finally, a numerical simulations is presented to show the effectiveness of the proposed chaos anti-synchronization schemes.

  1. Large-baseline InSAR for precise topographic mapping: a framework for TanDEM-X large-baseline data

    Directory of Open Access Journals (Sweden)

    M. Pinheiro

    2017-09-01

    Full Text Available The global Digital Elevation Model (DEM resulting from the TanDEM-X mission provides information about the world topography with outstanding precision. In fact, performance analysis carried out with the already available data have shown that the global product is well within the requirements of 10 m absolute vertical accuracy and 2 m relative vertical accuracy for flat to moderate terrain. The mission's science phase took place from October 2014 to December 2015. During this phase, bistatic acquisitions with across-track separation between the two satellites up to 3.6 km at the equator were commanded. Since the relative vertical accuracy of InSAR derived elevation models is, in principle, inversely proportional to the system baseline, the TanDEM-X science phase opened the doors for the generation of elevation models with improved quality with respect to the standard product. However, the interferometric processing of the large-baseline data is troublesome due to the increased volume decorrelation and very high frequency of the phase variations. Hence, in order to fully profit from the increased baseline, sophisticated algorithms for the interferometric processing, and, in particular, for the phase unwrapping have to be considered. This paper proposes a novel dual-baseline region-growing framework for the phase unwrapping of the large-baseline interferograms. Results from two experiments with data from the TanDEM-X science phase are discussed, corroborating the expected increased level of detail of the large-baseline DEMs.

  2. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas March 26Update of the voice messaging systemAll CERN sites April 4Updat...

  3. Breast Cancer and Estrogen-Alone Update

    Science.gov (United States)

    ... Current Issue Past Issues Research News From NIH Breast Cancer and Estrogen-Alone Update Past Issues / Summer 2006 ... hormone therapy does not increase the risk of breast cancer in postmenopausal women, according to an updated analysis ...

  4. Charactering baseline shift with 4th polynomial function for portable biomedical near-infrared spectroscopy device

    Science.gov (United States)

    Zhao, Ke; Ji, Yaoyao; Pan, Boan; Li, Ting

    2018-02-01

    The continuous-wave Near-infrared spectroscopy (NIRS) devices have been highlighted for its clinical and health care applications in noninvasive hemodynamic measurements. The baseline shift of the deviation measurement attracts lots of attentions for its clinical importance. Nonetheless current published methods have low reliability or high variability. In this study, we found a perfect polynomial fitting function for baseline removal, using NIRS. Unlike previous studies on baseline correction for near-infrared spectroscopy evaluation of non-hemodynamic particles, we focused on baseline fitting and corresponding correction method for NIRS and found that the polynomial fitting function at 4th order is greater than the function at 2nd order reported in previous research. Through experimental tests of hemodynamic parameters of the solid phantom, we compared the fitting effect between the 4th order polynomial and the 2nd order polynomial, by recording and analyzing the R values and the SSE (the sum of squares due to error) values. The R values of the 4th order polynomial function fitting are all higher than 0.99, which are significantly higher than the corresponding ones of 2nd order, while the SSE values of the 4th order are significantly smaller than the corresponding ones of the 2nd order. By using the high-reliable and low-variable 4th order polynomial fitting function, we are able to remove the baseline online to obtain more accurate NIRS measurements.

  5. The chemistry of Magela Creek. A baseline for assessing change downstream of Ranger. Supervising Scientist report 151

    International Nuclear Information System (INIS)

    Klessa, D.A.

    2000-01-01

    The compositions of waters in Magela Creek upstream and downstream of Ranger uranium mine were reviewed. The water quality parameters examined were pH, electrical conductivity (EC) and turbidity, and dissolved calcium, magnesium, sodium, potassium, chloride, sulphate, ammonium, nitrate, copper, lead, manganese, zinc, uranium and radium-226. The frequency distributions of each of these parameters in waters upstream of the mine were characterised and statistically described to provide a baseline which allows a change in water chemistry downstream of the mine to be assessed. With the exception of pH, EC, turbidity, magnesium, calcium, sodium and manganese, data that comprise the baseline are not normally distributed. The frequency distributions of copper, lead, zinc, uranium and radium-226 forming the baseline are characterised by a large proportion of values at or near analytical detection limits and contamination in a relatively large proportion of the remainder. A comparison of upstream and downstream data shows that there is good conformity in pH, EC, turbidity, sodium, potassium and chloride. For calcium, nitrate, ammonium, lead, uranium, radium and zinc less than 40% of the downstream data fall outside the 20th and 80th baseline percentiles but in the ease of U, data are biased towards relatively high values. More than 40% of downstream magnesium and sulphate data are outside these percentile boundaries and are skewed towards relatively high concentrations. Copper, lead and zinc in mine waters (characterised by the composition of waters contained in the former RP4) do not appear to pose a risk as contaminants based upon the results of toxicity testing and water quality guideline trigger levels with risk minimised for greater than 1 in 20 dilution

  6. Working Memory Updating as a Predictor of Academic Attainment

    Science.gov (United States)

    Lechuga, M. Teresa; Pelegrina, Santiago; Pelaez, Jose L.; Martin-Puga, M. Eva; Justicia, M. Jose

    2016-01-01

    There is growing evidence supporting the importance of executive functions, and specifically working memory updating (WMU), for children's academic achievement. This study aimed to assess the specific contribution of updating to the prediction of academic performance. Two updating tasks, which included different updating components, were…

  7. Standard Model updates and new physics analysis with the Unitarity Triangle fit

    International Nuclear Information System (INIS)

    Bevan, A.; Bona, M.; Ciuchini, M.; Derkach, D.; Franco, E.; Silvestrini, L.; Lubicz, V.; Tarantino, C.; Martinelli, G.; Parodi, F.; Schiavi, C.; Pierini, M.; Sordini, V.; Stocchi, A.; Vagnoni, V.

    2013-01-01

    We present the summer 2012 update of the Unitarity Triangle (UT) analysis performed by the UTfit Collaboration within the Standard Model (SM) and beyond. The increased accuracy on several of the fundamental constraints is now enhancing some of the tensions amongst and within the constraint themselves. In particular, the long standing tension between exclusive and inclusive determinations of the V ub and V cb CKM matrix elements is now playing a major role. Then we present the generalisation the UT analysis to investigate new physics (NP) effects, updating the constraints on NP contributions to ΔF=2 processes. In the NP analysis, both CKM and NP parameters are fitted simultaneously to obtain the possible NP effects in any specific sector. Finally, based on the NP constraints, we derive upper bounds on the coefficients of the most general ΔF=2 effective Hamiltonian. These upper bounds can be translated into lower bounds on the scale of NP that contributes to these low-energy effective interactions

  8. THE US LONG BASELINE NEUTRINO EXPERIMENT STUDY.

    Energy Technology Data Exchange (ETDEWEB)

    BISHAI,M.

    2007-08-06

    The US Long Baseline Neutrino Experiment Study was commissioned jointly by Brookhaven National Laboratory (BNL)and Fermi National Accelerator Laboratory (FNAL) to investigate the potential for future U.S. based long baseline neutrino oscillation experiments using MW class conventional neutrino beams that can be produced at FNAL. The experimental baselines are based on two possible detector locations: (1) off-axis to the existing FNAL NuMI beamline at baselines of 700 to 810 km and (2) NSF's proposed future Deep Underground Science and Engineering Laboratory (DUSEL) at baselines greater than 1000km. Two detector technologies are considered: a megaton class Water Cherenkov detector deployed deep underground at a DUSEL site, or a 100kT Liquid Argon Time-Projection Chamber (TPC) deployed on the surface at any of the proposed sites. The physics sensitivities of the proposed experiments are summarized. We find that conventional horn focused wide-band neutrino beam options from FNAL aimed at a massive detector with a baseline of > 1000km have the best sensitivity to CP violation and the neutrino mass hierarchy for values of the mixing angle {theta}{sub 13} down to 2{sup o}.

  9. Esophageal intraluminal baseline impedance is associated with severity of acid reflux and epithelial structural abnormalities in patients with gastroesophageal reflux disease.

    Science.gov (United States)

    Zhong, Chanjuan; Duan, Liping; Wang, Kun; Xu, Zhijie; Ge, Ying; Yang, Changqing; Han, Yajing

    2013-05-01

    The esophageal intraluminal baseline impedance may be used to evaluate the status of mucosa integrity. Esophageal acid exposure decreases the baseline impedance. We aimed to compare baseline impedance in patients with various reflux events and with different acid-related parameters, and investigate the relationships between epithelial histopathologic abnormalities and baseline impedance. A total of 229 GERD patients and 34 controls underwent 24-h multichannel intraluminal impedance and pH monitoring (MII-pH monitoring), gastroendoscopy, and completed a GERD questionnaire (GerdQ). We quantified epithelial intercellular spaces (ICSs) and expression of tight junction (TJ) proteins by histologic techniques. Mean baseline values in reflux esophagitis (RE) (1752 ± 1018 Ω) and non-erosive reflux disease (NERD) (2640 ± 1143 Ω) were significantly lower than in controls (3360 ± 1258 Ω; p acid reflux group (2510 ± 1239 Ω) and mixed acid/weakly acidic reflux group (2393 ± 1009 Ω) were much lower than in controls (3360 ± 1258 Ω; p = 0.020 and p acid exposure time (AET) (r = -0.41, p acid reflux events and with longer AET have low baseline impedance. Baseline values are correlated with esophageal mucosal histopathologic changes such as dilated ICS and TJ alteration.

  10. A COCAP program for the statistical analysis of common cause failure parameters

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Baehyeuk; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2016-03-15

    Probabilistic Safety Assessment (PSA) based applications and regulations are becoming more important in the field of nuclear energy. According to the results of a PSA in Korea, the common cause failure evaluates CDF (Core Damage Frequency) as one of the significant factors affecting redundancy of NPPs. The purpose of the study is to develop a COCAP (Common Cause Failure parameter Analysis for PSA) program for the accurate use of the alpha factor model parameter data provided by other countries and for obtaining the indigenous CCF data of NPPs in Korea through Bayesian updating.

  11. Very Long Baseline Interferometry: Dependencies on Frequency Stability

    Science.gov (United States)

    Nothnagel, Axel; Nilsson, Tobias; Schuh, Harald

    2018-04-01

    Very Long Baseline Interferometry (VLBI) is a differential technique observing radiation of compact extra-galactic radio sources with pairs of radio telescopes. For these observations, the frequency standards at the telescopes need to have very high stability. In this article we discuss why this is, and we investigate exactly how precise the frequency standards need to be. Four areas where good clock performance is needed are considered: coherence, geodetic parameter estimation, correlator synchronization, and UT1 determination. We show that in order to ensure the highest accuracy of VLBI, stability similar to that of a hydrogen maser is needed for time-scales up to a few hours. In the article, we are considering both traditional VLBI where extra-galactic radio sources are observed, as well as observation of man-made artificial radio sources emitted by satellites or spacecrafts.

  12. AN UPDATED ULTRAVIOLET CATALOG OF GALEX NEARBY GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Yu; Zou, Hu; Liu, JiFeng; Wang, Song, E-mail: ybai@nao.cas.cn, E-mail: zouhu@nao.cas.cn, E-mail: jfliu@nao.cas.cn, E-mail: songw@nao.cas.cn [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, 20A Datun Road, Chaoyang Distict, 100012 Beijing (China)

    2015-09-15

    The ultraviolet (UV) catalog of nearby galaxies compiled by Gil de Paz et al. presents the integrated photometry and surface brightness profiles for 1034 nearby galaxies observed by GALEX. We provide an updated catalog of 4138 nearby galaxies based on the latest Genral Release (GR6/GR7) of GALEX. These galaxies are selected from HyperLeda with apparent diameters larger than 1′. From the surface brightness profiles accurately measured using the deep NUV and FUV images, we have calculated the asymptotic magnitudes, aperture (D25) magnitudes, colors, structural parameters (effective radii and concentration indices), luminosities, and effective surface brightness for these galaxies. Archival optical and infrared photometry from HyperLeda, 2MASS, and IRAS are also integrated into the catalog. Our parameter measurements and some analyses are consistent with those of Paz et al. The (FUV − K) color provides a good criterion to distinguish between early- and late-type galaxies, which can be improved further using the concentration indices. The IRX–β relation is reformulated with our UV-selected nearby galaxies.

  13. Design parameters and source terms: Volume 2, Source terms: Revision 0

    International Nuclear Information System (INIS)

    1987-10-01

    The Design Parameters and Source Terms Document was prepared in accordance with DOE request and to provide data for the environmental impact study to be performed in the future for the Deaf Smith County, Texas site for a nuclear waste repository in salt. This document updates a previous unpublished report by Stearns Catalytic Corporation (SCC), entitled ''Design Parameters and Source Terms for a Two-Phase Repository Salt,'' 1985, to the level of the Site Characterization Plan - Conceptual Design Report. The previous unpublished SCC Study identifies the data needs for the Environmental Assessment effort for seven possible Salt Repository sites. 2 tabs

  14. An automated baseline correction protocol for infrared spectra of atmospheric aerosols collected on polytetrafluoroethylene (Teflon) filters

    Science.gov (United States)

    Kuzmiakova, Adele; Dillner, Ann M.; Takahama, Satoshi

    2016-06-01

    A growing body of research on statistical applications for characterization of atmospheric aerosol Fourier transform infrared (FT-IR) samples collected on polytetrafluoroethylene (PTFE) filters (e.g., Russell et al., 2011; Ruthenburg et al., 2014) and a rising interest in analyzing FT-IR samples collected by air quality monitoring networks call for an automated PTFE baseline correction solution. The existing polynomial technique (Takahama et al., 2013) is not scalable to a project with a large number of aerosol samples because it contains many parameters and requires expert intervention. Therefore, the question of how to develop an automated method for baseline correcting hundreds to thousands of ambient aerosol spectra given the variability in both environmental mixture composition and PTFE baselines remains. This study approaches the question by detailing the statistical protocol, which allows for the precise definition of analyte and background subregions, applies nonparametric smoothing splines to reproduce sample-specific PTFE variations, and integrates performance metrics from atmospheric aerosol and blank samples alike in the smoothing parameter selection. Referencing 794 atmospheric aerosol samples from seven Interagency Monitoring of PROtected Visual Environment (IMPROVE) sites collected during 2011, we start by identifying key FT-IR signal characteristics, such as non-negative absorbance or analyte segment transformation, to capture sample-specific transitions between background and analyte. While referring to qualitative properties of PTFE background, the goal of smoothing splines interpolation is to learn the baseline structure in the background region to predict the baseline structure in the analyte region. We then validate the model by comparing smoothing splines baseline-corrected spectra with uncorrected and polynomial baseline (PB)-corrected equivalents via three statistical applications: (1) clustering analysis, (2) functional group quantification

  15. A new Bayesian recursive technique for parameter estimation

    Science.gov (United States)

    Kaheil, Yasir H.; Gill, M. Kashif; McKee, Mac; Bastidas, Luis

    2006-08-01

    The performance of any model depends on how well its associated parameters are estimated. In the current application, a localized Bayesian recursive estimation (LOBARE) approach is devised for parameter estimation. The LOBARE methodology is an extension of the Bayesian recursive estimation (BARE) method. It is applied in this paper on two different types of models: an artificial intelligence (AI) model in the form of a support vector machine (SVM) application for forecasting soil moisture and a conceptual rainfall-runoff (CRR) model represented by the Sacramento soil moisture accounting (SAC-SMA) model. Support vector machines, based on statistical learning theory (SLT), represent the modeling task as a quadratic optimization problem and have already been used in various applications in hydrology. They require estimation of three parameters. SAC-SMA is a very well known model that estimates runoff. It has a 13-dimensional parameter space. In the LOBARE approach presented here, Bayesian inference is used in an iterative fashion to estimate the parameter space that will most likely enclose a best parameter set. This is done by narrowing the sampling space through updating the "parent" bounds based on their fitness. These bounds are actually the parameter sets that were selected by BARE runs on subspaces of the initial parameter space. The new approach results in faster convergence toward the optimal parameter set using minimum training/calibration data and fewer sets of parameter values. The efficacy of the localized methodology is also compared with the previously used BARE algorithm.

  16. Staged Optimization Design for Updating Urban Drainage Systems in a City of China

    Directory of Open Access Journals (Sweden)

    Kui Xu

    2018-01-01

    Full Text Available Flooding has been reported more often than in the past in most cities of China in recent years. In response, China’s State Council has urged the 36 largest cities to update the preparedness to handle the 50-year rainfall, which would be a massive project with large investments. We propose a staged optimization design for updating urban drainage that is not only a flexible option against environmental changes, but also an effective way to reduce the cost of the project. The staged cost optimization model involving the hydraulic model was developed in Fuzhou City, China. This model was established to minimize the total present costs, including intervention costs and flooding costs, with full consideration of the constraints of specific local conditions. The results show that considerable financial savings could be achieved by a staged design rather than the implement-once scheme. The model’s sensitivities to four data parameters were analyzed, including rainfall increase rate, flood unit cost, storage unit cost, and discount rate. The results confirm the applicability and robustness of the model for updating drainage systems to meet the requirements. The findings of this study may have important implications on urban flood management in the cities of developing countries with limited construction investments.

  17. The prognostic value of baseline {sup 18}F-FDG PET/CT in steroid-naive large-vessel vasculitis: introduction of volume-based parameters

    Energy Technology Data Exchange (ETDEWEB)

    Dellavedova, L. [Ospedale Civile di Legnano, PET/CT Center - Nuclear Medicine Department, Legnano (Italy); University of Milan, Department of Health Sciences, Milan (Italy); Carletto, M.; Maffioli, L.S. [Ospedale Civile di Legnano, PET/CT Center - Nuclear Medicine Department, Legnano (Italy); Faggioli, P.; Sciascera, A.; Mazzone, A. [Ospedale Civile di Legnano, Internal Medicine Department, Legnano (Italy); Del Sole, A. [University of Milan, Department of Health Sciences, Milan (Italy)

    2016-02-15

    The aim of this study was to analyse if the result of a baseline {sup 18}F-fluorodeoxyglucose (FDG) positron emission tomography (PET)/CT scan, in large-vessel vasculitis (LVV) patients, is able to predict the course of the disease, not only in terms of presence/absence of final complications but also in terms of favourable/complicated progress (response to steroid therapy, time to steroid suspension, relapses, etc.). A total of 46 consecutive patients, who underwent {sup 18}F-FDG PET/CT between May 2010 and March 2013 for fever of unknown origin (FUO) or suspected vasculitis (before starting corticosteroid therapy), were enrolled. The diagnosis of LVV was confirmed in 17 patients. Considering follow-up results, positive LVV patients were divided into two groups, one characterized by favourable (nine) and the other by complicated progress (eight), on the basis of presence/absence of vascular complications, presence/absence of at least another positive PET/CT during follow-up and impossibility to comply with the tapering schedule of the steroid due to biochemical/symptomatic relapse. Vessel uptake in subjects of the two groups was compared in terms of intensity and extension. To evaluate the extent of active disease, we introduced two volume-based parameters: ''volume of increased uptake'' (VIU) and ''total lesion glycolysis'' (TLG). The threshold used to calculate VIU on vessel walls was obtained by the ''vessel to liver'' ratio by means of receiver-operating characteristic analysis and was set at 0.92 x liver maximum standardized uptake value in each patient. Measures of tracer uptake intensity were significantly higher in patients with complicated progress compared to those with a favourable one (p < 0.05). Measures of disease extension were even more significant and TLG emerged as the best parameter to separate the two groups of patients (p = 0.01). This pilot study shows that, in LVV patients, the

  18. Worst case prediction of additives migration from polystyrene for food safety purposes: a model update.

    Science.gov (United States)

    Martínez-López, Brais; Gontard, Nathalie; Peyron, Stéphane

    2018-03-01

    A reliable prediction of migration levels of plastic additives into food requires a robust estimation of diffusivity. Predictive modelling of diffusivity as recommended by the EU commission is carried out using a semi-empirical equation that relies on two polymer-dependent parameters. These parameters were determined for the polymers most used by packaging industry (LLDPE, HDPE, PP, PET, PS, HIPS) from the diffusivity data available at that time. In the specific case of general purpose polystyrene, the diffusivity data published since then shows that the use of the equation with the original parameters results in systematic underestimation of diffusivity. The goal of this study was therefore, to propose an update of the aforementioned parameters for PS on the basis of up to date diffusivity data, so the equation can be used for a reasoned overestimation of diffusivity.

  19. Test models for improving filtering with model errors through stochastic parameter estimation

    International Nuclear Information System (INIS)

    Gershgorin, B.; Harlim, J.; Majda, A.J.

    2010-01-01

    The filtering skill for turbulent signals from nature is often limited by model errors created by utilizing an imperfect model for filtering. Updating the parameters in the imperfect model through stochastic parameter estimation is one way to increase filtering skill and model performance. Here a suite of stringent test models for filtering with stochastic parameter estimation is developed based on the Stochastic Parameterization Extended Kalman Filter (SPEKF). These new SPEKF-algorithms systematically correct both multiplicative and additive biases and involve exact formulas for propagating the mean and covariance including the parameters in the test model. A comprehensive study is presented of robust parameter regimes for increasing filtering skill through stochastic parameter estimation for turbulent signals as the observation time and observation noise are varied and even when the forcing is incorrectly specified. The results here provide useful guidelines for filtering turbulent signals in more complex systems with significant model errors.

  20. Energy storage systems cost update : a study for the DOE Energy Storage Systems Program.

    Energy Technology Data Exchange (ETDEWEB)

    Schoenung, Susan M. (Longitude 122 West, Menlo Park, CA)

    2011-04-01

    This paper reports the methodology for calculating present worth of system and operating costs for a number of energy storage technologies for representative electric utility applications. The values are an update from earlier reports, categorized by application use parameters. This work presents an update of energy storage system costs assessed previously and separately by the U.S. Department of Energy (DOE) Energy Storage Systems Program. The primary objective of the series of studies has been to express electricity storage benefits and costs using consistent assumptions, so that helpful benefit/cost comparisons can be made. Costs of energy storage systems depend not only on the type of technology, but also on the planned operation and especially the hours of storage needed. Calculating the present worth of life-cycle costs makes it possible to compare benefit values estimated on the same basis.

  1. Optimisation of milling parameters using neural network

    Directory of Open Access Journals (Sweden)

    Lipski Jerzy

    2017-01-01

    Full Text Available The purpose of this study was to design and test an intelligent computer software developed with the purpose of increasing average productivity of milling not compromising the design features of the final product. The developed system generates optimal milling parameters based on the extent of tool wear. The introduced optimisation algorithm employs a multilayer model of a milling process developed in the artificial neural network. The input parameters for model training are the following: cutting speed vc, feed per tooth fz and the degree of tool wear measured by means of localised flank wear (VB3. The output parameter is the surface roughness of a machined surface Ra. Since the model in the neural network exhibits good approximation of functional relationships, it was applied to determine optimal milling parameters in changeable tool wear conditions (VB3 and stabilisation of surface roughness parameter Ra. Our solution enables constant control over surface roughness parameters and productivity of milling process after each assessment of tool condition. The recommended parameters, i.e. those which applied in milling ensure desired surface roughness and maximal productivity, are selected from all the parameters generated by the model. The developed software may constitute an expert system supporting a milling machine operator. In addition, the application may be installed on a mobile device (smartphone, connected to a tool wear diagnostics instrument and the machine tool controller in order to supply updated optimal parameters of milling. The presented solution facilitates tool life optimisation and decreasing tool change costs, particularly during prolonged operation.

  2. Cortisol and politics: variance in voting behavior is predicted by baseline cortisol levels.

    Science.gov (United States)

    French, Jeffrey A; Smith, Kevin B; Alford, John R; Guck, Adam; Birnie, Andrew K; Hibbing, John R

    2014-06-22

    Participation in electoral politics is affected by a host of social and demographics variables, but there is growing evidence that biological predispositions may also play a role in behavior related to political involvement. We examined the role of individual variation in hypothalamic-pituitary-adrenal (HPA) stress axis parameters in explaining differences in self-reported and actual participation in political activities. Self-reported political activity, religious participation, and verified voting activity in U.S. national elections were collected from 105 participants, who were subsequently exposed to a standardized (nonpolitical) psychosocial stressor. We demonstrated that lower baseline salivary cortisol in the late afternoon was significantly associated with increased actual voting frequency in six national elections, but not with self-reported non-voting political activity. Baseline cortisol predicted significant variation in voting behavior above and beyond variation accounted for by traditional demographic variables (particularly age of participant in our sample). Participation in religious activity was weakly (and negatively) associated with baseline cortisol. Our results suggest that HPA-mediated characteristics of social, cognitive, and emotional processes may exert an influence on a trait as complex as voting behavior, and that cortisol is a better predictor of actual voting behavior, as opposed to self-reported political activity. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Cortisol and Politics: Variance in Voting Behavior is Predicted by Baseline Cortisol Levels

    Science.gov (United States)

    French, Jeffrey A.; Smith, Kevin B.; Alford, John R.; Guck, Adam; Birnie, Andrew K.; Hibbing, John R.

    2014-01-01

    Participation in electoral politics is affected by a host of social and demographics variables, but there is growing evidence that biological predispositions may also play a role in behavior related to political involvement. We examined the role of individual variation in hypothalamic-pituitary-adrenal (HPA) stress axis parameters in explaining differences in self-reported and actual participation in political activities. Self-reported political activity, religious participation, and verified voting activity in U.S. national elections were collected from 105 participants, who were subsequently exposed to a standardized (nonpolitical) psychosocial stressor. We demonstrated that lower baseline salivary cortisol in the late afternoon was significantly associated with increased actual voting frequency in six national elections, but not with self-reported non-voting political activity. Baseline cortisol predicted significant variation in voting behavior above and beyond variation accounted for by traditional demographic variables (particularly age of participant in our sample). Participation in religious activity was weakly (and negatively) associated with baseline cortisol. Our results suggest that HPA-mediated characteristics of social, cognitive, and emotional processes may exert an influence on a trait as complex as voting behavior, and that cortisol is a better predictor of actual voting behavior, as opposed to self-reported political activity. PMID:24835544

  4. The primordial helium abundance from updated emissivities

    International Nuclear Information System (INIS)

    Aver, Erik; Olive, Keith A.; Skillman, Evan D.; Porter, R.L.

    2013-01-01

    Observations of metal-poor extragalactic H II regions allow the determination of the primordial helium abundance, Y p . The He I emissivities are the foundation of the model of the H II region's emission. Porter, Ferland, Storey, and Detisch (2012) have recently published updated He I emissivities based on improved photoionization cross-sections. We incorporate these new atomic data and update our recent Markov Chain Monte Carlo analysis of the dataset published by Izotov, Thuan, and Stasi'nska (2007). As before, cuts are made to promote quality and reliability, and only solutions which fit the data within 95% confidence level are used to determine the primordial He abundance. The previously qualifying dataset is almost entirely retained and with strong concordance between the physical parameters. Overall, an upward bias from the new emissivities leads to a decrease in Y p . In addition, we find a general trend to larger uncertainties in individual objects (due to changes in the emissivities) and an increased variance (due to additional objects included). From a regression to zero metallicity, we determine Y p = 0.2465 ± 0.0097, in good agreement with the BBN result, Y p = 0.2485 ± 0.0002, based on the Planck determination of the baryon density. In the future, a better understanding of why a large fraction of spectra are not well fit by the model will be crucial to achieving an increase in the precision of the primordial helium abundance determination

  5. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas April 8 Update of switch in LHC 7 LHC 7 Point April 9 Update of...

  6. Rotating shaft model updating from modal data by a direct energy approach : a feasibility study

    International Nuclear Information System (INIS)

    Audebert, S.

    1996-01-01

    Investigations to improve the rotating machinery monitoring tend more and more to use numerical models. The aim is to obtain multi-fluid bearing rotor models which are able to correctly represent their dynamic behaviour, either modal or forced response type. The possibility of extending the direct energy method, initially developed for undamped structures, to rotating machinery is studied. It is based on the minimization of the kinetic and strain energy gap between experimental and analytic modal data. The preliminary determination of a multi-linear bearing rotor system Eigen modes shows the problem complexity in comparison with undamped non rotating structures: taking into account gyroscopic effects and bearing damping, as factors of rotor velocities, leads to complex component Eigen modes; moreover, non symmetric matrices, related to stiffness and damping bearing contributions, induce distinct left and right-hand side Eigen modes (left hand side Eigenmodes corresponds to the adjoint structure). Theoretically, the extension of the energy method is studied, considering first the intermediate case of an undamped non gyroscopic structure, second the general case of a rotating shaft: dta used for updating procedure are Eigen frequencies and left- and right- hand side mode shapes. Since left hand side mode shapes cannot be directly measured, they are replaced by analytic ones. The method is tested on a two-bearing rotor system, with a mass added; simulated data are used, relative to a non compatible structure, i.e. which is not a part of the set of modified analytic possible structures. Parameters to be corrected are the mass density, the Young's modulus, and the stiffness and damping linearized characteristics of bearings. If parameters are influent in regard with modes to be updates, the updating method permits a significant improvement of the gap between analytic and experimental modes, even for modes not involves in the procedure. Modal damping appears to be more

  7. 1993 baseline solid waste management system description

    International Nuclear Information System (INIS)

    Armacost, L.L.; Fowler, R.A.; Konynenbelt, H.S.

    1994-02-01

    Pacific Northwest Laboratory has prepared this report under the direction of Westinghouse Hanford Company. The report provides an integrated description of the system planned for managing Hanford's solid low-level waste, low-level mixed waste, transuranic waste, and transuranic mixed waste. The primary purpose of this document is to illustrate a collective view of the key functions planned at the Hanford Site to handle existing waste inventories, as well as solid wastes that will be generated in the future. By viewing this system as a whole rather than as individual projects, key facility interactions and requirements are identified and a better understanding of the overall system may be gained. The system is described so as to form a basis for modeling the system at various levels of detail. Model results provide insight into issues such as facility capacity requirements, alternative system operating strategies, and impacts of system changes (ie., startup dates). This description of the planned Hanford solid waste processing system: defines a baseline system configuration; identifies the entering waste streams to be managed within the system; identifies basic system functions and waste flows; and highlights system constraints. This system description will evolve and be revised as issues are resolved, planning decisions are made, additional data are collected, and assumptions are tested and changed. Out of necessity, this document will also be revised and updated so that a documented system description, which reflects current system planning, is always available for use by engineers and managers. It does not provide any results generated from the many alternatives that will be modeled in the course of analyzing solid waste disposal options; such results will be provided in separate documents

  8. Two-dimensional speckle tracking echocardiography prognostic parameters in patients after acute myocardial infarction.

    Science.gov (United States)

    Haberka, Maciej; Liszka, Jerzy; Kozyra, Andrzej; Finik, Maciej; Gąsior, Zbigniew

    2015-03-01

    The aim of the study was to evaluate the left ventricle (LV) function with speckle tracking echocardiography (STE) and to assess its relation to prognosis in patients after acute myocardial infarction (AMI). Sixty-three patients (F/M = 16/47 pts; 62.33 ± 11.85 years old) with AMI (NSTEMI/STEMI 24/39 pts) and successful percutaneous coronary intervention (PCI) with stent implantation (thrombolysis in myocardial infarction; TIMI 3 flow) were enrolled in this study. All patients underwent baseline two-dimensional conventional echocardiography and STE 3 days (baseline) and 30 days after PCI. All patients were followed up for cardiovascular clinical endpoints, major adverse cardiovascular endpoint (MACE), and functional status (Canadian Cardiovascular Society and New York Heart Association). During the follow-up (31.9 ± 5.1 months), there were 3 cardiovascular deaths, 15 patients had AMI, 2 patients had cerebral infarction, 24 patients reached the MACE. Baseline LV torsion (P = 0.035), but none of the other strain parameters were associated with the time to first unplanned cardiovascular hospitalization. Univariate analysis showed that baseline longitudinal two-chamber and four-chamber strain (sLa2 0 and sLa4 0) and the same parameters obtained 30 days after the AMI together with transverse four-chamber strain (sLa2 30, sLa4 30, and sTa4 30) were significantly associated with combined endpoint (MACE). The strongest association in the univariate analysis was found for the baseline sLa2. However, in multivariable analysis only a left ventricular remodeling (LVR - 27% pts) was significantly associated with MACE and strain parameters were not associated with the combined endpoint. The assessment of LV function with STE may improve cardiovascular risk prediction in postmyocardial infarction patients. © 2014, Wiley Periodicals, Inc.

  9. Long Baseline Observatory (LBO)

    Data.gov (United States)

    Federal Laboratory Consortium — The Long Baseline Observatory (LBO) comprises ten radio telescopes spanning 5,351 miles. It's the world's largest, sharpest, dedicated telescope array. With an eye...

  10. Hanford Site technical baseline database. Revision 1

    International Nuclear Information System (INIS)

    Porter, P.E.

    1995-01-01

    This report lists the Hanford specific files (Table 1) that make up the Hanford Site Technical Baseline Database. Table 2 includes the delta files that delineate the differences between this revision and revision 0 of the Hanford Site Technical Baseline Database. This information is being managed and maintained on the Hanford RDD-100 System, which uses the capabilities of RDD-100, a systems engineering software system of Ascent Logic Corporation (ALC). This revision of the Hanford Site Technical Baseline Database uses RDD-100 version 3.0.2.2 (see Table 3). Directories reflect those controlled by the Hanford RDD-100 System Administrator. Table 4 provides information regarding the platform. A cassette tape containing the Hanford Site Technical Baseline Database is available

  11. Microbial Communities Model Parameter Calculation for TSPA/SR

    International Nuclear Information System (INIS)

    D. Jolley

    2001-01-01

    This calculation has several purposes. First the calculation reduces the information contained in ''Committed Materials in Repository Drifts'' (BSC 2001a) to useable parameters required as input to MING V1.O (CRWMS M and O 1998, CSCI 30018 V1.O) for calculation of the effects of potential in-drift microbial communities as part of the microbial communities model. The calculation is intended to replace the parameters found in Attachment II of the current In-Drift Microbial Communities Model revision (CRWMS M and O 2000c) with the exception of Section 11-5.3. Second, this calculation provides the information necessary to supercede the following DTN: M09909SPAMING1.003 and replace it with a new qualified dataset (see Table 6.2-1). The purpose of this calculation is to create the revised qualified parameter input for MING that will allow ΔG (Gibbs Free Energy) to be corrected for long-term changes to the temperature of the near-field environment. Calculated herein are the quadratic or second order regression relationships that are used in the energy limiting calculations to potential growth of microbial communities in the in-drift geochemical environment. Third, the calculation performs an impact review of a new DTN: M00012MAJIONIS.000 that is intended to replace the currently cited DTN: GS9809083 12322.008 for water chemistry data used in the current ''In-Drift Microbial Communities Model'' revision (CRWMS M and O 2000c). Finally, the calculation updates the material lifetimes reported on Table 32 in section 6.5.2.3 of the ''In-Drift Microbial Communities'' AMR (CRWMS M and O 2000c) based on the inputs reported in BSC (2001a). Changes include adding new specified materials and updating old materials information that has changed

  12. Updating Parameters for Volcanic Hazard Assessment Using Multi-parameter Monitoring Data Streams And Bayesian Belief Networks

    Science.gov (United States)

    Odbert, Henry; Aspinall, Willy

    2014-05-01

    Evidence-based hazard assessment at volcanoes assimilates knowledge about the physical processes of hazardous phenomena and observations that indicate the current state of a volcano. Incorporating both these lines of evidence can inform our belief about the likelihood (probability) and consequences (impact) of possible hazardous scenarios, forming a basis for formal quantitative hazard assessment. However, such evidence is often uncertain, indirect or incomplete. Approaches to volcano monitoring have advanced substantially in recent decades, increasing the variety and resolution of multi-parameter timeseries data recorded at volcanoes. Interpreting these multiple strands of parallel, partial evidence thus becomes increasingly complex. In practice, interpreting many timeseries requires an individual to be familiar with the idiosyncrasies of the volcano, monitoring techniques, configuration of recording instruments, observations from other datasets, and so on. In making such interpretations, an individual must consider how different volcanic processes may manifest as measureable observations, and then infer from the available data what can or cannot be deduced about those processes. We examine how parts of this process may be synthesised algorithmically using Bayesian inference. Bayesian Belief Networks (BBNs) use probability theory to treat and evaluate uncertainties in a rational and auditable scientific manner, but only to the extent warranted by the strength of the available evidence. The concept is a suitable framework for marshalling multiple strands of evidence (e.g. observations, model results and interpretations) and their associated uncertainties in a methodical manner. BBNs are usually implemented in graphical form and could be developed as a tool for near real-time, ongoing use in a volcano observatory, for example. We explore the application of BBNs in analysing volcanic data from the long-lived eruption at Soufriere Hills Volcano, Montserrat. We discuss

  13. A Particle Smoother with Sequential Importance Resampling for soil hydraulic parameter estimation: A lysimeter experiment

    Science.gov (United States)

    Montzka, Carsten; Hendricks Franssen, Harrie-Jan; Moradkhani, Hamid; Pütz, Thomas; Han, Xujun; Vereecken, Harry

    2013-04-01

    An adequate description of soil hydraulic properties is essential for a good performance of hydrological forecasts. So far, several studies showed that data assimilation could reduce the parameter uncertainty by considering soil moisture observations. However, these observations and also the model forcings were recorded with a specific measurement error. It seems a logical step to base state updating and parameter estimation on observations made at multiple time steps, in order to reduce the influence of outliers at single time steps given measurement errors and unknown model forcings. Such outliers could result in erroneous state estimation as well as inadequate parameters. This has been one of the reasons to use a smoothing technique as implemented for Bayesian data assimilation methods such as the Ensemble Kalman Filter (i.e. Ensemble Kalman Smoother). Recently, an ensemble-based smoother has been developed for state update with a SIR particle filter. However, this method has not been used for dual state-parameter estimation. In this contribution we present a Particle Smoother with sequentially smoothing of particle weights for state and parameter resampling within a time window as opposed to the single time step data assimilation used in filtering techniques. This can be seen as an intermediate variant between a parameter estimation technique using global optimization with estimation of single parameter sets valid for the whole period, and sequential Monte Carlo techniques with estimation of parameter sets evolving from one time step to another. The aims are i) to improve the forecast of evaporation and groundwater recharge by estimating hydraulic parameters, and ii) to reduce the impact of single erroneous model inputs/observations by a smoothing method. In order to validate the performance of the proposed method in a real world application, the experiment is conducted in a lysimeter environment.

  14. Baseline mitral regurgitation predicts outcome in patients referred for dobutamine stress echocardiography.

    Science.gov (United States)

    O'Driscoll, Jamie M; Gargallo-Fernandez, Paula; Araco, Marco; Perez-Lopez, Manuel; Sharma, Rajan

    2017-11-01

    A number of parameters recorded during dobutamine stress echocardiography (DSE) are associated with worse outcome. However, the relative importance of baseline mitral regurgitation (MR) is unknown. The aim of this study was to assess the prevalence and associated implications of functional MR with long-term mortality in a large cohort of patients referred for DSE. 6745 patients (mean age 64.9 ± 12.2 years) were studied. Demographic, baseline and peak DSE data were collected. All-cause mortality was retrospectively analyzed. DSE was successfully completed in all patients with no adverse outcomes. MR was present in 1019 (15.1%) patients. During a mean follow up of 5.1 ± 1.8 years, 1642 (24.3%) patients died and MR was significantly associated with increased all-cause mortality (p statistic models significantly improved discrimination. MR is associated with all-cause mortality and adds incremental prognostic information among patients referred for DSE. The presence of MR should be taken into account when evaluating the prognostic significance of DSE results.

  15. A novel baseline correction method using convex optimization framework in laser-induced breakdown spectroscopy quantitative analysis

    Science.gov (United States)

    Yi, Cancan; Lv, Yong; Xiao, Han; Ke, Ke; Yu, Xun

    2017-12-01

    For laser-induced breakdown spectroscopy (LIBS) quantitative analysis technique, baseline correction is an essential part for the LIBS data preprocessing. As the widely existing cases, the phenomenon of baseline drift is generated by the fluctuation of laser energy, inhomogeneity of sample surfaces and the background noise, which has aroused the interest of many researchers. Most of the prevalent algorithms usually need to preset some key parameters, such as the suitable spline function and the fitting order, thus do not have adaptability. Based on the characteristics of LIBS, such as the sparsity of spectral peaks and the low-pass filtered feature of baseline, a novel baseline correction and spectral data denoising method is studied in this paper. The improved technology utilizes convex optimization scheme to form a non-parametric baseline correction model. Meanwhile, asymmetric punish function is conducted to enhance signal-noise ratio (SNR) of the LIBS signal and improve reconstruction precision. Furthermore, an efficient iterative algorithm is applied to the optimization process, so as to ensure the convergence of this algorithm. To validate the proposed method, the concentration analysis of Chromium (Cr),Manganese (Mn) and Nickel (Ni) contained in 23 certified high alloy steel samples is assessed by using quantitative models with Partial Least Squares (PLS) and Support Vector Machine (SVM). Because there is no prior knowledge of sample composition and mathematical hypothesis, compared with other methods, the method proposed in this paper has better accuracy in quantitative analysis, and fully reflects its adaptive ability.

  16. Better Plants Progress Update Fall 2013

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2013-09-23

    This Progress Update summarizes the significant energy saving achievements and cumulative cost savings made by these industry leaders from 2010-2012. The update also shares the plans and priorities over the next year for the Better Plants Program to continue to advance energy efficiency in the industrial sector.

  17. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-01-01

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  18. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-06-23

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  19. 75 FR 66748 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-29

    ...- 000] Notice of Baseline Filings October 22, 2010. ONEOK Gas Transportation, L.L.C Docket No. PR11-68... above submitted a revised baseline filing of their Statement of Operating Conditions for services...

  20. Hydrochemical variations in selected geothermal groundwater and carbonated springs in Korea: a baseline study for early detection of CO2 leakage.

    Science.gov (United States)

    Choi, Hanna; Piao, Jize; Woo, Nam C; Cho, Heuynam

    2017-02-01

    A baseline hydrochemistry of the above zone aquifer was examined for the potential of CO 2 early detection monitoring. Among the major ionic components and stable isotope ratios of oxygen, hydrogen, and carbon, components with a relative standard deviation (RSD) of leakage into the above zone. As an analog to the zone above CO 2 storage formation, we sampled deep groundwater, including geothermal groundwater from well depths of 400-700 m below the ground surface (bgs) and carbonated springs with a high CO 2 content in Korea. Under the natural conditions of inland geothermal groundwater, pH, electrical conductivity (EC), bicarbonate (HCO 3 ), δ 18 O, δ 2 H, and δ 13 C were relatively stable as well as sensitive to the introduction of CO 2 (g), thus showing good potential as monitoring parameters for early detection of CO 2 leakage. In carbonated springs, the parameters identified were pH, δ 18 O, and δ 2 H. Baseline hydrochemistry monitoring could provide information on parameters useful for detecting anomalies caused by CO 2 leakage as measures for early warning.

  1. Indoor Spatial Updating with Reduced Visual Information

    OpenAIRE

    Legge, Gordon E.; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M.

    2016-01-01

    Purpose Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (S...

  2. Updated aerosol module and its application to simulate secondary organic aerosols during IMPACT campaign May 2008

    Directory of Open Access Journals (Sweden)

    Y. P. Li

    2013-07-01

    Full Text Available The formation of Secondary organic aerosol (SOA was simulated with the Secondary ORGanic Aerosol Model (SORGAM by a classical gas-particle partitioning concept, using the two-product model approach, which is widely used in chemical transport models. In this study, we extensively updated SORGAM including three major modifications: firstly, we derived temperature dependence functions of the SOA yields for aromatics and biogenic VOCs (volatile organic compounds, based on recent chamber studies within a sophisticated mathematic optimization framework; secondly, we implemented the SOA formation pathways from photo oxidation (OH initiated of isoprene; thirdly, we implemented the SOA formation channel from NO3-initiated oxidation of reactive biogenic hydrocarbons (isoprene and monoterpenes. The temperature dependence functions of the SOA yields were validated against available chamber experiments, and the updated SORGAM with temperature dependence functions was evaluated with the chamber data. Good performance was found with the normalized mean error of less than 30%. Moreover, the whole updated SORGAM module was validated against ambient SOA observations represented by the summed oxygenated organic aerosol (OOA concentrations abstracted from aerosol mass spectrometer (AMS measurements at a rural site near Rotterdam, the Netherlands, performed during the IMPACT campaign in May 2008. In this case, we embedded both the original and the updated SORGAM module into the EURopean Air pollution and Dispersion-Inverse Model (EURAD-IM, which showed general good agreements with the observed meteorological parameters and several secondary products such as O3, sulfate and nitrate. With the updated SORGAM module, the EURAD-IM model also captured the observed SOA concentrations reasonably well especially those during nighttime. In contrast, the EURAD-IM model before update underestimated the observations by a factor of up to 5. The large improvements of the modeled

  3. Precise atmospheric parameters for the shortest-period binary white dwarfs: gravitational waves, metals, and pulsations

    International Nuclear Information System (INIS)

    Gianninas, A.; Kilic, Mukremin; Dufour, P.; Bergeron, P.; Brown, Warren R.; Hermes, J. J.

    2014-01-01

    We present a detailed spectroscopic analysis of 61 low-mass white dwarfs and provide precise atmospheric parameters, masses, and updated binary system parameters based on our new model atmosphere grids and the most recent evolutionary model calculations. For the first time, we measure systematic abundances of He, Ca, and Mg for metal-rich, extremely low mass white dwarfs and examine the distribution of these abundances as a function of effective temperature and mass. Based on our preliminary results, we discuss the possibility that shell flashes may be responsible for the presence of the observed He and metals. We compare stellar radii derived from our spectroscopic analysis to model-independent measurements and find good agreement except for white dwarfs with T eff ≲ 10,000 K. We also calculate the expected gravitational wave strain for each system and discuss their significance to the eLISA space-borne gravitational wave observatory. Finally, we provide an update on the instability strip of extremely low mass white dwarf pulsators.

  4. Precise atmospheric parameters for the shortest-period binary white dwarfs: gravitational waves, metals, and pulsations

    Energy Technology Data Exchange (ETDEWEB)

    Gianninas, A.; Kilic, Mukremin [Homer L. Dodge Department of Physics and Astronomy, University of Oklahoma, 440 West Brooks Street, Norman, OK 73019 (United States); Dufour, P.; Bergeron, P. [Département de Physique, Université de Montréal, C.P. 6128, Succ. Centre-Ville, Montréal, Québec H3C 3J7 (Canada); Brown, Warren R. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Hermes, J. J., E-mail: alexg@nhn.ou.edu [Department of Physics, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2014-10-10

    We present a detailed spectroscopic analysis of 61 low-mass white dwarfs and provide precise atmospheric parameters, masses, and updated binary system parameters based on our new model atmosphere grids and the most recent evolutionary model calculations. For the first time, we measure systematic abundances of He, Ca, and Mg for metal-rich, extremely low mass white dwarfs and examine the distribution of these abundances as a function of effective temperature and mass. Based on our preliminary results, we discuss the possibility that shell flashes may be responsible for the presence of the observed He and metals. We compare stellar radii derived from our spectroscopic analysis to model-independent measurements and find good agreement except for white dwarfs with T {sub eff} ≲ 10,000 K. We also calculate the expected gravitational wave strain for each system and discuss their significance to the eLISA space-borne gravitational wave observatory. Finally, we provide an update on the instability strip of extremely low mass white dwarf pulsators.

  5. A Time Domain Update Method for Reservoir History Matching of Electromagnetic Data

    KAUST Repository

    Katterbauer, Klemens

    2014-03-25

    The oil & gas industry has been the backbone of the world\\'s economy in the last century and will continue to be in the decades to come. With increasing demand and conventional reservoirs depleting, new oil industry projects have become more complex and expensive, operating in areas that were previously considered impossible and uneconomical. Therefore, good reservoir management is key for the economical success of complex projects requiring the incorporation of reliable uncertainty estimates for reliable production forecasts and optimizing reservoir exploitation. Reservoir history matching has played here a key role incorporating production, seismic, electromagnetic and logging data for forecasting the development of reservoirs and its depletion. With the advances in the last decade, electromagnetic techniques, such as crosswell electromagnetic tomography, have enabled engineers to more precisely map the reservoirs and understand their evolution. Incorporating the large amount of data efficiently and reducing uncertainty in the forecasts has been one of the key challenges for reservoir management. Computing the conductivity distribution for the field for adjusting parameters in the forecasting process via solving the inverse problem has been a challenge, due to the strong ill-posedness of the inversion problem and the extensive manual calibration required, making it impossible to be included into an efficient reservoir history matching forecasting algorithm. In the presented research, we have developed a novel Finite Difference Time Domain (FDTD) based method for incorporating electromagnetic data directly into the reservoir simulator. Based on an extended Archie relationship, EM simulations are performed for both forecasted and Porosity-Saturation retrieved conductivity parameters being incorporated directly into an update step for the reservoir parameters. This novel direct update method has significant advantages such as that it overcomes the expensive and ill

  6. 2016 Annual Technology Baseline (ATB) - Webinar Presentation

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Wesley; Kurup, Parthiv; Hand, Maureen; Feldman, David; Sigrin, Benjamin; Lantz, Eric; Stehly, Tyler; Augustine, Chad; Turchi, Craig; Porro, Gian; O' Connor, Patrick; Waldoch, Connor

    2016-09-13

    This deck was presented for the 2016 Annual Technology Baseline Webinar. The presentation describes the Annual Technology Baseline, which is a compilation of current and future cost and performance data for electricity generation technologies.

  7. Geographic and Operational Site Parameters List (GOSPL) for Hanford Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Last, George V.; Nichols, William E.; Kincaid, Charles T.

    2006-06-01

    This data package was originally prepared to support a 2004 composite analysis (CA) of low-level waste disposal at the Hanford Site. The Technical Scope and Approach for the 2004 Composite Analysis of Low Level Waste Disposal at the Hanford Site (Kincaid et. al. 2004) identified the requirements for that analysis and served as the basis for initial preparation of this data package. Completion of the 2004 CA was later deferred, with the 2004 Annual Status Report for the Composite Analysis of Low-Level Waste Disposal in the Central Plateau at the Hanford Site (DOE 2005) indicating that a comprehensive update to the CA was in preparation and would be submitted in 2006. However, the U.S. Department of Energy (DOE) has recently decided to further defer the CA update and will use the cumulative assessment currently under preparation for the environmental impact statement (EIS) being prepared for tank closure and other site decisions as the updated CA. Submittal of the draft EIS is currently planned for FY 2008. This data package describes the facility-specific parameters (e.g. location, operational dates, etc.) used to numerically simulate contaminant flow and transport in large-scale Hanford assessments. Kincaid et al. (2004) indicated that the System Assessment Capability (SAC) (Kincaid et al. 2000; Bryce et al. 2002; Eslinger 2002a, 2002b) would be used to analyze over a thousand different waste sites. A master spreadsheet termed the Geographic and Operational Site Parameters List (GOSPL) was assembled to facilitate the generation of keyword input files containing general information on each waste site/facility, its operational/disposal history, and its environmental settings (past, current, and future). This report briefly describes each of the key data fields, including the source(s) of data, and provides the resulting inputs to be used for large-scale Hanford assessments.

  8. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, B.C.; Menne, T.; Johansen, J.D.

    2008-01-01

    Background: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. Objectives: To examine associations of 21 allergens in the European baseline series to polysensitization....... Patients/Methods: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. Results...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization Udgivelsesdato: 2008...

  9. Associations between baseline allergens and polysensitization

    DEFF Research Database (Denmark)

    Carlsen, Berit Christina; Menné, Torkil; Johansen, Jeanne Duus

    2008-01-01

    BACKGROUND: Identification of patients at risk of developing polysensitization is not possible at present. An association between weak sensitizers and polysensitization has been hypothesized. OBJECTIVES: To examine associations of 21 allergens in the European baseline series to polysensitization....... PATIENTS/METHODS: From a database-based study with 14 998 patients patch tested with the European baseline series between 1985 and 2005, a group of 759 (5.1%) patients were polysensitized. Odds ratios were calculated to determine the relative contribution of each allergen to polysensitization. RESULTS...... denominator for the association between the allergens and the polysensitization was apparent, and any association, whether positive or negative, was relatively low. Based on these results, sensitization to specific baseline allergens cannot be used as risk indicators for polysensitization....

  10. TWRS technical baseline database manager definition document

    International Nuclear Information System (INIS)

    Acree, C.D.

    1997-01-01

    This document serves as a guide for using the TWRS Technical Baseline Database Management Systems Engineering (SE) support tool in performing SE activities for the Tank Waste Remediation System (TWRS). This document will provide a consistent interpretation of the relationships between the TWRS Technical Baseline Database Management software and the present TWRS SE practices. The Database Manager currently utilized is the RDD-1000 System manufactured by the Ascent Logic Corporation. In other documents, the term RDD-1000 may be used interchangeably with TWRS Technical Baseline Database Manager

  11. New uses of sulfur - update

    Energy Technology Data Exchange (ETDEWEB)

    Almond, K.P.

    1995-07-01

    An update to an extensive bibliography on alternate uses of sulfur was presented. Alberta Sulphur Research Ltd., previously compiled a bibliography in volume 24 of this quarterly bulletin. This update provides an additional 44 new publications. The information regarding current research focusses on topics regarding the use of sulfur in oil and gas applications, mining and metallurgy, concretes and other structural materials, waste management, rubber and textile products, asphalts and other paving and highway applications.

  12. HL-LHC updates in Japan

    CERN Multimedia

    Antonella Del Rosso

    2014-01-01

    At a recent meeting in Japan, updates on the High Luminosity LHC (HL-LHC) project were presented, including the progress made so far and the deadlines still to be met for the upgraded machine to be operational from 2020.   New magnets made with advanced superconductor Nb3Sn in the framework of the HL-LHC project. These magnets are currently under construction at CERN by the TE-MSC group. The LHC is the world’s most powerful particle accelerator, and in 2015 it will reach yet another new record for the energy of its colliding beams. One key factor of its discovery potential is its ability to produce collisions described in mathematical terms by the parameter known as “luminosity”. In 2025, the HL-LHC project will allow the total number of collisions in the LHC to increase by a factor of 10. The first step in this rich upgrade programme is the delivery of the Preliminary Design Report (PDR), which is also a key milestone of the HiLumi LHC Design Study partly fund...

  13. Multipartite interacting scalar dark matter in the light of updated LUX data

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharya, Subhaditya; Ghosh, Purusottam; Poulose, Poulose, E-mail: subhab@iitg.ernet.in, E-mail: p.ghosh@iitg.ernet.in, E-mail: poulose@iitg.ernet.in [Department of Physics, Indian Institute of Technology Guwahati, Guwahati, Assam 781039 (India)

    2017-04-01

    We explore constraints on multipartite dark matter (DM) framework composed of singlet scalar DM interacting with the Standard Model (SM) through Higgs portal coupling. We compute relic density and direct search constraints including the updated LUX bound for two component scenario with non-zero interactions between two DM components in Z{sub 2} × Z{sub 2}{sup '} framework in comparison with the one having O(2) symmetry. We point out availability of a significantly large region of parameter space of such a multipartite model with DM-DM interactions.

  14. News and Features Updates from USA.gov

    Data.gov (United States)

    General Services Administration — Stay on top of important government news and information with the USA.gov Updates: News and Features RSS feed. We'll update this feed when we add news and featured...

  15. Update of the Unitarity Triangle Analysis

    International Nuclear Information System (INIS)

    Tarantino, C.; Bona, M.; Sordini, V.

    2009-01-01

    We present the update of the Unitarity Triangle (UT) analysis within the Standard Model (SM) and beyond. Within the SM, combining the direct measurements on sides and angles, the UT turns out to be overconstraint in a consistent way, showing that the CKM matrix is the dominant source of flavour mixing and CP-violation and that New Physics (NP) effects can appear at most as small corrections to the CKM picture. Generalizing the UT analysis to investigate NP effects, constraints on b → s transitions are also included and both CKM and NP parameters are fitted simultaneously. While no evidence of NP effects is found in K - (bar) K and B d - (bar) B d mixing, in the B s - (bar) B s mixing an hint of NP is found. The UT analysis beyond the SM also allows us to derive bounds on the coefficients of the most general ΔF = 2 effective Hamiltonian, that can be translated into bounds on the NP scale. (authors)

  16. Vectorial capacity and vector control: reconsidering sensitivity to parameters for malaria elimination.

    Science.gov (United States)

    Brady, Oliver J; Godfray, H Charles J; Tatem, Andrew J; Gething, Peter W; Cohen, Justin M; McKenzie, F Ellis; Perkins, T Alex; Reiner, Robert C; Tusting, Lucy S; Sinka, Marianne E; Moyes, Catherine L; Eckhoff, Philip A; Scott, Thomas W; Lindsay, Steven W; Hay, Simon I; Smith, David L

    2016-02-01

    Major gains have been made in reducing malaria transmission in many parts of the world, principally by scaling-up coverage with long-lasting insecticidal nets and indoor residual spraying. Historically, choice of vector control intervention has been largely guided by a parameter sensitivity analysis of George Macdonald's theory of vectorial capacity that suggested prioritizing methods that kill adult mosquitoes. While this advice has been highly successful for transmission suppression, there is a need to revisit these arguments as policymakers in certain areas consider which combinations of interventions are required to eliminate malaria. Using analytical solutions to updated equations for vectorial capacity we build on previous work to show that, while adult killing methods can be highly effective under many circumstances, other vector control methods are frequently required to fill effective coverage gaps. These can arise due to pre-existing or developing mosquito physiological and behavioral refractoriness but also due to additive changes in the relative importance of different vector species for transmission. Furthermore, the optimal combination of interventions will depend on the operational constraints and costs associated with reaching high coverage levels with each intervention. Reaching specific policy goals, such as elimination, in defined contexts requires increasingly non-generic advice from modelling. Our results emphasize the importance of measuring baseline epidemiology, intervention coverage, vector ecology and program operational constraints in predicting expected outcomes with different combinations of interventions. © The Author 2016. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  17. Rebuild America partner update, November--December 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-11-01

    This issue of the update includes articles on retrofitting Duke University facilities, energy efficiency updates to buildings in Portland, Oregon, Salisbury, North Carolina, Hawaii, Roanoke-Chowan, Virginia, and energy savings centered designs for lighting systems.

  18. Business-as-Unusual: Existing policies in energy model baselines

    International Nuclear Information System (INIS)

    Strachan, Neil

    2011-01-01

    Baselines are generally accepted as a key input assumption in long-term energy modelling, but energy models have traditionally been poor on identifying baselines assumptions. Notably, transparency on the current policy content of model baselines is now especially critical as long-term climate mitigation policies have been underway for a number of years. This paper argues that the range of existing energy and emissions policies are an integral part of any long-term baseline, and hence already represent a 'with-policy' baseline, termed here a Business-as-Unusual (BAuU). Crucially, existing energy policies are not a sunk effort; as impacts of existing policy initiatives are targeted at future years, they may be revised through iterative policy making, and their quantitative effectiveness requires ex-post verification. To assess the long-term role of existing policies in energy modelling, currently identified UK policies are explicitly stripped out of the UK MARKAL Elastic Demand (MED) optimisation energy system model, to generate a BAuU (with-policy) and a REF (without policy) baseline. In terms of long-term mitigation costs, policy-baseline assumptions are comparable to another key exogenous modelling assumption - that of global fossil fuel prices. Therefore, best practice in energy modelling would be to have both a no-policy reference baseline, and a current policy reference baseline (BAuU). At a minimum, energy modelling studies should have a transparent assessment of the current policy contained within the baseline. Clearly identifying and comparing policy-baseline assumptions are required for cost effective and objective policy making, otherwise energy models will underestimate the true cost of long-term emissions reductions.

  19. Pakistan, Sindh Province - Baseline Indicators System : Baseline Procurement Performance Assessment Report

    OpenAIRE

    World Bank

    2009-01-01

    This document provides an assessment of the public procurement system in Sindh province using the baseline indicators system developed by the Development Assistance Committee of the Organization for Economic Cooperation and Development (OECD-DAC). This assessment, interviews and discussions were held with stakeholders from the public and private sectors as well as civil society. Developing...

  20. Life Support Baseline Values and Assumptions Document

    Science.gov (United States)

    Anderson, Molly S.; Ewert, Michael K.; Keener, John F.

    2018-01-01

    The Baseline Values and Assumptions Document (BVAD) provides analysts, modelers, and other life support researchers with a common set of values and assumptions which can be used as a baseline in their studies. This baseline, in turn, provides a common point of origin from which many studies in the community may depart, making research results easier to compare and providing researchers with reasonable values to assume for areas outside their experience. This document identifies many specific physical quantities that define life support systems, serving as a general reference for spacecraft life support system technology developers.

  1. Federal Education Update, December 2004. Commission Update 04-17.

    Science.gov (United States)

    California Postsecondary Education Commission, 2004

    2004-01-01

    This update presents some of the major issues affecting education occurring at the national level. These include: Higher Education Act Extended for One Year; New Law Increases Loan Forgiveness for Teachers; Domestic Appropriations Measures Completed; Change in Federal Student Aid Rules; Bush Advisor Nominated To Be Education Secretary In Second…

  2. An Updated Site Scale Saturated Zone Ground Water Transport Model For Yucca Mountain

    International Nuclear Information System (INIS)

    S. Kelkar; H. Viswanathan; A. Eddebbarrh; M. Ding; P. Reimus; B. Robinson; B. Arnold; A. Meijer

    2006-01-01

    The Yucca Mountain site scale saturated zone transport model has been revised to incorporate the updated flow model based on a hydrogeologic framework model using the latest lithology data, increased grid resolution that better resolves the geology within the model domain, updated Kd distributions for radionuclides of interest, and updated retardation factor distributions for colloid filtration. The resulting numerical transport model is used for performance assessment predictions of radionuclide transport and to guide future data collection and modeling activities. The transport model results are validated by comparing the model transport pathways with those derived from geochemical data, and by comparing the transit times from the repository footprint to the compliance boundary at the accessible environment with those derived from 14 C-based age estimates. The transport model includes the processes of advection, dispersion, fracture flow, matrix diffusion, sorption, and colloid-facilitated transport. The transport of sorbing radionuclides in the aqueous phase is modeled as a linear, equilibrium process using the Kd model. The colloid-facilitated transport of radionuclides is modeled using two approaches: the colloids with irreversibly embedded radionuclides undergo reversible filtration only, while the migration of radionuclides that reversibly sorb to colloids is modeled with modified values for sorption coefficient and matrix diffusion coefficients. Model breakthrough curves for various radionuclides at the compliance boundary are presented along with their sensitivity to various parameters

  3. Agent Communication for Dynamic Belief Update

    Science.gov (United States)

    Kobayashi, Mikito; Tojo, Satoshi

    Thus far, various formalizations of rational / logical agent model have been proposed. In this paper, we include the notion of communication channel and belief modality into update logic, and introduce Belief Update Logic (BUL). First, we discuss that how we can reformalize the inform action of FIPA-ACL into communication channel, which represents a connection between agents. Thus, our agents can send a message only when they believe, and also there actually is, a channel between him / her and a receiver. Then, we present a static belief logic (BL) and show its soundness and completeness. Next, we develop the logic to BUL, which can update Kripke model by the inform action; in which we show that in the updated model the belief operator also satisfies K45. Thereafter, we show that every sentence in BUL can be translated into BL; thus, we can contend that BUL is also sound and complete. Furthermore, we discuss the features of CUL, including the case of inconsistent information, as well as channel transmission. Finally, we summarize our contribution and discuss some future issues.

  4. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  5. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2008-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update is the single best source for the latest developments, trends, and issues in communication technology. Now in its 11th edition, Communication Technology Update has become an indispensable information resource for business, government, and academia. As always, every chapter has been completely rewritten to reflect the latest developments and market statistics, and now covers mobile computing, dig

  6. Updated safety analysis of ITER

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Neill, E-mail: neill.taylor@iter.org [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France); Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France)

    2011-10-15

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  7. Updated safety analysis of ITER

    International Nuclear Information System (INIS)

    Taylor, Neill; Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid

    2011-01-01

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  8. Mining Sequential Update Summarization with Hierarchical Text Analysis

    Directory of Open Access Journals (Sweden)

    Chunyun Zhang

    2016-01-01

    Full Text Available The outbreak of unexpected news events such as large human accident or natural disaster brings about a new information access problem where traditional approaches fail. Mostly, news of these events shows characteristics that are early sparse and later redundant. Hence, it is very important to get updates and provide individuals with timely and important information of these incidents during their development, especially when being applied in wireless and mobile Internet of Things (IoT. In this paper, we define the problem of sequential update summarization extraction and present a new hierarchical update mining system which can broadcast with useful, new, and timely sentence-length updates about a developing event. The new system proposes a novel method, which incorporates techniques from topic-level and sentence-level summarization. To evaluate the performance of the proposed system, we apply it to the task of sequential update summarization of temporal summarization (TS track at Text Retrieval Conference (TREC 2013 to compute four measurements of the update mining system: the expected gain, expected latency gain, comprehensiveness, and latency comprehensiveness. Experimental results show that our proposed method has good performance.

  9. Run-time Phenomena in Dynamic Software Updating: Causes and Effects

    DEFF Research Database (Denmark)

    Gregersen, Allan Raundahl; Jørgensen, Bo Nørregaard

    2011-01-01

    The development of a dynamic software updating system for statically-typed object-oriented programming languages has turned out to be a challenging task. Despite the fact that the present state of the art in dynamic updating systems, like JRebel, Dynamic Code Evolution VM, JVolve and Javeleon, all...... written in statically-typed object-oriented programming languages. In this paper, we present our experience from developing dynamically updatable applications using a state-of-the-art dynamic updating system for Java. We believe that the findings presented in this paper provide an important step towards...... provide very transparent and flexible technical solutions to dynamic updating, case studies have shown that designing dynamically updatable applications still remains a challenging task. This challenge has its roots in a number of run-time phenomena that are inherent to dynamic updating of applications...

  10. 34 CFR 668.55 - Updating information.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Updating information. 668.55 Section 668.55 Education... Information § 668.55 Updating information. (a)(1) Unless the provisions of paragraph (a)(2) or (a)(3) of this... applicant to verify the information contained in his or her application for assistance in an award year if...

  11. An Updated Performance Assessment For A New Low-Level Radioactive Waste Disposal Facility In West Texas - 12192

    Energy Technology Data Exchange (ETDEWEB)

    Dornsife, William P.; Kirk, J. Scott; Shaw, Chris G. [Waste Control Specialists LLC, Andrews, Texas (United States)

    2012-07-01

    This Performance Assessment (PA) submittal is an update to the original PA that was developed to support the licensing of the Waste Control Specialists LLC Low-Level Radioactive Waste (LLRW) disposal facility. This update includes both the Compact Waste Facility (CWF) and the Federal Waste Facility (FWF), in accordance with Radioactive Material License (RML) No. R04100, License Condition (LC) 87. While many of the baseline assumptions supporting the initial license application PA were incorporated in this update, a new transport code, GoldSim, and new deterministic groundwater flow codes, including HYDRUS and MODFLOWSURFACT{sup TM}, were employed to demonstrate compliance with the performance objectives codified in the regulations and RML No. R04100, LC 87. A revised source term, provided by the Texas Commission on Environmental Quality staff, was used to match the initial 15 year license term. This updated PA clearly confirms and demonstrates the robustness of the characteristics of the site's geology and the advanced engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public and site workers predicted in the initial and updated PA were a small fraction of the criterion doses of 0.25 mSv and 50 mSv, respectively. In a comparison between the results of the updated PA against the one developed in support of the initial license, both clearly demonstrated the robustness of the characteristics of the site's geology and engineering design of the disposal units. Based on the simulations from fate and transport models, the radiation doses to members of the general public predicted in the initial and updated PA were a fraction of the allowable 25 mrem/yr (0.25 m sievert/yr) dose standard for tens-of-thousands of years into the future. Draft Texas guidance on performance assessment (TCEQ, 2004) recommends a period of analysis equal to 1,000 years or until peak doses from

  12. FAQs about Baseline Testing among Young Athletes

    Science.gov (United States)

    ... a similar exam conducted by a health care professional during the season if an athlete has a suspected concussion. Baseline testing generally takes place during the pre-season—ideally prior to the first practice. It is important to note that some baseline ...

  13. 75 FR 74706 - Notice of Baseline Filings

    Science.gov (United States)

    2010-12-01

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings November 24, 2010. Centana Intrastate Pipeline, LLC. Docket No. PR10-84-001. Centana Intrastate Pipeline, LLC... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  14. Sensitivity Analysis of Input Parameters for the Dose Assessment from Gaseous Effluents due to the Normal Operation of Jordan Research and Training Reactor

    International Nuclear Information System (INIS)

    Kim, Sukhoon; Lee, Seunghee; Kim, Juyoul; Kim, Juyub; Han, Moonhee

    2015-01-01

    In this study, therefore, the sensitivity analysis of input variables for the dose assessment was performed for reviewing the effect of each parameter on the result after determining the type and range of parameters that could affect the exposure dose of the public. (Since JRTR will be operated by the concept of 'no liquid discharge,' the input parameters used for calculation of dose due to liquid effluents are not considered in the sensitivity analysis.) In this paper, the sensitivity analysis of input parameters for the dose assessment in the vicinity of the site boundary due to gaseous effluents was performed for a total of thirty-five (35) cases. And, detailed results for the input variables that have an significant effect are shown in Figures 1 through 7, respectively. For preparing a R-ER for the operating license of the JRTR, these results will be updated by the additional information and could be applied to predicting the variation trend of the exposure dose in the process of updating the input parameters for the dose assessment reflecting the characteristics of the JRTR site

  15. Baseline conditions at Olkiluoto

    International Nuclear Information System (INIS)

    2003-09-01

    The main purpose of this report is to establish a reference point - defined as the data collected up until the end of year 2002 - for the coming phases of the Finnish spent nuclear fuel disposal programme. The focus is: to define the current surface and underground conditions at the site, both as regards the properties for which a change is expected and for the properties which are of particular interest for long-term safety or environmental impact; to establish, as far as possible, the natural fluctuation of properties that are potentially affected by construction of the underground laboratory, the ONKALO, and to provide references to data on parameters or use in model development and testing and to use models to assist in understanding and interpreting the data. The emphasis of the baseline description is on bedrock characteristics that are relevant to the long-term safety of a spent fuel repository and, hence, to include the hydrogeological, hydrogeochemical, rock mechanical, tectonic and seismic conditions of the site. The construction of the ONKALO will also affect some conditions on the surface, and, therefore, a description of the main characteristics of the nature and the man-made constructions at Olkiluoto is also given. This report is primarily a road map to the available information on the prevailing conditions at the Olkiluoto site and a framework for understanding of data collected. Hence, it refers to numerous available background reports and other archived information produced over the past 20 years or more, and forms a recapitulation and revaluation of the characterisation data of the Olkiluoto site. (orig.)

  16. Community Design Parameters and the Performance of Residential Cogeneration Systems

    Directory of Open Access Journals (Sweden)

    Hazem Rashed-Ali

    2012-11-01

    Full Text Available The integration of cogeneration systems in residential and mixed-use communities has the potential of reducing their energy demand and harmful emissions and can thus play asignificant role in increasing their environmental sustainability. This study investigated the impact of selected planning and architectural design parameters on the environmental and economic performances of centralized cogeneration systems integrated into residential communities in U.S.cold climates. Parameters investigated include: 1 density, 2 use mix, 3 street configuration, 4 housing typology, 5 envelope and building systems’ efficiencies, and 6 passive solar energyutilization. The study integrated several simulation tools into a procedure to assess the impact of each design parameter on the cogeneration system performance. This assessment procedure included: developing a base-line model representing typical design characteristics of U.S. residential communities; assessing the cogeneration system’s performance within this model using three performance indicators: percentage of reduction in primary energy use, percentage of reduction in CO2 emissions; and internal rate of return; assessing the impact of each parameter on the system performance through developing 46 design variations of the base-line model representing potential changes in each parameter and calculating the three indicators for each variation; and finally, using a multi-attribute decision analysis methodology to evaluate the relative impact of each parameter on the cogeneration system performance. The study results show that planning parameters had a higher impact on the cogeneration system performance than architectural ones. Also, a significant correlation was found between design characteristics identified as favorable for the cogeneration system performance and those of sustainable residential communities. These include high densities, high use mix, interconnected street networks, and mixing of

  17. Towards Dynamic Updates in Service Composition

    Directory of Open Access Journals (Sweden)

    Mario Bravetti

    2015-12-01

    Full Text Available We survey our results about verification of adaptable processes. We present adaptable processes as a way of overcoming the limitations that process calculi have for describing patterns of dynamic process evolution. Such patterns rely on direct ways of controlling the behavior and location of running processes, and so they are at the heart of the adaptation capabilities present in many modern concurrent systems. Adaptable processes have named scopes and are sensible to actions of dynamic update at runtime; this allows to express dynamic and static topologies of adaptable processes as well as different evolvability patterns for concurrent processes. We introduce a core calculus of adaptable processes and consider verification problems for them: first based on specific properties related to error occurrence, that we call bounded and eventual adaptation, and then by considering a simple yet expressive temporal logic over adaptable processes. We provide (undecidability results of such verification problems over adaptable processes considering the spectrum of topologies/evolvability patterns introduced. We then consider distributed adaptability, where a process can update part of a protocol by performing dynamic distributed updates over a set of protocol participants. Dynamic updates in this context are presented as an extension of our work on choreographies and behavioural contracts in multiparty interactions. We show how update mechanisms considered for adaptable processes can be used to extend the theory of choreography and orchestration/contracts, allowing them to be modified at run-time by internal (self-adaptation or external intervention.

  18. Geochemical baseline studies of soil in Finland

    Science.gov (United States)

    Pihlaja, Jouni

    2017-04-01

    The soil element concentrations regionally vary a lot in Finland. Mostly this is caused by the different bedrock types, which are reflected in the soil qualities. Geological Survey of Finland (GTK) is carrying out geochemical baseline studies in Finland. In the previous phase, the research is focusing on urban areas and mine environments. The information can, for example, be used to determine the need for soil remediation, to assess environmental impacts or to measure the natural state of soil in industrial areas or mine districts. The field work is done by taking soil samples, typically at depth between 0-10 cm. Sampling sites are chosen to represent the most vulnerable areas when thinking of human impacts by possible toxic soil element contents: playgrounds, day-care centers, schools, parks and residential areas. In the mine districts the samples are taken from the areas locating outside the airborne dust effected areas. Element contents of the soil samples are then analyzed with ICP-AES and ICP-MS, Hg with CV-AAS. The results of the geochemical baseline studies are published in the Finnish national geochemical baseline database (TAPIR). The geochemical baseline map service is free for all users via internet browser. Through this map service it is possible to calculate regional soil baseline values using geochemical data stored in the map service database. Baseline data for 17 elements in total is provided in the map service and it can be viewed on the GTK's web pages (http://gtkdata.gtk.fi/Tapir/indexEN.html).

  19. Working memory updating occurs independently of the need to maintain task-context: accounting for triggering updating in the AX-CPT paradigm.

    Science.gov (United States)

    Kessler, Yoav; Baruchin, Liad J; Bouhsira-Sabag, Anat

    2017-01-01

    Theoretical models suggest that maintenance and updating are two functional states of working memory (WM), which are controlled by a gate between perceptual information and WM representations. Opening the gate enables updating WM with input, while closing it enables keeping the maintained information shielded from interference. However, it is still unclear when gate opening takes place, and what is the external signal that triggers it. A version of the AX-CPT paradigm was used to examine a recent proposal in the literature, suggesting that updating is triggered whenever the maintenance of the context is necessary for task performance (context-dependent tasks). In four experiments using this paradigm, we show that (1) a task-switching cost takes place in both context-dependent and context-independent trials; (2) task-switching is additive to the dependency effect, and (3) unlike switching cost, the dependency effect is not affected by preparation and, therefore, does not reflect context-updating. We suggest that WM updating is likely to be triggered by a simple mechanism that occurs in each trial of the task regardless of whether maintaining the context is needed or not. The implications for WM updating and its relationship to task-switching are discussed.

  20. A visual tracking method based on deep learning without online model updating

    Science.gov (United States)

    Tang, Cong; Wang, Yicheng; Feng, Yunsong; Zheng, Chao; Jin, Wei

    2018-02-01

    The paper proposes a visual tracking method based on deep learning without online model updating. In consideration of the advantages of deep learning in feature representation, deep model SSD (Single Shot Multibox Detector) is used as the object extractor in the tracking model. Simultaneously, the color histogram feature and HOG (Histogram of Oriented Gradient) feature are combined to select the tracking object. In the process of tracking, multi-scale object searching map is built to improve the detection performance of deep detection model and the tracking efficiency. In the experiment of eight respective tracking video sequences in the baseline dataset, compared with six state-of-the-art methods, the method in the paper has better robustness in the tracking challenging factors, such as deformation, scale variation, rotation variation, illumination variation, and background clutters, moreover, its general performance is better than other six tracking methods.

  1. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Update of European bioethics

    DEFF Research Database (Denmark)

    Rendtorff, Jacob Dahl

    2015-01-01

    This paper presents an update of the research on European bioethics undertaken by the author together with Professor Peter Kemp since the 1990s, on Basic ethical principles in European bioethics and biolaw. In this European approach to basic ethical principles in bioethics and biolaw......, the principles of autonomy, dignity, integrity and vulnerability are proposed as the most important ethical principles for respect for the human person in biomedical and biotechnological development. This approach to bioethics and biolaw is presented here in a short updated version that integrates the earlier...... research in a presentation of the present understanding of the basic ethical principles in bioethics and biolaw....

  3. Robot Visual Tracking via Incremental Self-Updating of Appearance Model

    Directory of Open Access Journals (Sweden)

    Danpei Zhao

    2013-09-01

    Full Text Available This paper proposes a target tracking method called Incremental Self-Updating Visual Tracking for robot platforms. Our tracker treats the tracking problem as a binary classification: the target and the background. The greyscale, HOG and LBP features are used in this work to represent the target and are integrated into a particle filter framework. To track the target over long time sequences, the tracker has to update its model to follow the most recent target. In order to deal with the problems of calculation waste and lack of model-updating strategy with the traditional methods, an intelligent and effective online self-updating strategy is devised to choose the optimal update opportunity. The strategy of updating the appearance model can be achieved based on the change in the discriminative capability between the current frame and the previous updated frame. By adjusting the update step adaptively, severe waste of calculation time for needless updates can be avoided while keeping the stability of the model. Moreover, the appearance model can be kept away from serious drift problems when the target undergoes temporary occlusion. The experimental results show that the proposed tracker can achieve robust and efficient performance in several benchmark-challenging video sequences with various complex environment changes in posture, scale, illumination and occlusion.

  4. Land Boundary Conditions for the Goddard Earth Observing System Model Version 5 (GEOS-5) Climate Modeling System: Recent Updates and Data File Descriptions

    Science.gov (United States)

    Mahanama, Sarith P.; Koster, Randal D.; Walker, Gregory K.; Takacs, Lawrence L.; Reichle, Rolf H.; De Lannoy, Gabrielle; Liu, Qing; Zhao, Bin; Suarez, Max J.

    2015-01-01

    The Earths land surface boundary conditions in the Goddard Earth Observing System version 5 (GEOS-5) modeling system were updated using recent high spatial and temporal resolution global data products. The updates include: (i) construction of a global 10-arcsec land-ocean lakes-ice mask; (ii) incorporation of a 10-arcsec Globcover 2009 land cover dataset; (iii) implementation of Level 12 Pfafstetter hydrologic catchments; (iv) use of hybridized SRTM global topography data; (v) construction of the HWSDv1.21-STATSGO2 merged global 30 arc second soil mineral and carbon data in conjunction with a highly-refined soil classification system; (vi) production of diffuse visible and near-infrared 8-day MODIS albedo climatologies at 30-arcsec from the period 2001-2011; and (vii) production of the GEOLAND2 and MODIS merged 8-day LAI climatology at 30-arcsec for GEOS-5. The global data sets were preprocessed and used to construct global raster data files for the software (mkCatchParam) that computes parameters on catchment-tiles for various atmospheric grids. The updates also include a few bug fixes in mkCatchParam, as well as changes (improvements in algorithms, etc.) to mkCatchParam that allow it to produce tile-space parameters efficiently for high resolution AGCM grids. The update process also includes the construction of data files describing the vegetation type fractions, soil background albedo, nitrogen deposition and mean annual 2m air temperature to be used with the future Catchment CN model and the global stream channel network to be used with the future global runoff routing model. This report provides detailed descriptions of the data production process and data file format of each updated data set.

  5. Low level laser therapy (photobiomodulation) for the management of breast cancer-related lymphedema: an update

    Science.gov (United States)

    Baxter, G. David; Liu, Lizhou; Chapple, Cathy; Petrich, Simone; Anders, Juanita J.; Tumilty, Steve

    2018-04-01

    Breast cancer related lymphedema (BCRL) is prevalent among breast cancer survivors, and may be painful and disfiguring with associated psychological impact. Previous research shows increasing use of low level laser therapy (LLLT), now commonly referred to as photobiomodulation (PBM) therapy for managing BCRL, in countries including the United States and Australia. However, conclusions were limited by the paucity, heterogeneity, and poor quality of previous studies. LLLT (PBM) has been barely used in clinical practice in New Zealand, and no clinical studies on LLLT (PBM) for BCRL have been conducted in this country. In order to promote this potentially useful treatment modality for BCRL patients, the Laser Lymphedema Trial Team at the University of Otago conducted a program to assess the effectiveness of LLLT (PBM) in management of BCRL. The program comprises three phases including a systematic review (completed), a feasibility study (completed), and a full-scale randomized controlled trial (proposed). This current paper provides an update on the program. Based upon the systematic review, LLLT (PBM) is considered a potentially effective treatment approach for women with BCRL; the review also indicated the need for further research including exploration of the relevance of dosage and other LLLT (PBM) parameters. The feasibility study demonstrated that it is feasible to conduct a fully powered RCT to definitively test the effectiveness of the additional use of LLLT (PBM) in the management of BCRL, and 114 participants will be needed at baseline in the main study. Currently, the full-scale RCT is under preparation.

  6. Framework for ensuring appropriate maintenance of baseline PSA and risk monitor models in a nuclear power plant

    International Nuclear Information System (INIS)

    Vrbanic, I.; Sorman, J.

    2005-01-01

    The necessity of observing both long term and short term risk changes many times imposes the need for a nuclear power plant to have a baseline PSA model to produce an estimate of long term averaged risk and a risk monitor to produce a time-dependent risk curve and/or safety functions status at points in time or over a shorter time period of interest. By nature, a baseline PSA reflects plant systems and operation in terms of average conditions and provides time-invariant quantitative risk metrics. Risk monitor, on the other hand, requires condition-specific modeling to produce a quantitative and/or qualitative estimate of plant's condition-specific risk metrics. While risk monitor is used for computing condition-specific risk metrics over time, a baseline PSA model is needed for variety of other risk oriented applications, such as assessments of proposed design modifications or risk ranking of equipment. Having in mind their importance and roles, it is essential that both models, i.e. baseline PSA model and risk monitor are maintained in the way that they represent, as accurately as practically achievable, the actual plant status (e.g. systems' design and plant's procedures in effect) and its history (e.g. numbers of equipment failures and demands that influence relevant PSA parameters). Paper discusses the requirements for appropriate maintenance of plant's baseline PSA model and risk monitor model and presents the framework for plant's engineering and administrative procedures that would ensure they are met. (author)

  7. Decentralized Consistent Network Updates in SDN with ez-Segway

    KAUST Repository

    Nguyen, Thanh Dang

    2017-03-06

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  8. Intrafractional baseline drift during free breathing breast cancer radiation therapy.

    Science.gov (United States)

    Jensen, Christer Andre; Acosta Roa, Ana María; Lund, Jo-Åsmund; Frengen, Jomar

    2017-06-01

    Intrafraction motion in breast cancer radiation therapy (BCRT) has not yet been thoroughly described in the literature. It has been observed that baseline drift occurs as part of the intrafraction motion. This study aims to measure baseline drift and its incidence in free-breathing BCRT patients using an in-house developed laser system for tracking the position of the sternum. Baseline drift was monitored in 20 right-sided breast cancer patients receiving free breathing 3D-conformal RT by using an in-house developed laser system which measures one-dimensional distance in the AP direction. A total of 357 patient respiratory traces from treatment sessions were logged and analysed. Baseline drift was compared to patient positioning error measured from in-field portal imaging. The mean overall baseline drift at end of treatment sessions was -1.3 mm for the patient population. Relatively small baseline drift was observed during the first fraction; however it was clearly detected already at the second fraction. Over 90% of the baseline drift occurs during the first 3 min of each treatment session. The baseline drift rate for the population was -0.5 ± 0.2 mm/min in the posterior direction the first minute after localization. Only 4% of the treatment sessions had a 5 mm or larger baseline drift at 5 min, all towards the posterior direction. Mean baseline drift in the posterior direction in free breathing BCRT was observed in 18 of 20 patients over all treatment sessions. This study shows that there is a substantial baseline drift in free breathing BCRT patients. No clear baseline drift was observed during the first treatment session; however, baseline drift was markedly present at the rest of the sessions. Intrafraction motion due to baseline drift should be accounted for in margin calculations.

  9. Long-term urine biobanking: storage stability of clinical chemical parameters under moderate freezing conditions without use of preservatives.

    Science.gov (United States)

    Remer, Thomas; Montenegro-Bethancourt, Gabriela; Shi, Lijie

    2014-12-01

    To examine the long-term stability and validity of analyte concentrations of 21 clinical biochemistry parameters in 24-h urine samples stored for 12 or 15 yr at -22°C and preservative free. Healthy children's 24-h urine samples in which the respective analytes had been measured shortly after sample collection (baseline) were reanalyzed. Second measurement was performed after 12 yr (organic acids) and 15 yr (creatinine, urea, osmolality, iodine, nitrogen, anions, cations, acid-base parameters) with the same analytical methodology. Paired comparisons and correlations between the baseline and repeated measurements were done. Recovery rates were calculated. More than half of the analytes (creatinine, urea, iodine, nitrogen, sodium, potassium, magnesium, calcium, ammonium, bicarbonate, citric & uric acid) showed measurement values after >10 yr of storage not significantly different from baseline. 15 of the 21 parameters were highly correlated (r=0.99) between baseline and second measurement. Poorest correlation was r=0.77 for oxalate. Recovery ranged from 73% (oxalate) to 105% (phosphate). Our results suggest high long-term stability and measurement validity for numerous clinical chemistry parameters stored at -22°C without addition of any urine preservative. Prospective storage of urine aliquots at -22°C for periods even exceeding 10 yr, appears to be an acceptable and valid tool in epidemiological settings for later quantification of several urine analytes. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  10. Modular Estimation Strategy of Vehicle Dynamic Parameters for Motion Control Applications

    Directory of Open Access Journals (Sweden)

    Rawash Mustafa

    2018-01-01

    Full Text Available The presence of motion control or active safety systems in vehicles have become increasingly important for improving vehicle performance and handling and negotiating dangerous driving situations. The performance of such systems would be improved if combined with knowledge of vehicle dynamic parameters. Since some of these parameters are difficult to measure, due to technical or economic reasons, estimation of those parameters might be the only practical alternative. In this paper, an estimation strategy of important vehicle dynamic parameters, pertaining to motion control applications, is presented. The estimation strategy is of a modular structure such that each module is concerned with estimating a single vehicle parameter. Parameters estimated include: longitudinal, lateral, and vertical tire forces – longitudinal velocity – vehicle mass. The advantage of this strategy is its independence of tire parameters or wear, road surface condition, and vehicle mass variation. Also, because of its modular structure, each module could be later updated or exchanged for a more effective one. Results from simulations on a 14-DOF vehicle model are provided here to validate the strategy and show its robustness and accuracy.

  11. Extracting Baseline Electricity Usage Using Gradient Tree Boosting

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taehoon [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Lee, Dongeun [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Choi, Jaesik [Ulsan Nat. Inst. of Sci. & Tech., Ulsan (South Korea); Spurlock, Anna [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-05-05

    To understand how specific interventions affect a process observed over time, we need to control for the other factors that influence outcomes. Such a model that captures all factors other than the one of interest is generally known as a baseline. In our study of how different pricing schemes affect residential electricity consumption, the baseline would need to capture the impact of outdoor temperature along with many other factors. In this work, we examine a number of different data mining techniques and demonstrate Gradient Tree Boosting (GTB) to be an effective method to build the baseline. We train GTB on data prior to the introduction of new pricing schemes, and apply the known temperature following the introduction of new pricing schemes to predict electricity usage with the expected temperature correction. Our experiments and analyses show that the baseline models generated by GTB capture the core characteristics over the two years with the new pricing schemes. In contrast to the majority of regression based techniques which fail to capture the lag between the peak of daily temperature and the peak of electricity usage, the GTB generated baselines are able to correctly capture the delay between the temperature peak and the electricity peak. Furthermore, subtracting this temperature-adjusted baseline from the observed electricity usage, we find that the resulting values are more amenable to interpretation, which demonstrates that the temperature-adjusted baseline is indeed effective.

  12. Multiobjective constraints for climate model parameter choices: Pragmatic Pareto fronts in CESM1

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2017-09-01

    Global climate models (GCMs) are examples of high-dimensional input-output systems, where model output is a function of many variables, and an update in model physics commonly improves performance in one objective function (i.e., measure of model performance) at the expense of degrading another. Here concepts from multiobjective optimization in the engineering literature are used to investigate parameter sensitivity and optimization in the face of such trade-offs. A metamodeling technique called cut high-dimensional model representation (cut-HDMR) is leveraged in the context of multiobjective optimization to improve GCM simulation of the tropical Pacific climate, focusing on seasonal precipitation, column water vapor, and skin temperature. An evolutionary algorithm is used to solve for Pareto fronts, which are surfaces in objective function space along which trade-offs in GCM performance occur. This approach allows the modeler to visualize trade-offs quickly and identify the physics at play. In some cases, Pareto fronts are small, implying that trade-offs are minimal, optimal parameter value choices are more straightforward, and the GCM is well-functioning. In all cases considered here, the control run was found not to be Pareto-optimal (i.e., not on the front), highlighting an opportunity for model improvement through objectively informed parameter selection. Taylor diagrams illustrate that these improvements occur primarily in field magnitude, not spatial correlation, and they show that specific parameter updates can improve fields fundamental to tropical moist processes—namely precipitation and skin temperature—without significantly impacting others. These results provide an example of how basic elements of multiobjective optimization can facilitate pragmatic GCM tuning processes.

  13. 75 FR 57268 - Notice of Baseline Filings

    Science.gov (United States)

    2010-09-20

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-103-000; Docket No. PR10-104-000; Docket No. PR10-105- 000 (Not Consolidated)] Notice of Baseline Filings September 13..., 2010, and September 10, 2010, respectively the applicants listed above submitted their baseline filing...

  14. Minnesota's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    Jerold T. Hahn; W. Brad Smith

    1987-01-01

    The Minnesota 1987 inventory update, derived by using tree growth models, reports 13.5 million acres of timberland, a decline of less than 1% since 1977. This bulletin presents findings from the inventory update in tables detailing timer land area, volume, and biomass.

  15. Wisconsin's forest statistics, 1987: an inventory update.

    Science.gov (United States)

    W. Brad Smith; Jerold T. Hahn

    1989-01-01

    The Wisconsin 1987 inventory update, derived by using tree growth models, reports 14.7 million acres of timberland, a decline of less than 1% since 1983. This bulletin presents findings from the inventory update in tables detailing timberland area, volume, and biomass.

  16. Research on Topographic Map Updating

    Directory of Open Access Journals (Sweden)

    Ivana Javorović

    2013-04-01

    Full Text Available The investigation of interpretability of panchromatic satellite image IRS-1C integrated with multispectral Landsat TM image with the purpose of updating the topographic map sheet at the scale of 1:25 000 has been described. The geocoding of source map was based on trigonometric points of the map sheet. Satellite images were geocoded using control points selected from the map. The contents of map have been vectorized and topographic database designed. The digital image processing improved the interpretability of images. Then, the vectorization of new contents was made. The change detection of the forest and water area was defined by using unsupervised classification of spatial and spectral merged images. Verification of the results was made using corresponding aerial photographs. Although this methodology could not insure the complete updating of topographic map at the scale of 1:25 000, the database has been updated with huge amount of data. Erdas Imagine 8.3. software was used. 

  17. Application of a Bayesian algorithm for the Statistical Energy model updating of a railway coach

    DEFF Research Database (Denmark)

    Sadri, Mehran; Brunskog, Jonas; Younesian, Davood

    2016-01-01

    into account based on published data on comparison between experimental and theoretical results, so that the variance of the theory is estimated. The Monte Carlo Metropolis Hastings algorithm is employed to estimate the modified values of the parameters. It is shown that the algorithm can be efficiently used......The classical statistical energy analysis (SEA) theory is a common approach for vibroacoustic analysis of coupled complex structures, being efficient to predict high-frequency noise and vibration of engineering systems. There are however some limitations in applying the conventional SEA...... the performance of the proposed strategy, the SEA model updating of a railway passenger coach is carried out. First, a sensitivity analysis is carried out to select the most sensitive parameters of the SEA model. For the selected parameters of the model, prior probability density functions are then taken...

  18. Re-evaluation and updating of the seismic hazard of Lebanon

    Science.gov (United States)

    Huijer, Carla; Harajli, Mohamed; Sadek, Salah

    2016-01-01

    This paper presents the results of a study undertaken to evaluate the implications of the newly mapped offshore Mount Lebanon Thrust (MLT) fault system on the seismic hazard of Lebanon and the current seismic zoning and design parameters used by the local engineering community. This re-evaluation is critical, given that the MLT is located at close proximity to the major cities and economic centers of the country. The updated seismic hazard was assessed using probabilistic methods of analysis. The potential sources of seismic activities that affect Lebanon were integrated along with any/all newly established characteristics within an updated database which includes the newly mapped fault system. The earthquake recurrence relationships of these sources were developed from instrumental seismology data, historical records, and earlier studies undertaken to evaluate the seismic hazard of neighboring countries. Maps of peak ground acceleration contours, based on 10 % probability of exceedance in 50 years (as per Uniform Building Code (UBC) 1997), as well as 0.2 and 1 s peak spectral acceleration contours, based on 2 % probability of exceedance in 50 years (as per International Building Code (IBC) 2012), were also developed. Finally, spectral charts for the main coastal cities of Beirut, Tripoli, Jounieh, Byblos, Saida, and Tyre are provided for use by designers.

  19. Dynamic contrast-enhanced MR imaging pharmacokinetic parameters as predictors of treatment response of brain metastases in patients with lung cancer

    Energy Technology Data Exchange (ETDEWEB)

    Kuchcinski, Gregory; Duhal, Romain; Lalisse, Maxime; Dumont, Julien; Lopes, Renaud; Pruvo, Jean-Pierre; Leclerc, Xavier; Delmaire, Christine [University of Lille, CHU Lille, Department of Neuroradiology, Lille (France); Le Rhun, Emilie [University of Lille, CHU Lille, Department of Neurosurgery, Lille (France); Oscar Lambret Center, Department of Medical Oncology, Lille (France); Inserm U1192-PRISM-Laboratoire de Proteomique, Reponse Inflammatoire, Spectrometrie de Masse, Lille (France); Cortot, Alexis B. [University of Lille, CHU Lille, Department of Thoracic Oncology, Lille (France); Drumez, Elodie [University of Lille, CHU Lille, Department of Biostatistics, Lille (France)

    2017-09-15

    To determine the diagnostic accuracy of pharmacokinetic parameters measured by dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) in predicting the response of brain metastases to antineoplastic therapy in patients with lung cancer. Forty-four consecutive patients with lung cancer, harbouring 123 newly diagnosed brain metastases prospectively underwent conventional 3-T MRI at baseline (within 1 month before treatment), during the early (7-10 weeks) and midterm (5-7 months) post-treatment period. An additional DCE MRI sequence was performed during baseline and early post-treatment MRI to evaluate baseline pharmacokinetic parameters (K{sup trans}, k{sub ep}, v{sub e}, v{sub p}) and their early variation (∇K{sup trans}, ∇k{sub ep}, ∇v{sub e}, ∇v{sub p}). The objective response was judged by the volume variation of each metastasis from baseline to midterm MRI. ROC curve analysis determined the best DCE MRI parameter to predict the objective response. Baseline DCE MRI parameters were not associated with the objective response. Early ∇K{sup trans}, ∇v{sub e} and ∇v{sub p} were significantly associated with the objective response (p = 0.02, p = 0.001 and p = 0.02, respectively). The best predictor of objective response was ∇v{sub e} with an area under the curve of 0.93 [95% CI = 0.87, 0.99]. DCE MRI and early ∇v{sub e} may be a useful tool to predict the objective response of brain metastases in patients with lung cancer. (orig.)

  20. 75 FR 65010 - Notice of Baseline Filings

    Science.gov (United States)

    2010-10-21

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-1-000; Docket No. PR11-2-000; Docket No. PR11-3-000] Notice of Baseline Filings October 14, 2010. Cranberry Pipeline Docket..., 2010, respectively the applicants listed above submitted their baseline filing of its Statement of...

  1. Software Updating in Wireless Sensor Networks: A Survey and Lacunae

    Directory of Open Access Journals (Sweden)

    Cormac J. Sreenan

    2013-11-01

    Full Text Available Wireless Sensor Networks are moving out of the laboratory and into the field. For a number of reasons there is often a need to update sensor node software, or node configuration, after deployment. The need for over-the-air updates is driven both by the scale of deployments, and by the remoteness and inaccessibility of sensor nodes. This need has been recognized since the early days of sensor networks, and research results from the related areas of mobile networking and distributed systems have been applied to this area. In order to avoid any manual intervention, the update process needs to be autonomous. This paper presents a comprehensive survey of software updating in Wireless Sensor Networks, and analyses the features required to make these updates autonomous. A new taxonomy of software update features and a new model for fault detection and recovery are presented. The paper concludes by identifying the lacunae relating to autonomous software updates, providing direction for future research.

  2. 77 FR 41258 - FOIA Fee Schedule Update

    Science.gov (United States)

    2012-07-13

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...

  3. 76 FR 43819 - FOIA Fee Schedule Update

    Science.gov (United States)

    2011-07-22

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Establishment of FOIA Fee Schedule. SUMMARY: The Defense Nuclear Facilities Safety Board is publishing its Freedom of Information Act (FOIA) Fee Schedule Update pursuant to...

  4. Nonlinear adaptive synchronization rule for identification of a large amount of parameters in dynamical models

    International Nuclear Information System (INIS)

    Ma Huanfei; Lin Wei

    2009-01-01

    The existing adaptive synchronization technique based on the stability theory and invariance principle of dynamical systems, though theoretically proved to be valid for parameters identification in specific models, is always showing slow convergence rate and even failed in practice when the number of parameters becomes large. Here, for parameters update, a novel nonlinear adaptive rule is proposed to accelerate the rate. Its feasibility is validated by analytical arguments as well as by specific parameters identification in the Lotka-Volterra model with multiple species. Two adjustable factors in this rule influence the identification accuracy, which means that a proper choice of these factors leads to an optimal performance of this rule. In addition, a feasible method for avoiding the occurrence of the approximate linear dependence among terms with parameters on the synchronized manifold is also proposed.

  5. 76 FR 8725 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-15

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Notice of Baseline Filings Enstor Grama Ridge Storage and Docket No. PR10-97-002. Transportation, L.L.C.. EasTrans, LLC Docket No. PR10-30-001... revised baseline filing of their Statement of Operating Conditions for services provided under section 311...

  6. 76 FR 5797 - Notice of Baseline Filings

    Science.gov (United States)

    2011-02-02

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-114-001; Docket No. PR10-129-001; Docket No. PR10-131- 001; Docket No. PR10-68-002 Not Consolidated] Notice of Baseline... applicants listed above submitted a revised baseline filing of their Statement of Operating Conditions for...

  7. The CASLEO Polarimetric Survey of Main Belt Asteroids: Updated results

    Science.gov (United States)

    Gil-Hutton, R.; Cellino, A.; Cañada-Assandri, M.

    2011-10-01

    We present updated results of the polarimetric survey of main-belt asteroids at Complejo Astronómico El Leoncito (Casleo), San Juan, Argentina, using the 2.15 m telescope and the Torino and CASPROF polarimeters. The goals of this survey are to increase the database of asteroid polarimetry, to estimate diversity in polarimetric properties of asteroids belonging to different taxonomic classes, and to search for objects that exhibit anomalous polarimetric properties. The survey began in 2003, and data for a sample of more than 170 asteroids have been obtained, most of them having been polarimetrically observed for the first time. Using these data we find phase-polarization curves and polarimetric parameters for several taxonomic classes.

  8. Seeking deep convective parameter updates that improve tropical Pacific climatology in CESM using Pareto fronts

    Science.gov (United States)

    Langenbrunner, B.; Neelin, J. D.

    2016-12-01

    Despite increasing complexity and process representation in global climate models (GCMs), accurate climate simulation is limited by uncertainties in sub-grid scale model physics, where cloud processes and precipitation occur, and the interaction with large-scale dynamics. Identifying highly sensitive parameters and constraining them against observations is therefore a valuable step in narrowing uncertainty. However, changes in parameterizations often improve some variables or aspects of the simulation while degrading others. This analysis addresses means of improving GCM simulation of present-day tropical Pacific climate in the face of these tradeoffs. Focusing on the deep convection scheme in the fully coupled Community Earth System Model (CESM) version 1, four parameters were systematically sampled, and a metamodel or model emulator was used to reconstruct the parameter space of this perturbed physics ensemble. Using this metamodel, a Pareto front is constructed to visualize multiobjective tradeoffs in model performance, and results highlight the most important aspects of model physics as well as the most sensitive parameter ranges. For example, parameter tradeoffs arise in the tropical Pacific where precipitation cannot improve without sea surface temperature getting worse. Tropical precipitation sensitivity is found to be highly nonlinear for low values of entrainment in convecting plumes, though it is fairly insensitive at the high end of the plausible range. Increasing the adjustment timescale for convective closure causes the centroid of tropical precipitation to vary as much as two degrees latitude, highlighting the effect these physics can have on large-scale features of the hydrological cycle. The optimization procedure suggests that simultaneously increasing the maximum downdraft mass flux fraction and the adjustment timescale can yield improvements to surface temperature and column water vapor without degrading the simulation of precipitation. These

  9. Environmental Regulatory Update Table, December 1989

    International Nuclear Information System (INIS)

    Houlbert, L.M.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1990-01-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  10. Environmental regulatory update table, March 1989

    International Nuclear Information System (INIS)

    Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1989-04-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  11. Environmental Regulatory Update Table, April 1989

    International Nuclear Information System (INIS)

    Houlberg, L.; Langston, M.E.; Nikbakht, A.; Salk, M.S.

    1989-05-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  12. Environmental Regulatory Update Table, December 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1992-01-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  13. Environmental Regulatory Update Table, August 1990

    International Nuclear Information System (INIS)

    Houlberg, L.M.; Nikbakht, A.; Salk, M.S.

    1990-09-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  14. Environmental Regulatory Update Table, October 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-11-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  15. Environmental Regulatory Update Table, November 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-12-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  16. Environmental Regulatory Update Table, September 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-10-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  17. The Energy Information Administration's assessment of reformulated gasoline: An update

    International Nuclear Information System (INIS)

    1994-12-01

    This report (Part II) concludes a two part study of The Energy Information Administration's (EIA) assessment of Reformulated Gasoline (RFG). The data contained herein updates EIA's previous findings and analyses on reformulated gasoline as it affects the petroleum industry. The major findings of Part II have not changed considerably from Part I: Supplies of RFG are adequate to meet demand, but a tight supply-demand balance exists, leaving the RFG system with little ability to absorb unexpected supply or delivery system disruption. In December 1994, the estimated demand for RFG was 2.6 million barrels per day, with the production capability just meeting this demand. The study concludes that current prices for RFG are consistent with the costs underlying the product, and the difference in RFG and conventional gasoline indicates confidence in supply. The study also follows the impact of recent events such as: postponement of the Renewable Oxygenate Standard, the decision to require importers to use the U.S. average baseline for limiting emissions, the disruption of the Colonial Pipeline in Texas, and Pennsylvania's request to opt-out of the RFG program

  18. How Do We Update Faces? Effects of Gaze Direction and Facial Expressions on Working Memory Updating

    OpenAIRE

    Artuso, Caterina; Palladino, Paola; Ricciardelli, Paola

    2012-01-01

    The aim of the study was to investigate how the biological binding between different facial dimensions, and their social and communicative relevance, may impact updating processes in working memory (WM). We focused on WM updating because it plays a key role in ongoing processing. Gaze direction and facial expression are crucial and changeable components of face processing. Direct gaze enhances the processing of approach-oriented facial emotional expressions (e.g., joy), while averted gaze enh...

  19. Ambiguities of theoretical parameters and CP or T violation in neutrino factories

    International Nuclear Information System (INIS)

    Koike, Masafumi; Ota, Toshihiko; Sato, Joe

    2002-01-01

    We study the sensitivity to the CP- or T-violation search in the presence of ambiguities of the theoretical parameters. Three generations of neutrinos are considered. The parameters whose ambiguities are considered are the differences of the squared masses, the mixing angles, and the density of matter. We first consider the statistics that are sensitive to the genuine CP-violation effect originating from the imaginary coupling. No ambiguity of the parameters is considered in this part. It is argued that the widely adopted usual statistics are not necessarily sensitive to the genuine CP-violation effect. Two statistics that are sensitive to the imaginary coupling are proposed. The qualitative difference between these statistics and the usual ones are discussed. Next we proceed to the case where the ambiguity of the parameters is present. The sensitivity of the CP-violation search is greatly spoiled when the baseline length is longer than about one thousand kilometers, which turns out to be due to the ambiguity of the matter effect. Thus the CP-violation search by use of CP conjugate channels turns out to require a low energy neutrino and short baseline length. It is also shown that such a loss of sensitivity is avoided by using T-conjugate oscillation channels

  20. 75 FR 70732 - Notice of Baseline Filings

    Science.gov (United States)

    2010-11-18

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR11-71-000; Docket No. PR11-72-000; Docket No. PR11-73- 000] Notice of Baseline Filings November 10, 2010. Docket No. PR11-71-000..., 2010, the applicants listed above submitted their baseline filing of their Statement of Operating...

  1. Robustness of Adaptive Survey Designs to Inaccuracy of Design Parameters

    Directory of Open Access Journals (Sweden)

    Burger Joep

    2017-09-01

    Full Text Available Adaptive survey designs (ASDs optimize design features, given 1 the interactions between the design features and characteristics of sampling units and 2 a set of constraints, such as a budget and a minimum number of respondents. Estimation of the interactions is subject to both random and systematic error. In this article, we propose and evaluate four viewpoints to assess robustness of ASDs to inaccuracy of design parameter estimates: the effect of both imprecision and bias on both ASD structure and ASD performance. We additionally propose three distance measures to compare the structure of ASDs. The methodology is illustrated using a simple simulation study and a more complex but realistic case study on the Dutch Travel Survey. The proposed methodology can be applied to other ASD optimization problems. In our simulation study and case study, the ASD was fairly robust to imprecision, but not to realistic dynamics in the design parameters. To deal with the sensitivity of ASDs to changing design parameters, we recommend to learn and update the design parameters.

  2. Baseline Response Levels Are a Nuisance in Infant Contingency Learning

    Science.gov (United States)

    Millar, W. S.; Weir, Catherine

    2015-01-01

    The impact of differences in level of baseline responding on contingency learning in the first year was examined by considering the response acquisition of infants classified into baseline response quartiles. Whereas the three lower baseline groups showed the predicted increment in responding to a contingency, the highest baseline responders did…

  3. 2011 update to the Society of Thoracic Surgeons and the Society of Cardiovascular Anesthesiologists blood conservation clinical practice guidelines.

    Science.gov (United States)

    Ferraris, Victor A; Brown, Jeremiah R; Despotis, George J; Hammon, John W; Reece, T Brett; Saha, Sibu P; Song, Howard K; Clough, Ellen R; Shore-Lesserson, Linda J; Goodnough, Lawrence T; Mazer, C David; Shander, Aryeh; Stafford-Smith, Mark; Waters, Jonathan; Baker, Robert A; Dickinson, Timothy A; FitzGerald, Daniel J; Likosky, Donald S; Shann, Kenneth G

    2011-03-01

    Practice guidelines reflect published literature. Because of the ever changing literature base, it is necessary to update and revise guideline recommendations from time to time. The Society of Thoracic Surgeons recommends review and possible update of previously published guidelines at least every three years. This summary is an update of the blood conservation guideline published in 2007. The search methods used in the current version differ compared to the previously published guideline. Literature searches were conducted using standardized MeSH terms from the National Library of Medicine PUBMED database list of search terms. The following terms comprised the standard baseline search terms for all topics and were connected with the logical 'OR' connector--Extracorporeal circulation (MeSH number E04.292), cardiovascular surgical procedures (MeSH number E04.100), and vascular diseases (MeSH number C14.907). Use of these broad search terms allowed specific topics to be added to the search with the logical 'AND' connector. In this 2011 guideline update, areas of major revision include: 1) management of dual anti-platelet therapy before operation, 2) use of drugs that augment red blood cell volume or limit blood loss, 3) use of blood derivatives including fresh frozen plasma, Factor XIII, leukoreduced red blood cells, platelet plasmapheresis, recombinant Factor VII, antithrombin III, and Factor IX concentrates, 4) changes in management of blood salvage, 5) use of minimally invasive procedures to limit perioperative bleeding and blood transfusion, 6) recommendations for blood conservation related to extracorporeal membrane oxygenation and cardiopulmonary perfusion, 7) use of topical hemostatic agents, and 8) new insights into the value of team interventions in blood management. Much has changed since the previously published 2007 STS blood management guidelines and this document contains new and revised recommendations. Copyright © 2011 The Society of Thoracic

  4. IPCC Socio-Economic Baseline Dataset

    Data.gov (United States)

    National Aeronautics and Space Administration — The Intergovernmental Panel on Climate Change (IPCC) Socio-Economic Baseline Dataset consists of population, human development, economic, water resources, land...

  5. Reference values of clinical chemistry and hematology parameters in rhesus monkeys (Macaca mulatta).

    Science.gov (United States)

    Chen, Younan; Qin, Shengfang; Ding, Yang; Wei, Lingling; Zhang, Jie; Li, Hongxia; Bu, Hong; Lu, Yanrong; Cheng, Jingqiu

    2009-01-01

    Rhesus monkey models are valuable to the studies of human biology. Reference values for clinical chemistry and hematology parameters of rhesus monkeys are required for proper data interpretation. Whole blood was collected from 36 healthy Chinese rhesus monkeys (Macaca mulatta) of either sex, 3 to 5 yr old. Routine chemistry and hematology parameters, and some special coagulation parameters including thromboelastograph and activities of coagulation factors were tested. We presented here the baseline values of clinical chemistry and hematology parameters in normal Chinese rhesus monkeys. These data may provide valuable information for veterinarians and investigators using rhesus monkeys in experimental studies.

  6. Camera Trajectory fromWide Baseline Images

    Science.gov (United States)

    Havlena, M.; Torii, A.; Pajdla, T.

    2008-09-01

    Camera trajectory estimation, which is closely related to the structure from motion computation, is one of the fundamental tasks in computer vision. Reliable camera trajectory estimation plays an important role in 3D reconstruction, self localization, and object recognition. There are essential issues for a reliable camera trajectory estimation, for instance, choice of the camera and its geometric projection model, camera calibration, image feature detection and description, and robust 3D structure computation. Most of approaches rely on classical perspective cameras because of the simplicity of their projection models and ease of their calibration. However, classical perspective cameras offer only a limited field of view, and thus occlusions and sharp camera turns may cause that consecutive frames look completely different when the baseline becomes longer. This makes the image feature matching very difficult (or impossible) and the camera trajectory estimation fails under such conditions. These problems can be avoided if omnidirectional cameras, e.g. a fish-eye lens convertor, are used. The hardware which we are using in practice is a combination of Nikon FC-E9 mounted via a mechanical adaptor onto a Kyocera Finecam M410R digital camera. Nikon FC-E9 is a megapixel omnidirectional addon convertor with 180° view angle which provides images of photographic quality. Kyocera Finecam M410R delivers 2272×1704 images at 3 frames per second. The resulting combination yields a circular view of diameter 1600 pixels in the image. Since consecutive frames of the omnidirectional camera often share a common region in 3D space, the image feature matching is often feasible. On the other hand, the calibration of these cameras is non-trivial and is crucial for the accuracy of the resulting 3D reconstruction. We calibrate omnidirectional cameras off-line using the state-of-the-art technique and Mičušík's two-parameter model, that links the radius of the image point r to the

  7. Online updating and uncertainty quantification using nonstationary output-only measurement

    Science.gov (United States)

    Yuen, Ka-Veng; Kuok, Sin-Chi

    2016-01-01

    Extended Kalman filter (EKF) is widely adopted for state estimation and parametric identification of dynamical systems. In this algorithm, it is required to specify the covariance matrices of the process noise and measurement noise based on prior knowledge. However, improper assignment of these noise covariance matrices leads to unreliable estimation and misleading uncertainty estimation on the system state and model parameters. Furthermore, it may induce diverging estimation. To resolve these problems, we propose a Bayesian probabilistic algorithm for online estimation of the noise parameters which are used to characterize the noise covariance matrices. There are three major appealing features of the proposed approach. First, it resolves the divergence problem in the conventional usage of EKF due to improper choice of the noise covariance matrices. Second, the proposed approach ensures the reliability of the uncertainty quantification. Finally, since the noise parameters are allowed to be time-varying, nonstationary process noise and/or measurement noise are explicitly taken into account. Examples using stationary/nonstationary response of linear/nonlinear time-varying dynamical systems are presented to demonstrate the efficacy of the proposed approach. Furthermore, comparison with the conventional usage of EKF will be provided to reveal the necessity of the proposed approach for reliable model updating and uncertainty quantification.

  8. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  9. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  10. Baseline budgeting for continuous improvement.

    Science.gov (United States)

    Kilty, G L

    1999-05-01

    This article is designed to introduce the techniques used to convert traditionally maintained department budgets to baseline budgets. This entails identifying key activities, evaluating for value-added, and implementing continuous improvement opportunities. Baseline Budgeting for Continuous Improvement was created as a result of a newly named company president's request to implement zero-based budgeting. The president was frustrated with the mind-set of the organization, namely, "Next year's budget should be 10 to 15 percent more than this year's spending." Zero-based budgeting was not the answer, but combining the principles of activity-based costing and the Just-in-Time philosophy of eliminating waste and continuous improvement did provide a solution to the problem.

  11. Safe and sensible preprocessing and baseline correction of pupil-size data.

    Science.gov (United States)

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  12. Environmental Regulatory Update Table, August 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M., Hawkins, G.T.; Salk, M.S.

    1991-09-01

    This Environmental Regulatory Update Table (August 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  13. Environmental regulatory update table, July 1991

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Salk, M.S.

    1991-08-01

    This Environmental Regulatory Update Table (July 1991) provides information on regulatory initiatives of interest to DOE operations and contractor staff with environmental management responsibilities. The table is updated each month with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  14. Sugammadex: An Update

    Directory of Open Access Journals (Sweden)

    Ezri Tiberiu

    2016-01-01

    Full Text Available The purpose of this update is to provide recent knowledge and debates regarding the use of sugammadex in the fields of anesthesia and critical care. The review is not intended to provide a comprehensive description of sugammadex and its clinical use.

  15. Performance Measurement Baseline Change Request

    Data.gov (United States)

    Social Security Administration — The Performance Measurement Baseline Change Request template is used to document changes to scope, cost, schedule, or operational performance metrics for SSA's Major...

  16. Assessment of early treatment response to neoadjuvant chemotherapy in breast cancer using non-mono-exponential diffusion models: a feasibility study comparing the baseline and mid-treatment MRI examinations

    Energy Technology Data Exchange (ETDEWEB)

    Bedair, Reem; Manavaki, Roido; Gill, Andrew B.; Abeyakoon, Oshaani; Gilbert, Fiona J. [University of Cambridge, Department of Radiology, School of Clinical Medicine, Cambridge (United Kingdom); Priest, Andrew N.; Patterson, Andrew J. [Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); McLean, Mary A. [Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); University of Cambridge, Li Ka Shing Centre, Cancer Research UK Cambridge Institute, Cambridge (United Kingdom); Graves, Martin J. [University of Cambridge, Department of Radiology, School of Clinical Medicine, Cambridge (United Kingdom); Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Addenbrookes Hospital, Cambridge (United Kingdom); Griffiths, John R. [University of Cambridge, Li Ka Shing Centre, Cancer Research UK Cambridge Institute, Cambridge (United Kingdom)

    2017-07-15

    To assess the feasibility of the mono-exponential, bi-exponential and stretched-exponential models in evaluating response of breast tumours to neoadjuvant chemotherapy (NACT) at 3 T. Thirty-six female patients (median age 53, range 32-75 years) with invasive breast cancer undergoing NACT were enrolled for diffusion-weighted MRI (DW-MRI) prior to the start of treatment. For assessment of early response, changes in parameters were evaluated on mid-treatment MRI in 22 patients. DW-MRI was performed using eight b values (0, 30, 60, 90, 120, 300, 600, 900 s/mm{sup 2}). Apparent diffusion coefficient (ADC), tissue diffusion coefficient (D{sub t}), vascular fraction (Florin), distributed diffusion coefficient (DDC) and alpha (α) parameters were derived. Then t tests compared the baseline and changes in parameters between response groups. Repeatability was assessed at inter- and intraobserver levels. All patients underwent baseline MRI whereas 22 lesions were available at mid-treatment. At pretreatment, mean diffusion coefficients demonstrated significant differences between groups (p < 0.05). At mid-treatment, percentage increase in ADC and DDC showed significant differences between responders (49 % and 43 %) and non-responders (21 % and 32 %) (p = 0.03, p = 0.04). Overall, stretched-exponential parameters showed excellent repeatability. DW-MRI is sensitive to baseline and early treatment changes in breast cancer using non-mono-exponential models, and the stretched-exponential model can potentially monitor such changes. (orig.)

  17. Assessment of early treatment response to neoadjuvant chemotherapy in breast cancer using non-mono-exponential diffusion models: a feasibility study comparing the baseline and mid-treatment MRI examinations

    International Nuclear Information System (INIS)

    Bedair, Reem; Manavaki, Roido; Gill, Andrew B.; Abeyakoon, Oshaani; Gilbert, Fiona J.; Priest, Andrew N.; Patterson, Andrew J.; McLean, Mary A.; Graves, Martin J.; Griffiths, John R.

    2017-01-01

    To assess the feasibility of the mono-exponential, bi-exponential and stretched-exponential models in evaluating response of breast tumours to neoadjuvant chemotherapy (NACT) at 3 T. Thirty-six female patients (median age 53, range 32-75 years) with invasive breast cancer undergoing NACT were enrolled for diffusion-weighted MRI (DW-MRI) prior to the start of treatment. For assessment of early response, changes in parameters were evaluated on mid-treatment MRI in 22 patients. DW-MRI was performed using eight b values (0, 30, 60, 90, 120, 300, 600, 900 s/mm"2). Apparent diffusion coefficient (ADC), tissue diffusion coefficient (D_t), vascular fraction (Florin), distributed diffusion coefficient (DDC) and alpha (α) parameters were derived. Then t tests compared the baseline and changes in parameters between response groups. Repeatability was assessed at inter- and intraobserver levels. All patients underwent baseline MRI whereas 22 lesions were available at mid-treatment. At pretreatment, mean diffusion coefficients demonstrated significant differences between groups (p < 0.05). At mid-treatment, percentage increase in ADC and DDC showed significant differences between responders (49 % and 43 %) and non-responders (21 % and 32 %) (p = 0.03, p = 0.04). Overall, stretched-exponential parameters showed excellent repeatability. DW-MRI is sensitive to baseline and early treatment changes in breast cancer using non-mono-exponential models, and the stretched-exponential model can potentially monitor such changes. (orig.)

  18. Parameters for characterizing sites for disposal of low-level radioactive waste

    International Nuclear Information System (INIS)

    Lutton, R.J.; Malone, P.G.; Meade, R.B.; Patrick, D.M.

    1982-05-01

    Sixty-seven site parameters and parameter groups are identified as important characteristics of sites for disposal of low-level radioactive waste and require detailed evaluation. Several of the most important parameters are needed for hydrological analysis while others are needed for facility design, construction, and operation. Still others are needed for baseline and detection stages of monitoring. It is recommended that all parameters be evaluated by technically qualified personnel. Appropriate tests and documentation methods are discussed in a second report, which will follow. However, site-specific testing or elaborate field measurement will not always be necessary, i.e., where indicated to be unnecessary on a technical basis. Much of this report, Appendices A through G, is directed to explaining the importances of parameters and to establishing site-specific limitations

  19. Value of quantitative MRI parameters in predicting and evaluating clinical outcome in conservatively treated patients with chronic midportion Achilles tendinopathy: A prospective study.

    Science.gov (United States)

    Tsehaie, J; Poot, D H J; Oei, E H G; Verhaar, J A N; de Vos, R J

    2017-07-01

    To evaluate whether baseline MRI parameters provide prognostic value for clinical outcome, and to study correlation between MRI parameters and clinical outcome. Observational prospective cohort study. Patients with chronic midportion Achilles tendinopathy were included and performed a 16-week eccentric calf-muscle exercise program. Outcome measurements were the validated Victorian Institute of Sports Assessment-Achilles (VISA-A) questionnaire and MRI parameters at baseline and after 24 weeks. The following MRI parameters were assessed: tendon volume (Volume), tendon maximum cross-sectional area (CSA), tendon maximum anterior-posterior diameter (AP), and signal intensity (SI). Intra-class correlation coefficients (ICCs) and minimum detectable changes (MDCs) for each parameter were established in a reliability analysis. Twenty-five patients were included and complete follow-up was achieved in 20 patients. The average VISA-A scores increased significantly with 12.3 points (27.6%). The reliability was fair-good for all MRI-parameters with ICCs>0.50. Average tendon volume and CSA decreased significantly with 0.28cm 3 (5.2%) and 4.52mm 2 (4.6%) respectively. Other MRI parameters did not change significantly. None of the baseline MRI parameters were univariately associated with VISA-A change after 24 weeks. MRI SI increase over 24 weeks was positively correlated with the VISA-A score improvement (B=0.7, R 2 =0.490, p=0.02). Tendon volume and CSA decreased significantly after 24 weeks of conservative treatment. As these differences were within the MDC limits, they could be a result of a measurement error. Furthermore, MRI parameters at baseline did not predict the change in symptoms, and therefore have no added value in providing a prognosis in daily clinical practice. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  20. Quickly Updatable Hologram Images Using Poly(N-vinyl Carbazole (PVCz Photorefractive Polymer Composite

    Directory of Open Access Journals (Sweden)

    Wataru Sakai

    2012-08-01

    Full Text Available Quickly updatable hologram images using photorefractive (PR polymer composite based on poly(N-vinyl carbazole (PVCz is presented. PVCz is one of the pioneer materials of photoconductive polymers. PR polymer composite consists of 44 wt % of PVCz, 35 wt % of 4-azacycloheptylbenzylidene-malonitrile (7-DCST as a nonlinear optical dye, 20 wt % of carbazolylethylpropionate (CzEPA as a photoconductive plasticizer and 1 wt % of 2,4,7-trinitro-9-fluorenone (TNF as a sensitizer. PR composite gives high diffraction efficiency of 68% at E = 45 V μm−1. Response speed of optical diffraction is the key parameter for real-time 3D holographic display. The key parameter for obtaining quickly updatable holographic images is to control the glass transition temperature lower enough to enhance chromophore orientation. Object image of the reflected coin surface recorded with reference beam at 532 nm (green beam in the PR polymer composite is simultaneously reconstructed using a red probe beam at 642 nm. Instead of using a coin object, an object image produced by a computer was displayed on a spatial light modulator (SLM and used for the hologram. The reflected object beam from an SLM was interfered with a reference beam on PR polymer composite to record a hologram and simultaneously reconstructed by a red probe beam.

  1. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas March 26 Update of the voice messaging system All CERN sites April...

  2. Online Sequential Projection Vector Machine with Adaptive Data Mean Update.

    Science.gov (United States)

    Chen, Lin; Jia, Ji-Ting; Zhang, Qiong; Deng, Wan-Yu; Wei, Wei

    2016-01-01

    We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM) which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1) the projection vectors for dimension reduction, (2) the input weights, biases, and output weights, and (3) the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD) approach, adaptive multihyperplane machine (AMM), primal estimated subgradient solver (Pegasos), online sequential extreme learning machine (OSELM), and SVD + OSELM (feature selection based on SVD is performed before OSELM). The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.

  3. Online Sequential Projection Vector Machine with Adaptive Data Mean Update

    Directory of Open Access Journals (Sweden)

    Lin Chen

    2016-01-01

    Full Text Available We propose a simple online learning algorithm especial for high-dimensional data. The algorithm is referred to as online sequential projection vector machine (OSPVM which derives from projection vector machine and can learn from data in one-by-one or chunk-by-chunk mode. In OSPVM, data centering, dimension reduction, and neural network training are integrated seamlessly. In particular, the model parameters including (1 the projection vectors for dimension reduction, (2 the input weights, biases, and output weights, and (3 the number of hidden nodes can be updated simultaneously. Moreover, only one parameter, the number of hidden nodes, needs to be determined manually, and this makes it easy for use in real applications. Performance comparison was made on various high-dimensional classification problems for OSPVM against other fast online algorithms including budgeted stochastic gradient descent (BSGD approach, adaptive multihyperplane machine (AMM, primal estimated subgradient solver (Pegasos, online sequential extreme learning machine (OSELM, and SVD + OSELM (feature selection based on SVD is performed before OSELM. The results obtained demonstrated the superior generalization performance and efficiency of the OSPVM.

  4. Control of Interference during Working Memory Updating

    Science.gov (United States)

    Szmalec, Arnaud; Verbruggen, Frederick; Vandierendonck, Andre; Kemps, Eva

    2011-01-01

    The current study examined the nature of the processes underlying working memory updating. In 4 experiments using the n-back paradigm, the authors demonstrate that continuous updating of items in working memory prevents strong binding of those items to their contexts in working memory, and hence leads to an increased susceptibility to proactive…

  5. 42 CFR 414.30 - Conversion factor update.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Conversion factor update. 414.30 Section 414.30... Practitioners § 414.30 Conversion factor update. Unless Congress acts in accordance with section 1848(d)(3) of... preceding FY over the third preceding FY exceeds the performance standard rate of increase established for...

  6. Application of Real Time Models Updating in ABO Central Field

    International Nuclear Information System (INIS)

    Heikal, S.; Adewale, D.; Doghmi, A.; Augustine, U.

    2003-01-01

    ABO central field is the first deep offshore oil production in Nigeria located in OML 125 (ex-OPL316). The field was developed in a water depth of between 500 and 800 meters. Deep-water development requires much faster data handling and model updates in order to make the best possible technical decision. This required an easy way to incorporate the latest information and dynamic update of the reservoir model enabling real time reservoir management. The paper aims at discussing the benefits of real time static and dynamic model update and illustrates with a horizontal well example how this update was beneficial prior and during the drilling operation minimizing the project CAPEX Prior to drilling, a 3D geological model was built based on seismic and offset wells' data. The geological model was updated twice, once after the pilot hole drilling and then after reaching the landing point and prior drilling the horizontal section .Forward modeling ws made was well using the along the planned trajectory. During the drilling process both geo- steering and LWD data were loaded in real time to the 3D modeling software. The data was analyzed and compared with the predicted model. The location of markers was changed as drilling progressed and the entire 3D Geological model was rapidly updated. The target zones were revaluated in the light of the new model updates. Recommendations were communicated to the field, and the well trajectory was modified to take into account the new information. The combination of speed, flexibility and update-ability of the 3D modeling software enabled continues geological model update on which the asset team based their trajectory modification decisions throughout the drilling phase. The well was geo-steered through 7 meters thickness of sand. After the drilling, the testing showed excellent results with a productivity and fluid properties data were used to update the dynamic model reviewing the well production plateau providing optimum reservoir

  7. Classifying vulnerability to sleep deprivation using baseline measures of psychomotor vigilance.

    Science.gov (United States)

    Patanaik, Amiya; Kwoh, Chee Keong; Chua, Eric C P; Gooley, Joshua J; Chee, Michael W L

    2015-05-01

    To identify measures derived from baseline psychomotor vigilance task (PVT) performance that can reliably predict vulnerability to sleep deprivation. Subjects underwent total sleep deprivation and completed a 10-min PVT every 1-2 h in a controlled laboratory setting. Participants were categorized as vulnerable or resistant to sleep deprivation, based on a median split of lapses that occurred following sleep deprivation. Standard reaction time, drift diffusion model (DDM), and wavelet metrics were derived from PVT response times collected at baseline. A support vector machine model that incorporated maximum relevance and minimum redundancy feature selection and wrapper-based heuristics was used to classify subjects as vulnerable or resistant using rested data. Two academic sleep laboratories. Independent samples of 135 (69 women, age 18 to 25 y), and 45 (3 women, age 22 to 32 y) healthy adults. In both datasets, DDM measures, number of consecutive reaction times that differ by more than 250 ms, and two wavelet features were selected by the model as features predictive of vulnerability to sleep deprivation. Using the best set of features selected in each dataset, classification accuracy was 77% and 82% using fivefold stratified cross-validation, respectively. In both datasets, DDM measures, number of consecutive reaction times that differ by more than 250 ms, and two wavelet features were selected by the model as features predictive of vulnerability to sleep deprivation. Using the best set of features selected in each dataset, classification accuracy was 77% and 82% using fivefold stratified cross-validation, respectively. Despite differences in experimental conditions across studies, drift diffusion model parameters associated reliably with individual differences in performance during total sleep deprivation. These results demonstrate the utility of drift diffusion modeling of baseline performance in estimating vulnerability to psychomotor vigilance decline

  8. Baseline characteristics predict risk of progression and response to combined medical therapy for benign prostatic hyperplasia (BPH).

    Science.gov (United States)

    Kozminski, Michael A; Wei, John T; Nelson, Jason; Kent, David M

    2015-02-01

    To better risk stratify patients, using baseline characteristics, to help optimise decision-making for men with moderate-to-severe lower urinary tract symptoms (LUTS) secondary to benign prostatic hyperplasia (BPH) through a secondary analysis of the Medical Therapy of Prostatic Symptoms (MTOPS) trial. After review of the literature, we identified potential baseline risk factors for BPH progression. Using bivariate tests in a secondary analysis of MTOPS data, we determined which variables retained prognostic significance. We then used these factors in Cox proportional hazard modelling to: i) more comprehensively risk stratify the study population based on pre-treatment parameters and ii) to determine which risk strata stood to benefit most from medical intervention. In all, 3047 men were followed in MTOPS for a mean of 4.5 years. We found varying risks of progression across quartiles. Baseline BPH Impact Index score, post-void residual urine volume, serum prostate-specific antigen (PSA) level, age, American Urological Association Symptom Index score, and maximum urinary flow rate were found to significantly correlate with overall BPH progression in multivariable analysis. Using baseline factors permits estimation of individual patient risk for clinical progression and the benefits of medical therapy. A novel clinical decision tool based on these analyses will allow clinicians to weigh patient-specific benefits against possible risks of adverse effects for a given patient. © 2014 The Authors. BJU International © 2014 BJU International.

  9. Effect of asynchronous updating on the stability of cellular automata

    International Nuclear Information System (INIS)

    Baetens, J.M.; Van der Weeën, P.; De Baets, B.

    2012-01-01

    Highlights: ► An upper bound on the Lyapunov exponent of asynchronously updated CA is established. ► The employed update method has repercussions on the stability of CAs. ► A decision on the employed update method should be taken with care. ► Substantial discrepancies arise between synchronously and asynchronously updated CA. ► Discrepancies between different asynchronous update schemes are less pronounced. - Abstract: Although cellular automata (CAs) were conceptualized as utter discrete mathematical models in which the states of all their spatial entities are updated simultaneously at every consecutive time step, i.e. synchronously, various CA-based models that rely on so-called asynchronous update methods have been constructed in order to overcome the limitations that are tied up with the classical way of evolving CAs. So far, only a few researchers have addressed the consequences of this way of updating on the evolved spatio-temporal patterns, and the reachable stationary states. In this paper, we exploit Lyapunov exponents to determine to what extent the stability of the rules within a family of totalistic CAs is affected by the underlying update method. For that purpose, we derive an upper bound on the maximum Lyapunov exponent of asynchronously iterated CAs, and show its validity, after which we present a comparative study between the Lyapunov exponents obtained for five different update methods, namely one synchronous method and four well-established asynchronous methods. It is found that the stability of CAs is seriously affected if one of the latter methods is employed, whereas the discrepancies arising between the different asynchronous methods are far less pronounced and, finally, we discuss the repercussions of our findings on the development of CA-based models.

  10. Haematology and Serum Biochemistry Parameters and Variations in the Eurasian Beaver (Castor fiber).

    Science.gov (United States)

    Girling, Simon J; Campbell-Palmer, Roisin; Pizzi, Romain; Fraser, Mary A; Cracknell, Jonathan; Arnemo, Jon; Rosell, Frank

    2015-01-01

    Haematology parameters (N = 24) and serum biochemistry parameters (N = 35) were determined for wild Eurasian beavers (Castor fiber), between 6 months - 12 years old. Of the population tested in this study, N = 18 Eurasian beavers were from Norway and N = 17 originating from Bavaria but now living extensively in a reserve in England. All blood samples were collected from beavers via the ventral tail vein. All beavers were chemically restrained using inhalant isoflurane in 100% oxygen prior to blood sampling. Results were determined for haematological and serum biochemical parameters for the species and were compared between the two different populations with differences in means estimated and significant differences being noted. Standard blood parameters for the Eurasian beaver were determined and their ranges characterised using percentiles. Whilst the majority of blood parameters between the two populations showed no significant variation, haemoglobin, packed cell volume, mean cell haemoglobin and white blood cell counts showed significantly greater values (pbeavers or between sexually immature (beavers in the animals sampled. With Eurasian beaver reintroduction encouraged by legislation throughout Europe, knowledge of baseline blood values for the species and any variations therein is essential when assessing their health and welfare and the success or failure of any reintroduction program. This is the first study to produce base-line blood values and their variations for the Eurasian beaver.

  11. A comparison of updating algorithms for large N reduced models

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI Universidad Autónoma de Madrid,E-28049 Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ramos, Alberto [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland)

    2015-06-29

    We investigate Monte Carlo updating algorithms for simulating SU(N) Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole SU(N) matrix at once, or iterating through SU(2) subgroups of the SU(N) matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  12. A comparison of updating algorithms for large $N$ reduced models

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto

    2015-01-01

    We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  13. An Automated Baseline Correction Method Based on Iterative Morphological Operations.

    Science.gov (United States)

    Chen, Yunliang; Dai, Liankui

    2018-05-01

    Raman spectra usually suffer from baseline drift caused by fluorescence or other reasons. Therefore, baseline correction is a necessary and crucial step that must be performed before subsequent processing and analysis of Raman spectra. An automated baseline correction method based on iterative morphological operations is proposed in this work. The method can adaptively determine the structuring element first and then gradually remove the spectral peaks during iteration to get an estimated baseline. Experiments on simulated data and real-world Raman data show that the proposed method is accurate, fast, and flexible for handling different kinds of baselines in various practical situations. The comparison of the proposed method with some state-of-the-art baseline correction methods demonstrates its advantages over the existing methods in terms of accuracy, adaptability, and flexibility. Although only Raman spectra are investigated in this paper, the proposed method is hopefully to be used for the baseline correction of other analytical instrumental signals, such as IR spectra and chromatograms.

  14. Energy Economic Data Base (EEDB) Program: Phase VI update (1983) report

    International Nuclear Information System (INIS)

    1984-09-01

    This update of the Energy Economic Data Base is the latest in a series of technical and cost studies prepared by United Engineers and Constructors Inc., during the last 18 years. The data base was developed during 1978 and has been updated annually since then. The purpose of the updates has been to reflect the impact of changing regulations and technology on the costs of electric power generating stations. This Phase VI (Sixth) Update report documents the results of the 1983 EEDB Program update effort. The latest effort was a comprehensive update of the technical and capital cost information for the pressurized water reactor, boiling water reactor, and liquid metal fast breeder reactor nuclear power plant data models and for the 800 MWe and 500 MWe high sulfur coal-fired power plant data models. The update provided representative costs for these nuclear and coal-fired power plants for the 1980's. In addition, the updated nuclear power plant data models for the 1980's were modified to provide anticipated costs for nuclear power plants for the 1990's. Consequently, the Phase VI Update has continued to provide important benchmark information through which technical and capital cost trends may be identified that have occurred since January 1, 1978

  15. Indoor Spatial Updating with Reduced Visual Information.

    Science.gov (United States)

    Legge, Gordon E; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M

    2016-01-01

    Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135) and Severe Blur (Snellen 20/900) conditions, and a Narrow Field (8°) condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating. With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition. If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment.

  16. Indoor Spatial Updating with Reduced Visual Information.

    Directory of Open Access Journals (Sweden)

    Gordon E Legge

    Full Text Available Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms.Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (Snellen 20/135 and Severe Blur (Snellen 20/900 conditions, and a Narrow Field (8° condition. The subjects estimated the dimensions of seven rectangular rooms with and without these visual restrictions. They were also guided along three-segment paths in the rooms. At the end of each path, they were asked to estimate the distance and direction to the starting location. In Experiment 1, the subjects walked along the path. In Experiment 2, they were pushed in a wheelchair to determine if reduced proprioceptive input would result in poorer spatial updating.With unrestricted vision, mean Weber fractions for room-size estimates were near 20%. Severe Blur but not Mild Blur yielded larger errors in room-size judgments. The Narrow Field was associated with increased error, but less than with Severe Blur. There was no effect of visual restriction on estimates of distance back to the starting location, and only Severe Blur yielded larger errors in the direction estimates. Contrary to expectation, the wheelchair subjects did not exhibit poorer updating performance than the walking subjects, nor did they show greater dependence on visual condition.If our results generalize to people with low vision, severe deficits in acuity or field will adversely affect the ability to judge the size of indoor spaces, but updating of position and orientation may be less affected by visual impairment.

  17. Spatial snowdrift game in heterogeneous agent systems with co-evolutionary strategies and updating rules

    International Nuclear Information System (INIS)

    Xia Hai-Jiang; Li Ping-Ping; Ke Jian-Hong; Lin Zhen-Quan

    2015-01-01

    We propose an evolutionary snowdrift game model for heterogeneous systems with two types of agents, in which the inner-directed agents adopt the memory-based updating rule while the copycat-like ones take the unconditional imitation rule; moreover, each agent can change his type to adopt another updating rule once the number he sequentially loses the game at is beyond his upper limit of tolerance. The cooperative behaviors of such heterogeneous systems are then investigated by Monte Carlo simulations. The numerical results show the equilibrium cooperation frequency and composition as functions of the cost-to-benefit ratio r are both of plateau structures with discontinuous steplike jumps, and the number of plateaux varies non-monotonically with the upper limit of tolerance ν T as well as the initial composition of agents f a0 . Besides, the quantities of the cooperation frequency and composition are dependent crucially on the system parameters including ν T , f a0 , and r. One intriguing observation is that when the upper limit of tolerance is small, the cooperation frequency will be abnormally enhanced with the increase of the cost-to-benefit ratio in the range of 0 < r < 1/4. We then probe into the relative cooperation frequencies of either type of agents, which are also of plateau structures dependent on the system parameters. Our results may be helpful to understand the cooperative behaviors of heterogenous agent systems. (paper)

  18. CKM parameter fits, the Bs0- anti Bs0 mixing ratio xs and CP-violating phases in B decays

    International Nuclear Information System (INIS)

    Ali, A.; London, D.

    1993-02-01

    We review and update constraints on the parameters of the flavour mixing matrix (V CKM ) in the Standard Model. In performing these fits, we use inputs from the measurements of parallel ε parallel , the CP-violating parameter in K decays, x d = (ΔM)/Γ, the mixing parameter in B 0 d -anti B 0 d mixing, and the present measurements of the matrix elements parallel V cb parallel and parallel V ub parallel . We take into account the next-to-leading order QCD results in our analysis, wherever available, and incorporate results stemming from the ongoing lattice calculations of the B-meson coupling constants, which predict a value f Bd = 200 ± 30 MeV, though for the sake of comparison we also show the CKM fits for smaller values of f Bd . We use the updated CKM matrix to predict the mixing ration x, relevant for B 0 s - anti B 0 s , mixing, and the phases in the CKM unitarity triangle, sin 2α, sin 2β and sin 2γ, which determine the CP-violating asymmetries on B-decays. The importance of measuring the ratio x, in restricting the allowed values of the CKM parameters is emphasized. (orig.)

  19. Rationing with baselines

    DEFF Research Database (Denmark)

    Hougaard, Jens Leth; Moreno-Ternero, Juan D.; Østerdal, Lars Peter Raahave

    2013-01-01

    We introduce a new operator for general rationing problems in which, besides conflicting claims, individual baselines play an important role in the rationing process. The operator builds onto ideas of composition, which are not only frequent in rationing, but also in related problems...... such as bargaining, choice, and queuing. We characterize the operator and show how it preserves some standard axioms in the literature on rationing. We also relate it to recent contributions in such literature....

  20. Impact of the updating scheme on stationary states of networks

    International Nuclear Information System (INIS)

    Radicchi, F; Ahn, Y Y; Meyer-Ortmanns, H

    2008-01-01

    From Boolean networks it is well known that the number of attractors as a function of the system size depends on the updating scheme which is chosen either synchronously or asynchronously. In this contribution, we report on a systematic interpolation between synchronous and asynchronous updating in a one-dimensional chain of Ising spins. The stationary state for fully synchronous updating is antiferromagnetic. The interpolation allows us to locate a phase transition between phases with an absorbing and a fluctuating stationary state. The associated universality class is that of parity conservation. We also report on a more recent study of asynchronous updates applied to the yeast cell-cycle network. Compared to the synchronous update, the basin of attraction of the largest attractor considerably shrinks and the convergence to the biological pathway slows down and is less dominant. Both examples illustrate how sensitively the stationary states and the properties of attractors can depend on the updating mode of the algorithm

  1. Map updates in a dynamic Voronoi data structure

    DEFF Research Database (Denmark)

    Mioc, Darka; Antón Castro, Francesc/François; Gold, C. M.

    2006-01-01

    In this paper we are using local and sequential map updates in the Voronoi data structure, which allows us to automatically record each event and performed map updates within the system. These map updates are executed through map construction commands that are composed of atomic actions (geometric...... algorithms for addition, deletion, and motion of spatial objects) on the dynamic Voronoi data structure. The formalization of map commands led to the development of a spatial language comprising a set of atomic operations or constructs on spatial primitives (points and lines), powerful enough to define...

  2. Effects of triplet Higgs bosons in long baseline neutrino experiments

    Science.gov (United States)

    Huitu, K.; Kärkkäinen, T. J.; Maalampi, J.; Vihonen, S.

    2018-05-01

    The triplet scalars (Δ =Δ++,Δ+,Δ0) utilized in the so-called type-II seesaw model to explain the lightness of neutrinos, would generate nonstandard interactions (NSI) for a neutrino propagating in matter. We investigate the prospects to probe these interactions in long baseline neutrino oscillation experiments. We analyze the upper bounds that the proposed DUNE experiment might set on the nonstandard parameters and numerically derive upper bounds, as a function of the lightest neutrino mass, on the ratio the mass MΔ of the triplet scalars, and the strength |λϕ| of the coupling ϕ ϕ Δ of the triplet Δ and conventional Higgs doublet ϕ . We also discuss the possible misinterpretation of these effects as effects arising from a nonunitarity of the neutrino mixing matrix and compare the results with the bounds that arise from the charged lepton flavor violating processes.

  3. Key Techniques for Dynamic Updating of National Fundamental Geographic Information Database

    Directory of Open Access Journals (Sweden)

    WANG Donghua

    2015-07-01

    Full Text Available One of the most important missions of fundamental surveying and mapping work is to keep the fundamental geographic information fresh. In this respect, National Administration of Surveying, Mapping and Geoinformation has launched the project of dynamic updating of national fundamental geographic information database since 2012, which aims to update 1:50 000, 1:250 000 and 1:1 000 000 national fundamental geographic information database continuously and quickly, by updating and publishing once a year. This paper introduces the general technical thinking of dynamic updating, states main technical methods, such as dynamic updating of fundamental database, linkage updating of derived databases, and multi-tense database management and service and so on, and finally introduces main technical characteristics and engineering applications.

  4. Mercury baseline levels in Flemish soils (Belgium)

    International Nuclear Information System (INIS)

    Tack, Filip M.G.; Vanhaesebroeck, Thomas; Verloo, Marc G.; Van Rompaey, Kurt; Ranst, Eric van

    2005-01-01

    It is important to establish contaminant levels that are normally present in soils to provide baseline data for pollution studies. Mercury is a toxic element of concern. This study was aimed at assessing baseline mercury levels in soils in Flanders. In a previous study, mercury contents in soils in Oost-Vlaanderen were found to be significantly above levels reported elsewhere. For the current study, observations were extended over two more provinces, West-Vlaanderen and Antwerpen. Ranges of soil Hg contents were distinctly higher in the province Oost-Vlaanderen (interquartile range from 0.09 to 0.43 mg/kg) than in the other provinces (interquartile ranges from 0.7 to 0.13 and 0.7 to 0.15 mg/kg for West-Vlaanderen and Antwerpen, respectively). The standard threshold method was applied to separate soils containing baseline levels of Hg from the data. Baseline concentrations for Hg were characterised by a median of 0.10 mg Hg/kg dry soil, an interquartile range from 0.07 to 0.14 mg/kg and a 90% percentile value of 0.30 mg/kg. The influence of soil properties such as clay and organic carbon contents, and pH on baseline Hg concentrations was not important. Maps of the spatial distribution of Hg levels showed that the province Oost-Vlaanderen exhibited zones with systematically higher Hg soil contents. This may be related to the former presence of many small-scale industries employing mercury in that region. - Increased mercury levels may reflect human activity

  5. Esophageal baseline impedance levels in patients with pathophysiological characteristics of functional heartburn.

    Science.gov (United States)

    Martinucci, I; de Bortoli, N; Savarino, E; Piaggi, P; Bellini, M; Antonelli, A; Savarino, V; Frazzoni, M; Marchi, S

    2014-04-01

    Recently, it has been suggested that low esophageal basal impedance may reflect impaired mucosal integrity and increased acid sensitivity. We aimed to compare baseline impedance levels in patients with heartburn and pathophysiological characteristics related to functional heartburn (FH) divided into two groups on the basis of symptom relief after proton pump inhibitors (PPIs). Patients with heartburn and negative endoscopy were treated with esomeprazole or pantoprazole 40 mg daily for 8 weeks. According to MII-pH (off therapy) analysis, patients with normal acid exposure time (AET), normal reflux number, and lack of association between symptoms and refluxes were selected; of whom 30 patients with a symptom relief higher than 50% after PPIs composed Group A, and 30 patients, matched for sex and age, without symptom relief composed Group B. A group of 20 healthy volunteers (HVs) was enrolled. For each patient and HV, we evaluated the baseline impedance levels at channel 3, during the overnight rest, at three different times. Group A (vs Group B) showed an increase in the following parameters: mean AET (1.4 ± 0.8% vs 0.5 ± 0.6%), mean reflux number (30.4 ± 8.7 vs 24 ± 6.9), proximal reflux number (11.1 ± 5.2 vs 8.2 ± 3.6), acid reflux number (17.9 ± 6.1 vs 10.7 ± 6.9). Baseline impedance levels were lower in Group A than in Group B and in HVs (p heartburn and normal AET could achieve a better understanding of pathophysiology in reflux disease patients, and could improve the distinction between FH and hypersensitive esophagus. © 2014 John Wiley & Sons Ltd.

  6. Implications of the top quark mass measurement for the CKM parameters, x$_{s}$ and CP asymmetries

    CERN Document Server

    Ali, A; London, D

    1995-01-01

    Motivated by the recent determination of the top quark mass by the CDF collaboration, \\mt =174 \\pm 10 ^{+13}_{-12} GeV, we review and update the constraints on the parameters of the quark flavour mixing matrix V_{CKM} in the standard model. In performing our fits, we use inputs from the measurements of the following quantities: (i) \\abseps, the CP-violating parameter in K decays, (ii) \\delmd, the mass difference due to the \\bdbdbar\\ mixing, (iii) the matrix elements \\absvcb and \\absvub, and (iv) B-hadron lifetimes. We find that the allowed region of the unitarity triangle is very large, mostly due to theoretical uncertainties. (This emphasizes the importance of measurements of CP-violating rate asymmetries in the B system.) Nevertheless, the present data do somewhat restrict the allowed values of the coupling constant product f_{B_d}\\sqrt{\\hat{B}_{B_d}} and the renormalization-scale invariant bag constant \\hat{B}_K. With the updated CKM matrix we present the currently-allowed range of the ratio \\vert V_{td}/V...

  7. Tank waste remediation system technical baseline summary description

    International Nuclear Information System (INIS)

    Raymond, R.E.

    1998-01-01

    This document is one of the tools used to develop and control the mission work as depicted in the included figure. This Technical Baseline Summary Description document is the top-level tool for management of the Technical Baseline for waste storage operations

  8. Neutrino oscillations on the way to long-baseline experiments

    CERN Document Server

    Ryabov, V A

    2003-01-01

    The motivations and physical objectives of experiments in the search for nu /sub mu / to nu /sub e/, nu /sub tau / oscillations in long- baseline accelerator neutrino beams are reviewed. Neutrino beams, detectors, and methods for detecting oscillations (detection of the disappearance of nu /sub mu /, and the appearance of nu /sub e/ and nu /sub tau /) in the current K2K (KEK to Super Kamiokande) experiment and in the MINOS (FNAL to Soudan) and OPERA (CERN to Gran Sasso) near-future experiments are discussed. Possibilities of measuring the oscillation parameters in these experiments are considered in connection with new data obtained in CHOOZ and Palo Verde reactor experiments, the solar neutrino deficit and nu /sub mu // nu /sub e/ anomaly of atmospheric neutrinos, which are observed in large-scale underground detectors, and the excess of nu /sub e/ events in the LSND experiment. Neutrino-oscillation scenarios used in models with three and four (including sterile) types of neutrino, as well as the possibility...

  9. Carbon tetrachloride ERA soil-gas baseline monitoring

    International Nuclear Information System (INIS)

    Fancher, J.D.

    1994-01-01

    From December 1991 through December 1993, Westinghouse Hanford Company performed routine baseline monitoring of selected wells ad soil-gas points twice weekly in the 200 West Area of the Hanford Site. This work supported the carbon Tetrachloride Expedited Response Action (ERA) and provided a solid baseline of volatile organic compound (VOC) concentrations in wells and in the subsurface at the ERA site. As site remediation continues, comparisons to this baseline can be one means of measuring the success of carbon tetrachloride vapor extraction. This report contains observations of the patterns and trends associated with data obtained during soil-gas monitoring at the 200 West Area: Monitoring performed since late 1991 includes monitoring soil-gas probes ad wellheads for volatile organic compounds (VOCs). This report reflects monitoring data collected from December 1991 through December 1993

  10. Energy Consumption Analysis for Concrete Residences—A Baseline Study in Taiwan

    Directory of Open Access Journals (Sweden)

    Kuo-Liang Lin

    2017-02-01

    Full Text Available Estimating building energy consumption is difficult because it deals with complex interactions among uncertain weather conditions, occupant behaviors, and building characteristics. To facilitate estimation, this study employs a benchmarking methodology to obtain energy baseline for sample buildings. Utilizing a scientific simulation tool, this study attempts to develop energy consumption baselines of two typical concrete residences in Taiwan, and subsequently allows a simplified energy consumption prediction process at an early design stage of building development. Using weather data of three metropolitan cities as testbeds, annual energy consumption of two types of modern residences are determined through a series of simulation sessions with different building settings. The impacts of key building characteristics, including building insulation, air tightness, orientation, location, and residence type, are carefully investigated. Sample utility bills are then collected to validate the simulated results, resulting in three adjustment parameters for normalization, including ‘number of residents’, ‘total floor area’, and ‘air conditioning comfort level’, for justification of occupant behaviors in different living conditions. Study results not only provide valuable benchmarking data serving as references for performance evaluation of different energy-saving strategies, but also show how effective extended building insulation, enhanced air tightness, and prudent selection of residence location and orientation can be for successful implementation of building sustainability in tropical and subtropical regions.

  11. Adaptive function projective synchronization of two-cell Quantum-CNN chaotic oscillators with uncertain parameters

    International Nuclear Information System (INIS)

    Sudheer, K. Sebastian; Sabir, M.

    2009-01-01

    This work investigates function projective synchronization of two-cell Quantum-CNN chaotic oscillators using adaptive method. Quantum-CNN oscillators produce nano scale chaotic oscillations under certain conditions. By Lyapunove stability theory, the adaptive control law and the parameter update law are derived to make the state of two chaotic systems function projective synchronized. Numerical simulations are presented to demonstrate the effectiveness of the proposed adaptive controllers.

  12. Statistical and perceptual updating: correlated impairments in right brain injury.

    Science.gov (United States)

    Stöttinger, Elisabeth; Filipowicz, Alex; Marandi, Elahe; Quehl, Nadine; Danckert, James; Anderson, Britt

    2014-06-01

    It has been hypothesized that many of the cognitive impairments commonly seen after right brain damage (RBD) can be characterized as a failure to build or update mental models. We (Danckert et al. in Neglect as a disorder of representational updating. NOVA Open Access, New York, 2012a; Cereb Cortex 22:2745-2760, 2012b) were the first to directly assess the association between RBD and updating and found that RBD patients were unable to exploit a strongly biased play strategy in their opponent in the children's game rock, paper, scissors. Given that this game required many other cognitive capacities (i.e., working memory, sustained attention, reward processing), RBD patients could have failed this task for various reasons other than a failure to update. To assess the generality of updating deficits after RBD, we had RBD, left brain-damaged (LBD) patients and healthy controls (HCs) describe line drawings that evolved gradually from one figure (e.g., rabbit) to another (e.g., duck) in addition to the RPS updating task. RBD patients took significantly longer to alter their perceptual report from the initial object to the final object than did LBD patients and HCs. Although both patient groups performed poorly on the RPS task, only the RBD patients showed a significant correlation between the two, very different, updating tasks. We suggest these data indicate a general deficiency in the ability to update mental representations following RBD.

  13. Accounting for the role of hematocrit in between-subject variations of MRI-derived baseline cerebral hemodynamic parameters and functional BOLD responses.

    Science.gov (United States)

    Xu, Feng; Li, Wenbo; Liu, Peiying; Hua, Jun; Strouse, John J; Pekar, James J; Lu, Hanzhang; van Zijl, Peter C M; Qin, Qin

    2018-01-01

    Baseline hematocrit fraction (Hct) is a determinant for baseline cerebral blood flow (CBF) and between-subject variation of Hct thus causes variation in task-based BOLD fMRI signal changes. We first verified in healthy volunteers (n = 12) that Hct values can be derived reliably from venous blood T 1 values by comparison with the conventional lab test. Together with CBF measured using phase-contrast MRI, this noninvasive estimation of Hct, instead of using a population-averaged Hct value, enabled more individual determination of oxygen delivery (DO 2 ), oxygen extraction fraction (OEF), and cerebral metabolic rate of oxygen (CMRO 2 ). The inverse correlation of CBF and Hct explained about 80% of between-subject variation of CBF in this relatively uniform cohort of subjects, as expected based on the regulation of DO 2 to maintain constant CMRO 2 . Furthermore, we compared the relationships of visual task-evoked BOLD response with Hct and CBF. We showed that Hct and CBF contributed 22%-33% of variance in BOLD signal and removing the positive correlation with Hct and negative correlation with CBF allowed normalization of BOLD signal with 16%-22% lower variability. The results of this study suggest that adjustment for Hct effects is useful for studies of MRI perfusion and BOLD fMRI. Hum Brain Mapp 39:344-353, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. 324 Building Baseline Radiological Characterization

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Reeder, J.C. Cooper

    2010-06-24

    This report documents the analysis of radiological data collected as part of the characterization study performed in 1998. The study was performed to create a baseline of the radiological conditions in the 324 Building.

  15. Supporting Frequent Updates in R-Trees: A Bottom-Up Approach

    DEFF Research Database (Denmark)

    Lee, Mong Li; Hsu, Wynne; Jensen, Christian Søndergaard

    2003-01-01

    Advances in hardware-related technologies promise to enable new data management applications that monitor continuous processes. In these applications, enormous amounts of state samples are obtained via sensors and are streamed to a database. Further, updates are very frequent and may exhibit...... and aims to improve update performance. It has different levels of reorganization—ranging from global to local—during updates, avoiding expensive top-down updates. A compact main-memory summary structure that allows direct access to the R-tree index nodes is used together with efficient bottom...

  16. The distance effect in numerical memory-updating tasks.

    Science.gov (United States)

    Lendínez, Cristina; Pelegrina, Santiago; Lechuga, Teresa

    2011-05-01

    Two experiments examined the role of numerical distance in updating numerical information in working memory. In the first experiment, participants had to memorize a new number only when it was smaller than a previously memorized number. In the second experiment, updating was based on an external signal, which removed the need to perform any numerical comparison. In both experiments, distance between the memorized number and the new one was manipulated. The results showed that smaller distances between the new and the old information led to shorter updating times. This graded facilitation suggests that the process by which information is substituted in the focus of attention involves maintaining the shared features between the new and the old number activated and selecting other new features to be activated. Thus, the updating cost may be related to amount of new features to be activated in the focus of attention.

  17. Low-Power Bitstream-Residual Decoder for H.264/AVC Baseline Profile Decoding

    Directory of Open Access Journals (Sweden)

    Xu Ke

    2009-01-01

    Full Text Available Abstract We present the design and VLSI implementation of a novel low-power bitstream-residual decoder for H.264/AVC baseline profile. It comprises a syntax parser, a parameter decoder, and an Inverse Quantization Inverse Transform (IQIT decoder. The syntax parser detects and decodes each incoming codeword in the bitstream under the control of a hierarchical Finite State Machine (FSM; the IQIT decoder performs inverse transform and quantization with pipelining and parallelism. Various power reduction techniques, such as data-driven based on statistic results, nonuniform partition, precomputation, guarded evaluation, hierarchical FSM decomposition, TAG method, zero-block skipping, and clock gating , are adopted and integrated throughout the bitstream-residual decoder. With innovative architecture, the proposed design is able to decode QCIF video sequences of 30 fps at a clock rate as low as 1.5 MHz. A prototype H.264/AVC baseline decoding chip utilizing the proposed decoder is fabricated in UMC 0.18  m 1P6M CMOS technology. The proposed design is measured under 1 V 1.8 V supply with 0.1 V step. It dissipates 76  W at 1 V and 253  W at 1.8 V.

  18. Environmental Regulatory Update Table, January/February 1995

    Energy Technology Data Exchange (ETDEWEB)

    Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Mayer, S.J.; Salk, M.S.

    1995-03-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives impacting environmental, health, and safety management responsibilities. the table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action.

  19. Environmental Regulatory Update Table, January/February 1995

    International Nuclear Information System (INIS)

    Houlberg, L.M.; Hawkins, G.T.; Bock, R.E.; Mayer, S.J.; Salk, M.S.

    1995-03-01

    The Environmental Regulatory Update Table provides information on regulatory initiatives impacting environmental, health, and safety management responsibilities. the table is updated bi-monthly with information from the Federal Register and other sources, including direct contact with regulatory agencies. Each table entry provides a chronological record of the rulemaking process for that initiative with an abstract and a projection of further action

  20. UCI2001: The updated catalogue of Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Panza, G.F.

    2002-05-01

    A new updated earthquake catalogue for the Italian territory, named UCI2001, is described here; it consists of an updated and revised version of the CCI1996 catalogue (Peresan et al., 1997). The revision essentially corresponds to the incorporation of data from the NEIC (National Earthquake Information Centre) and ALPOR (Catalogo delle Alpi Orientali) catalogues, while the updating is performed using the NEIC Preliminary Determinations of Epicenters since 1986. A brief overview of the catalogues used for the monitoring of seismicity in the Italian area is provided, together with the essential information about the structure of the UCI2001 catalogue and a description of its format. A complete list of the events, as on May 1 2002, is given in the Appendix. (author)