WorldWideScience

Sample records for cooling incorporating uncertainty

  1. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  2. Incorporating uncertainty in RADTRAN 6.0 input files.

    Energy Technology Data Exchange (ETDEWEB)

    Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John (Alion Science and Technology)

    2010-02-01

    Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine is required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.

  3. Incorporating model uncertainty into optimal insurance contract design

    OpenAIRE

    Pflug, G.; Timonina-Farkas, A.; Hochrainer-Stigler, S.

    2017-01-01

    In stochastic optimization models, the optimal solution heavily depends on the selected probability model for the scenarios. However, the scenario models are typically chosen on the basis of statistical estimates and are therefore subject to model error. We demonstrate here how the model uncertainty can be incorporated into the decision making process. We use a nonparametric approach for quantifying the model uncertainty and a minimax setup to find model-robust solutions. The method is illust...

  4. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  5. Uncertainty incorporated beam angle optimization for IMPT treatment planning.

    Science.gov (United States)

    Cao, Wenhua; Lim, Gino J; Lee, Andrew; Li, Yupeng; Liu, Wei; Ronald Zhu, X; Zhang, Xiaodong

    2012-08-01

    Beam angle optimization (BAO) by far remains an important and challenging problem in external beam radiation therapy treatment planning. Conventional BAO algorithms discussed in previous studies all focused on photon-based therapies. Impact of BAO on proton therapy is important while proton therapy increasingly receives great interests. This study focuses on potential benefits of BAO on intensity-modulated proton therapy (IMPT) that recently began available to clinical cancer treatment. The authors have developed a novel uncertainty incorporated BAO algorithm for IMPT treatment planning in that IMPT plan quality is highly sensitive to uncertainties such as proton range and setup errors. A linear programming was used to optimize robust intensity maps to scenario-based uncertainties for an incident beam angle configuration. Unlike conventional intensity-modulated radiation therapy with photons (IMXT), the search space for IMPT treatment beam angles may be relatively small but optimizing an IMPT plan may require higher computational costs due to larger data size. Therefore, a deterministic local neighborhood search algorithm that only needs a very limited number of plan objective evaluations was used to optimize beam angles in IMPT treatment planning. Three prostate cancer cases and two skull base chordoma cases were studied to demonstrate the dosimetric advantages and robustness of optimized beam angles from the proposed BAO algorithm. Two- to four-beam plans were optimized for prostate cases, and two- and three-beam plans were optimized for skull base cases. By comparing plans with conventional two parallel-opposed angles, all plans with optimized angles consistently improved sparing at organs at risks, i.e., rectum and femoral heads for prostate, brainstem for skull base, in either nominal dose distribution or uncertainty-based dose distributions. The efficiency of the BAO algorithm was demonstrated by comparing it with alternative methods including simulated

  6. Making Invasion models useful for decision makers; incorporating uncertainty, knowledge gaps, and decision-making preferences

    Science.gov (United States)

    Denys Yemshanov; Frank H Koch; Mark Ducey

    2015-01-01

    Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision maker’s perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...

  7. Towards a relational concept of uncertainty: incorporating the human dimension

    NARCIS (Netherlands)

    Brugnach, M.; Dewulf, A.; Pahl-Wostl, C.; Taillieu, T.

    2007-01-01

    In this paper, we extend the conceptualization of uncertainties in natural resources management. Uncertainties come in different kinds, as it is apparent from the multiple classifications and typologies of uncertainties in the literature. Here, we re-contextualize uncertainty in a broader way - its

  8. Evaluating Prognostics Performance for Algorithms Incorporating Uncertainty Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Uncertainty Representation and Management (URM) are an integral part of the prognostic system development.1As capabilities of prediction algorithms evolve, research...

  9. Uncertainties in physics calculations for gas cooled reactor cores

    International Nuclear Information System (INIS)

    1991-04-01

    The meeting was attended by 29 participants from Austria, China, France, Germany, Japan, Switzerland, the Union of Soviet Socialist Republics and the United States of America and was subdivided into four technical sessions: Analytical methods, comparison of predictions with results from existing HTGRs, uncertainty evaluations (3 papers); Analytical methods, predictions of performance of future HTGRs, uncertainty evaluations - part 1 (5 papers); Analytical methods, predictions of performance of future HTGRs, uncertainty evaluations - part 2 (6 papers); Critical experiments - planning and results, uncertainty evaluations (5 papers). The participants presented 19 papers on behalf of their countries or organizations. A separate abstract was prepared for each of these papers. Refs, figs and tabs

  10. Incorporating Uncertainty into Backward Erosion Piping Risk Assessments

    Directory of Open Access Journals (Sweden)

    Robbins Bryant A.

    2016-01-01

    Full Text Available Backward erosion piping (BEP is a type of internal erosion that typically involves the erosion of foundation materials beneath an embankment. BEP has been shown, historically, to be the cause of approximately one third of all internal erosion related failures. As such, the probability of BEP is commonly evaluated as part of routine risk assessments for dams and levees in the United States. Currently, average gradient methods are predominantly used to perform these assessments, supported by mean trends of critical gradient observed in laboratory flume tests. Significant uncertainty exists surrounding the mean trends of critical gradient used in practice. To quantify this uncertainty, over 100 laboratory-piping tests were compiled and analysed to assess the variability of laboratory measurements of horizontal critical gradient. Results of these analyses indicate a large amount of uncertainty surrounding critical gradient measurements for all soils, with increasing uncertainty as soils become less uniform.

  11. How incorporating more data reduces uncertainty in recovery predictions

    Energy Technology Data Exchange (ETDEWEB)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  12. Thoughts on Incorporating HPRF in a Linear Cooling Channel

    International Nuclear Information System (INIS)

    Gallardo, Juan C.; Zisman, Michael S.

    2009-01-01

    We discuss a possible implementation of high-pressure gas-filled RF (HPRF) cavities in a linear cooling channel for muons and some of the technical issues that must be dealt with. The approach we describe is a hybrid approach that uses high-pressure hydrogen gas to avoid cavity breakdown, along with discrete LiH absorbers to provide the majority of the energy loss. Initial simulations show that the channel performs as well as the original vacuum RF channel while potentially avoiding the degradation in RF gradient associated with the strong magnetic field in the cooling channel.

  13. Gauge theories under incorporation of a generalized uncertainty principle

    International Nuclear Information System (INIS)

    Kober, Martin

    2010-01-01

    There is considered an extension of gauge theories according to the assumption of a generalized uncertainty principle which implies a minimal length scale. A modification of the usual uncertainty principle implies an extended shape of matter field equations like the Dirac equation. If there is postulated invariance of such a generalized field equation under local gauge transformations, the usual covariant derivative containing the gauge potential has to be replaced by a generalized covariant derivative. This leads to a generalized interaction between the matter field and the gauge field as well as to an additional self-interaction of the gauge field. Since the existence of a minimal length scale seems to be a necessary assumption of any consistent quantum theory of gravity, the gauge principle is a constitutive ingredient of the standard model, and even gravity can be described as gauge theory of local translations or Lorentz transformations, the presented extension of gauge theories appears as a very important consideration.

  14. Incorporating uncertainties into economic forecasts: an application to forecasting economic activity in Croatia

    Directory of Open Access Journals (Sweden)

    Dario Rukelj

    2011-06-01

    Full Text Available In this paper we present a framework for incorporating uncertainties into economic activity forecasts for Croatia. Using the vector error correction model (VECM proposed by Rukelj (2010 as the benchmark model, we forecast densities of the variable of interest using stochastic simulations for incorporating future and parameter uncertainty. We exploit the use of parametric and non-parametric approaches in generating random shocks as in Garrat et al. (2003. Finally we evaluate the results by the Kolmogorov-Smirnov and Anderson-Darling test of probability integral transforms. The main findings are: (1 the parametric and the non-parametric approach yield similar results; (2 the incorporation of parameter uncertainty results in much wider probability forecast; and (3 evaluation of density forecasts indicates better performance when only future uncertainties are considered and parameter uncertainties are excluded.

  15. Incorporating the Technology Roadmap Uncertainties into the Project Risk Assessment

    International Nuclear Information System (INIS)

    Bonnema, B.E.

    2002-01-01

    This paper describes two methods, Technology Roadmapping and Project Risk Assessment, which were used to identify and manage the technical risks relating to the treatment of sodium bearing waste at the Idaho National Engineering and Environmental Laboratory. The waste treatment technology under consideration was Direct Vitrification. The primary objective of the Technology Roadmap is to identify technical data uncertainties for the technologies involved and to prioritize the testing or development studies to fill the data gaps. Similarly, project management's objective for a multi-million dollar construction project includes managing all the key risks in accordance to DOE O 413.3 - ''Program and Project Management for the Acquisition of Capital Assets.'' In the early stages, the Project Risk Assessment is based upon a qualitative analysis for each risk's probability and consequence. In order to clearly prioritize the work to resolve the technical issues identified in the Technology Roadmap, the issues must be cross- referenced to the project's Risk Assessment. This will enable the project to get the best value for the cost to mitigate the risks

  16. A method to incorporate uncertainty in the classification of remote sensing images

    OpenAIRE

    Gonçalves, Luísa M. S.; Fonte, Cidália C.; Júlio, Eduardo N. B. S.; Caetano, Mario

    2009-01-01

    The aim of this paper is to investigate if the incorporation of the uncertainty associated with the classification of surface elements into the classification of landscape units (LUs) increases the results accuracy. To this end, a hybrid classification method is developed, including uncertainty information in the classification of very high spatial resolution multi-spectral satellite images, to obtain a map of LUs. The developed classification methodology includes the following...

  17. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    Science.gov (United States)

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to

  18. Stochastic modelling of landfill processes incorporating waste heterogeneity and data uncertainty

    International Nuclear Information System (INIS)

    Zacharof, A.I.; Butler, A.P.

    2004-01-01

    A landfill is a very complex heterogeneous environment and as such it presents many modelling challenges. Attempts to develop models that reproduce these complexities generally involve the use of large numbers of spatially dependent parameters that cannot be properly characterised in the face of data uncertainty. An alternative method is presented, which couples a simplified microbial degradation model with a stochastic hydrological and contaminant transport model. This provides a framework for incorporating the complex effects of spatial heterogeneity within the landfill in a simplified manner, along with other key variables. A methodology for handling data uncertainty is also integrated into the model structure. Illustrative examples of the model's output are presented to demonstrate effects of data uncertainty on leachate composition and gas volume prediction

  19. Incorporating uncertainty of management costs in sensitivity analyses of matrix population models.

    Science.gov (United States)

    Salomon, Yacov; McCarthy, Michael A; Taylor, Peter; Wintle, Brendan A

    2013-02-01

    The importance of accounting for economic costs when making environmental-management decisions subject to resource constraints has been increasingly recognized in recent years. In contrast, uncertainty associated with such costs has often been ignored. We developed a method, on the basis of economic theory, that accounts for the uncertainty in population-management decisions. We considered the case where, rather than taking fixed values, model parameters are random variables that represent the situation when parameters are not precisely known. Hence, the outcome is not precisely known either. Instead of maximizing the expected outcome, we maximized the probability of obtaining an outcome above a threshold of acceptability. We derived explicit analytical expressions for the optimal allocation and its associated probability, as a function of the threshold of acceptability, where the model parameters were distributed according to normal and uniform distributions. To illustrate our approach we revisited a previous study that incorporated cost-efficiency analyses in management decisions that were based on perturbation analyses of matrix population models. Incorporating derivations from this study into our framework, we extended the model to address potential uncertainties. We then applied these results to 2 case studies: management of a Koala (Phascolarctos cinereus) population and conservation of an olive ridley sea turtle (Lepidochelys olivacea) population. For low aspirations, that is, when the threshold of acceptability is relatively low, the optimal strategy was obtained by diversifying the allocation of funds. Conversely, for high aspirations, the budget was directed toward management actions with the highest potential effect on the population. The exact optimal allocation was sensitive to the choice of uncertainty model. Our results highlight the importance of accounting for uncertainty when making decisions and suggest that more effort should be placed on

  20. Incorporating Wind Power Forecast Uncertainties Into Stochastic Unit Commitment Using Neural Network-Based Prediction Intervals.

    Science.gov (United States)

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2015-09-01

    Penetration of renewable energy resources, such as wind and solar power, into power systems significantly increases the uncertainties on system operation, stability, and reliability in smart grids. In this paper, the nonparametric neural network-based prediction intervals (PIs) are implemented for forecast uncertainty quantification. Instead of a single level PI, wind power forecast uncertainties are represented in a list of PIs. These PIs are then decomposed into quantiles of wind power. A new scenario generation method is proposed to handle wind power forecast uncertainties. For each hour, an empirical cumulative distribution function (ECDF) is fitted to these quantile points. The Monte Carlo simulation method is used to generate scenarios from the ECDF. Then the wind power scenarios are incorporated into a stochastic security-constrained unit commitment (SCUC) model. The heuristic genetic algorithm is utilized to solve the stochastic SCUC problem. Five deterministic and four stochastic case studies incorporated with interval forecasts of wind power are implemented. The results of these cases are presented and discussed together. Generation costs, and the scheduled and real-time economic dispatch reserves of different unit commitment strategies are compared. The experimental results show that the stochastic model is more robust than deterministic ones and, thus, decreases the risk in system operations of smart grids.

  1. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    Directory of Open Access Journals (Sweden)

    Gerhard Strydom

    2013-01-01

    Full Text Available The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC transient PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS or Latin Hypercube Sampling (LHS data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.

  2. Incorporating uncertainties into risk assessment with an application to the exploratory studies facilities at Yucca Mountain

    International Nuclear Information System (INIS)

    Fathauer, P.M.

    1995-08-01

    A methodology that incorporates variability and reducible sources of uncertainty into the probabilistic and consequence components of risk was developed. The method was applied to the north tunnel of the Exploratory Studies Facility at Yucca Mountain in Nevada. In this assessment, variability and reducible sources of uncertainty were characterized and propagated through the risk assessment models using a Monte Carlo based software package. The results were then manipulated into risk curves at the 5% and 95% confidence levels for both the variability and overall uncertainty analyses, thus distinguishing between variability and reducible sources of uncertainty. In the Yucca Mountain application, the designation of the north tunnel as an item important to public safety, as defined by 10 CFR 60, was determined. Specifically, the annual frequency of a rock fall breaching a waste package causing an off-site dose of 500 mrem (5x10 -3 Sv) was calculated. The annual frequency, taking variability into account, ranged from 1.9x10 -9 per year at the 5% confidence level to 2.5x10 -9 per year at the 95% confidence level. The frequency range after including all uncertainty was 9.5x10 -10 to 1.8x10 -8 per year. The maximum observable frequency, at the 100% confidence level, was 4.9x10 -8 per year. This is below the 10 -6 per year frequency criteria of 10 CFR 60. Therefore, based on this work, the north tunnel does not fall under the items important to public safety designation for the event studied

  3. Atherosclerotic plaque component segmentation in combined carotid MRI and CTA data incorporating class label uncertainty

    DEFF Research Database (Denmark)

    van Engelen, Arna; Niessen, Wiro J.; Klein, Stefan

    2014-01-01

    Atherosclerotic plaque composition can indicate plaque vulnerability. We segment atherosclerotic plaque components from the carotid artery on a combination of in vivo MRI and CT-angiography (CTA) data using supervised voxelwise classification. In contrast to previous studies the ground truth...... for training is directly obtained from 3D registration with histology for fibrous and lipid-rich necrotic tissue, and with [Formula: see text]CT for calcification. This registration does, however, not provide accurate voxelwise correspondence. We therefore evaluate three approaches that incorporate uncertainty......), II) samples are weighted by the local contour distance of the lumen and outer wall between histology and in vivo data, and III) 10% of each class is rejected by Gaussian outlier rejection. Classification was evaluated on the relative volumes (% of tissue type in the vessel wall) for calcified...

  4. Multi-model inference for incorporating trophic and climate uncertainty into stock assessments

    Science.gov (United States)

    Ianelli, James; Holsman, Kirstin K.; Punt, André E.; Aydin, Kerim

    2016-12-01

    Ecosystem-based fisheries management (EBFM) approaches allow a broader and more extensive consideration of objectives than is typically possible with conventional single-species approaches. Ecosystem linkages may include trophic interactions and climate change effects on productivity for the relevant species within the system. Presently, models are evolving to include a comprehensive set of fishery and ecosystem information to address these broader management considerations. The increased scope of EBFM approaches is accompanied with a greater number of plausible models to describe the systems. This can lead to harvest recommendations and biological reference points that differ considerably among models. Model selection for projections (and specific catch recommendations) often occurs through a process that tends to adopt familiar, often simpler, models without considering those that incorporate more complex ecosystem information. Multi-model inference provides a framework that resolves this dilemma by providing a means of including information from alternative, often divergent models to inform biological reference points and possible catch consequences. We apply an example of this approach to data for three species of groundfish in the Bering Sea: walleye pollock, Pacific cod, and arrowtooth flounder using three models: 1) an age-structured "conventional" single-species model, 2) an age-structured single-species model with temperature-specific weight at age, and 3) a temperature-specific multi-species stock assessment model. The latter two approaches also include consideration of alternative future climate scenarios, adding another dimension to evaluate model projection uncertainty. We show how Bayesian model-averaging methods can be used to incorporate such trophic and climate information to broaden single-species stock assessments by using an EBFM approach that may better characterize uncertainty.

  5. Incorporation of Uncertainty Analysis in Experimental/Computational Fluid Dynamics Validations

    National Research Council Canada - National Science Library

    Coleman, Hugh

    2002-01-01

    A quantitative approach to verification and validation of simulations was developed which properly takes into account the uncertainties in experimental data and the uncertainties in the simulation result...

  6. Supporting Sustainable Markets Through Life Cycle Assessment: Evaluating emerging technologies, incorporating uncertainty and the consumer perspective

    Science.gov (United States)

    Merugula, Laura

    As civilization's collective knowledge grows, we are met with the realization that human-induced physical and biological transformations influenced by exogenous psychosocial and economic factors affect virtually every ecosystem on the planet. Despite improvements in energy generation and efficiencies, demand of material goods and energy services increases with no sign of a slowing pace. Sustainable development requires a multi-prong approach that involves reshaping demand, consumer education, sustainability-oriented policy, and supply chain management that does not serve the expansionist mentality. Thus, decision support tools are needed that inform developers, consumers, and policy-makers for short-term and long-term planning. These tools should incorporate uncertainty through quantitative methods as well as qualitatively informing the nature of the model as imperfect but necessary and adequate. A case study is presented of the manufacture and deployment of utility-scale wind turbines evaluated for a proposed change in blade manufacturing. It provides the first life cycle assessment (LCA) evaluating impact of carbon nanofibers, an emerging material, proposed for integration to wind power generation systems as blade reinforcement. Few LCAs of nanoproducts are available in scientific literature due to research and development (R&D) for applications that continues to outpace R&D for environmental, health, and safety (EHS) and life cycle impacts. LCAs of emerging technologies are crucial for informing developers of potential impacts, especially where market growth is swift and dissipative. A second case study is presented that evaluates consumer choice between disposable and reusable beverage cups. While there are a few studies that attempt to make the comparison using LCA, none adequately address uncertainty, nor are they representative for the typical American consumer. By disaggregating U.S. power generation into 26 subregional grid production mixes and evaluating

  7. A novel personal cooling system (PCS) incorporated with phase change materials (PCMs) and ventilation fans: An investigation on its cooling efficiency.

    Science.gov (United States)

    Lu, Yehu; Wei, Fanru; Lai, Dandan; Shi, Wen; Wang, Faming; Gao, Chuansi; Song, Guowen

    2015-08-01

    Personal cooling systems (PCS) have been developed to mitigate the impact of severe heat stress for humans working in hot environments. It is still a great challenge to develop PCSs that are portable, inexpensive, and effective. We studied the performance of a new hybrid PCS incorporating both ventilation fans and phase change materials (PCMs). The cooling efficiency of the newly developed PCS was investigated on a sweating manikin in two hot conditions: hot humid (HH, 34°C, 75% RH) and hot dry (HD, 34°C, 28% RH). Four test scenarios were selected: fans off with no PCMs (i.e., Fan-off, the CONTROL), fans on with no PCMs (i.e., Fan-on), fans off with fully solidified PCMs (i.e., PCM+Fan-off), and fans on with fully solidified PCMs (i.e., PCM+Fan-on). It was found that the addition of PCMs provided a 54∼78min cooling in HH condition. In contrast, the PCMs only offered a 19-39min cooling in HD condition. In both conditions, the ventilation fans greatly enhanced the evaporative heat loss compared with Fan-off. The hybrid PCS (i.e., PCM+Fan-on) provided a continuous cooling effect during the three-hour test and the average cooling rate for the whole body was around 111 and 315W in HH and HD conditions, respectively. Overall, the new hybrid PCS may be an effective means of ameliorating symptoms of heat stress in both hot-humid and hot-dry environments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Uncertainties

    Indian Academy of Sciences (India)

    The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...

  9. Because Doubt Is A Sure Thing: Incorporating Uncertainty Characterization Into Climate Change Decision-Making

    Science.gov (United States)

    Moss, R.; Rice, J.; Scott, M. J.; Unwin, S.; Whitney, P.

    2012-12-01

    This presentation describes the results of new research to develop a stakeholder-driven uncertainty characterization (UC) process to help address the challenges of regional climate change mitigation and adaptation decisions. Integrated regional Earth system models are a promising approach for modeling how climate change may affect natural resources, infrastructure, and socioeconomic conditions at regional scales, and how different adaptation and mitigation strategies may interact. However, the inherent complexity, long run-times, and large numbers of uncertainties in coupled regional human-environment systems render standard, model-driven approaches for uncertainty characterization infeasible. This new research focuses on characterizing stakeholder decision support needs as part of an overall process to identify the key uncertainties relevant for the application in question. The stakeholder-driven process reduces the dimensionality of the uncertainty modeling challenge while providing robust insights for science and decision-making. This research is being carried out as part of the integrated Regional Earth System Model (iRESM) initiative, a new scientific framework developed at Pacific Northwest National Laboratory to evaluate the interactions between human and environmental systems and mitigation and adaptation decisions at regional scales. The framework provides a flexible architecture for model couplings between a regional Earth system model, a regional integrated assessment model, and highly spatially resolved models of crop productivity, building energy demands, electricity infrastructure operation and expansion, and water supply and management. In an example of applying the stakeholder-driven UC process, the presentation first identifies stakeholder decision criteria for a particular regional mitigation or adaptation question. These criteria are used in conjunction with the flexible architecture to determine the relevant component models for coupling and the

  10. Wind Energy Management System EMS Integration Project: Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.; Ma, Jian; Guttromson, Ross T.; Subbarao, Krishnappa; Chakrabarti, Bhujanga B.

    2010-01-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind and solar power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation), and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and wind/solar forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. To improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter

  11. Expressions for the evaporation of sessile liquid droplets incorporating the evaporative cooling effect.

    Science.gov (United States)

    Wang, Yilin; Ma, Liran; Xu, Xuefeng; Luo, Jianbin

    2016-12-15

    The evaporation along the surface of pinned, sessile droplets is investigated numerically by using the combined field approach. In the present model, the evaporative cooling at the droplet surface which leads to a reduction in the evaporation is taken into account. Simple, yet accurate analytical expressions for the local evaporation flux and for the total evaporation rate of sessile droplets are obtained. The theoretical analyses indicate that the reduction in the evaporation becomes more pronounced as the evaporative cooling number Ec increases. The results also reveal that the variation of total evaporation rate with contact angle will change its trend as the intensity of the evaporative cooling changes. For small values of Ec, the total evaporation rate increases with the contact angle, the same as predicted by Deegan et al. and by Hu and Larson in their isothermal models in which the evaporative cooling is neglected. Contrarily, when the evaporative cooling effect is strong enough, the total evaporation rate will decrease as the contact angle increases. The present theory is corroborated experimentally, and found in good agreement with the expressions proposed by Hu and Larson in the limiting isothermal case. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    Science.gov (United States)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique

  13. Covering Materials Incorporating Radiation-Preventing Techniques to Meet Greenhouse Cooling Challenges in Arid Regions: A Review

    Directory of Open Access Journals (Sweden)

    Ahmed M. Abdel-Ghany

    2012-01-01

    Full Text Available Cooling greenhouses is essential to provide a suitable environment for plant growth in arid regions characterized by brackish water resources. However, using conventional cooling methods are facing many challenges. Filtering out near infra-red radiation (NIR at the greenhouse cover can significantly reduce the heating load and can solve the overheating problem of the greenhouse air. This paper is to review (i the problems of using conventional cooling methods and (ii the advantages of greenhouse covers that incorporate NIR reflectors. This survey focuses on how the cover type affects the transmittance of photosynthetically active radiation (PAR, the reflectance or absorptance of NIR and the greenhouse air temperature. NIR-reflecting plastic films seem to be the most suitable, low cost and simple cover for greenhouses under arid conditions. Therefore, this review discusses how various additives should be incorporated in plastic film to increase its mechanical properties, durability and ability to stand up to extremely harsh weather. Presently, NIR-reflecting covers are able to reduce greenhouse air temperature by no more than 5°C. This reduction is not enough in regions where the ambient temperature may exceed 45°C in summer. There is a need to develop improved NIR-reflecting plastic film covers.

  14. New methodology for assessing reactivity feedbacks and uncertainties in sodium-cooled fast reactors

    International Nuclear Information System (INIS)

    Bouret, C.; Buiron, L.; Rimpault, G.

    2014-01-01

    During the design of different SFR cores considered for the ASTRID reactor prototype, the main neutronic variables of interest are evaluated and then used to simulate transients. The assessment, management and reduction of uncertainties on these neutronic quantities help increase the margins and improve performance while cutting the costs of the concept. This paper aims at highlighting the benefits of using perturbation tools based on transport theory to solve the Boltzmann equation on 3D Hexagonal-Z geometry for the calculation of neutronic parameters, sensitivities and uncertainties for the heterogeneous core design. Moreover, a new methodology is proposed to access uncertainty affecting local reactivity feedback coefficients. This method allows taking into account the correlations between the different local reactivity feedback coefficients used to simulate transients. (authors)

  15. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  16. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  17. Incorporation of Satellite Data and Uncertainty in a Nationwide Groundwater Recharge Model in New Zealand

    Directory of Open Access Journals (Sweden)

    Rogier Westerhoff

    2018-01-01

    Full Text Available A nationwide model of groundwater recharge for New Zealand (NGRM, as described in this paper, demonstrated the benefits of satellite data and global models to improve the spatial definition of recharge and the estimation of recharge uncertainty. NGRM was inspired by the global-scale WaterGAP model but with the key development of rainfall recharge calculation on scales relevant to national- and catchment-scale studies (i.e., a 1 km × 1 km cell size and a monthly timestep in the period 2000–2014 provided by satellite data (i.e., MODIS-derived evapotranspiration, AET and vegetation in combination with national datasets of rainfall, elevation, soil and geology. The resulting nationwide model calculates groundwater recharge estimates, including their uncertainty, consistent across the country, which makes the model unique compared to all other New Zealand estimates targeted towards groundwater recharge. At the national scale, NGRM estimated an average recharge of 2500 m 3 /s, or 298 mm/year, with a model uncertainty of 17%. Those results were similar to the WaterGAP model, but the improved input data resulted in better spatial characteristics of recharge estimates. Multiple uncertainty analyses led to these main conclusions: the NGRM model could give valuable initial estimates in data-sparse areas, since it compared well to most ground-observed lysimeter data and local recharge models; and the nationwide input data of rainfall and geology caused the largest uncertainty in the model equation, which revealed that the satellite data could improve spatial characteristics without significantly increasing the uncertainty. Clearly the increasing volume and availability of large-scale satellite data is creating more opportunities for the application of national-scale models at the catchment, and smaller, scales. This should result in improved utility of these models including provision of initial estimates in data-sparse areas. Topics for future

  18. Wind Energy Management System Integration Project Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.; Ma, Jian; Guttromson, Ross T.; Subbarao, Krishnappa; Chakrabarti, Bhujanga B.

    2010-09-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and wind forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique

  19. Incorporating the Uncertainties of Nodal-Plane Orientation in the Seismo-Lineament Analysis Method (SLAM)

    Science.gov (United States)

    Cronin, V.; Sverdrup, K. A.

    2013-05-01

    The process of delineating a seismo-lineament has evolved since the first description of the Seismo-Lineament Analysis Method (SLAM) by Cronin et al. (2008, Env & Eng Geol 14(3) 199-219). SLAM is a reconnaissance tool to find the trace of the fault that produced an shallow-focus earthquake by projecting the corresponding nodal planes (NP) upward to their intersections with the ground surface, as represented by a DEM or topographic map. A seismo-lineament is formed by the intersection of the uncertainty volume associated with a given NP and the ground surface. The ground-surface trace of the fault that produced the earthquake is likely to be within one of the two seismo-lineaments associated with the two NPs derived from the earthquake's focal mechanism solution. When no uncertainty estimate has been reported for the NP orientation, the uncertainty volume associated with a given NP is bounded by parallel planes that are [1] tangent to the ellipsoidal uncertainty volume around the focus and [2] parallel to the NP. If the ground surface is planar, the resulting seismo-lineament is bounded by parallel lines. When an uncertainty is reported for the NP orientation, the seismo-lineament resembles a bow tie, with the epicenter located adjacent to or within the "knot." Some published lists of focal mechanisms include only one NP with associated uncertainties. The NP orientation uncertainties in strike azimuth (+/- gamma), dip angle (+/- epsilon) and rake that are output from an FPFIT analysis (Reasenberg and Oppenheimer, 1985, USGS OFR 85-739) are taken to be the same for both NPs (Oppenheimer, 2013, pers com). The boundaries of the NP uncertainty volume are each comprised by planes that are tangent to the focal uncertainty ellipsoid. One boundary, whose nearest horizontal distance from the epicenter is greater than or equal to that of the other boundary, is formed by the set of all planes with strike azimuths equal to the reported NP strike azimuth +/- gamma, and dip angle

  20. Technique Incorporating Cooling & Contraction / Expansion Analysis to Illustrate Shrinkage Tendency in Cast Irons

    Science.gov (United States)

    Stan, S.; Chisamera, M.; Riposan, I.; Neacsu, L.; Cojocaru, A. M.; Stan, I.

    2017-06-01

    With the more widespread adoption of thermal analysis testing, thermal analysis data have become an indicator of cast iron quality. The cooling curve and its first derivative display patterns that can be used to predict the characteristics of a cast iron. An experimental device was developed with a technique to simultaneously evaluate cooling curves and expansion or contraction of cast metals during solidification. Its application is illustrated with results on shrinkage tendency of ductile iron treated with FeSiMgRECa master alloy and inoculated with FeSi based alloys, as affected by mould rigidity (green sand and resin sand moulds). Undercooling at the end of solidification relative to the metastable (carbidic) equilibrium temperature and the expansion within the solidification sequence appear to have a strong influence on the susceptibility to macro - and micro - shrinkage in ductile iron castings. Green sand moulds, as less rigid moulds, encourage the formation of contraction defects, not only because of high initial expansion values, but also because of a higher cooling rate during solidification, and consequently, increased undercooling below the metastable equilibrium temperature up to the end of solidification.

  1. Robust framework for PET image reconstruction incorporating system and measurement uncertainties.

    Directory of Open Access Journals (Sweden)

    Huafeng Liu

    Full Text Available In Positron Emission Tomography (PET, an optimal estimate of the radioactivity concentration is obtained from the measured emission data under certain criteria. So far, all the well-known statistical reconstruction algorithms require exactly known system probability matrix a priori, and the quality of such system model largely determines the quality of the reconstructed images. In this paper, we propose an algorithm for PET image reconstruction for the real world case where the PET system model is subject to uncertainties. The method counts PET reconstruction as a regularization problem and the image estimation is achieved by means of an uncertainty weighted least squares framework. The performance of our work is evaluated with the Shepp-Logan simulated and real phantom data, which demonstrates significant improvements in image quality over the least squares reconstruction efforts.

  2. Incorporating uncertainty in watershed management decision-making: A mercury TMDL case study

    Science.gov (United States)

    Labiosa, W.; Leckie, J.; Shachter, R.; Freyberg, D.; Rytuba, J.; ,

    2005-01-01

    Water quality impairment due to high mercury fish tissue concentrations and high mercury aqueous concentrations is a widespread problem in several sub-watersheds that are major sources of mercury to the San Francisco Bay. Several mercury Total Maximum Daily Load regulations are currently being developed to address this problem. Decisions about control strategies are being made despite very large uncertainties about current mercury loading behavior, relationships between total mercury loading and methyl mercury formation, and relationships between potential controls and mercury fish tissue levels. To deal with the issues of very large uncertainties, data limitations, knowledge gaps, and very limited State agency resources, this work proposes a decision analytical alternative for mercury TMDL decision support. The proposed probabilistic decision model is Bayesian in nature and is fully compatible with a "learning while doing" adaptive management approach. Strategy evaluation, sensitivity analysis, and information collection prioritization are examples of analyses that can be performed using this approach.

  3. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    Science.gov (United States)

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection

  4. Incorporating reliability evaluation into the uncertainty analysis of electricity market price

    International Nuclear Information System (INIS)

    Kang, Chongqing; Bai, Lichao; Xia, Qing; Jiang, Jianjian; Zhao, Jing

    2005-01-01

    A novel model and algorithm for analyzing the uncertainties in electricity market is proposed in this paper. In this model, bidding decision is formulated as a probabilistic model that takes into account the decision-maker's willingness to bid, risk preferences, the fluctuation of fuel-price, etc. At the same time, generating unit's uncertain output model is considered by its forced outage rate (FOR). Based on the model, the uncertainty of market price is then analyzed. Taking the analytical results into consideration, not only the reliability of the power system can be conventionally analyzed, but also the possible distribution of market prices can be easily obtained. The probability distribution of market prices can be further used to calculate the expected output and the sales income of generating unit in the market. Based on these results, it is also possible to evaluate the risk involved by generating units. A simple system with four generating units is used to illustrate the proposed algorithm. The proposed algorithm and the modeling technique are expected to helpful to the market participants in making their economic decisions

  5. Incorporating covariance estimation uncertainty in spatial sampling design for prediction with trans-Gaussian random fields

    Directory of Open Access Journals (Sweden)

    Gunter eSpöck

    2015-05-01

    Full Text Available Recently, Spock and Pilz [38], demonstratedthat the spatial sampling design problem forthe Bayesian linear kriging predictor can betransformed to an equivalent experimentaldesign problem for a linear regression modelwith stochastic regression coefficients anduncorrelated errors. The stochastic regressioncoefficients derive from the polar spectralapproximation of the residual process. Thus,standard optimal convex experimental designtheory can be used to calculate optimal spatialsampling designs. The design functionals ̈considered in Spock and Pilz [38] did nottake into account the fact that kriging isactually a plug-in predictor which uses theestimated covariance function. The resultingoptimal designs were close to space-fillingconfigurations, because the design criteriondid not consider the uncertainty of thecovariance function.In this paper we also assume that thecovariance function is estimated, e.g., byrestricted maximum likelihood (REML. Wethen develop a design criterion that fully takesaccount of the covariance uncertainty. Theresulting designs are less regular and space-filling compared to those ignoring covarianceuncertainty. The new designs, however, alsorequire some closely spaced samples in orderto improve the estimate of the covariancefunction. We also relax the assumption ofGaussian observations and assume that thedata is transformed to Gaussianity by meansof the Box-Cox transformation. The resultingprediction method is known as trans-Gaussiankriging. We apply the Smith and Zhu [37]approach to this kriging method and show thatresulting optimal designs also depend on theavailable data. We illustrate our results witha data set of monthly rainfall measurementsfrom Upper Austria.

  6. Incorporating organ movements in IMRT treatment planning for prostate cancer: Minimizing uncertainties in the inverse planning process

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Oelfke, Uwe

    2005-01-01

    We investigate an off-line strategy to incorporate inter fraction organ movements in IMRT treatment planning. Nowadays, imaging modalities located in the treatment room allow for several CT scans of a patient during the course of treatment. These multiple CT scans can be used to estimate a probability distribution of possible patient geometries. This probability distribution can subsequently be used to calculate the expectation value of the delivered dose distribution. In order to incorporate organ movements into the treatment planning process, it was suggested that inverse planning could be based on that probability distribution of patient geometries instead of a single snapshot. However, it was shown that a straightforward optimization of the expectation value of the dose may be insufficient since the expected dose distribution is related to several uncertainties: first, this probability distribution has to be estimated from only a few images. And second, the distribution is only sparsely sampled over the treatment course due to a finite number of fractions. In order to obtain a robust treatment plan these uncertainties should be considered and minimized in the inverse planning process. In the current paper, we calculate a 3D variance distribution in addition to the expectation value of the dose distribution which are simultaniously optimized. The variance is used as a surrogate to quantify the associated risks of a treatment plan. The feasibility of this approach is demonstrated for clinical data of prostate patients. Different scenarios of dose expectation values and corresponding variances are discussed

  7. Atlas-based segmentation technique incorporating inter-observer delineation uncertainty for whole breast

    International Nuclear Information System (INIS)

    Bell, L R; Pogson, E M; Metcalfe, P; Holloway, L; Dowling, J A

    2017-01-01

    Accurate, efficient auto-segmentation methods are essential for the clinical efficacy of adaptive radiotherapy delivered with highly conformal techniques. Current atlas based auto-segmentation techniques are adequate in this respect, however fail to account for inter-observer variation. An atlas-based segmentation method that incorporates inter-observer variation is proposed. This method is validated for a whole breast radiotherapy cohort containing 28 CT datasets with CTVs delineated by eight observers. To optimise atlas accuracy, the cohort was divided into categories by mean body mass index and laterality, with atlas’ generated for each in a leave-one-out approach. Observer CTVs were merged and thresholded to generate an auto-segmentation model representing both inter-observer and inter-patient differences. For each category, the atlas was registered to the left-out dataset to enable propagation of the auto-segmentation from atlas space. Auto-segmentation time was recorded. The segmentation was compared to the gold-standard contour using the dice similarity coefficient (DSC) and mean absolute surface distance (MASD). Comparison with the smallest and largest CTV was also made. This atlas-based auto-segmentation method incorporating inter-observer variation was shown to be efficient (<4min) and accurate for whole breast radiotherapy, with good agreement (DSC>0.7, MASD <9.3mm) between the auto-segmented contours and CTV volumes. (paper)

  8. Incorporating atmospheric uncertainties into estimates of the detection capability of the IMS infrasound network

    Science.gov (United States)

    Le Pichon, Alexis; Ceranna, Lars; Taillepied, Doriane

    2015-04-01

    To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a dedicated network is being deployed. Multi-year observations recorded by the International Monitoring System (IMS) infrasound network confirm that its detection capability is highly variable in space and time. Today, numerical modeling techniques provide a basis to better understand the role of different factors describing the source and the atmosphere that influence propagation predictions. Previous studies estimated the radiated source energy from remote observations using frequency dependent attenuation relation and state-of-the-art specifications of the stratospheric wind. In order to account for a realistic description of the dynamic structure of the atmosphere, model predictions are further enhanced by wind and temperature error distributions as measured in the framework of the ARISE project (http://arise-project.eu/). In the context of the future verification of the CTBT, these predictions quantify uncertainties in the spatial and temporal variability of the IMS infrasound network performance in higher resolution, and will be helpful for the design and prioritizing maintenance of any arbitrary infrasound monitoring network.

  9. Semi-active control for vibration mitigation of structural systems incorporating uncertainties

    International Nuclear Information System (INIS)

    Miah, Mohammad S; Chatzi, Eleni N; Weber, Felix

    2015-01-01

    This study introduces a novel semi-active control scheme, where the linear-quadratic regulator (LQR) is combined with an unscented Kalman filter (UKF) observer, for the real-time mitigation of structural vibration. Due to a number of factors, such as environmental effects and ageing processes, the controlled system may be characterized by uncertainties. The UKF, which comprises a nonlinear observer, is employed herein for devising an adaptive semi-active control scheme capable of tackling such a challenge. This is achieved through the real-time realization of joint state and parameter estimation during the structural control process via the proposed LQR-UKF approach. The behavior of the introduced scheme is exemplified through two numerical applications. The efficacy of the devised methodology is firstly compared against the standard LQR-KF approach in a linear benchmark application where the system model is assumed known a priori, and secondly, the method is validated on a joint state and parameter estimation problem where the system model is assumed uncertain, formulated as nonlinear, and updated in real-time. (paper)

  10. Incorporation of velocity-dependent restitution coefficient and particle surface friction into kinetic theory for modeling granular flow cooling.

    Science.gov (United States)

    Duan, Yifei; Feng, Zhi-Gang

    2017-12-01

    Kinetic theory (KT) has been successfully used to model rapid granular flows in which particle interactions are frictionless and near elastic. However, it fails when particle interactions become frictional and inelastic. For example, the KT is not able to accurately predict the free cooling process of a vibrated granular medium that consists of inelastic frictional particles under microgravity. The main reason that the classical KT fails to model these flows is due to its inability to account for the particle surface friction and its inelastic behavior, which are the two most important factors that need be considered in modeling collisional granular flows. In this study, we have modified the KT model that is able to incorporate these two factors. The inelasticity of a particle is considered by establishing a velocity-dependent expression for the restitution coefficient based on many experimental studies found in the literature, and the particle friction effect is included by using a tangential restitution coefficient that is related to the particle friction coefficient. Theoretical predictions of the free cooling process by the classical KT and the improved KT are compared with the experimental results from a study conducted on an airplane undergoing parabolic flights without the influence of gravity [Y. Grasselli, G. Bossis, and G. Goutallier, Europhys. Lett. 86, 60007 (2009)10.1209/0295-5075/86/60007]. Our results show that both the velocity-dependent restitution coefficient and the particle surface friction are important in predicting the free cooling process of granular flows; the modified KT model that integrates these two factors is able to improve the simulation results and leads to better agreement with the experimental results.

  11. Incorporation of velocity-dependent restitution coefficient and particle surface friction into kinetic theory for modeling granular flow cooling

    Science.gov (United States)

    Duan, Yifei; Feng, Zhi-Gang

    2017-12-01

    Kinetic theory (KT) has been successfully used to model rapid granular flows in which particle interactions are frictionless and near elastic. However, it fails when particle interactions become frictional and inelastic. For example, the KT is not able to accurately predict the free cooling process of a vibrated granular medium that consists of inelastic frictional particles under microgravity. The main reason that the classical KT fails to model these flows is due to its inability to account for the particle surface friction and its inelastic behavior, which are the two most important factors that need be considered in modeling collisional granular flows. In this study, we have modified the KT model that is able to incorporate these two factors. The inelasticity of a particle is considered by establishing a velocity-dependent expression for the restitution coefficient based on many experimental studies found in the literature, and the particle friction effect is included by using a tangential restitution coefficient that is related to the particle friction coefficient. Theoretical predictions of the free cooling process by the classical KT and the improved KT are compared with the experimental results from a study conducted on an airplane undergoing parabolic flights without the influence of gravity [Y. Grasselli, G. Bossis, and G. Goutallier, Europhys. Lett. 86, 60007 (2009), 10.1209/0295-5075/86/60007]. Our results show that both the velocity-dependent restitution coefficient and the particle surface friction are important in predicting the free cooling process of granular flows; the modified KT model that integrates these two factors is able to improve the simulation results and leads to better agreement with the experimental results.

  12. Optimal Scanning Bandwidth Strategy Incorporating Uncertainty about Adversary’s Characteristics

    Directory of Open Access Journals (Sweden)

    Andrey Garnaev

    2014-12-01

    Full Text Available In this paper, we investigate the problem of designing a spectrum scanning strategy to detect an intelligent Invader who wants to utilize spectrum undetected for his/her unapproved purposes. To deal with this problem we model the situation as two games, between a Scanner and an Invader, and solve them sequentially. The first game is formulated to design the optimal (in maxmin sense scanning algorithm, while the second one allows one to find the optimal values of the parameters for the algorithm depending on the parameters of the network. These games provide solutions for two dilemmas that the rivals face. The Invader’s dilemma consists of the following: the more bandwidth the Invader attempts to use leads to a larger payoff if he is not detected, but at the same time also increases the probability of being detected and thus fined. Similarly, the Scanner faces a dilemma: the wider the bandwidth scanned, the higher the probability of detecting the Invader, but at the expense of increasing the cost of building the scanning system. The equilibrium strategies are found explicitly and reveal interesting properties. In particular, we have found a discontinuous dependence of the equilibrium strategies on the network parameters, fine and the type of the Invader’s award. This discontinuity of the fine means that the network provider has to take into account a human/social factor since some threshold values of fine could be very sensible for the Invader, while in other situations simply increasing the fine has a minimal deterrence impact. Also we show how incomplete information about the Invader’s technical characteristics and reward (e.g. motivated by using different type of application, say, video-streaming or downloading files can be incorporated into the scanning strategy to increase its efficiency.

  13. Development of a decision support tool for seasonal water supply management incorporating system uncertainties and operational constraints

    Science.gov (United States)

    Wang, H.; Asefa, T.

    2017-12-01

    A real-time decision support tool (DST) for water supply system would consider system uncertainties, e.g., uncertain streamflow and demand, as well as operational constraints and infrastructure outage (e.g., pump station shutdown, an offline reservoir due to maintenance). Such DST is often used by water managers for resource allocation and delivery for customers. Although most seasonal DST used by water managers recognize those system uncertainties and operational constraints, most use only historical information or assume deterministic outlook of water supply systems. This study presents a seasonal DST that incorporates rainfall/streamflow uncertainties, seasonal demand outlook and system operational constraints. Large scale climate-information is captured through a rainfall simulator driven by a Bayesian non-homogeneous Markov Chain Monte Carlo model that allows non-stationary transition probabilities contingent on Nino 3.4 index. An ad-hoc seasonal demand forecasting model considers weather conditions explicitly and socio-economic factors implicitly. Latin Hypercube sampling is employed to effectively sample probability density functions of flow and demand. Seasonal system operation is modelled as a mixed-integer optimization problem that aims at minimizing operational costs. It embeds the flexibility of modifying operational rules at different components, e.g., surface water treatment plants, desalination facilities, and groundwater pumping stations. The proposed framework is illustrated at a wholesale water supplier in Southeastern United States, Tampa Bay Water. The use of the tool is demonstrated in proving operational guidance in a typical drawdown and refill cycle of a regional reservoir. The DST provided: 1) probabilistic outlook of reservoir storage and chance of a successful refill by the end of rainy season; 2) operational expectations for large infrastructures (e.g., high service pumps and booster stations) throughout the season. Other potential use

  14. Growth of Errors and Uncertainties in Medium Range Ensemble Forecasts of U.S. East Coast Cool Season Extratropical Cyclones

    Science.gov (United States)

    Zheng, Minghua

    Cool-season extratropical cyclones near the U.S. East Coast often have significant impacts on the safety, health, environment and economy of this most densely populated region. Hence it is of vital importance to forecast these high-impact winter storm events as accurately as possible by numerical weather prediction (NWP), including in the medium-range. Ensemble forecasts are appealing to operational forecasters when forecasting such events because they can provide an envelope of likely solutions to serve user communities. However, it is generally accepted that ensemble outputs are not used efficiently in NWS operations mainly due to the lack of simple and quantitative tools to communicate forecast uncertainties and ensemble verification to assess model errors and biases. Ensemble sensitivity analysis (ESA), which employs a linear correlation and regression between a chosen forecast metric and the forecast state vector, can be used to analyze the forecast uncertainty development for both short- and medium-range forecasts. The application of ESA to a high-impact winter storm in December 2010 demonstrated that the sensitivity signals based on different forecast metrics are robust. In particular, the ESA based on the leading two EOF PCs can separate sensitive regions associated with cyclone amplitude and intensity uncertainties, respectively. The sensitivity signals were verified using the leave-one-out cross validation (LOOCV) method based on a multi-model ensemble from CMC, ECMWF, and NCEP. The climatology of ensemble sensitivities for the leading two EOF PCs based on 3-day and 6-day forecasts of historical cyclone cases was presented. It was found that the EOF1 pattern often represents the intensity variations while the EOF2 pattern represents the track variations along west-southwest and east-northeast direction. For PC1, the upper-level trough associated with the East Coast cyclone and its downstream ridge are important to the forecast uncertainty in cyclone

  15. Mathematical modelling and optimization of a large-scale combined cooling, heat, and power system that incorporates unit changeover and time-of-use electricity price

    International Nuclear Information System (INIS)

    Zhu, Qiannan; Luo, Xianglong; Zhang, Bingjian; Chen, Ying

    2017-01-01

    Highlights: • We propose a novel superstructure for the design and optimization of LSCCHP. • A multi-objective multi-period MINLP model is formulated. • The unit start-up cost and time-of-use electricity prices are involved. • Unit size discretization strategy is proposed to linearize the original MINLP model. • A case study is elaborated to demonstrate the effectiveness of the proposed method. - Abstract: Building energy systems, particularly large public ones, are major energy consumers and pollutant emission contributors. In this study, a superstructure of large-scale combined cooling, heat, and power system is constructed. The off-design unit, economic cost, and CO 2 emission models are also formulated. Moreover, a multi-objective mixed integer nonlinear programming model is formulated for the simultaneous system synthesis, technology selection, unit sizing, and operation optimization of large-scale combined cooling, heat, and power system. Time-of-use electricity price and unit changeover cost are incorporated into the problem model. The economic objective is to minimize the total annual cost, which comprises the operation and investment costs of large-scale combined cooling, heat, and power system. The environmental objective is to minimize the annual global CO 2 emission of large-scale combined cooling, heat, and power system. The augmented ε–constraint method is applied to achieve the Pareto frontier of the design configuration, thereby reflecting the set of solutions that represent optimal trade-offs between the economic and environmental objectives. Sensitivity analysis is conducted to reflect the impact of natural gas price on the combined cooling, heat, and power system. The synthesis and design of combined cooling, heat, and power system for an airport in China is studied to test the proposed synthesis and design methodology. The Pareto curve of multi-objective optimization shows that the total annual cost varies from 102.53 to 94.59 M

  16. Sensitivity and uncertainty analysis for the tritium breeding ratio of a DEMO fusion reactor with a helium cooled pebble bed blanket

    Science.gov (United States)

    Nunnenmann, Elena; Fischer, Ulrich; Stieglitz, Robert

    2017-09-01

    An uncertainty analysis was performed for the tritium breeding ratio (TBR) of a fusion power plant of the European DEMO type using the MCSEN patch to the MCNP Monte Carlo code. The breeding blanket was of the type Helium Cooled Pebble Bed (HCPB), currently under development in the European Power Plant Physics and Technology (PPPT) programme for a fusion power demonstration reactor (DEMO). A suitable 3D model of the DEMO reactor with HCPB blanket modules, as routinely used for blanket design calculations, was employed. The nuclear cross-section data were taken from the JEFF-3.2 data library. For the uncertainty analysis, the isotopes H-1, Li-6, Li-7, Be-9, O-16, Si-28, Si-29, Si-30, Cr-52, Fe-54, Fe-56, Ni-58, W-182, W-183, W-184 and W-186 were considered. The covariance data were taken from JEFF-3.2 where available. Otherwise a combination of FENDL-2.1 for Li-7, EFF-3 for Be-9 and JENDL-3.2 for O-16 were compared with data from TENDL-2014. Another comparison was performed with covariance data from JEFF-3.3T1. The analyses show an overall uncertainty of ± 3.2% for the TBR when using JEFF-3.2 covariance data with the mentioned additions. When using TENDL-2014 covariance data as replacement, the uncertainty increases to ± 8.6%. For JEFF-3.3T1 the uncertainty result is ± 5.6%. The uncertainty is dominated by O-16, Li-6 and Li-7 cross-sections.

  17. Elucidating Inherent Uncertainties in Data Assimilation for Predictions Incorporating Non-stationary Processes - Focus on Predictive Phenology

    Science.gov (United States)

    Lowman, L.; Barros, A. P.

    2017-12-01

    Data assimilation (DA) is the widely accepted procedure for estimating parameters within predictive models because of the adaptability and uncertainty quantification offered by Bayesian methods. DA applications in phenology modeling offer critical insights into how extreme weather or changes in climate impact the vegetation life cycle. Changes in leaf onset and senescence, root phenology, and intermittent leaf shedding imply large changes in the surface radiative, water, and carbon budgets at multiple scales. Models of leaf phenology require concurrent atmospheric and soil conditions to determine how biophysical plant properties respond to changes in temperature, light and water demand. Presently, climatological records for fraction of photosynthetically active radiation (FPAR) and leaf area index (LAI), the modelled states indicative of plant phenology, are not available. Further, DA models are typically trained on short periods of record (e.g. less than 10 years). Using limited records with a DA framework imposes non-stationarity on estimated parameters and the resulting predicted model states. This talk discusses how uncertainty introduced by the inherent non-stationarity of the modeled processes propagates through a land-surface hydrology model coupled to a predictive phenology model. How water demand is accounted for in the upscaling of DA model inputs and analysis period serves as a key source of uncertainty in the FPAR and LAI predictions. Parameters estimated from different DA effectively calibrate a plant water-use strategy within the land-surface hydrology model. For example, when extreme droughts are included in the DA period, the plants are trained to uptake water, transpire, and assimilate carbon under favorable conditions and quickly shut down at the onset of water stress.

  18. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    Science.gov (United States)

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and

  19. The incorporation of variability and uncertainty evaluations in WWTP design by means of stochastic dynamic modeling: the case of the Eindhoven WWTP upgrade.

    Science.gov (United States)

    Benedetti, Lorenzo; Belia, Evangelina; Cierkens, Katrijn; Flameling, Tony; De Baets, Bernard; Nopens, Ingmar; Weijers, Stefan

    2013-01-01

    This paper illustrates how a dynamic model can be used to evaluate a plant upgrade on the basis of post-upgrade performance data. The case study is that of the Eindhoven wastewater treatment plant upgrade completed in 2006. As a first step, the design process based on a static model was thoroughly analyzed and the choices regarding variability and uncertainty (i.e. safety factors) were made explicit. This involved the interpretation of the design guidelines and other assumptions made by the engineers. As a second step, a (calibrated) dynamic model of the plant was set up, able to reproduce the anticipated variability (duration and frequency). The third step was to define probability density functions for the parameters assumed to be uncertain, and propagate that uncertainty with the dynamic model by means of Monte Carlo simulations. The last step was the statistical evaluation and interpretation of the simulation results. This work should be regarded as a 'learning exercise' increasing the understanding of how and to what extent variability and uncertainty are currently incorporated in design guidelines used in practice and how model-based post-project appraisals could be performed.

  20. Hierarchical Bayesian analysis to incorporate age uncertainty in growth curve analysis and estimates of age from length: Florida manatee (Trichechus manatus) carcasses

    Science.gov (United States)

    Schwarz, L.K.; Runge, M.C.

    2009-01-01

    Age estimation of individuals is often an integral part of species management research, and a number of ageestimation techniques are commonly employed. Often, the error in these techniques is not quantified or accounted for in other analyses, particularly in growth curve models used to describe physiological responses to environment and human impacts. Also, noninvasive, quick, and inexpensive methods to estimate age are needed. This research aims to provide two Bayesian methods to (i) incorporate age uncertainty into an age-length Schnute growth model and (ii) produce a method from the growth model to estimate age from length. The methods are then employed for Florida manatee (Trichechus manatus) carcasses. After quantifying the uncertainty in the aging technique (counts of ear bone growth layers), we fit age-length data to the Schnute growth model separately by sex and season. Independent prior information about population age structure and the results of the Schnute model are then combined to estimate age from length. Results describing the age-length relationship agree with our understanding of manatee biology. The new methods allow us to estimate age, with quantified uncertainty, for 98% of collected carcasses: 36% from ear bones, 62% from length.

  1. Parameterizing Spatial Models of Infectious Disease Transmission that Incorporate Infection Time Uncertainty Using Sampling-Based Likelihood Approximations.

    Directory of Open Access Journals (Sweden)

    Rajat Malik

    Full Text Available A class of discrete-time models of infectious disease spread, referred to as individual-level models (ILMs, are typically fitted in a Bayesian Markov chain Monte Carlo (MCMC framework. These models quantify probabilistic outcomes regarding the risk of infection of susceptible individuals due to various susceptibility and transmissibility factors, including their spatial distance from infectious individuals. The infectious pressure from infected individuals exerted on susceptible individuals is intrinsic to these ILMs. Unfortunately, quantifying this infectious pressure for data sets containing many individuals can be computationally burdensome, leading to a time-consuming likelihood calculation and, thus, computationally prohibitive MCMC-based analysis. This problem worsens when using data augmentation to allow for uncertainty in infection times. In this paper, we develop sampling methods that can be used to calculate a fast, approximate likelihood when fitting such disease models. A simple random sampling approach is initially considered followed by various spatially-stratified schemes. We test and compare the performance of our methods with both simulated data and data from the 2001 foot-and-mouth disease (FMD epidemic in the U.K. Our results indicate that substantial computation savings can be obtained--albeit, of course, with some information loss--suggesting that such techniques may be of use in the analysis of very large epidemic data sets.

  2. Design and experimental analysis of counter-flow heat and mass exchanger incorporating (M-cycle) for evaporative cooling

    Science.gov (United States)

    Khalid, Omar; Butt, Zubair; Tanveer, Waqas; Rao, Hasan Iqbal

    2017-04-01

    In this paper, the functioning of dew-point cooler is improved in terms of its thermal effectiveness. For this reason, a heat and mass exchanger has been designed by using a counter-flow pattern incorporating Maisotsenko cycle (M-cycle) having effective absorbing material called Kraft paper on wet channel side and improved width to height ratio. Experimentation has been performed under various inlet air working parameters such as humidity, velocity and temperature in addition with changing feed water temperature. The results from the experiments specify that the dew-point and the wet-bulb effectiveness is achieved between 67-87 % and 104-120 % respectively. Analysis is performed with temperature variation between 25 and 45 °C at different absolute humidity levels ranging from 14.4 to 18 g/kg, while the inlet air velocity is varied between 0.88 and 1.50 m/s. Thus, the working ability of the improved design has been found 5 % more effective in terms of wet bulb effectiveness as compared to previous counter-flow designs.

  3. Design Safety Considerations for Water Cooled Small Modular Reactors Incorporating Lessons Learned from the Fukushima Daiichi Accident

    International Nuclear Information System (INIS)

    2016-03-01

    The global future deployment of advanced nuclear reactors for electricity generation depends primarily on the ability of nuclear industries, utilities and regulatory authorities to further enhance their reliability and economic competitiveness while satisfying stringent safety requirements. The IAEA has a project to help coordinate Member States efforts in the development and deployment of small and medium sized or small modular reactor (SMR) technology. This project aims simultaneously to facilitate SMR technology developers and potential SMR uses, particularly States embarking on a nuclear power programme, in identifying key enabling technologies and enhancing capacity building by resolving issues relevant to deployment, including nuclear reactor safety. The objective of this publication is to explore common practices for Member States, which will be an essential resource for future development and deployment of SMR technology. The accident at the Fukushima Daiichi nuclear power plant was caused by an unprecedented combination of natural events: a strong earthquake, beyond the design basis, followed by a series of tsunamis of heights exceeding the design basis tsunami considered in the flood analysis for the site. Consequently, all the operating nuclear power plants and advanced reactors under development, including SMRs, have been incorporating lessons learned from the accident to assure and enhance the performance of the engineered safety features in coping with such external events

  4. Restaurant food cooling practices.

    Science.gov (United States)

    Brown, Laura Green; Ripley, Danny; Blade, Henry; Reimann, Dave; Everstine, Karen; Nicholas, Dave; Egan, Jessica; Koktavy, Nicole; Quilliam, Daniela N

    2012-12-01

    Improper food cooling practices are a significant cause of foodborne illness, yet little is known about restaurant food cooling practices. This study was conducted to examine food cooling practices in restaurants. Specifically, the study assesses the frequency with which restaurants meet U.S. Food and Drug Administration (FDA) recommendations aimed at reducing pathogen proliferation during food cooling. Members of the Centers for Disease Control and Prevention's Environmental Health Specialists Network collected data on food cooling practices in 420 restaurants. The data collected indicate that many restaurants are not meeting FDA recommendations concerning cooling. Although most restaurant kitchen managers report that they have formal cooling processes (86%) and provide training to food workers on proper cooling (91%), many managers said that they do not have tested and verified cooling processes (39%), do not monitor time or temperature during cooling processes (41%), or do not calibrate thermometers used for monitoring temperatures (15%). Indeed, 86% of managers reported cooling processes that did not incorporate all FDA-recommended components. Additionally, restaurants do not always follow recommendations concerning specific cooling methods, such as refrigerating cooling food at shallow depths, ventilating cooling food, providing open-air space around the tops and sides of cooling food containers, and refraining from stacking cooling food containers on top of each other. Data from this study could be used by food safety programs and the restaurant industry to target training and intervention efforts concerning cooling practices. These efforts should focus on the most frequent poor cooling practices, as identified by this study.

  5. Complete Sensitivity/Uncertainty Analysis of LR-0 Reactor Experiments with MSRE FLiBe Salt and Perform Comparison with Molten Salt Cooled and Molten Salt Fueled Reactor Models

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nicholas R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Powers, Jeffrey J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Mueller, Don [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patton, Bruce W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-12-01

    In September 2016, reactor physics measurements were conducted at Research Centre Rez (RC Rez) using the FLiBe (2 7LiF + BeF2) salt from the Molten Salt Reactor Experiment (MSRE) in the LR-0 low power nuclear reactor. These experiments were intended to inform on neutron spectral effects and nuclear data uncertainties for advanced reactor systems using FLiBe salt in a thermal neutron energy spectrum. Oak Ridge National Laboratory (ORNL), in collaboration with RC Rez, performed sensitivity/uncertainty (S/U) analyses of these experiments as part of the ongoing collaboration between the United States and the Czech Republic on civilian nuclear energy research and development. The objectives of these analyses were (1) to identify potential sources of bias in fluoride salt-cooled and salt-fueled reactor simulations resulting from cross section uncertainties, and (2) to produce the sensitivity of neutron multiplication to cross section data on an energy-dependent basis for specific nuclides. This report provides a final report on the S/U analyses of critical experiments at the LR-0 Reactor relevant to fluoride salt-cooled high temperature reactor (FHR) and liquid-fueled molten salt reactor (MSR) concepts. In the future, these S/U analyses could be used to inform the design of additional FLiBe-based experiments using the salt from MSRE. The key finding of this work is that, for both solid and liquid fueled fluoride salt reactors, radiative capture in 7Li is the most significant contributor to potential bias in neutronics calculations within the FLiBe salt.

  6. Restaurant Food Cooling Practices†

    Science.gov (United States)

    BROWN, LAURA GREEN; RIPLEY, DANNY; BLADE, HENRY; REIMANN, DAVE; EVERSTINE, KAREN; NICHOLAS, DAVE; EGAN, JESSICA; KOKTAVY, NICOLE; QUILLIAM, DANIELA N.

    2017-01-01

    Improper food cooling practices are a significant cause of foodborne illness, yet little is known about restaurant food cooling practices. This study was conducted to examine food cooling practices in restaurants. Specifically, the study assesses the frequency with which restaurants meet U.S. Food and Drug Administration (FDA) recommendations aimed at reducing pathogen proliferation during food cooling. Members of the Centers for Disease Control and Prevention’s Environmental Health Specialists Network collected data on food cooling practices in 420 restaurants. The data collected indicate that many restaurants are not meeting FDA recommendations concerning cooling. Although most restaurant kitchen managers report that they have formal cooling processes (86%) and provide training to food workers on proper cooling (91%), many managers said that they do not have tested and verified cooling processes (39%), do not monitor time or temperature during cooling processes (41%), or do not calibrate thermometers used for monitoring temperatures (15%). Indeed, 86% of managers reported cooling processes that did not incorporate all FDA-recommended components. Additionally, restaurants do not always follow recommendations concerning specific cooling methods, such as refrigerating cooling food at shallow depths, ventilating cooling food, providing open-air space around the tops and sides of cooling food containers, and refraining from stacking cooling food containers on top of each other. Data from this study could be used by food safety programs and the restaurant industry to target training and intervention efforts concerning cooling practices. These efforts should focus on the most frequent poor cooling practices, as identified by this study. PMID:23212014

  7. Probability of Loss of Assured Safety in Systems with Multiple Time-Dependent Failure Modes: Incorporation of Delayed Link Failure in the Presence of Aleatory Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C. [Arizona State Univ., Tempe, AZ (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sallaberry, Cedric Jean-Marie. [Engineering Mechanics Corp. of Columbus, OH (United States)

    2018-02-01

    Probability of loss of assured safety (PLOAS) is modeled for weak link (WL)/strong link (SL) systems in which one or more WLs or SLs could potentially degrade into a precursor condition to link failure that will be followed by an actual failure after some amount of elapsed time. The following topics are considered: (i) Definition of precursor occurrence time cumulative distribution functions (CDFs) for individual WLs and SLs, (ii) Formal representation of PLOAS with constant delay times, (iii) Approximation and illustration of PLOAS with constant delay times, (iv) Formal representation of PLOAS with aleatory uncertainty in delay times, (v) Approximation and illustration of PLOAS with aleatory uncertainty in delay times, (vi) Formal representation of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, (vii) Approximation and illustration of PLOAS with delay times defined by functions of link properties at occurrence times for failure precursors, and (viii) Procedures for the verification of PLOAS calculations for the three indicated definitions of delayed link failure.

  8. Beam cooling

    OpenAIRE

    Danared, H

    2006-01-01

    Beam cooling is the technique of reducing the momentum spread and increasing the phase-space density of stored particle beams. This paper gives an introduction to beam cooling and Liouville’s theorem, and then it describes the three methods of active beam cooling that have been proven to work so far, namely electron cooling, stochastic cooling, and laser cooling. Ionization cooling is also mentioned briefly.

  9. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  10. Spray cooling

    International Nuclear Information System (INIS)

    Rollin, Philippe.

    1975-01-01

    Spray cooling - using water spraying in air - is surveyed as a possible system for make-up (peak clipping in open circuit) or major cooling (in closed circuit) of the cooling water of the condensers in thermal power plants. Indications are given on the experiments made in France and the systems recently developed in USA, questions relating to performance, cost and environmental effects of spray devices are then dealt with [fr

  11. Cooling tower

    International Nuclear Information System (INIS)

    Baer, E.; Dittrich, H.; Ernst, G.; Roller, W.

    1975-01-01

    The task on which the invention is based is to design a cooling tower in such a way that the negative influences of the wind, in particular strong side winds (wind velocities of over 10 m/s), on the functioning of the cooling tower are reduced or eliminated altogether. (orig./TK) [de

  12. Cooling tower waste reduction

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, S.J.; Celeste, J.; Chine, R.; Scott, C.

    1998-05-01

    At Lawrence Livermore National Laboratory (LLNL), the two main cooling tower systems (central and northwest) were upgraded during the summer of 1997 to reduce the generation of hazardous waste. In 1996, these two tower systems generated approximately 135,400 lbs (61,400 kg) of hazardous sludge, which is more than 90 percent of the hazardous waste for the site annually. At both, wet decks (cascade reservoirs) were covered to block sunlight. Covering the cascade reservoirs reduced the amount of chemical conditioners (e.g. algaecide and biocide), required and in turn the amount of waste generated was reduced. Additionally, at the northwest cooling tower system, a sand filtration system was installed to allow cyclical filtering and backflushing, and new pumps, piping, and spray nozzles were installed to increase agitation. the appurtenance upgrade increased the efficiency of the cooling towers. The sand filtration system at the northwest cooling tower system enables operators to continuously maintain the cooling tower water quality without taking the towers out of service. Operational costs (including waste handling and disposal) and maintenance activities are compared for the cooling towers before and after upgrades. Additionally, the effectiveness of the sand filter system in conjunction with the wet deck covers (northwest cooling tower system), versus the cascade reservoir covers alone (south cooling tower south) is discussed. the overall expected return on investment is calculated to be in excess of 250 percent. this upgrade has been incorporated into the 1998 DOE complex-wide water conservation project being led by Sandia National Laboratory/Albuquerque.

  13. ELECTRON COOLING OF RHIC.

    Energy Technology Data Exchange (ETDEWEB)

    BEN-ZVI, I.; LITVINENKO, V.; BARTON, D.; ET AL.

    2005-05-16

    We report progress on the R&D program for electron-cooling of the Relativistic Heavy Ion Collider (RHIC). This electron cooler is designed to cool 100 GeV/nucleon at storage energy using 54 MeV electrons. The electron source will be a superconducting RF photocathode gun. The accelerator will be a superconducting energy recovery linac. The frequency of the accelerator is set at 703.75 MHz. The maximum electron bunch frequency is 9.38 MHz, with bunch charge of 20 nC. The R&D program has the following components: The photoinjector and its photocathode, the superconducting linac cavity, start-to-end beam dynamics with magnetized electrons, electron cooling calculations including benchmarking experiments and development of a large superconducting solenoid. The photoinjector and linac cavity are being incorporated into an energy recovery linac aimed at demonstrating ampere class current at about 20 MeV.

  14. Evaluation of Commercial Off-the-Shelf and Government Off-the-Shelf Microclimate Cooling Systems

    National Research Council Canada - National Science Library

    Laprise, Brad; Teal, Walter; Zuckerman, Leah; Cardinal, Jason

    2005-01-01

    Professionals in dire need of body cooling, namely the First Responder community, can currently choose from an extensive variety of commercially available microclimate cooling products incorporating...

  15. Ventilative Cooling

    DEFF Research Database (Denmark)

    Heiselberg, Per Kvols; Kolokotroni, Maria

    This report, by venticool, summarises the outcome of the work of the initial working phase of IEA ECB Annex 62 Ventilative Cooling and is based on the findings in the participating countries. It presents a summary of the first official Annex 62 report that describes the state-of-the-art of ventil......This report, by venticool, summarises the outcome of the work of the initial working phase of IEA ECB Annex 62 Ventilative Cooling and is based on the findings in the participating countries. It presents a summary of the first official Annex 62 report that describes the state......-of-the-art of ventilative cooling potentials and limitations, its consideration in current energy performance regulations, available building components and control strategies and analysis methods and tools. In addition, the report provides twenty six examples of operational buildings using ventilative cooling ranging from...

  16. Davis-Besse uncertainty study

    International Nuclear Information System (INIS)

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  17. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  18. Electron Cooling of RHIC

    International Nuclear Information System (INIS)

    Ben-Zvi, I.; Barton, D.S.; Beavis, D.B.; Blaskiewicz, M.; Brennan, J.M.; Burrill, A.; Calaga, R.; Cameron, P.; Chang, X.Y.; Connolly, R.; Eidelman, Yu.I.; Fedotov, A.V.; Fischer, W.; Gassner, D.M.; Hahn, H.; Harrison, M.; Hershcovitch, A.; Hseuh, H.-C.; Jain, A.K.; Johnson, P.D.J.; Kayran, D.; Kewisch, J.; Lambiase, R.F.; Litvinenko, V.; MacKay, W.W.; Mahler, G.J.; Malitsky, N.; McIntyre, G.T.; Meng, W.; Mirabella, K.A.M.; Montag, C.; Nehring, T.C.N.; Nicoletti, T.; Oerter, B.; Parzen, G.; Pate, D.; Rank, J.; Rao, T.; Roser, T.; Russo, T.; Scaduto, J.; Smith, K.; Trbojevic, D.; Wang, G.; Wei, J.; Williams, N.W.W.; Wu, K.-C.; Yakimenko, V.; Zaltsman, A.; Zhao, Y.; Abell, D.T.; Bruhwiler, D.L.; Bluem, H.; Burger, A.; Cole, M.D.; Favale, A.J.; Holmes, D.; Rathke, J.; Schultheiss, T.; Todd, A.M.M.; Burov, A.V.; Nagaitsev, S.; Delayen, J.R.; Derbenev, Y.S.; Funk, L. W.; Kneisel, P.; Merminga, L.; Phillips, H.L.; Preble, J.P.; Koop, I.; Parkhomchuk, V.V.; Shatunov, Y.M.; Skrinsky, A.N.; Koop, I.; Parkhomchuk, V.V.; Shatunov, Y.M.; Skrinsky, A.N.; Sekutowicz, J.S.

    2005-01-01

    We report progress on the R and D program for electron-cooling of the Relativistic Heavy Ion Collider (RHIC). This electron cooler is designed to cool 100 GeV/nucleon at storage energy using 54 MeV electrons. The electron source will be a superconducting RF photocathode gun. The accelerator will be a superconducting energy recovery linac. The frequency of the accelerator is set at 703.75 MHz. The maximum electron bunch frequency is 9.38 MHz, with bunch charge of 20 nC. The R and D program has the following components: The photoinjector and its photocathode, the superconducting linac cavity, start-to-end beam dynamics with magnetized electrons, electron cooling calculations including benchmarking experiments and development of a large superconducting solenoid. The photoinjector and linac cavity are being incorporated into an energy recovery linac aimed at demonstrating ampere class current at about 20 MeV. A Zeroth Order Design Report is in an advanced draft state, and can be found on the web at http://www.agsrhichome.bnl.gov/eCool/

  19. Electron Cooling of RHIC

    Energy Technology Data Exchange (ETDEWEB)

    I. Ben-Zvi; D.S. Barton; D.B. Beavis; M. Blaskiewicz; J.M. Brennan; A. Burrill; R. Calaga; P. Cameron; X.Y. Chang; R. Connolly; Yu.I. Eidelman; A.V. Fedotov; W. Fischer; D.M. Gassner; H. Hahn; M. Harrison; A. Hershcovitch; H.-C. Hseuh; A.K. Jain; P.D.J. Johnson; D. Kayran; J. Kewisch; R.F. Lambiase; V. Litvinenko; W.W. MacKay; G.J. Mahler; N. Malitsky; G.T. McIntyre; W. Meng; K.A.M. Mirabella; C. Montag; T.C.N. Nehring; T. Nicoletti; B. Oerter; G. Parzen; D. Pate; J. Rank; T. Rao; T. Roser; T. Russo; J. Scaduto; K. Smith; D. Trbojevic; G. Wang; J. Wei; N.W.W. Williams; K.-C. Wu; V. Yakimenko; A. Zaltsman; Y. Zhao; D.T. Abell; D.L. Bruhwiler; H. Bluem; A. Burger; M.D. Cole; A.J. Favale; D. Holmes; J. Rathke; T. Schultheiss; A.M.M. Todd; A.V. Burov; S. Nagaitsev; J.R. Delayen; Y.S. Derbenev; L. W. Funk; P. Kneisel; L. Merminga; H.L. Phillips; J.P. Preble; I. Koop; V.V. Parkhomchuk; Y.M. Shatunov; A.N. Skrinsky; I. Koop; V.V. Parkhomchuk; Y.M. Shatunov; A.N. Skrinsky; J.S. Sekutowicz

    2005-05-16

    We report progress on the R&D program for electron-cooling of the Relativistic Heavy Ion Collider (RHIC). This electron cooler is designed to cool 100 GeV/nucleon at storage energy using 54 MeV electrons. The electron source will be a superconducting RF photocathode gun. The accelerator will be a superconducting energy recovery linac. The frequency of the accelerator is set at 703.75 MHz. The maximum electron bunch frequency is 9.38 MHz, with bunch charge of 20 nC. The R&D program has the following components: The photoinjector and its photocathode, the superconducting linac cavity, start-to-end beam dynamics with magnetized electrons, electron cooling calculations including benchmarking experiments and development of a large superconducting solenoid. The photoinjector and linac cavity are being incorporated into an energy recovery linac aimed at demonstrating ampere class current at about 20 MeV. A Zeroth Order Design Report is in an advanced draft state, and can be found on the web at http://www.agsrhichome.bnl.gov/eCool/.

  20. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  1. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  2. Investigation of V and V process for thermal fatigue issue in a sodium cooled fast reactor – Application of uncertainty quantification scheme in verification and validation with fluid-structure thermal interaction problem in T-junction piping system

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, Masaaki, E-mail: tanaka.masaaki@jaea.go.jp

    2014-11-15

    Highlights: • Outline of numerical simulation code MUGTHES for fluid-structure thermal interaction was described. • The grid convergence index (GCI) method was applied according to the ASME V and V-20 guide. • Uncertainty of MUGTHES can be successfully quantified for thermal-hydraulic problems and unsteady heat conduction problems in the structure. • Validation for fluid-structure thermal interaction problem in a T-junction piping system was well conducted. - Abstract: Thermal fatigue caused by thermal mixing phenomena is one of the most important issues in design and safety assessment of fast breeder reactors. A numerical simulation code MUGTHES consisting of two calculation modules for unsteady thermal-hydraulics analysis and unsteady heat conduction analysis in structure has been developed to predict thermal mixing phenomena and to estimate thermal response of structure under the thermal interaction between fluid and structure fields. Although verification and validation (V and V) of MUGTHES has been required, actual procedure for uncertainty quantification is not fixed yet. In order to specify an actual procedure of V and V, uncertainty quantifications with the grid convergence index (GCI) estimation according to the existing guidelines were conducted in fundamental laminar flow problems for the thermal-hydraulics analysis module, and also uncertainty for the structure heat conduction analysis module and conjugate heat transfer model was quantified in comparison with the theoretical solutions of unsteady heat conduction problems. After the verification, MUGTHES was validated for a practical fluid-structure thermal interaction problem in T-junction piping system compared with measured results of velocity and temperatures of fluid and structure. Through the numerical simulations in the verification and validation, uncertainty of the code was successfully estimated and applicability of the code to the thermal fatigue issue was confirmed.

  3. Investigation of V and V process for thermal fatigue issue in a sodium cooled fast reactor – Application of uncertainty quantification scheme in verification and validation with fluid-structure thermal interaction problem in T-junction piping system

    International Nuclear Information System (INIS)

    Tanaka, Masaaki

    2014-01-01

    Highlights: • Outline of numerical simulation code MUGTHES for fluid-structure thermal interaction was described. • The grid convergence index (GCI) method was applied according to the ASME V and V-20 guide. • Uncertainty of MUGTHES can be successfully quantified for thermal-hydraulic problems and unsteady heat conduction problems in the structure. • Validation for fluid-structure thermal interaction problem in a T-junction piping system was well conducted. - Abstract: Thermal fatigue caused by thermal mixing phenomena is one of the most important issues in design and safety assessment of fast breeder reactors. A numerical simulation code MUGTHES consisting of two calculation modules for unsteady thermal-hydraulics analysis and unsteady heat conduction analysis in structure has been developed to predict thermal mixing phenomena and to estimate thermal response of structure under the thermal interaction between fluid and structure fields. Although verification and validation (V and V) of MUGTHES has been required, actual procedure for uncertainty quantification is not fixed yet. In order to specify an actual procedure of V and V, uncertainty quantifications with the grid convergence index (GCI) estimation according to the existing guidelines were conducted in fundamental laminar flow problems for the thermal-hydraulics analysis module, and also uncertainty for the structure heat conduction analysis module and conjugate heat transfer model was quantified in comparison with the theoretical solutions of unsteady heat conduction problems. After the verification, MUGTHES was validated for a practical fluid-structure thermal interaction problem in T-junction piping system compared with measured results of velocity and temperatures of fluid and structure. Through the numerical simulations in the verification and validation, uncertainty of the code was successfully estimated and applicability of the code to the thermal fatigue issue was confirmed

  4. A very cool cooling system

    CERN Multimedia

    Antonella Del Rosso

    2015-01-01

    The NA62 Gigatracker is a jewel of technology: its sensor, which delivers the time of the crossing particles with a precision of less than 200 picoseconds (better than similar LHC detectors), has a cooling system that might become the precursor to a completely new detector technique.   The 115 metre long vacuum tank of the NA62 experiment. The NA62 Gigatracker (GTK) is composed of a set of three innovative silicon pixel detectors, whose job is to measure the arrival time and the position of the incoming beam particles. Installed in the heart of the NA62 detector, the silicon sensors are cooled down (to about -20 degrees Celsius) by a microfluidic silicon device. “The cooling system is needed to remove the heat produced by the readout chips the silicon sensor is bonded to,” explains Alessandro Mapelli, microsystems engineer working in the Physics department. “For the NA62 Gigatracker we have designed a cooling plate on top of which both the silicon sensor and the...

  5. Cooling systems

    International Nuclear Information System (INIS)

    Coutant, C.C.

    1978-01-01

    Progress on the thermal effects project is reported with regard to physiology and distribution of Corbicula; power plant effects studies on burrowing mayfly populations; comparative thermal responses of largemouth bass from northern and southern populations; temperature selection by striped bass in Cherokee Reservoir; fish population studies; and predictive thermoregulation by fishes. Progress is also reported on the following; cause and ecological ramifications of threadfin shad impingement; entrainment project; aquaculture project; pathogenic amoeba project; and cooling tower drift project

  6. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    . This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles......This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...

  7. Network planning under uncertainties

    Science.gov (United States)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    generic framework for solving the network planning problem under uncertainties. In addition to reviewing the various network planning problems involving uncertainties, we also propose that a unified framework based on robust optimization can be used to solve a rather large segment of network planning problem under uncertainties. Robust optimization is first introduced in the operations research literature and is a framework that incorporates information about the uncertainty sets for the parameters in the optimization model. Even though robust optimization is originated from tackling the uncertainty in the optimization process, it can serve as a comprehensive and suitable framework for tackling generic network planning problems under uncertainties. In this paper, we begin by explaining the main ideas behind the robust optimization approach. Then we demonstrate the capabilities of the proposed framework by giving out some examples of how the robust optimization framework can be applied to the current common network planning problems under uncertain environments. Next, we list some practical considerations for solving the network planning problem under uncertainties with the proposed framework. Finally, we conclude this article with some thoughts on the future directions for applying this framework to solve other network planning problems.

  8. Cargill Incorporated

    Science.gov (United States)

    Cargill, Incorporated, 518 East Fourth Street, Watkins Glen, New York 14891 has applied to the U.S. Environmental Protection Agency (EPA) under the provisions of the Safe Drinking Water Act, 42 U.S.C. 300f et. seq (the Act), for a new Underground Injection

  9. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  10. Cooling device for upper lid of reactor pressure vessel

    International Nuclear Information System (INIS)

    Takayama, Kazuhiko.

    1997-01-01

    An upper lid of a reactor pressure vessel in a BWR type reactor has one or more annular cooling elements on the surface. The outer side thereof is covered by a temperature keeping frame. The cooling elements have an annular hollow shape, and cooling water is supplied to the hollow portion. As the cooling water supplied to the cooling elements, cooling water of a reactor auxiliary cooling system as a cooling water system incorporated in the reactor container or cooling water of a dehumidification system of the reactor container is used. A plurality of temperature sensors are disposed to various portions of the upper lid of the pressure vessel. A control device determines scattering of the temperature for the entire upper lid of the pressure vessel by temperature signals sent from the temperature sensors. Then, the amount of cooling water to be flown to each of the cooling elements is controlled so as to eliminate the scattering. (I.N.)

  11. Electron Cooling of RHIC

    CERN Document Server

    Ben-Zvi, Ilan; Barton, Donald; Beavis, Dana; Blaskiewicz, Michael; Bluem, Hans; Brennan, Joseph M; Bruhwiler, David L; Burger, Al; Burov, Alexey; Burrill, Andrew; Calaga, Rama; Cameron, Peter; Chang, Xiangyun; Cole, Michael; Connolly, Roger; Delayen, Jean R; Derbenev, Yaroslav S; Eidelman, Yury I; Favale, Anthony; Fedotov, Alexei V; Fischer, Wolfram; Funk, L W; Gassner, David M; Hahn, Harald; Harrison, Michael; Hershcovitch, Ady; Holmes, Douglas; Hseuh Hsiao Chaun; Johnson, Peter; Kayran, Dmitry; Kewisch, Jorg; Kneisel, Peter; Koop, Ivan; Lambiase, Robert; Litvinenko, Vladimir N; MacKay, William W; Mahler, George; Malitsky, Nikolay; McIntyre, Gary; Meng, Wuzheng; Merminga, Lia; Meshkov, Igor; Mirabella, Kerry; Montag, Christoph; Nagaitsev, Sergei; Nehring, Thomas; Nicoletti, Tony; Oerter, Brian; Parkhomchuk, Vasily; Parzen, George; Pate, David; Phillips, Larry; Preble, Joseph P; Rank, Jim; Rao, Triveni; Rathke, John; Roser, Thomas; Russo, Thomas; Scaduto, Joseph; Schultheiss, Tom; Sekutowicz, Jacek; Shatunov, Yuri; Sidorin, Anatoly O; Skrinsky, Aleksander Nikolayevich; Smirnov, Alexander V; Smith, Kevin T; Todd, Alan M M; Trbojevic, Dejan; Troubnikov, Grigory; Wang, Gang; Wei, Jie; Williams, Neville; Wu, Kuo-Chen; Yakimenko, Vitaly; Zaltsman, Alex; Zhao, Yongxiang; ain, Animesh K

    2005-01-01

    We report progress on the R&D program for electron-cooling of the Relativistic Heavy Ion Collider (RHIC). This electron cooler is designed to cool 100 GeV/nucleon at storage energy using 54 MeV electrons. The electron source will be a superconducting RF photocathode gun. The accelerator will be a superconducting energy recovery linac. The frequency of the accelerator is set at 703.75 MHz. The maximum electron bunch frequency is 9.38 MHz, with bunch charge of 20 nC. The R&D program has the following components: The photoinjector and its photocathode, the superconducting linac cavity, start-to-end beam dynamics with magnetized electrons, electron cooling calculations including benchmarking experiments and development of a large superconducting solenoid. The photoinjector and linac cavity are being incorporated into an energy recovery linac aimed at demonstrating ampere class current at about 20 MeV. A Zeroth Order Design Report is in an advanced draft state, and can be found on the web at http://www.ags...

  12. ATLAS - Liquid Cooling Systems

    CERN Multimedia

    Bonneau, P.

    1998-01-01

    Photo 1 - Cooling Unit - Side View Photo 2 - Cooling Unit - Detail Manifolds Photo 3 - Cooling Unit - Rear View Photo 4 - Cooling Unit - Detail Pump, Heater and Exchanger Photo 5 - Cooling Unit - Detail Pump and Fridge Photo 6 - Cooling Unit - Front View

  13. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  14. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  15. Development of cooling strategy for an air cooled lithium-ion battery pack

    Science.gov (United States)

    Sun, Hongguang; Dixon, Regan

    2014-12-01

    This paper describes a cooling strategy development method for an air cooled battery pack with lithium-ion pouch cells used in a hybrid electric vehicle (HEV). The challenges associated with the temperature uniformity across the battery pack, the temperature uniformity within each individual lithium-ion pouch cell, and the cooling efficiency of the battery pack are addressed. Initially, a three-dimensional battery pack thermal model developed based on simplified electrode theory is correlated to physical test data. An analytical design of experiments (DOE) approach using Optimal Latin-hypercube technique is then developed by incorporating a DOE design model, the correlated battery pack thermal model, and a morphing model. Analytical DOE studies are performed to examine the effects of cooling strategies including geometries of the cooling duct, cooling channel, cooling plate, and corrugation on battery pack thermal behavior and to identify the design concept of an air cooled battery pack to maximize its durability and its driving range.

  16. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  17. Cool Snacks

    DEFF Research Database (Denmark)

    Krogager, Stinne Gunder Strøm; Grunert, Klaus G; Brunsø, Karen

    2016-01-01

    Young people snack and their snacking habits are not always healthy. We address the questions whether it is possible to develop a new snack product that adolescents will find attractive, even though it is based on ingredients as healthy as fruits and vegetables, and we argue that developing...... such a product requires an interdisciplinary effort where researchers with backgrounds in psychology, anthropology, media science, philosophy, sensory science and food science join forces. We present the COOL SNACKS project, where such a blend of competences was used first to obtain thorough insight into young...... people's snacking behaviour and then to develop and test new, healthier snacking solutions. These new snacking solutions were tested and found to be favourably accepted by young people. The paper therefore provides a proof of principle that the development of snacks that are both healthy and attractive...

  18. Cool visitors

    CERN Multimedia

    2006-01-01

    Pictured, from left to right: Tim Izo (saxophone, flute, guitar), Bobby Grant (tour manager), George Pajon (guitar). What do the LHC and a world-famous hip-hop group have in common? They are cool! On Saturday, 1st July, before their appearance at the Montreux Jazz Festival, three members of the 'Black Eyed Peas' came on a surprise visit to CERN, inspired by Dan Brown's Angels and Demons. At short notice, Connie Potter (Head of the ATLAS secretariat) organized a guided tour of ATLAS and the AD 'antimatter factory'. Still curious, lead vocalist Will.I.Am met CERN physicist Rolf Landua after the concert to ask many more questions on particles, CERN, and the origin of the Universe.

  19. Handling uncertainty

    DEFF Research Database (Denmark)

    Jønsson, Jesper Bosse; Fold, Niels

    2009-01-01

    Small-scale mining supports the livelihoods of several hundred thousand rural households in Africa. Nonetheless, the understanding of the organizational dynamics of small-scale miners' activities is modest. The paper outlines the small-scale mining codes in Tanzania and contrasts them to prevalent...... organizational practices in two Tanzanian small-scale mining settlements. It is argued that there is a need to adjust the regulatory mechanisms to well-consolidated practices: If basic practices differ substantially from official prescriptions of the mining codes over an extended period of time, certain elements...... manoeuvring of local small-scale mining operators and the reasons for the variations are essential to understand for policymakers and development practitioners. By incorporating prevalent practices and context-dependent variations in some of the crucial organizational components, it is possible to design...

  20. WORKSHOP: Beam cooling

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    Cooling - the control of unruly particles to provide well-behaved beams - has become a major new tool in accelerator physics. The main approaches of electron cooling pioneered by Gersh Budker at Novosibirsk and stochastic cooling by Simon van der Meer at CERN, are now complemented by additional ideas, such as laser cooling of ions and ionization cooling of muons

  1. Renewable Heating And Cooling

    Science.gov (United States)

    Renewable heating and cooling is a set of alternative resources and technologies that can be used in place of conventional heating and cooling technologies for common applications such as water heating, space heating, space cooling and process heat.

  2. Background and Qualification of Uncertainty Methods

    International Nuclear Information System (INIS)

    D'Auria, F.; Petruzzi, A.

    2008-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. The paper reviews the salient features of two independent approaches for estimating uncertainties associated with predictions of complex system codes. Namely the propagation of code input error and the propagation of the calculation output error constitute the key-words for identifying the methods of current interest for industrial applications. Throughout the developed methods, uncertainty bands can be derived (both upper and lower) for any desired quantity of the transient of interest. For the second case, the uncertainty method is coupled with the thermal-hydraulic code to get the Code with capability of Internal Assessment of Uncertainty, whose features are discussed in more detail.

  3. Decay heat uncertainty quantification of MYRRHA

    Science.gov (United States)

    Fiorito, Luca; Buss, Oliver; Hoefer, Axel; Stankovskiy, Alexey; Eynde, Gert Van den

    2017-09-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  4. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  5. Passive Cooling of buildings by night-time ventilation

    DEFF Research Database (Denmark)

    Artmann, Nikolai; Manz, Heinrich; Heiselberg, Per

    Due to an overall trend towards an increasing cooling energy demand in buildings in many European countries over the last few decades, passive cooling by night-time ventilation is seen as a promising concept. However, because of uncertainties in thermal comfort predictions, architects and engineers...

  6. Proton and neutron electromagnetic form factors and uncertainties

    Science.gov (United States)

    Ye, Zhihong; Arrington, John; Hill, Richard J.; Lee, Gabriel

    2018-02-01

    We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q2 and high-Q2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.

  7. Review of groundwater cooling systems in London

    Energy Technology Data Exchange (ETDEWEB)

    Ampofo, F.; Maidment, G.G.; Missenden, J.F. [Department of Engineering Systems, Faculty of Engineering, Science and The Built Environment, London South Bank University, 103 Borough Road, London, SE1 0AA (United Kingdom)

    2006-12-15

    The environmental impact of the UK building stock has increased the pressure on architects, engineers and building operators to reduce the use of air conditioning in favour of more passive cooling solutions. Good progress has been made in this direction but many passive solutions are limited to new-build projects. For existing buildings, and those for which mechanical air conditioning cannot be avoided, low energy cooling capability can be incorporated to improve significantly overall efficiency. This paper focuses on one such low energy capability - cooling using groundwater, which has gained popularity in recent years in the London area. Among the reasons for this are the excellent energy efficiency and the increasing viability of water extraction systems. The paper shows that groundwater cooling technology can be incorporated into newly-build and existing buildings to help reduce the environmental impact of the UK building stock. (author)

  8. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  9. Process fluid cooling system

    International Nuclear Information System (INIS)

    Farquhar, N.G.; Schwab, J.A.

    1977-01-01

    A system of heat exchangers is disclosed for cooling process fluids. The system is particularly applicable to cooling steam generator blowdown fluid in a nuclear plant prior to chemical purification of the fluid in which it minimizes the potential of boiling of the plant cooling water which cools the blowdown fluid

  10. Cooling Grapple System for FMEF hot cell

    International Nuclear Information System (INIS)

    Semmens, L.S.; Frandsen, G.B.; Tome, R.

    1983-01-01

    A Cooling Grapple System was designed and built to handle fuel assemblies within the FMEF hot cell. The variety of functions for which it is designed makes it unique from grapples presently in use. The Cooling Grapple can positively grip and transport assemblies vertically, retrieve assemblies from molten sodium where six inches of grapple tip is submerged, cool 7 kw assemblies in argon, and service an in-cell area of 372 m 2 (4000 ft 2 ). Novel and improved operating and maintenance features were incorporated in the design including a shear pin and mechanical catcher system to prevent overloading the grapple while allowing additional reaction time for crane shutdown

  11. Hybrid radiator cooling system

    Science.gov (United States)

    France, David M.; Smith, David S.; Yu, Wenhua; Routbort, Jules L.

    2016-03-15

    A method and hybrid radiator-cooling apparatus for implementing enhanced radiator-cooling are provided. The hybrid radiator-cooling apparatus includes an air-side finned surface for air cooling; an elongated vertically extending surface extending outwardly from the air-side finned surface on a downstream air-side of the hybrid radiator; and a water supply for selectively providing evaporative cooling with water flow by gravity on the elongated vertically extending surface.

  12. Uncertainty and global warming : an option - pricing approach to policy

    OpenAIRE

    Baranzini, Andrea; Chesney, Marc; Morisset, Jacques

    1995-01-01

    Uncertainty is inherent in the analysis of global warming issues. Not only is there considerable scientific uncertainty about the magnitude of global warming, but even if that problem were resolved, there is uncertainty about what monetary value to assign to the costs and benefits of various policies to reduce global warming. And yet the influence of uncertainty in policymaker's decisions is ignored in most studies of the issue. The authors try to explicitly incorporate the effect of uncertai...

  13. Heating up the gas cooling market

    International Nuclear Information System (INIS)

    Watt, G.

    2001-01-01

    Gas cooling is an exciting technology with a potentially bright future. It comprises the production of cooling (and heating) in buildings and industry, by substituting environmentally-friendlier natural gas or LPG over predominantly coal-fired electricity in air conditioning equipment. There are currently four established technologies using gas to provide cooling energy or conditioned air. These are: absorption, both direct gas-fired and utilising hot water or steam; gas engine driven vapour compression (GED); cogeneration, with absorption cooling driven by recovered heat; and desiccant systems. The emergence of gas cooling technologies has been, and remains, one of evolution rather than revolution. However, further development of the technology has had a revolutionary effect on the performance, reliability and consumer acceptability of gas cooling products. Developments from world-renowned manufacturers such as York, Hitachi, Robur and Thermax have produced a range of absorption equipment variously offering: the use of 100 percent environmentally-friendly refrigerants, with zero global warming potential; the ideal utilisation of waste heat from cogeneration systems; a reduction in electrical distribution and stand-by generation capacity; long product life expectancy; far less noise and vibration; performance efficiency maintained down to about 20 percent of load capacity; and highly automated and low-cost maintenance. It is expected that hybrid systems, that is a mixture of gas and electric cooling technologies, will dominate the future market, reflecting the uncertainty in the electricity market and the prospects of stable future gas prices

  14. Incorporating Externalities and Uncertainty into Life-Cycle Cost Analysis

    Science.gov (United States)

    2012-03-01

    byproducts of combustion that form as a result of the reaction of nitrogen and oxygen. NOx can react with other compounds in the air to form particulate...cycle air emissions of coal , domestic natural gas, LNG, and SNG for electricity generation. Environmental Science & Technology, 41(17), 6290-6296

  15. Muon cooling channels

    CERN Document Server

    Eberhard-K-Kei

    2003-01-01

    A procedure uses the equations that govern ionization cooling, and leads to the most important parameters of a muon cooling channel that achieves assumed performance parameters. First, purely transverse cooling is considered, followed by both transverse and longitudinal cooling in quadrupole and solenoid channels. Similarities and differences in the results are discussed in detail, and a common notation is developed. Procedure and notation are applied to a few published cooling channels. The parameters of the cooling channels are derived step by step, starting from assumed values of the initial, final and equilibrium emittances, both transverse and longitudinal, the length of the cooling channel, and the material properties of the absorber. The results obtained include cooling lengths and partition numbers, amplitude functions and limits on the dispersion at the absorber, length, aperture and spacing of the absorber, parameters of the RF system that achieve the longitudinal amplitude function and bucket area ...

  16. Decay heat uncertainty quantification of MYRRHA

    OpenAIRE

    Fiorito Luca; Buss Oliver; Hoefer Axel; Stankovskiy Alexey; Eynde Gert Van den

    2017-01-01

    MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS) currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay hea...

  17. Uncertainty quantification approaches for advanced reactor analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  18. The challenges on uncertainty analysis for pebble bed HTGR

    International Nuclear Information System (INIS)

    Hao, C.; Li, F.; Zhang, H.

    2012-01-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  19. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  20. Cooling water distribution system

    Science.gov (United States)

    Orr, Richard

    1994-01-01

    A passive containment cooling system for a nuclear reactor containment vessel. Disclosed is a cooling water distribution system for introducing cooling water by gravity uniformly over the outer surface of a steel containment vessel using an interconnected series of radial guide elements, a plurality of circumferential collector elements and collector boxes to collect and feed the cooling water into distribution channels extending along the curved surface of the steel containment vessel. The cooling water is uniformly distributed over the curved surface by a plurality of weirs in the distribution channels.

  1. Laser cooling of solids

    OpenAIRE

    Nemova, Galina

    2009-01-01

    Parallel to advances in laser cooling of atoms and ions in dilute gas phase, which has progressed immensely, resulting in physics Nobel prizes in 1997 and 2001, major progress has recently been made in laser cooling of solids. I compare the physical nature of the laser cooling of atoms and ions with that of the laser cooling of solids. I point out all advantages of this new and very promising area of laser physics. Laser cooling of solids (optical refrigeration) at the present time can be lar...

  2. Cooling tower calculations

    International Nuclear Information System (INIS)

    Simonkova, J.

    1988-01-01

    The problems are summed up of the dynamic calculation of cooling towers with forced and natural air draft. The quantities and relations are given characterizing the simultaneous exchange of momentum, heat and mass in evaporative water cooling by atmospheric air in the packings of cooling towers. The method of solution is clarified in the calculation of evaporation criteria and thermal characteristics of countercurrent and cross current cooling systems. The procedure is demonstrated of the calculation of cooling towers, and correction curves and the effect assessed of the operating mode at constant air number or constant outlet air volume flow on their course in ventilator cooling towers. In cooling towers with the natural air draft the flow unevenness is assessed of water and air relative to its effect on the resulting cooling efficiency of the towers. The calculation is demonstrated of thermal and resistance response curves and cooling curves of hydraulically unevenly loaded towers owing to the water flow rate parameter graded radially by 20% along the cross-section of the packing. Flow rate unevenness of air due to wind impact on the outlet air flow from the tower significantly affects the temperatures of cooled water in natural air draft cooling towers of a design with lower demands on aerodynamics, as early as at wind velocity of 2 m.s -1 as was demonstrated on a concrete example. (author). 11 figs., 10 refs

  3. Robustness to strategic uncertainty

    NARCIS (Netherlands)

    Andersson, O.; Argenton, C.; Weibull, J.W.

    We introduce a criterion for robustness to strategic uncertainty in games with continuum strategy sets. We model a player's uncertainty about another player's strategy as an atomless probability distribution over that player's strategy set. We call a strategy profile robust to strategic uncertainty

  4. Fission Spectrum Related Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  5. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  6. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  7. Regulating fisheries under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn; Jensen, Frank

    2017-01-01

    the effects of these uncertainties into a single welfare measure for comparing tax and quota regulation. It is shown that quotas are always preferred to fees when structural economic uncertainty dominates. Since most regulators are subject to this kind of uncertainty, this result is a potentially important......Regulator uncertainty is decisive for whether price or quantity regulation maximizes welfare in fisheries. In this paper, we develop a model of fisheries regulation that includes ecological uncertainly, variable economic uncertainty as well as structural economic uncertainty. We aggregate...... qualification of the pro-price regulation message dominating the fisheries economics literature. We also believe that the model of a fishery developed in this paper could be applied to the regulation of other renewable resources where regulators are subject to uncertainty either directly or with some...

  8. Optimal Groundwater Extraction under Uncertainty and a Spatial Stock Externality

    Science.gov (United States)

    We introduce a model that incorporates two important elements to estimating welfare gains from groundwater management: stochasticity and a spatial stock externality. We estimate welfare gains resulting from optimal management under uncertainty as well as a gradual stock externali...

  9. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  10. NASA Microclimate Cooling Challenges

    Science.gov (United States)

    Trevino, Luis A.

    2004-01-01

    The purpose of this outline form presentation is to present NASA's challenges in microclimate cooling as related to the spacesuit. An overview of spacesuit flight-rated personal cooling systems is presented, which includes a brief history of cooling systems from Gemini through Space Station missions. The roles of the liquid cooling garment, thermal environment extremes, the sublimator, multi-layer insulation, and helmet visor UV and solar coatings are reviewed. A second section is presented on advanced personal cooling systems studies, which include heat acquisition studies on cooling garments, heat rejection studies on water boiler & radiators, thermal storage studies, and insulation studies. Past and present research and development and challenges are summarized for the advanced studies.

  11. The cooling of particle beams

    International Nuclear Information System (INIS)

    Sessler, A.M.

    1994-10-01

    A review is given of the various methods which can be employed for cooling particle beams. These methods include radiation damping, stimulated radiation damping, ionization cooling, stochastic cooling, electron cooling, laser cooling, and laser cooling with beam coupling. Laser Cooling has provided beams of the lowest temperatures, namely 1 mK, but only for ions and only for the longitudinal temperature. Recent theoretical work has suggested how laser cooling, with the coupling of beam motion, can be used to reduce the ion beam temperature in all three directions. The majority of this paper is devoted to describing laser cooling and laser cooling with beam coupling

  12. Initial Cooling Experiment (ICE)

    CERN Multimedia

    Photographic Service; CERN PhotoLab

    1978-01-01

    In 1977, in a record-time of 9 months, the magnets of the g-2 experiment were modified and used to build a proton/antiproton storage ring: the "Initial Cooling Experiment" (ICE). It served for the verification of the cooling methods to be used for the "Antiproton Project". Stochastic cooling was proven the same year, electron cooling followed later. Also, with ICE the experimental lower limit for the antiproton lifetime was raised by 9 orders of magnitude: from 2 microseconds to 32 hours. For its previous life as g-2 storage ring, see 7405430. More on ICE: 7711282, 7809081, 7908242.

  13. Turbine airfoil cooling system with cooling systems using high and low pressure cooling fluids

    Energy Technology Data Exchange (ETDEWEB)

    Marsh, Jan H.; Messmann, Stephen John; Scribner, Carmen Andrew

    2017-10-25

    A turbine airfoil cooling system including a low pressure cooling system and a high pressure cooling system for a turbine airfoil of a gas turbine engine is disclosed. In at least one embodiment, the low pressure cooling system may be an ambient air cooling system, and the high pressure cooling system may be a compressor bleed air cooling system. In at least one embodiment, the compressor bleed air cooling system in communication with a high pressure subsystem that may be a snubber cooling system positioned within a snubber. A delivery system including a movable air supply tube may be used to separate the low and high pressure cooling subsystems. The delivery system may enable high pressure cooling air to be passed to the snubber cooling system separate from low pressure cooling fluid supplied by the low pressure cooling system to other portions of the turbine airfoil cooling system.

  14. Avoiding climate change uncertainties in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick Arthur

    2013-01-01

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies...

  15. Improved Thermoelectrically Cooled Laser-Diode Assemblies

    Science.gov (United States)

    Glesne, Thomas R.; Schwemmer, Geary K.; Famiglietti, Joe

    1994-01-01

    Cooling decreases wavelength and increases efficiency and lifetime. Two improved thermoelectrically cooled laser-diode assemblies incorporate commercial laser diodes providing combination of both high wavelength stability and broad wavelength tuning which are broadly tunable, highly stable devices for injection seeding of pulsed, high-power tunable alexandrite lasers used in lidar remote sensing of water vapor at wavelengths in vicinity of 727 nanometers. Provide temperature control needed to take advantage of tunability of commercial AlGaAs laser diodes in present injection-seeding application.

  16. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  17. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    The quality of analytical results is expressed by their uncertainty, as it is estimated on the basis of an uncertainty budget; little effort is, however, often spent on ascertaining the quality of the uncertainty budget. The uncertainty budget is based on circumstantial or historical data...... observed and expected variability is tested by means of the T-test, which follows a chi-square distribution with a number of degrees of freedom determined by the number of replicates. Significant deviations between predicted and observed variability may be caused by a variety of effects, and examples...... will be presented; both underestimation and overestimation may occur, each leading to correcting the influence of uncertainty components according to their influence on the variability of experimental results. Some uncertainty components can be verified only with a very small number of degrees of freedom, because...

  18. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  19. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  20. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  1. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  2. The final cool down

    CERN Multimedia

    Thursday 29th May, the cool-down of the final sector (sector 4-5) of LHC has begun, one week after the start of the cool-down of sector 1-2. It will take five weeks for the sectors to be cooled from room temperature to 5 K and a further two weeks to complete the cool down to 1.9 K and the commissioning of cryogenic instrumentation, as well as to fine tune the cryogenic plants and the cooling loops of cryostats.Nearly a year and half has passed since sector 7-8 was cooled for the first time in January 2007. For Laurent Tavian, AT/CRG Group Leader, reaching the final phase of the cool down is an important milestone, confirming the basic design of the cryogenic system and the ability to operate complete sectors. “All the sectors have to operate at the same time otherwise we cannot inject the beam into the machine. The stability and reliability of the cryogenic system and its utilities are now very important. That will be the new challenge for the coming months,” he explains. The status of the cool down of ...

  3. Solar absorption cooling

    NARCIS (Netherlands)

    Kim, D.S.

    2007-01-01

    As the world concerns more and more on global climate changes and depleting energy resources, solar cooling technology receives increasing interests from the public as an environment-friendly and sustainable alternative. However, making a competitive solar cooling machine for the market still

  4. Cooling of electronic equipment

    DEFF Research Database (Denmark)

    A. Kristensen, Anders Schmidt

    2003-01-01

    Cooling of electronic equipment is studied. The design size of electronic equipment decrease causing the thermal density to increase. This affect the cooling which can cause for example failures of critical components due to overheating or thermal induced stresses. Initially a pin fin heat sink...

  5. Coherent electron cooling

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko,V.

    2009-05-04

    Cooling intense high-energy hadron beams remains a major challenge in modern accelerator physics. Synchrotron radiation is still too feeble, while the efficiency of two other cooling methods, stochastic and electron, falls rapidly either at high bunch intensities (i.e. stochastic of protons) or at high energies (e-cooling). In this talk a specific scheme of a unique cooling technique, Coherent Electron Cooling, will be discussed. The idea of coherent electron cooling using electron beam instabilities was suggested by Derbenev in the early 1980s, but the scheme presented in this talk, with cooling times under an hour for 7 TeV protons in the LHC, would be possible only with present-day accelerator technology. This talk will discuss the principles and the main limitations of the Coherent Electron Cooling process. The talk will describe the main system components, based on a high-gain free electron laser driven by an energy recovery linac, and will present some numerical examples for ions and protons in RHIC and the LHC and for electron-hadron options for these colliders. BNL plans a demonstration of the idea in the near future.

  6. Measure Guideline: Ventilation Cooling

    Energy Technology Data Exchange (ETDEWEB)

    Springer, D.; Dakin, B.; German, A.

    2012-04-01

    The purpose of this measure guideline on ventilation cooling is to provide information on a cost-effective solution for reducing cooling system energy and demand in homes located in hot-dry and cold-dry climates. This guideline provides a prescriptive approach that outlines qualification criteria, selection considerations, and design and installation procedures.

  7. Stochastic cooling at Fermilab

    International Nuclear Information System (INIS)

    Marriner, J.

    1986-08-01

    The topics discussed are the stochastic cooling systems in use at Fermilab and some of the techniques that have been employed to meet the particular requirements of the anti-proton source. Stochastic cooling at Fermilab became of paramount importance about 5 years ago when the anti-proton source group at Fermilab abandoned the electron cooling ring in favor of a high flux anti-proton source which relied solely on stochastic cooling to achieve the phase space densities necessary for colliding proton and anti-proton beams. The Fermilab systems have constituted a substantial advance in the techniques of cooling including: large pickup arrays operating at microwave frequencies, extensive use of cryogenic techniques to reduce thermal noise, super-conducting notch filters, and the development of tools for controlling and for accurately phasing the system

  8. INITIAL COOLING EXPERIMENT (ICE)

    CERN Multimedia

    CERN PhotoLab

    1979-01-01

    ICE was built in 1977, using the modified bending magnets of the g-2 muon storage ring (see 7405430). Its purpose was to verify the validity of stochastic and electron cooling for the antiproton project. Stochastic cooling proved a resounding success early in 1978 and the antiproton project could go ahead, now entirely based on stochastic cooling. Electron cooling was experimented with in 1979. The 26 kV equipment is housed in the cage to the left of the picture, adjacent to the "e-cooler" located in a straight section of the ring. With some modifications, the cooler was later transplanted into LEAR (Low Energy Antiproton Ring) and then, with further modifications, into the AD (Antiproton Decelerator), where it cools antiprotons to this day (2006). See also: 7711282, 7802099, 7809081.

  9. Initial Cooling Experiment (ICE)

    CERN Multimedia

    CERN PhotoLab

    1978-01-01

    ICE was built in 1977, in a record time of 9 months, using the modified bending magnets of the g-2 muon storage ring. Its purpose was to verify the validity of stochastic and electron cooling for the antiproton project, to be launched in 1978. Already early in 1978, stochastic cooling proved a resounding success, such that the antiproton (p-pbar)project was entirely based on it. Tests of electron cooling followed later: protons of 46 MeV kinetic energy were cooled with an electron beam of 26 kV and 1.3 A. The cage seen prominently in the foreground houses the HV equipment, adjacent to the "cooler" installed in a straight section of the ring. With some modifications, the cooler was later transplanted into LEAR (Low Energy Antiproton Ring) and then, with further modifications, into the AD (Antiproton Decelerator), where it cools antiprotons to this day (2006). See also: 7711282, 7802099, 7908242.

  10. Study of Nuclear Decay Data Contribution to Uncertainties in Heat Load Estimations for Spent Fuel Pools

    Science.gov (United States)

    Ferroukhi, H.; Leray, O.; Hursin, M.; Vasiliev, A.; Perret, G.; Pautz, A.

    2014-04-01

    At the Paul Scherrer Institut (PSI), a methodology for nuclear data uncertainty propagation in CASMO-5M (C5M) assembly calculations is under development. This paper presents a preliminary application of this methodology to C5M decay heat calculations. Applying a stochastic sampling method, nuclear decay data uncertainties are first propagated for the cooling phase only. Thereafter, the uncertainty propagation is enlarged to gradually account for cross-section as well as fission yield uncertainties during the depletion phase. On that basis, assembly heat load uncertainties as well as total uncertainty for the entire pool are quantified for cooling times up to one year. The relative contributions from the various types of nuclear data uncertainties are in this context also estimated.

  11. Cooled-Spool Piston Compressor

    Science.gov (United States)

    Morris, Brian G.

    1994-01-01

    Proposed cooled-spool piston compressor driven by hydraulic power and features internal cooling of piston by flowing hydraulic fluid to limit temperature of compressed gas. Provides sufficient cooling for higher compression ratios or reactive gases. Unlike conventional piston compressors, all parts of compressed gas lie at all times within relatively short distance of cooled surface so that gas cooled more effectively.

  12. Effect of closed loop cooling water transit time on containment cooling

    International Nuclear Information System (INIS)

    Smith, R.P.; Vossahlik, J.E.; Goodwin, E.F.

    1996-01-01

    Long term containment cooling analyses in nuclear plant systems are usually conducted assuming a quasi steady-state process, that is, a steady state evaluation of the cooling system is completed for each calculational step. In reality, fluid transport in the system, and heat addition to system components may affect the heat removal rate of the system. Transient effects occurring during system startup may affect the maximum temperatures experienced in the system. It is important to ensure that such transient effects do not affect operation of the system (e.g., cause a high temperature trip). To evaluate the effect of fluid transit delays, a closed loop cooling water system model has been developed that incorporates the fluid transport times when determining the closed loop cooling system performance. This paper describes the closed loop cooling system model as implemented in the CONTEMPT-LT/028 code. The evaluation of the transient temperature response of the closed loop cooling system using the model is described. The paper also describes the effect of fluid transit time on the overall containment cooling performance

  13. PIV uncertainty propagation

    NARCIS (Netherlands)

    Sciacchitano, A.; Wieneke, Bernhard

    2016-01-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It

  14. Feedback versus uncertainty

    NARCIS (Netherlands)

    Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.

    2014-01-01

    Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of

  15. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    correlation between x and p. The virtue of Schrodinger's version (5) is that it accounts for this correlation. In spe- cial cases like the free particle and the harmonic oscillator, the 'Schrodinger uncertainty product' even remains constant with time, whereas Heisenberg's does not. The glory of giving the uncertainty principle to ...

  16. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  17. Study of cooling effectiveness for an integrated cooling turbine blade

    OpenAIRE

    Matsushita, Masahiro; Yamane, Takashi; Mimura, Fujio; Fukuyama, Yoshitaka; 松下 政裕; 山根 敬; 三村 富嗣雄; 福山 佳孝

    2007-01-01

    Experimental study of film cooling, impingement cooling and integrated cooling were carried out with the aim of applying them to turbine cooling. The experiments were conducted with 673 K hot gas flow and room temperature cooling air. Test plate surface temperature distributions were measured with an infrared camera. This report presents fundamental research data on cooling performance of the test plates for the validation of numerical simulation. Moreover, simplify heat transfer calculations...

  18. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  19. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  20. Second sector cool down

    CERN Multimedia

    2007-01-01

    At the beginning of July, cool-down is starting in the second LHC sector, sector 4-5. The cool down of sector 4-5 may occasionally generate mist at Point 4, like that produced last January (photo) during the cool-down of sector 7-8.Things are getting colder in the LHC. Sector 7-8 has been kept at 1.9 K for three weeks with excellent stability (see Bulletin No. 16-17 of 16 April 2007). The electrical tests in this sector have got opt to a successful start. At the beginning of July the cryogenic teams started to cool a second sector, sector 4-5. At Point 4 in Echenevex, where one of the LHC’s cryogenic plants is located, preparations for the first phase of the cool-down are underway. During this phase, the sector will first be cooled to 80 K (-193°C), the temperature of liquid nitrogen. As for the first sector, 1200 tonnes of liquid nitrogen will be used for the cool-down. In fact, the nitrogen circulates only at the surface in the ...

  1. Evaporative cooling of trapped atoms

    International Nuclear Information System (INIS)

    Ketterle, W.; Van Druten, N.J.

    1996-01-01

    This report discusses the following topics on evaporative cooling of trapped atoms: Theoretical models for evaporative cooling; the role of collisions for real atoms; experimental techniques and summary of evaporative cooling experiments. 166 refs., 6 figs., 3 tabs

  2. Cool WISPs for stellar cooling excesses

    Energy Technology Data Exchange (ETDEWEB)

    Giannotti, Maurizio [Barry Univ., Miami Shores, FL (United States). Physical Sciences; Irastorza, Igor [Zaragoza Univ. (Spain). Dept. de Fisica Teorica; Redondo, Javier [Zaragoza Univ. (Spain). Dept. de Fisica Teorica; Max-Planck-Institut fuer Physik, Muenchen (Germany); Ringwald, Andreas [DESY Hamburg (Germany). Theory Group

    2015-12-15

    Several stellar systems (white dwarfs, red giants, horizontal branch stars and possibly the neutron star in the supernova remnant Cassiopeia A) show a preference for a mild non-standard cooling mechanism when compared with theoretical models. This exotic cooling could be provided by Weakly Interacting Slim Particles (WISPs), produced in the hot cores and abandoning the star unimpeded, contributing directly to the energy loss. Taken individually, these excesses do not show a strong statistical weight. However, if one mechanism could consistently explain several of them, the hint could be significant. We analyze the hints in terms of neutrino anomalous magnetic moments, minicharged particles, hidden photons and axion-like particles (ALPs). Among them, the ALP represents the best solution. Interestingly, the hinted ALP parameter space is accessible to the next generation proposed ALP searches, such as ALPS II and IAXO.

  3. Liquid metal cooled fast breeder nuclear reactors

    International Nuclear Information System (INIS)

    Thatcher, G.; Mitchell, A.J.

    1981-01-01

    Fuel sub-assemblies for liquid metal-cooled fast breeder reactors are described which each incorporate a fluid flow control valve for regulating the rate of flow through the sub-assembly. These small electro-magnetic valves seek to maintain the outlet coolant temperature of at least some of the breeder sub-assemblies substantially constant throughout the life of the fuel assembly without severely pressurising the sub-assembly. (U.K.)

  4. Water-cooled electronics

    CERN Document Server

    Dumont, G; Righini, B

    2000-01-01

    LHC experiments demand on cooling of electronic instrumentation will be extremely high. A large number of racks will be located in underground caverns and counting rooms, where cooling by conventional climatisation would be prohibitively expensive. A series of tests on the direct water cooling of VMEbus units and of their standard power supplies is reported. A maximum dissipation of 60 W for each module and more than 1000 W delivered by the power supply to the crate have been reached. These values comply with the VMEbus specifications. (3 refs).

  5. Cooling towers: a bibliography

    International Nuclear Information System (INIS)

    Whitson, M.O.

    1981-02-01

    This bibliography cites 300 selected references containing information on various aspects of large cooling tower technology, including design, construction, operation, performance, economics, and environmental effects. The towers considered include natural-draft and mechanical-draft types employing wet, dry, or combination wet-dry cooling. A few references deal with alternative cooling methods, principally ponds or spray canals. The citations were compiled for the DOE Energy Information Data Base (EDB) covering the period January to December 1980. The references are to reports from the Department of Energy and its contractors, reports from other government or private organizations, and journal articles, books, conference papers, and monographs from US originators

  6. ON THE ESTIMATION OF SYSTEMATIC UNCERTAINTIES OF STAR FORMATION HISTORIES

    International Nuclear Information System (INIS)

    Dolphin, Andrew E.

    2012-01-01

    In most star formation history (SFH) measurements, the reported uncertainties are those due to effects whose sizes can be readily measured: Poisson noise, adopted distance and extinction, and binning choices in the solution itself. However, the largest source of error, systematics in the adopted isochrones, is usually ignored and very rarely explicitly incorporated into the uncertainties. I propose a process by which estimates of the uncertainties due to evolutionary models can be incorporated into the SFH uncertainties. This process relies on application of shifts in temperature and luminosity, the sizes of which must be calibrated for the data being analyzed. While there are inherent limitations, the ability to estimate the effect of systematic errors and include them in the overall uncertainty is significant. The effects of this are most notable in the case of shallow photometry, with which SFH measurements rely on evolved stars.

  7. Uncertainty Quantification in Fatigue Crack Growth Prognosis

    Directory of Open Access Journals (Sweden)

    Shankar Sankararaman

    2011-01-01

    Full Text Available This paper presents a methodology to quantify the uncertainty in fatigue crack growth prognosis, applied to structures with complicated geometry and subjected to variable amplitude multi-axial loading. Finite element analysis is used to address the complicated geometry and calculate the stress intensity factors. Multi-modal stress intensity factors due to multi-axial loading are combined to calculate an equivalent stress intensity factor using a characteristic plane approach. Crack growth under variable amplitude loading is modeled using a modified Paris law that includes retardation effects. During cycle-by-cycle integration of the crack growth law, a Gaussian process surrogate model is used to replace the expensive finite element analysis. The effect of different types of uncertainty – physical variability, data uncertainty and modeling errors – on crack growth prediction is investigated. The various sources of uncertainty include, but not limited to, variability in loading conditions, material parameters, experimental data, model uncertainty, etc. Three different types of modeling errors – crack growth model error, discretization error and surrogate model error – are included in analysis. The different types of uncertainty are incorporated into the crack growth prediction methodology to predict the probability distribution of crack size as a function of number of load cycles. The proposed method is illustrated using an application problem, surface cracking in a cylindrical structure.

  8. The Role of the Cooling Prescription for Disk Fragmentation: Numerical Convergence and Critical Cooling Parameter in Self-gravitating Disks

    Science.gov (United States)

    Baehr, Hans; Klahr, Hubert

    2015-12-01

    Protoplanetary disks fragment due to gravitational instability when there is enough mass for self-gravitation, described by the Toomre parameter, and when heat can be lost at a rate comparable to the local dynamical timescale, described by {t}{{c}}=β {{{Ω }}}-1. Simulations of self-gravitating disks show that the cooling parameter has a rough critical value at {β }{{crit}}=3. When below {β }{{crit}}, gas overdensities will contract under their own gravity and fragment into bound objects while otherwise maintaining a steady state of gravitoturbulence. However, previous studies of the critical cooling parameter have found dependences on simulation resolution, indicating that the simulation of self-gravitating protoplanetary disks is not so straightforward. In particular, the simplicity of the cooling timescale tc prevents fragments from being disrupted by pressure support as temperatures rise. We alter the cooling law so that the cooling timescale is dependent on local surface density fluctuations, which is a means of incorporating optical depth effects into the local cooling of an object. For lower resolution simulations, this results in a lower critical cooling parameter and a disk that is more stable to gravitational stresses, suggesting that the formation of large gas giants planets in large, cool disks is generally suppressed by more realistic cooling. At our highest resolution, however, the model becomes unstable to fragmentation for cooling timescales up to β =10.

  9. Cooling Devices in Laser therapy.

    Science.gov (United States)

    Das, Anupam; Sarda, Aarti; De, Abhishek

    2016-01-01

    Cooling devices and methods are now integrated into most laser systems, with a view to protecting the epidermis, reducing pain and erythema and improving the efficacy of laser. On the basis of method employed, it can be divided into contact cooling and non-contact cooling. With respect to timing of irradiation of laser, the nomenclatures include pre-cooling, parallel cooling and post-cooling. The choice of the cooling device is dictated by the laser device, the physician's personal choice with respect to user-friendliness, comfort of the patient, the price and maintenance costs of the device. We hereby briefly review the various techniques of cooling, employed in laser practice.

  10. Cooling of wood briquettes

    Directory of Open Access Journals (Sweden)

    Adžić Miroljub M.

    2013-01-01

    Full Text Available This paper is concerned with the experimental research of surface temperature of wood briquettes during cooling phase along the cooling line. The cooling phase is an important part of the briquette production technology. It should be performed with care, otherwise the quality of briquettes could deteriorate and possible changes of combustion characteristics of briquettes could happen. The briquette surface temperature was measured with an IR camera and a surface temperature probe at 42 sections. It was found that the temperature of briquette surface dropped from 68 to 34°C after 7 minutes spent at the cooling line. The temperature at the center of briquette, during the 6 hour storage, decreased to 38°C.

  11. Turbine airfoil film cooling

    Science.gov (United States)

    Hylton, L. D.; Nirmalan, V.; Sultanian, B. K.; Kaufman, R. M.

    1987-10-01

    The experimental data obtained in this program gives insight into the physical phenomena that occur on a film cooled airfoil, and should provide a relevant data base for verification of new design tools. Results indicate that the downstream film cooling process is a complex function of the thermal dilution and turbulence augmentation parameters with trends actually reversing as blowing strength and coolant-to-gas temperature ratio varied. The pressure surface of the airfoil is shown to exhibit a considerably higher degree of sensitivity to changes in the film cooling parameters and, consequently, should prove to be more of a challenge than the suction surface in accurately predicting heat transfer levels with downsteam film cooling.

  12. Warm and Cool Dinosaurs.

    Science.gov (United States)

    Mannlein, Sally

    2001-01-01

    Presents an art activity in which first grade students draw dinosaurs in order to learn about the concept of warm and cool colors. Explains how the activity also helped the students learn about the concept of distance when drawing. (CMK)

  13. LHC cooling gains ground

    CERN Multimedia

    Huillet-Miraton Catherine

    The nominal cryogenic conditions of 1.9 K have been achieved in sectors 5-6 and 7-8. This means that a quarter of the machine has reached the nominal conditions for LHC operation, having attained a temperature of below 2 K (-271°C), which is colder than interstellar space! Elsewhere, the cryogenic system in Sector 8-1 has been filled with liquid helium and cooled to 2K and will soon be available for magnet testing. Sectors 6-7 and 2-3 are being cooled down and cool-down operations have started in Sector 3-4. Finally, preparations are in hand for the cool-down of Sector 1-2 in May and of Sector 4-5, which is currently being consolidated. The LHC should be completely cold for the summer. For more information: http://lhc.web.cern.ch/lhc/Cooldown_status.htm.

  14. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  15. Laser cooling of solids

    Energy Technology Data Exchange (ETDEWEB)

    Epstein, Richard I [Los Alamos National Laboratory; Sheik-bahae, Mansoor [UNM

    2008-01-01

    We present an overview of solid-state optical refrigeration also known as laser cooling in solids by fluorescence upconversion. The idea of cooling a solid-state optical material by simply shining a laser beam onto it may sound counter intuitive but is rapidly becoming a promising technology for future cryocooler. We chart the evolution of this science in rare-earth doped solids and semiconductors.

  16. Application of uncertainty analyses with the MAAP4 code

    International Nuclear Information System (INIS)

    Nagashima, K.; Alammar, M.; Da Silva, H.C.; Henry, R.E.; Kenton, M.; Kuhtenia, D.; Kwee, M.; Ranval, W.

    1996-01-01

    Uncertainty analyses are an important element associated with using integral computer codes to evaluate the response of a reactor/containment system to off-normal situations. The more severe the off-normal transient, the more important the uncertainty analyses. How should such uncertainty analyses be formulated? How should the results of the uncertainty approach be applied? To address these questions for the MAAP4 code, an approach has been developed to uncertainty evaluations defining the importance of individual physical process (Table 1) and establishing a structure on how phenomena should be evaluated and quantified with respect to the integral assessment. Documentation of the technical basis for uncertainty bounds is essential to meaningful uncertainty analyses. In particular, the technical basis for determining oxidation rates, cooling rates, combustion rates, etc. must come from a composite of separate effects and integral experiments, as well as industrial experience. How this technical basis is developed and how it should be used must be documented so that the user has a clear understanding what is, or is not, included in the technical basis for the phenomena of interest. This paper will discuss the approach to developing the technical basis for uncertainty evaluations related to the phenomenon of RCS failure which includes the influence of natural circulation within the reactor coolant system. This discussion is an example of how relevant experiments and analyses must be documented to create the uncertainty bounds for each of the physical processes of interest. How these uncertainty bounds should be used in plant analyses will be discussed. As addressed by the plant specific PSAs/IPEs, there is a low frequency, for which severe accidents could occur and the core debris would not be cooled within the vessel, i.e. the reactor vessel would fail and core debris would be released to the containment. Under these conditions, the objectives of accident management

  17. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    Science.gov (United States)

    Artelle, Kyle A; Anderson, Sean C; Cooper, Andrew B; Paquet, Paul C; Reynolds, John D; Darimont, Chris T

    2013-01-01

    Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis) in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years) examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty) revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  18. Confronting uncertainty in wildlife management: performance of grizzly bear management.

    Directory of Open Access Journals (Sweden)

    Kyle A Artelle

    Full Text Available Scientific management of wildlife requires confronting the complexities of natural and social systems. Uncertainty poses a central problem. Whereas the importance of considering uncertainty has been widely discussed, studies of the effects of unaddressed uncertainty on real management systems have been rare. We examined the effects of outcome uncertainty and components of biological uncertainty on hunt management performance, illustrated with grizzly bears (Ursus arctos horribilis in British Columbia, Canada. We found that both forms of uncertainty can have serious impacts on management performance. Outcome uncertainty alone--discrepancy between expected and realized mortality levels--led to excess mortality in 19% of cases (population-years examined. Accounting for uncertainty around estimated biological parameters (i.e., biological uncertainty revealed that excess mortality might have occurred in up to 70% of cases. We offer a general method for identifying targets for exploited species that incorporates uncertainty and maintains the probability of exceeding mortality limits below specified thresholds. Setting targets in our focal system using this method at thresholds of 25% and 5% probability of overmortality would require average target mortality reductions of 47% and 81%, respectively. Application of our transparent and generalizable framework to this or other systems could improve management performance in the presence of uncertainty.

  19. Assignment of uncertainties to scientific data

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1994-01-01

    Long-standing problems of uncertainty assignment to scientific data came into a sharp focus in recent years when uncertainty information ('covariance files') had to be added to application-oriented large libraries of evaluated nuclear data such as ENDF and JEF. Question arouse about the best way to express uncertainties, the meaning of statistical and systematic errors, the origin of correlation and construction of covariance matrices, the combination of uncertain data from different sources, the general usefulness of results that are strictly valid only for Gaussian or only for linear statistical models, etc. Conventional statistical theory is often unable to give unambiguous answers, and tends to fail when statistics is bad so that prior information becomes crucial. Modern probability theory, on the other hand, incorporating decision information becomes group-theoretic results, is shown to provide straight and unique answers to such questions, and to deal easily with prior information and small samples. (author). 10 refs

  20. Temperature profiles of different cooling methods in porcine pancreas procurement.

    Science.gov (United States)

    Weegman, Bradley P; Suszynski, Thomas M; Scott, William E; Ferrer Fábrega, Joana; Avgoustiniatos, Efstathios S; Anazawa, Takayuki; O'Brien, Timothy D; Rizzari, Michael D; Karatzas, Theodore; Jie, Tun; Sutherland, David E R; Hering, Bernhard J; Papas, Klearchos K

    2014-01-01

    Porcine islet xenotransplantation is a promising alternative to human islet allotransplantation. Porcine pancreas cooling needs to be optimized to reduce the warm ischemia time (WIT) following donation after cardiac death, which is associated with poorer islet isolation outcomes. This study examines the effect of four different cooling Methods on core porcine pancreas temperature (n = 24) and histopathology (n = 16). All Methods involved surface cooling with crushed ice and chilled irrigation. Method A, which is the standard for porcine pancreas procurement, used only surface cooling. Method B involved an intravascular flush with cold solution through the pancreas arterial system. Method C involved an intraductal infusion with cold solution through the major pancreatic duct, and Method D combined all three cooling Methods. Surface cooling alone (Method A) gradually decreased core pancreas temperature to procurement, but incorporating an intraductal infusion (Method C) rapidly reduced core temperature 15-20 °C within the first 2 min of cooling. Combining all methods (Method D) was the most effective at rapidly reducing temperature and providing sustained cooling throughout the duration of procurement, although the recorded WIT was not different between Methods (P = 0.36). Histological scores were different between the cooling Methods (P = 0.02) and the worst with Method A. There were differences in histological scores between Methods A and C (P = 0.02) and Methods A and D (P = 0.02), but not between Methods C and D (P = 0.95), which may highlight the importance of early cooling using an intraductal infusion. In conclusion, surface cooling alone cannot rapidly cool large (porcine or human) pancreata. Additional cooling with an intravascular flush and intraductal infusion results in improved core porcine pancreas temperature profiles during procurement and histopathology scores. These data may also have implications on human pancreas procurement as use of an

  1. Black Hole Spin Measurement Uncertainty

    Science.gov (United States)

    Salvesen, Greg; Begelman, Mitchell C.

    2018-01-01

    Angular momentum, or spin, is one of only two fundamental properties of astrophysical black holes, and measuring its value has numerous applications. For instance, obtaining reliable spin measurements could constrain the growth history of supermassive black holes and reveal whether relativistic jets are powered by tapping into the black hole spin reservoir. The two well-established techniques for measuring black hole spin can both be applied to X-ray binaries, but are in disagreement for cases of non-maximal spin. This discrepancy must be resolved if either technique is to be deemed robust. We show that the technique based on disc continuum fitting is sensitive to uncertainties regarding the disc atmosphere, which are observationally unconstrained. By incorporating reasonable uncertainties into black hole spin probability density functions, we demonstrate that the spin measured by disc continuum fitting can become highly uncertain. Future work toward understanding how the observed disc continuum is altered by atmospheric physics, particularly magnetic fields, will further strengthen black hole spin measurement techniques.

  2. Comparing Social Stories™ to Cool versus Not Cool

    Science.gov (United States)

    Leaf, Justin B.; Mitchell, Erin; Townley-Cochran, Donna; McEachin, John; Taubman, Mitchell; Leaf, Ronald

    2016-01-01

    In this study we compared the cool versus not cool procedure to Social Stories™ for teaching various social behaviors to one individual diagnosed with autism spectrum disorder. The researchers randomly assigned three social skills to the cool versus not cool procedure and three social skills to the Social Stories™ procedure. Naturalistic probes…

  3. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  4. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  5. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  6. Thermal neutron diffusion cooling coefficient for plexiglass

    International Nuclear Information System (INIS)

    Drozdowicz, K.

    1992-08-01

    The thermal neutron diffusion cooling coefficient is a macroscopic material parameter. It is needed for description of the decay of the thermal neutron pulse in a medium and gives information of the diffusion cooling of the thermal neutron spectrum in a bounded volume. Experimental results from various measurements for plexiglass are overviewed in the paper. A method for theoretical, exact calculation of the parameter is presented. The formula utilizes some other thermal neutron parameters and a cooling function, i.e. the function which describes the deviation of the neutron spectrum in a bounded system from the distribution in an infinite one. The energy dependence of the function is obtained numerically from relations which results from the eigenvalue problem of the scattering operator when both the decay constant and the spectrum of the thermal neutron flux are developed on powers of the geometrical buckling. The case of a 1/ν absorption cross section is considered. The calculation utilizes a synthetic scattering function elaborated for hydrogenous media by GRANADA (1985). The influence of some quantities used in the calculation on the final result is investigated. The obtained value of the diffusion cooling coefficient for plexiglass is C = 6514 cm 4 s -1 at the temperature of 20 degrees C. The uncertainty is estimated to be ± 100 cm 4 s -1 within the physical model of the scattering kernel used. (au)

  7. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  8. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  9. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  10. Sources of Judgmental Uncertainty

    Science.gov (United States)

    1977-09-01

    sometimes at the end. To avoid primacy or recency effects , which were not part of this first study, for half of the subjects the orders of information items...summarize, 72 subjects were randomly assigned to two conditions of control and exposed to three conditions of orderliness. Order effects and primacy / recency ...WORDS (Continue on reverie atids If necessary and Identity by block number) ~ Judgmental Uncertainty Primacy / Recency Environmental UncertaintyN1

  11. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Wu, J.S.; Apostolakis, G.E.; Okrent, D.

    1989-01-01

    The theory of evidence and the theory of possibility are considered by some analysts as potential models for uncertainty. This paper discusses two issues: how formal probability theory has been relaxed to develop these uncertainty models; and the degree to which these models can be applied to risk assessment. The scope of the second issue is limited to an investigation of their compatibility for combining various pieces of evidence, which is an important problem in PRA

  12. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    Purpose: Relational uncertainty determines how relationships develop because it enables the building of trust and commitment. However, relational uncertainty has not been explored in an inter-organisational setting. This paper investigates how organisations experience relational uncertainty in se...

  13. Laser cooling of neutral atoms

    International Nuclear Information System (INIS)

    1993-01-01

    A qualitative description of laser cooling of neutral atoms is given. Two of the most important mechanisms utilized in laser cooling, the so-called Doppler Cooling and Sisyphus Cooling, are reviewed. The minimum temperature reached by the atoms is derived using simple arguments. (Author) 7 refs

  14. Robustness of ancestral sequence reconstruction to phylogenetic uncertainty.

    Science.gov (United States)

    Hanson-Smith, Victor; Kolaczkowski, Bryan; Thornton, Joseph W

    2010-09-01

    Ancestral sequence reconstruction (ASR) is widely used to formulate and test hypotheses about the sequences, functions, and structures of ancient genes. Ancestral sequences are usually inferred from an alignment of extant sequences using a maximum likelihood (ML) phylogenetic algorithm, which calculates the most likely ancestral sequence assuming a probabilistic model of sequence evolution and a specific phylogeny--typically the tree with the ML. The true phylogeny is seldom known with certainty, however. ML methods ignore this uncertainty, whereas Bayesian methods incorporate it by integrating the likelihood of each ancestral state over a distribution of possible trees. It is not known whether Bayesian approaches to phylogenetic uncertainty improve the accuracy of inferred ancestral sequences. Here, we use simulation-based experiments under both simplified and empirically derived conditions to compare the accuracy of ASR carried out using ML and Bayesian approaches. We show that incorporating phylogenetic uncertainty by integrating over topologies very rarely changes the inferred ancestral state and does not improve the accuracy of the reconstructed ancestral sequence. Ancestral state reconstructions are robust to uncertainty about the underlying tree because the conditions that produce phylogenetic uncertainty also make the ancestral state identical across plausible trees; conversely, the conditions under which different phylogenies yield different inferred ancestral states produce little or no ambiguity about the true phylogeny. Our results suggest that ML can produce accurate ASRs, even in the face of phylogenetic uncertainty. Using Bayesian integration to incorporate this uncertainty is neither necessary nor beneficial.

  15. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  16. Emergency reactor cooling circuit

    International Nuclear Information System (INIS)

    Araki, Hidefumi; Matsumoto, Tomoyuki; Kataoka, Yoshiyuki.

    1994-01-01

    Cooling water in a gravitationally dropping water reservoir is injected into a reactor pressure vessel passing through a pipeline upon occurrence of emergency. The pipeline is inclined downwardly having one end thereof being in communication with the pressure vessel. During normal operation, the cooling water in the upper portion of the inclined pipeline is heated by convection heat transfer from the communication portion with the pressure vessel. On the other hand, cooling water present at a position lower than the communication portion forms cooling water lumps. Accordingly, temperature stratification layers are formed in the inclined pipeline. Therefore, temperature rise of water in a vertical pipeline connected to the inclined pipeline is small. With such a constitution, the amount of heat lost from the pressure vessel by way of the water injection pipeline is reduced. Further, there is no worry that cooling water to be injected upon occurrence of emergency is boiled under reduced pressure in the injection pipeline to delay the depressurization of the pressure vessel. (I.N.)

  17. Emergency core cooling system

    International Nuclear Information System (INIS)

    Arai, Kenji; Oikawa, Hirohide.

    1990-01-01

    The device according to this invention can ensure cooling water required for emerency core cooling upon emergence such as abnormally, for example, loss of coolant accident, without using dynamic equipments such as a centrifugal pump or large-scaled tank. The device comprises a pressure accumulation tank containing a high pressure nitrogen gas and cooling water inside, a condensate storage tank, a pressure suppression pool and a jet stream pump. In this device there are disposed a pipeline for guiding cooling water in the pressure accumulation tank as a jetting water to a jetting stream pump, a pipeline for guiding cooling water stored in the condensate storage tank and the pressure suppression pool as pumped water to the jetting pump and, further, a pipeline for guiding the discharged water from the jet stream pump which is a mixed stream of pumped water and jetting water into the reactor pressure vessel. In this constitution, a sufficient amount of water ranging from relatively high pressure to low pressure can be supplied into the reactor pressure vessel, without increasing the size of the pressure accumulation tank. (I.S.)

  18. Gas cooled leads

    International Nuclear Information System (INIS)

    Shutt, R.P.; Rehak, M.L.; Hornik, K.E.

    1993-01-01

    The intent of this paper is to cover as completely as possible and in sufficient detail the topics relevant to lead design. The first part identifies the problems associated with lead design, states the mathematical formulation, and shows the results of numerical and analytical solutions. The second part presents the results of a parametric study whose object is to determine the best choice for cooling method, material, and geometry. These findings axe applied in a third part to the design of high-current leads whose end temperatures are determined from the surrounding equipment. It is found that cooling method or improved heat transfer are not critical once good heat exchange is established. The range 5 5 but extends over a large of values. Mass flow needed to prevent thermal runaway varies linearly with current above a given threshold. Below that value, the mass flow is constant with current. Transient analysis shows no evidence of hysteresis. If cooling is interrupted, the mass flow needed to restore the lead to its initially cooled state grows exponentially with the time that the lead was left without cooling

  19. Dry cooling tower operating experience in the LOFT reactor

    International Nuclear Information System (INIS)

    Hunter, J.A.

    1980-01-01

    A dry cooling tower has been uniquely utilized to dissipate heat generated in a small experimental pressurized water nuclear reactor. Operational experience revealed that dry cooling towers can be intermittently operated with minimal wind susceptibility and water hammer occurrences by cooling potential steam sources after a reactor scram, by isolating idle tubes from the external atmosphere, and by operating at relatively high pressures. Operating experience has also revealed that tube freezing can be minimized by incorporating the proper heating and heat loss prevention features

  20. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  1. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  2. Monitoring Cray Cooling Systems

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, Don E [ORNL; Ezell, Matthew A [ORNL; Becklehimer, Jeff [Cray, Inc.; Donovan, Matthew J [ORNL; Layton, Christopher C [ORNL

    2014-01-01

    While sites generally have systems in place to monitor the health of Cray computers themselves, often the cooling systems are ignored until a computer failure requires investigation into the source of the failure. The Liebert XDP units used to cool the Cray XE/XK models as well as the Cray proprietary cooling system used for the Cray XC30 models provide data useful for health monitoring. Unfortunately, this valuable information is often available only to custom solutions not accessible by a center-wide monitoring system or is simply ignored entirely. In this paper, methods and tools used to harvest the monitoring data available are discussed, and the implementation needed to integrate the data into a center-wide monitoring system at the Oak Ridge National Laboratory is provided.

  3. Muon ionization cooling experiment

    CERN Multimedia

    CERN. Geneva

    2003-01-01

    A neutrino factory based on a muon storage ring is the ultimate tool for studies of neutrino oscillations, including possibly leptonic CP violation. It is also the first step towards muon colliders. The performance of this new and promising line of accelerators relies heavily on the concept of ionisation cooling of minimum ionising muons, for which much R&D is required. The concept of a muon ionisation cooling experiment has been extensively studied and first steps are now being taken towards its realisation by a joint international team of accelerator and particle physicists. The aim of the workshop is to to explore at least two versions of an experiment based on existing cooling channel designs. If such an experiment is feasible, one shall then select, on the basis of effectiveness, simplicity, availability of components and overall cost, a design for the proposed experiment, and assemble the elements necessary to the presentation of a proposal. Please see workshop website.

  4. Reactor cooling system

    International Nuclear Information System (INIS)

    Kato, Etsuji.

    1979-01-01

    Purpose: To eliminate cleaning steps in the pipelines upon reactor shut-down by connecting a filtrating and desalting device to the cooling system to thereby always clean up the water in the pipelines. Constitution: A filtrating and desalting device is connected to the pipelines in the cooling system by way of drain valves and a check valve. Desalted water is taken out from the exit of the filtrating and desalting device and injected to one end of the cooling system pipelines by way of the drain valve and the check valve and then returned by way of another drain valve to the desalting device. Water in the pipelines is thus always desalted and the cleaning step in the pipelines is no more required in the shut-down. (Kawakami, Y.)

  5. Visualization of Uncertainty

    Science.gov (United States)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs

  6. A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.

  7. Conveying uncertainty in prognosis to patients with ESRD.

    Science.gov (United States)

    Parvez, Sanah; Abdel-Kader, Khaled; Song, Mi-Kyung; Unruh, Mark

    2015-01-01

    Prognosis is a component of medical practice imbued with uncertainty. In nephrology, where mortality rates of elderly patients on dialysis are comparable to those of cancer patients, the implications of prognosis are unavoidable. Yet while most patients with end-stage renal disease (ESRD) desire to hear their prognosis, many nephrologists balk at this prospect in part owing to the uncertainty inherent in prognostic estimates. In this review, the concept of 'uncertainty' in clinical practice is considered from physician and patient perspectives. From the training perspective, providers learn that uncertainty is inescapable in medicine and develop strategies to manage its presence, including the avoidance of communicating uncertainty to their patients. This presages infrequent discussions of prognosis, which in turn influence patient preferences for treatments that have little therapeutic benefits. A general approach to conveying prognostic uncertainty to ESRD patients includes confronting our own emotional reaction to uncertainty, learning how to effectively communicate uncertainty to our patients, and using an effective interdisciplinary team approach to demonstrate an ongoing commitment to our patients despite the presence of prognostic uncertainty. Uncertainty in prognosis is inevitable. Once providers learn to incorporate it into their discussions of prognosis and collaborate with their ESRD patients, such discussions can foster trust and reduce anxiety for both sides. © 2015 S. Karger AG, Basel.

  8. Cooling pond fog studies

    International Nuclear Information System (INIS)

    Hicks, B.B.

    1978-01-01

    The Fog Excess Water Index (FEWI) method of fog prediction has been verified by the use of data obtained at the Dresden cooling pond during 1976 and 1977 and by a reanalysis of observations made in conjunction with a study of cooling pond simulators during 1974. For applications in which the method is applied to measurements or estimates of bulk water temperature, a critical value of about 0.7 mb appears to be most appropriate. The present analyses confirm the earlier finding that wind speed plays little part in determining the susceptibility for fog generation

  9. Superconductor rotor cooling system

    Science.gov (United States)

    Gamble, Bruce B.; Sidi-Yekhlef, Ahmed; Schwall, Robert E.; Driscoll, David I.; Shoykhet, Boris A.

    2002-01-01

    A system for cooling a superconductor device includes a cryocooler located in a stationary reference frame and a closed circulation system external to the cryocooler. The closed circulation system interfaces the stationary reference frame with a rotating reference frame in which the superconductor device is located. A method of cooling a superconductor device includes locating a cryocooler in a stationary reference frame, and transferring heat from a superconductor device located in a rotating reference frame to the cryocooler through a closed circulation system external to the cryocooler. The closed circulation system interfaces the stationary reference frame with the rotating reference frame.

  10. Stochastic cooling for beginners

    International Nuclear Information System (INIS)

    Moehl, D.

    1984-01-01

    These two lectures have been prepared to give a simple introduction to the principles. In Part I we try to explain stochastic cooling using the time-domain picture which starts from the pulse response of the system. In Part II the discussion is repeated, looking more closely at the frequency-domain response. An attempt is made to familiarize the beginners with some of the elementary cooling equations, from the 'single particle case' up to equations which describe the evolution of the particle distribution. (orig.)

  11. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  12. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...... risk discourse (Myers 2005; 2007). In additional, however, I argue that commonplaces are used to mitigate feelings of insecurity caused by uncertainty and to negotiate new codes of moral conduct. Keywords: uncertainty, commonplaces, risk discourse, focus groups, appraisal...

  13. Direct Laser Cooling Al{}^{+} Ion Optical Clocks

    Science.gov (United States)

    Zhang, Jie; Deng, Ke; Luo, Jun; Lu, Ze-Huang

    2017-05-01

    The Al{}+ ion optical clock is a very promising optical frequency standard candidate due to its extremely small black-body radiation shift. It has been successfully demonstrated with the indirect cooled, quantum-logic-based spectroscopy technique. Its accuracy is limited by second-order Doppler shift, and its stability is limited by the number of ions that can be probed in quantum logic processing. We propose a direct laser cooling scheme of Al{}+ ion optical clocks where both the stability and accuracy of the clocks are greatly improved. In the proposed scheme, two Al{}+ traps are utilized. The first trap is used to trap a large number of Al{}+ ions to improve the stability of the clock laser, while the second trap is used to trap a single Al{}+ ion to provide the ultimate accuracy. Both traps are cooled with a continuous wave 167 nm laser. The expected clock laser stability can reach 9.0× {10}-17/\\sqrt{τ }. For the second trap, in addition to 167 nm laser Doppler cooling, a second stage pulsed 234 nm two-photon cooling laser is utilized to further improve the accuracy of the clock laser. The total systematic uncertainty can be reduced to about 1× {10}-18. The proposed Al{}+ ion optical clock has the potential to become the most accurate and stable optical clock. Supported by the National Basic Research Program of China under Grant No 2012CB821300, the National Natural Science Foundation of China under Grant Nos 91336213, 11304109, 91536116 and 11174095, and the Program for New Century Excellent Talents by the Ministry of Education under Grant No NCET-11-0176.

  14. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  15. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  16. Electron Cooling Dynamics for RHIC

    International Nuclear Information System (INIS)

    Fedotov, A.V.; Ben-Zvi, I.; Eidelman, Yu.; Litvinenko, V.N.; Malitsky, N.; Bruhwiler, D.; Meshkov, I.; Sidorin, A.; Smirnov, A.; Trubnikov, G.

    2005-01-01

    Research towards high-energy electron cooling of RHIC is presently underway at Brookhaven National Laboratory. In this new regime, electron cooling has many unique features and challenges. At high energy, due to the difficulty of providing operational reserves, the expected cooling times must be estimated with a high degree of accuracy compared to extant low-energy coolers. To address these high-energy cooling issues, a detailed study of cooling dynamics based on computer codes and experimental benchmarking was launched at BNL. In this paper, we present an update of the high-energy cooling dynamics studies. We also include a discussion of some features of electron cooling relevant to colliders, such as the effects of rapid cooling of the beam core and an accurate treatment of the intra-beam scattering for such cooled ion distributions

  17. Processes influencing cooling of reactor effluents

    International Nuclear Information System (INIS)

    Magoulas, V.E.; Murphy, C.E. Jr.

    1982-01-01

    Discharge of heated reactor cooling water from SRP reactors to the Savannah River is through sections of stream channels into the Savannah River Swamp and from the swamp into the river. Significant cooling of the reactor effluents takes place in both the streams and swamp. The majority of the cooling is through processes taking place at the surface of the water. The major means of heat dissipation are convective transfer of heat to the air, latent heat transfer through evaporation and radiative transfer of infrared radiation. A model was developed which incorporates the effects of these processes on stream and swamp cooling of reactor effluents. The model was used to simulate the effect of modifications in the stream environment on the temperature of water flowing into the river. Environmental effects simulated were the effect of changing radiant heat load, the effect of changes in tree canopy density in the swamp, the effect of total removal of trees from the swamp, and the effect of diverting the heated water from L reactor from Steel Creek to Pen Branch. 6 references, 7 figures

  18. Elementary stochastic cooling

    Energy Technology Data Exchange (ETDEWEB)

    Tollestrup, A.V.; Dugan, G

    1983-12-01

    Major headings in this review include: proton sources; antiproton production; antiproton sources and Liouville, the role of the Debuncher; transverse stochastic cooling, time domain; the accumulator; frequency domain; pickups and kickers; Fokker-Planck equation; calculation of constants in the Fokker-Planck equation; and beam feedback. (GHT)

  19. Radiant Floor Cooling Systems

    DEFF Research Database (Denmark)

    Olesen, Bjarne W.

    2008-01-01

    In many countries, hydronic radiant floor systems are widely used for heating all types of buildings such as residential, churches, gymnasiums, hospitals, hangars, storage buildings, industrial buildings, and smaller offices. However, few systems are used for cooling.This article describes a floor...

  20. Measure Guideline: Ventilation Cooling

    Energy Technology Data Exchange (ETDEWEB)

    Springer, D. [Alliance for Residential Building Innovation (ARBI), David, CA (United States); Dakin, B. [Alliance for Residential Building Innovation (ARBI), David, CA (United States); German, A. [Alliance for Residential Building Innovation (ARBI), David, CA (United States)

    2012-04-01

    The purpose of this measure guideline is to provide information on a cost-effective solution for reducing cooling system energy and demand in homes located in hot-dry and cold-dry climates. This guideline provides a prescriptive approach that outlines qualification criteria, selection considerations, and design and installation procedures.

  1. Passive cooling containment study

    International Nuclear Information System (INIS)

    Shin, J.J.; Iotti, R.C.; Wright, R.F.

    1993-01-01

    Pressure and temperature transients of nuclear reactor containment following postulated loss of coolant accident with a coincident station blackout due to total loss of all alternating current power are studied analytically and experimentally for the full scale NPR (New Production Reactor). All the reactor and containment cooling under this condition would rely on the passive cooling system which removes reactor decay heat and provides emergency core and containment cooling. Containment passive cooling for this study takes place in the annulus between containment steel shell and concrete shield building by natural convection air flow and thermal radiation. Various heat transfer coefficients inside annular air space were investigated by running the modified CONTEMPT code CONTEMPT-NPR. In order to verify proper heat transfer coefficient, temperature, heat flux, and velocity profiles were measured inside annular air space of the test facility which is a 24 foot (7.3m) high, steam heated inner cylinder of three foot (.91m) diameter and five and half foot (1.7m) diameter outer cylinder. Comparison of CONTEMPT-NPR and WGOTHIC was done for reduced scale NPR

  2. Experience in KINS on Best Estimate Calculation with Uncertainty Evaluation

    International Nuclear Information System (INIS)

    Bang, Young Seok; Huh, Byung-Gil; Cheong, Ae-ju; Woo, Sweng-Woong

    2013-01-01

    In the present paper, experience of Korea Institute of Nuclear Safety (KINS) on Best Estimate (BE) calculation and uncertainty evaluation of large break loss-of-coolant accident (LB LOCA) of Korean Pressurized Water Reactor (PWR) with various type of Emergency Core Cooling System (ECCS) design is addressed. Specifically, the current status of BE code, BE calculations and uncertainty parameters and related approaches are discussed. And the specific problem such as how to recover the difficulty in treating the uncertainty related to the phenomena specific to ECCS design (e.g., upper plenum injection phenomena) is discussed. Based on the results and discussion, it is found that the present KINS-REM has been successfully developed and applied to the regulatory auditing calculations. Need of further study includes the improvement of reflood model of MARS code, uncertainty of blowdown quenching, and reconsideration of the unconcerned model and fuel conductivity degradation with burnup. (authors)

  3. Towards Laser Cooling of Semiconductors

    Science.gov (United States)

    Hassani nia, Iman

    This dissertation reports on novel theoretical concepts as well as experimental efforts toward laser cooling of semiconductors. The use of quantum well system brings the opportunity to engineer bandstructure, effective masses and the spatial distribution of electrons and holes. This permits the incorporation of novel quantum mechanical phenomena to manipulate the temperature change of the material upon light-matter interaction. Inspired by the fact that Coulomb interaction can lead to blueshift of radiation after photo-absorption, the theory of Coulomb assisted laser cooling is proposed and investigated for the first time. In order to design suitable multiple quantum well (MQW) structures with Coulomb interaction a Poisson-Schrodinger solver was devised using MATLAB software. The software is capable of simulating all III-V material compositions and it results have been confirmed experimentally. In the next step, different MQW designs were proposed and optimized to exploit Coulomb interaction for assisting of optical refrigeration. One of the suitable designs with standard InGaAsP/InAlAs/InP layers was used to grow the MQW structures using metal organic vapor deposition (MOCVD). Novel techniques of fabrication were implemented to make suspended structures for detecting ultralow thermal powers. By fabricating accurate thermometers, the temperature changes of the device upon laser absorption were measured. The accurate measurement of the temperature encouraged us to characterize the electrical response of the device as another important tool to promote our understanding of the 4 underlying physical phenomena. This is in addition to the accurate spectral and time-resolved photoluminescence measurements that provided us with a wealth of information about the effects of stress, Auger recombination and excitonic radiance in such structures. As the future works, important measurements for finding the quantum efficiency of the devices via electrical characterization and

  4. Electron Cooling Study for MEIC

    International Nuclear Information System (INIS)

    Electron cooling of the ion beams is one critical R&D to achieve high luminosities in JLab's MEIC proposal. In the present MEIC design, a multi-staged cooling scheme is adapted, which includes DC electron cooling in the booster ring and bunched beam electron cooling in the collider ring at both the injection energy and the collision energy. We explored the feasibility of using both magnetized and non-magnetized electron beam for cooling, and concluded that a magnetized electron beam is necessary. Electron cooling simulation results for the newly updated MEIC design is also presented.

  5. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  6. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  7. Risk, Uncertainty, and Entrepreneurship

    DEFF Research Database (Denmark)

    Koudstaal, Martin; Sloof, Randolph; Van Praag, Mirjam

    2016-01-01

    Theory predicts that entrepreneurs have distinct attitudes toward risk and uncertainty, but empirical evidence is mixed. To better understand these mixed results, we perform a large “lab-in-the-field” experiment comparing entrepreneurs to managers (a suitable comparison group) and employees (n D ...

  8. Schrodinger's Uncertainty Principle?

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Schrödinger's Uncertainty Principle? - Lilies can be Painted. Rajaram Nityananda. General Article Volume 4 Issue 2 February 1999 pp 24-26. Fulltext. Click here to view fulltext PDF. Permanent link:

  9. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  10. Use of a temperature-initiated passive cooling system (TIPACS) for the modular high-temperature gas-cooled reactor cavity cooling system (RCCS)

    International Nuclear Information System (INIS)

    Forsberg, C.W.; Conklin, J.; Reich, W.J.

    1994-04-01

    A new type of passive cooling system has been invented (Forsberg 1993): the Temperature-Initiated Passive Cooling System (TIPACS). The characteristics of the TIPACS potentially match requirements for an improved reactor-cavity-cooling system (RCCS) for the modular high-temperature gas-cooled reactor (MHTGR). This report is an initial evaluation of the TIPACS for the MHTGR with a Rankines (steam) power conversion cycle. Limited evaluations were made of applying the TIPACS to MHTGRs with reactor pressure vessel temperatures up to 450 C. These temperatures may occur in designs of Brayton cycle (gas turbine) and process heat MHTGRs. The report is structured as follows. Section 2 describes the containment cooling issues associated with the MHTGR and the requirements for such a cooling system. Section 3 describes TIPACS in nonmathematical terms. Section 4 describes TIPACS's heat-removal capabilities. Section 5 analyzes the operation of the temperature-control mechanism that determines under what conditions the TIPACS rejects heat to the environment. Section 6 addresses other design and operational issues. Section 7 identifies uncertainties, and Section 8 provides conclusions. The appendixes provide the detailed data and models used in the analysis

  11. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  12. Cryopreservation: Vitrification and Controlled Rate Cooling.

    Science.gov (United States)

    Hunt, Charles J

    2017-01-01

    Cryopreservation is the application of low temperatures to preserve the structural and functional integrity of cells and tissues. Conventional cooling protocols allow ice to form and solute concentrations to rise during the cryopreservation process. The damage caused by the rise in solute concentration can be mitigated by the use of compounds known as cryoprotectants. Such compounds protect cells from the consequences of slow cooling injury, allowing them to be cooled at cooling rates which avoid the lethal effects of intracellular ice. An alternative to conventional cooling is vitrification. Vitrification methods incorporate cryoprotectants at sufficiently high concentrations to prevent ice crystallization so that the system forms an amorphous glass thus avoiding the damaging effects caused by conventional slow cooling. However, vitrification too can impose damaging consequences on cells as the cryoprotectant concentrations required to vitrify cells at lower cooling rates are potentially, and often, harmful. While these concentrations can be lowered to nontoxic levels, if the cells are ultra-rapidly cooled, the resulting metastable system can lead to damage through devitrification and growth of ice during subsequent storage and rewarming if not appropriately handled.The commercial and clinical application of stem cells requires robust and reproducible cryopreservation protocols and appropriate long-term, low-temperature storage conditions to provide reliable master and working cell banks. Though current Good Manufacturing Practice (cGMP) compliant methods for the derivation and banking of clinical grade pluripotent stem cells exist and stem cell lines suitable for clinical applications are available, current cryopreservation protocols, whether for vitrification or conventional slow freezing, remain suboptimal. Apart from the resultant loss of valuable product that suboptimal cryopreservation engenders, there is a danger that such processes will impose a selective

  13. Sorption cooling: a valid extension to passive cooling

    NARCIS (Netherlands)

    Doornink, D.J.; Burger, Johannes Faas; ter Brake, Hermanus J.M.

    2008-01-01

    Passive cooling has shown to be a very dependable cryogenic cooling method for space missions. Several missions employ passive radiators to cool down their delicate sensor systems for many years, without consuming power, without exporting vibrations or producing electromagnetic interference. So for

  14. Energy management under policy and technology uncertainty

    International Nuclear Information System (INIS)

    Tylock, Steven M.; Seager, Thomas P.; Snell, Jeff; Bennett, Erin R.; Sweet, Don

    2012-01-01

    Energy managers in public agencies are subject to multiple and sometimes conflicting policy objectives regarding cost, environmental, and security concerns associated with alternative energy technologies. Making infrastructure investment decisions requires balancing different distributions of risks and benefits that are far from clear. For example, managers at permanent Army installations must incorporate Congressional legislative objectives, executive orders, Department of Defense directives, state laws and regulations, local restrictions, and multiple stakeholder concerns when undertaking new energy initiatives. Moreover, uncertainty with regard to alternative energy technologies is typically much greater than that associated with traditional technologies, both because the technologies themselves are continuously evolving and because the intermittent nature of many renewable technologies makes a certain level of uncertainty irreducible. This paper describes a novel stochastic multi-attribute analytic approach that allows users to explore different priorities or weighting schemes in combination with uncertainties related to technology performance. To illustrate the utility of this approach for understanding conflicting policy or stakeholder perspectives, prioritizing the need for more information, and making investment decisions, we apply this approach to an energy technology decision problem representative of a permanent military base. Highlights: ► Incorporate disparate criteria with uncertain performance. ► Analyze decisions with contrasting stakeholder positions. ► Interactively compare alternatives based on uncertain weighting. ► User friendly multi-criteria decision analysis (MCDA) tool.

  15. Characterizing spatial uncertainty when integrating social data in conservation planning.

    Science.gov (United States)

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  16. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  17. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  18. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    Science.gov (United States)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  19. General relativistic effects on the cooling of neutron stars

    International Nuclear Information System (INIS)

    Kindl, C.; Straumann, N.

    1981-01-01

    The authors present a discussion of general relativistic effects on the cooling of neutron stars and show analytically that these almost cancel for the dominant neutrino processes and a very stiff equation of state (apart from a trivial redshift of the surface temperature for an observer ''at infinity''). Numerical results for a ''realistic'' equation of state show larger general relativistic corrections. These are, however, still smaller than the uncertainties in the neutrino loss rates. Previous results of cooling curves would thus not be changed significantly by a general relativistic treatment of the thermal properties of neutron stars. (Auth.)

  20. Delayed neutron spectra and their uncertainties in fission product summation calculations

    Energy Technology Data Exchange (ETDEWEB)

    Miyazono, T.; Sagisaka, M.; Ohta, H.; Oyamatsu, K.; Tamaki, M. [Nagoya Univ. (Japan)

    1997-03-01

    Uncertainties in delayed neutron summation calculations are evaluated with ENDF/B-VI for 50 fissioning systems. As the first step, uncertainty calculations are performed for the aggregate delayed neutron activity with the same approximate method as proposed previously for the decay heat uncertainty analyses. Typical uncertainty values are about 6-14% for {sup 238}U(F) and about 13-23% for {sup 243}Am(F) at cooling times 0.1-100 (s). These values are typically 2-3 times larger than those in decay heat at the same cooling times. For aggregate delayed neutron spectra, the uncertainties would be larger than those for the delayed neutron activity because much more information about the nuclear structure is still necessary. (author)

  1. Cooling devices in laser therapy

    Directory of Open Access Journals (Sweden)

    Anupam Das

    2016-01-01

    Full Text Available Cooling devices and methods are now integrated into most laser systems, with a view to protecting the epidermis, reducing pain and erythema and improving the efficacy of laser. On the basis of method employed, it can be divided into contact cooling and non-contact cooling. With respect to timing of irradiation of laser, the nomenclatures include pre-cooling, parallel cooling and post-cooling. The choice of the cooling device is dictated by the laser device, the physician′s personal choice with respect to user-friendliness, comfort of the patient, the price and maintenance costs of the device. We hereby briefly review the various techniques of cooling, employed in laser practice.

  2. Modeling transport phenomena and uncertainty quantification in solidification processes

    Science.gov (United States)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  3. Cooling apparatus and couplings therefor

    Science.gov (United States)

    Lomax, Curtis; Webbon, Bruce

    1993-11-01

    A cooling apparatus includes a container filled with a quantity of coolant fluid initially cooled to a solid phase, a cooling loop disposed between a heat load and the container. A pump for circulating a quantity of the same type of coolant fluid in a liquid phase through the cooling loop, a pair of couplings for communicating the liquid phase coolant fluid into the container in a direct interface with the solid phase coolant fluid.

  4. Cooled particle accelerator target

    Science.gov (United States)

    Degtiarenko, Pavel V.

    2005-06-14

    A novel particle beam target comprising: a rotating target disc mounted on a retainer and thermally coupled to a first array of spaced-apart parallel plate fins that extend radially inwardly from the retainer and mesh without physical contact with a second array of spaced-apart parallel plate fins that extend radially outwardly from and are thermally coupled to a cooling mechanism capable of removing heat from said second array of spaced-apart fins and located within the first array of spaced-apart parallel fins. Radiant thermal exchange between the two arrays of parallel plate fins provides removal of heat from the rotating disc. A method of cooling the rotating target is also described.

  5. ITER cooling systems

    International Nuclear Information System (INIS)

    Natalizio, A.; Hollies, R.E.; Sochaski, R.O.; Stubley, P.H.

    1992-06-01

    The ITER reference system uses low-temperature water for heat removal and high-temperature helium for bake-out. As these systems share common equipment, bake-out cannot be performed until the cooling system is drained and dried, and the reactor cannot be started until the helium has been purged from the cooling system. This study examines the feasibility of using a single high-temperature fluid to perform both heat removal and bake-out. The high temperature required for bake-out would also be in the range for power production. The study examines cost, operational benefits, and impact on reactor safety of two options: a high-pressure water system, and a low-pressure organic system. It was concluded that the cost savings and operational benefits are significant; there are no significant adverse safety impacts from operating either the water system or the organic system; and the capital costs of both systems are comparable

  6. Optimization of FRAP uncertainty analysis option

    International Nuclear Information System (INIS)

    Peck, S.O.

    1979-10-01

    The automated uncertainty analysis option that has been incorporated in the FRAP codes (FRAP-T5 and FRAPCON-2) provides the user with a means of obtaining uncertainty bands on code predicted variables at user-selected times during a fuel pin analysis. These uncertainty bands are obtained by multiple single fuel pin analyses to generate data which can then be analyzed by second order statistical error propagation techniques. In this process, a considerable amount of data is generated and stored on tape. The user has certain choices to make regarding which independent variables are to be used in the analysis and what order of error propagation equation should be used in modeling the output response. To aid the user in these decisions, a computer program, ANALYZ, has been written and added to the uncertainty analysis option package. A variety of considerations involved in fitting response surface equations and certain pit-falls of which the user should be aware are discussed. An equation is derived expressing a residual as a function of a fitted model and an assumed true model. A variety of experimental design choices are discussed, including the advantages and disadvantages of each approach. Finally, a description of the subcodes which constitute program ANALYZ is provided

  7. Cooling and dehumidifying coils

    International Nuclear Information System (INIS)

    Murthy, M.V.K.

    1988-01-01

    The operating features of cooling and dehumidifying coils and their constructional details are discussed. The heat transfer relations as applicable to the boiling refrigerant and a single phase fluid are presented. Methods of accounting for the effect of moisture condensation on the air side heat transfer coefficient and the fin effectiveness are explained. The logic flow necessary to analyze direct expansion coils and chilled water coils is discussed

  8. Laser Cooling of Solids

    Science.gov (United States)

    2009-01-01

    Panel (b) com- pares the cooling efficiencies of available thermoelectric coolers ( TECs ) with ZBLANP:Yb3+-based optical refrigerators. Devices based...on materials with low parasitic heating will outperform TECs below 200 . Coolers made from current materials are less efficient than TECs at all...luminescence extraction efficiency are being explored as well. A novel method based on the frustrated total internal reflection across a vacuum “ nano -gap” is

  9. Conduction cooling: multicrate fastbus hardware

    International Nuclear Information System (INIS)

    Makowiecki, D.; Sims, W.; Larsen, R.

    1980-11-01

    Described is a new and novel approach for cooling nuclear instrumentation modules via heat conduction. The simplicity of liquid cooled crates and ease of thermal management with conduction cooled modules are described. While this system was developed primarily for the higher power levels expected with Fastbus electronics, it has many general applications

  10. ITER cooling system

    International Nuclear Information System (INIS)

    Kveton, O.K.

    1990-11-01

    The present specification of the ITER cooling system does not permit its operation with water above 150 C. However, the first wall needs to be heated to higher temperatures during conditioning at 250 C and bake-out at 350 C. In order to use the cooling water for these operations the cooling system would have to operate during conditioning at 37 Bar and during bake-out at 164 Bar. This is undesirable from the safety analysis point of view, and alternative heating methods are to be found. This review suggests that superheated steam or gas heating can be used for both baking and conditioning. The blanket design must consider the use of dual heat transfer media, allowing for change from one to another in both directions. Transfer from water to gas or steam is the most intricate and risky part of the entire heating process. Superheated steam conditioning appears unfavorable. The use of inert gas is recommended, although alternative heating fluids such as organic coolant should be investigated

  11. Muon Cooling - Emittance Exchange

    International Nuclear Information System (INIS)

    Parsa, Z.

    2001-01-01

    Muon Cooling is the key factor in building of a Muon collider, (to a less degree) Muon storage ring, and a Neutrino Factory. Muon colliders potential to provide a probe for fundamental particle physics is very interesting, but may take a considerable time to realize, as much more work and study is needed. Utilizing high intensity Muon sources - Neutrino Factories, and other intermediate steps are very important and will greatly expand our abilities and confidence in the credibility of high energy muon colliders. To obtain the needed collider luminosity, the phase-space volume must be greatly reduced within the muon life time. The Ionization cooling is the preferred method used to compress the phase space and reduce the emittance to obtain high luminosity muon beams. We note that, the ionization losses results not only in damping, but also heating. The use of alternating solenoid lattices has been proposed, where the emittance are large. We present an overview of the cooling and discuss formalism, solenoid magnets and some beam dynamics

  12. Reactor container cooling device

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Koji; Kinoshita, Shoichiro

    1995-11-10

    The device of the present invention efficiently lowers pressure and temperature in a reactor container upon occurrence of a severe accident in a BWR-type reactor and can cool the inside of the container for a long period of time. That is, (1) pipelines on the side of an exhaustion tower of a filter portion in a filter bent device of the reactor container are in communication with pipelines on the side of a steam inlet of a static container cooling device by way of horizontal pipelines, (2) a back flow check valve is disposed to horizontal pipelines, (3) a steam discharge valve for a pressure vessel is disposed closer to the reactor container than the joint portion between the pipelines on the side of the steam inlet and the horizontal pipelines. Upon occurrence of a severe accident, when the pressure vessel should be ruptured and steams containing aerosol in the reactor core should be filled in the reactor container, the inlet valve of the static container cooling device is closed. Steams are flown into the filter bent device of the reactor container, where the aerosols can be removed. (I.S.).

  13. Visualizing Java uncertainty

    Science.gov (United States)

    Knight, Claire; Munro, Malcolm

    2001-07-01

    Distributed component based systems seem to be the immediate future for software development. The use of such techniques, object oriented languages, and the combination with ever more powerful higher-level frameworks has led to the rapid creation and deployment of such systems to cater for the demand of internet and service driven business systems. This diversity of solution through both components utilised and the physical/virtual locations of those components can provide powerful resolutions to the new demand. The problem lies in the comprehension and maintenance of such systems because they then have inherent uncertainty. The components combined at any given time for a solution may differ, the messages generated, sent, and/or received may differ, and the physical/virtual locations cannot be guaranteed. Trying to account for this uncertainty and to build in into analysis and comprehension tools is important for both development and maintenance activities.

  14. Risk, uncertainty and regulation.

    Science.gov (United States)

    Krebs, John R

    2011-12-13

    This paper reviews the relationship between scientific evidence, uncertainty, risk and regulation. Risk has many different meanings. Furthermore, if risk is defined as the likelihood of an event happening multiplied by its impact, subjective perceptions of risk often diverge from the objective assessment. Scientific evidence may be ambiguous. Scientific experts are called upon to assess risks, but there is often uncertainty in their assessment, or disagreement about the magnitude of the risk. The translation of risk assessments into policy is a political judgement that includes consideration of the acceptability of the risk and the costs and benefits of legislation to reduce the risk. These general points are illustrated with reference to three examples: regulation of risk from pesticides, control of bovine tuberculosis and pricing of alcohol as a means to discourage excessive drinking.

  15. How Uncertain is Uncertainty?

    Science.gov (United States)

    Vámos, Tibor

    The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.

  16. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    certify to :  ISO   9001  (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO /IEC 17025:2005  Each has uncertainty...Analytical Measurement  Uncertainty Estimation” Defense Technical Information  Center # ADA 396946 William S. Ingersoll,  2001 12  Follows the  ISO  GUM...SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY

  17. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  18. Uncertainty and Decision Making

    Science.gov (United States)

    1979-09-01

    included as independent variables orderli- ness, the status of the source of information, the primacy versus recency of positive information items, and...low uncertainty and high satisfac- tion. The primacy / recency and sequential/final variables produced no significant differences. In summary, we have...to which the different independent variables (credibility, probability, and content) had an effect on the favorability judgments. The results were

  19. Growth uncertainty and risksharing

    OpenAIRE

    Stefano Athanasoulis; Eric Van Wincoop

    1997-01-01

    How large are potential benefits from global risksharing? In order to answer this question we propose a new methodology that is closely connected with the empirical growth literature. We obtain estimates of residual risk (growth uncertainty) at various horizons from regressions of country-specific growth in deviation from world growth on a wide set of variables in the information set. Since this residual risk can be entirely hedged through risksharing, we use it to obtain a measure of the pot...

  20. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  1. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  2. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  3. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  4. Cooling lubricants; Kuehlschmierstoffe

    Energy Technology Data Exchange (ETDEWEB)

    Pfeiffer, W. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Breuer, D. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Blome, H. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Deininger, C. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Hahn, J.U. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Kleine, H. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Nies, E. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Pflaumbaum, W. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Stockmann, R. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Willert, G. [Berufsgenossenschaftliches Inst. fuer Arbeitssicherheit, St. Augustin (Germany); Sonnenschein, G. [Maschinenbau- und Metall-Berufsgenossenschaft, Duesseldorf (Germany)

    1996-08-01

    As a rule, the base substances used are certain liquid hydrocarbons from mineral oils as well as from native and synthetic oils. Through the addition of further substances the cooling lubricant takes on the particular qualities required for the use in question. Employees working with cooling lubricants are exposed to various hazards. The assessment of the concentrations at the work station is carried out on the basis of existing technical rules for contact with hazardous substances. However, the application/implementation of compulsory investigation and supervision in accordance with these rules is made difficult by the fact that cooling lubricants are, as a rule, made up of complicated compound mixtures. In addition to protecting employees from exposure to mists and vapours from the cooling lubricants, protection for the skin is also of particular importance. Cooling lubricants should not, if at all possible, be brought into contact with the skin. Cleansing the skin and skin care is just as important as changing working clothes regularly, and hygiene and cleanliness at the workplace. Unavoidable emissions are to be immediately collected at the point where they arise or are released and safely disposed of. This means taking into account all sources of emissions. The programme presented in this report therefore gives a very detailed account of the individual protective measures and provides recommendations for the design of technical protection facilities. (orig./MG) [Deutsch] Als Basisstoffe dienen in der Regel bestimmte fluessige Kohlenwasserstoffverbindungen aus Mineraloelen sowie aus nativen oder synthetischen Oelen. Durch die Zugabe von weiteren Stoffen erlangt der Kuehlschmierstoff seine fuer den jeweiligen Anwendungsabfall geforderten Eigenschaften. Beschaeftigte, die mit Kuehlschmierstoffen umgehen, sind unterschiedliche Gefahren ausgesetzt. Die Beurteilung der Kuehlschmierstoffkonzentrationen in der Luft am Arbeitsplatz erfolgt auf der Grundlage bestehender

  5. Greenhouse with an Integrated NIR Filter and a Solar Cooling System

    NARCIS (Netherlands)

    Sonneveld, P.J.; Swinkels, G.L.A.M.; Kempkes, F.L.K.; Campen, J.B.; Bot, G.P.A.

    2006-01-01

    The scope of this paper is a new greenhouse design that incorporates both a filter for rejecting near infrared radiation (NIR) and a solar cooling system. Cooled greenhouses are an important issue for the combination of high global radiation and high outdoor temperatures. As a first measure, this

  6. Passive ventilation systems with heat recovery and night cooling

    DEFF Research Database (Denmark)

    Hviid, Christian Anker; Svendsen, Svend

    2008-01-01

    of Denmark. Through building integration in high performance offices the system is optimized to incorporate multiple functions like heating, cooling and ventilation, thus saving the expenses of separate cooling and heating systems. The simulation results are derived using the state-of-the-art building......In building design the requirements for energy consumption for ventilation, heating and cooling and the requirements for increasingly better indoor climate are two opposing factors. This paper presents the schematic layout and simulation results of an innovative multifunc-tional ventilation concept...... with little energy consumption and with satisfying indoor climate. The concept is based on using passive measures like stack and wind driven ventilation, effective night cooling and low pressure loss heat recovery using two fluid coupled water-to-air heat exchangers developed at the Technical University...

  7. Experimental tests and qualification of analytical methods to address thermohydraulic phenomena in advanced water cooled reactors. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2000-05-01

    Worldwide there is considerable experience in nuclear power technology, especially in water cooled reactor technology. Of the operating plants, in September 1998, 346 were light water reactors (LWRs) totalling 306 GW(e) and 29 were heavy water reactors (HWRs) totalling 15 GW(e). The accumulated experience and lessons learned from these plants are being incorporated into new advanced reactor designs. Utility requirements documents have been formulated to guide these design activities by incorporating this experience, and results from research and development programmes, with the aim of reducing costs and licensing uncertainties by establishing the technical bases for the new designs. Common goals for advanced designs are high availability, user-friendly features, competitive economics and compliance with internationally recognized safety objectives. Large water cooled reactors with power outputs of 1300 MW(e) and above, which possess inherent safety characteristics (e.g. negative Doppler moderator temperature coefficients, and negative moderator void coefficient) and incorporate proven, active engineered systems to accomplish safety functions are being developed. Other designs with power outputs from, for example, 220 MW(e) up to about 1300 MW(e) which also possess inherent safety characteristics and which place more emphasis on utilization of passive safety systems are being developed. Passive systems are based on natural forces and phenomena such as natural convection and gravity, making safety functions less dependent on active systems and components like pumps and diesel generators. In some cases, further experimental tests for the thermohydraulic conditions of interest in advanced designs can provide improved understanding of the phenomena. Further, analytical methods to predict reactor thermohydraulic behaviour can be qualified for use by comparison with the experimental results. These activities should ultimately result in more economical designs. The

  8. Laser cooling by adiabatic transfer

    Science.gov (United States)

    Norcia, Matthew; Cline, Julia; Bartolotta, John; Holland, Murray; Thompson, James

    2017-04-01

    We have demonstrated a new method of laser cooling applicable to particles with narrow linewidth optical transitions. This simple and robust cooling mechanism uses a frequency-swept laser to adiabatically transfer atoms between internal and motional states. The role of spontaneous emission is reduced (though is still critical) compared to Doppler cooling. This allows us to achieve greater slowing forces than would be possible with Doppler cooling, and may make this an appealing technique for cooling molecules. In this talk, I will present a demonstration of this technique in a cold strontium system. DARPA QUASAR, NIST, NSF PFC.

  9. Electron cooling experiments at Fermilab

    International Nuclear Information System (INIS)

    Forster, R.; Hardek, T.; Johnson, D.E.; Kells, W.; Kerner, V.; Lai, H.; Lennox, A.J.; Mills, F.; Miyahara, Y.; Oleksiuk, L.; Peters, R.; Rhoades, T.; Young, D.; McIntyre, P.M.

    1981-01-01

    A 115 Mev proton beam has been successfully cooled in the Electron Cooling Ring at Fermilab. Initial experiments have measured the longitudinal drag force, transverse damping rate, and equilibrium beam size. The proton beam was cooled by a factor of aproximately 50 in momentum spread in 5 sec, and by a factor of 3 in transverse size in 15 sec. Long term losses were consistent with single scattering from residual gas, with lifetime approximately 1000 sec. Using the measured electron beam temperature T/sub e/.0.8(2) ev, the observed cooling agrees well with expectations from cooling theory. 13 refs

  10. Laser cooling of molecular anions.

    Science.gov (United States)

    Yzombard, Pauline; Hamamda, Mehdi; Gerber, Sebastian; Doser, Michael; Comparat, Daniel

    2015-05-29

    We propose a scheme for laser cooling of negatively charged molecules. We briefly summarize the requirements for such laser cooling and we identify a number of potential candidates. A detailed computation study with C_{2}^{-}, the most studied molecular anion, is carried out. Simulations of 3D laser cooling in a gas phase show that this molecule could be cooled down to below 1 mK in only a few tens of milliseconds, using standard lasers. Sisyphus cooling, where no photodetachment process is present, as well as Doppler laser cooling of trapped C_{2}^{-}, are also simulated. This cooling scheme has an impact on the study of cold molecules, molecular anions, charged particle sources, and antimatter physics.

  11. Laser Cooling of Molecular Anions

    CERN Document Server

    Yzombard, Pauline; Gerber, Sebastian; Doser, Michael; Comparat, Daniel

    2015-01-01

    We propose a scheme for laser cooling of negatively charged molecules. We briefly summarise the requirements for such laser cooling and we identify a number of potential candidates. A detailed computation study with C$\\_2^-$, the most studied molecular anion, is carried out. Simulations of 3D laser cooling in a gas phase show that this molecule could be cooled down to below 1 mK in only a few tens of milliseconds, using standard lasers. Sisyphus cooling, where no photo-detachment process is present, as well as Doppler laser cooling of trapped C$\\_2^-$, are also simulated. This cooling scheme has an impact on the study of cold molecules, molecular anions, charged particle sources and antimatter physics.

  12. Passive Two-Phase Cooling of Automotive Power Electronics: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Moreno, G.; Jeffers, J. R.; Narumanchi, S.; Bennion, K.

    2014-08-01

    Experiments were conducted to evaluate the use of a passive two-phase cooling strategy as a means of cooling automotive power electronics. The proposed cooling approach utilizes an indirect cooling configuration to alleviate some reliability concerns and to allow the use of conventional power modules. An inverter-scale proof-of-concept cooling system was fabricated, and tests were conducted using the refrigerants hydrofluoroolefin HFO-1234yf and hydrofluorocarbon HFC-245fa. Results demonstrated that the system can dissipate at least 3.5 kW of heat with 250 cm3 of HFC-245fa. An advanced evaporator design that incorporates features to improve performance and reduce size was conceived. Simulation results indicate its thermal resistance can be 37% to 48% lower than automotive dual side cooled power modules. Tests were also conducted to measure the thermal performance of two air-cooled condensers--plain and rifled finned tube designs. The results combined with some analysis were then used to estimate the required condenser size per operating conditions and maximum allowable system (i.e., vapor and liquid) temperatures.

  13. Collision entropy and optimal uncertainty

    OpenAIRE

    Bosyk, G. M.; Portesi, M.; Plastino, A.

    2011-01-01

    We propose an alternative measure of quantum uncertainty for pairs of arbitrary observables in the 2-dimensional case, in terms of collision entropies. We derive the optimal lower bound for this entropic uncertainty relation, which results in an analytic function of the overlap of the corresponding eigenbases. Besides, we obtain the minimum uncertainty states. We compare our relation with other formulations of the uncertainty principle.

  14. Incorporating Feminist Standpoint Theory

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    2005-01-01

    As has been noted by Alvin Goldman, there are some very interesting similarities between his Veritistic Social Epistemology (VSE) and Sandra Harding’s Feminist Standpoint Theory (FST). In the present paper, it is argued that these similarities are so significant as to motivate an incorporation...

  15. Cooling water systems design using process integration

    CSIR Research Space (South Africa)

    Gololo, KV

    2010-09-01

    Full Text Available Cooling water systems are generally designed with a set of heat exchangers arranged in parallel. This arrangement results in higher cooling water flowrate and low cooling water return temperature thus reducing cooling tower efficiency. Previous...

  16. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  17. Gas cooled HTR

    International Nuclear Information System (INIS)

    Schweiger, F.

    1985-01-01

    In the He-cooled, graphite-moderated HTR with spherical fuel elements, the steam generator is fixed outside the pressure vessel. The heat exchangers are above the reactor level. The hot gases stream from the reactor bottom over the heat exchanger, through an annular space around the heat exchanger and through feed lines in the side reflector of the reactor back to its top part. This way, in case of shutdown there is a supplementary natural draught that helps the inner natural circulation (chimney draught effect). (orig./PW)

  18. Cooled, temperature controlled electrometer

    Science.gov (United States)

    Morgan, John P.

    1992-08-04

    A cooled, temperature controlled electrometer for the measurement of small currents. The device employs a thermal transfer system to remove heat from the electrometer circuit and its environment and dissipate it to the external environment by means of a heat sink. The operation of the thermal transfer system is governed by a temperature regulation circuit which activates the thermal transfer system when the temperature of the electrometer circuit and its environment exceeds a level previously inputted to the external variable temperature control circuit. The variable temperature control circuit functions as subpart of the temperature control circuit. To provide temperature stability and uniformity, the electrometer circuit is enclosed by an insulated housing.

  19. Illumination and radiative cooling

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Shanhui; Raman, Aaswath Pattabhi; Zhu, Linxiao; Rephaeli, Eden

    2018-03-20

    Aspects of the present disclosure are directed to providing and/or controlling electromagnetic radiation. As may be implemented in accordance with one or more embodiments, an apparatus includes a first structure that contains an object, and a second structure that is transparent at solar wavelengths and emissive in the atmospheric electromagnetic radiation transparency window. The second structure operates with the first structure to pass light into the first structure for illuminating the object, and to radiatively cool the object while preserving the object's color.

  20. Application of Fuzzy Set Theory for Uncertainty Analysis in the Probabilistic Safety Assessment of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Dybach, A.M.

    2015-01-01

    The paper discusses the application of fuzzy set theory for uncertainty analysis in the NPP probabilistic safety assessment as an alternative to statistical methods. Results obtained with the Monte Carlo method and fuzzy set theory to assess the probability and uncertainty of failure of the safety function performed by the passive emergency core cooling system are compared

  1. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  2. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  3. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  4. Onderzoeksrapportage duurzaam koelen : EOS Renewable Cooling

    NARCIS (Netherlands)

    Broeze, J.; Sluis, van der S.; Wissink, E.

    2010-01-01

    For reducing energy use for cooling, alternative methods (that do not rely on electricity) are needed. Renewable cooling is based on naturally available resources such as evaporative cooling, free cooling, phase change materials, ground subcooling, solar cooling, wind cooling, night radiation &

  5. Cooling of the Building Structure by Night-time Ventilation

    DEFF Research Database (Denmark)

    Artmann, Nikolai

    In modern, extensively glazed office buildings, due to high solar and internal loads and increased comfort expectations, air conditioning is increasingly applied even in moderate and cold climates, like in Central and Northern Europe. Particularly in these cases, night-time ventilation is often...... seen as a promising passive cooling concept. Many successful examples of passively cooled buildings demonstrate the possibility of providing good thermal comfort conditions without the need for energy-intensive air conditioning systems. However, due to uncertainties in the prediction of thermal comfort...... is essential for effective night cooling, and thus a sufficient amount of thermal mass is needed in the building. In order to assess the impact of different parameters, such as slab thickness, material properties and the surface heat transfer, the dynamic heat storage capacity of building elements...

  6. Cooled spool piston compressor

    Science.gov (United States)

    Morris, Brian G. (Inventor)

    1993-01-01

    A hydraulically powered gas compressor receives low pressure gas and outputs a high pressure gas. The housing of the compressor defines a cylinder with a center chamber having a cross-sectional area less than the cross-sectional area of a left end chamber and a right end chamber, and a spool-type piston assembly is movable within the cylinder and includes a left end closure, a right end closure, and a center body that are in sealing engagement with the respective cylinder walls as the piston reciprocates. First and second annual compression chambers are provided between the piston enclosures and center housing portion of the compressor, thereby minimizing the spacing between the core gas and a cooled surface of the compressor. Restricted flow passageways are provided in the piston closure members and a path is provided in the central body of the piston assembly, such that hydraulic fluid flows through the piston assembly to cool the piston assembly during its operation. The compressor of the present invention may be easily adapted for a particular application, and is capable of generating high gas pressures while maintaining both the compressed gas and the compressor components within acceptable temperature limits.

  7. Water cooled nuclear reactors

    International Nuclear Information System (INIS)

    Donaldson, A.J.

    1989-01-01

    In order to reduce any loss of primary water coolant from around a reactor core of a water cooled nuclear reactor caused by any failure of a pressure vessel, an inner vessel is positioned within and spaced from the pressure vessel. The reactor core and main portion of the primary water coolant circuit and a heat exchanger are positioned within the inner vessel to maintain some primary water coolant around the reactor core and to allow residual decay heat to be removed from the reactor core by the heat exchanger. In the embodiment shown an aperture at the upper region of the inner vessel is dimensioned configured and arranged to prevent steam from a steam space of an integral pressurised water cooled nuclear reactor for a ship entering the main portion of the primary water coolant circuit in the inner vessel if the longitudinal axis of the nuclear reactor is displaced from its normal substantially vertical position to an abnormal position at an angle to the vertical direction. Shields are integral with the inner vessel. (author)

  8. Emergency core cooling system

    International Nuclear Information System (INIS)

    Kato, Masaru.

    1981-01-01

    Purpose: To enable quick cooling of a core by efficiently utilizing coolant supplied in an emergency. Constitution: A feedwater nozzle and a water level detector are disposed in the gap between a partition plate for supporting the top of a fuel assembly and a lattice plate for supporting the bottom of the fuel assembly. At the time of a loss of coolant accident, coolant is injected from a sprinkling nozzle toward the reactor core, and is also injected from a feedwater nozzle. When the coolant reaches a prescribed level in the reactor core, the water level is detected by the detector, the coolant is fed by a pump to the lower plenum, and the submerging speed of the reactor core is thereby accelerated. When the water level again becomes lower than the prescribed level, the coolant is again filled from the feedwater nozzle, and similar operation is thereafter repeated. Accordingly, the coolant filled in the reactor core can be efficiently utilized to cool the reactor core. (Kamimura, M.)

  9. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  10. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  11. Oil price uncertainty in Canada

    International Nuclear Information System (INIS)

    Elder, John; Serletis, Apostolos

    2009-01-01

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  12. Calculation and experimental test of the cooling factor of tungsten

    International Nuclear Information System (INIS)

    Puetterich, T.; Neu, R.; Dux, R.; Whiteford, A.D.; O'Mullane, M.G.; Summers, H.P.

    2010-01-01

    The cooling factor of W is evaluated using state of the art data for line radiation and an ionization balance which has been benchmarked with experiment. For the calculation of line radiation, level-resolved calculations were performed with the Cowan code to obtain the electronic structure and excitation cross sections (plane-wave Born approximation). The data were processed by a collisional radiative model to obtain electron density dependent emissions. These data were then combined with the radiative power derived from recombination rates and bremsstrahlung to obtain the total cooling factor. The effect of uncertainties in the recombination rates on the cooling factor was studied and was identified to be of secondary importance. The new cooling factor is benchmarked, by comparisons of the line radiation with spectral measurements as well as with a direct measurement of the cooling factor. Additionally, a less detailed calculation using a configuration averaged model was performed. It was used to benchmark the level-resolved calculations and to improve the prediction on radiation power from line radiation for ionization stages which are computationally challenging. The obtained values for the cooling factor validate older predictions from the literature. Its ingredients and the absolute value are consistent with the existing experimental results regarding the value itself, the spectral distribution of emissions and the ionization equilibrium. A table of the cooling factor versus electron temperature is provided. Finally, the cooling factor is used to investigate the operational window of a fusion reactor with W as intrinsic impurity. The minimum value of nTτ E , for which a thermonuclear burn is possible, is increased by 20% for a W concentration of 3.0 x 10 -5 compared with a plasma without any impurities, except for the He ash which is considered in both cases.

  13. Electronic cooling using thermoelectric devices

    Energy Technology Data Exchange (ETDEWEB)

    Zebarjadi, M., E-mail: m.zebarjadi@rutgers.edu [Department of Mechanical and Aerospace Engineering, Rutgers University, Piscataway, New Jersey 08854 (United States); Institute of Advanced Materials, Devices, and Nanotechnology, Rutgers University, Piscataway, New Jersey 08854 (United States)

    2015-05-18

    Thermoelectric coolers or Peltier coolers are used to pump heat in the opposite direction of the natural heat flux. These coolers have also been proposed for electronic cooling, wherein the aim is to pump heat in the natural heat flux direction and from hot spots to the colder ambient temperature. In this manuscript, we show that for such applications, one needs to use thermoelectric materials with large thermal conductivity and large power factor, instead of the traditionally used high ZT thermoelectric materials. We further show that with the known thermoelectric materials, the active cooling cannot compete with passive cooling, and one needs to explore a new set of materials to provide a cooling solution better than a regular copper heat sink. We propose a set of materials and directions for exploring possible materials candidates suitable for electronic cooling. Finally, to achieve maximum cooling, we propose to use thermoelectric elements as fins attached to copper blocks.

  14. LONGITUDINAL IONIZATION COOLING WITHOUT WEDGES

    International Nuclear Information System (INIS)

    BERG, J.S.

    2001-01-01

    The emittance of a muon beam must be reduced very rapidly due to the finite lifetime of the muons. The most effective known way to accomplish this is ionization cooling. It is straightforward to reduce transverse emittance through ionization cooling, but the reducing the longitudinal emittance is more challenging. Longitudinal cooling is necessary for a muon collider, and would be helpful for a neutrino factory. The method traditionally proposed for longitudinal cooling is emittance exchange involving wedges of absorber material: the longitudinal emittance is reduced at the cost of increased transverse emittance. The larger transverse emittance can then be reduced straightforwardly. An alternative method is proposed here, which does not require wedges of material but instead makes slight modifications to the standard transverse cooling lattice. We demonstrate a lattice which is a slight modification to a standard Super FOFO transverse cooling lattice, which has linear eigenvalues all of which have magnitude less than one

  15. Radiative cooling for thermophotovoltaic systems

    Science.gov (United States)

    Zhou, Zhiguang; Sun, Xingshu; Bermel, Peter

    2016-09-01

    Radiative cooling has recently garnered a great deal of attention for its potential as an alternative method for photovoltaic thermal management. Here, we will consider the limits of radiative cooling for thermal management of electronics broadly, as well as a specific application to thermal power generation. We show that radiative cooling power can increase rapidly with temperature, and is particularly beneficial in systems lacking standard convective cooling. This finding indicates that systems previously operating at elevated temperatures (e.g., 80°C) can be passively cooled close to ambient under appropriate conditions with a reasonable cooling area. To examine these general principles for a previously unexplored application, we consider the problem of thermophotovoltaic (TPV) conversion of heat to electricity via thermal radiation illuminating a photovoltaic diode. Since TPV systems generally operate in vacuum, convective cooling is sharply limited, but radiative cooling can be implemented with proper choice of materials and structures. In this work, realistic simulations of system performance are performed using the rigorous coupled wave analysis (RCWA) techniques to capture thermal emitter radiation, PV diode absorption, and radiative cooling. We subsequently optimize the structural geometry within realistic design constraints to find the best configurations to minimize operating temperature. It is found that low-iron soda-lime glass can potentially cool the PV diode by a substantial amount, even to below ambient temperatures. The cooling effect can be further improved by adding 2D-periodic photonic crystal structures. We find that the improvement of efficiency can be as much as an 18% relative increase, relative to the non-radiatively cooled baseline, as well as a potentially significant improvement in PV diode lifetime.

  16. Stochastic cooling technology at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Pasquinelli, R.J. E-mail: pasquin@fnal.gov

    2004-10-11

    The first antiproton cooling systems were installed and commissioned at Fermilab in 1984-1985. In the interim period, there have been several major upgrades, system improvements, and complete reincarnation of cooling systems. This paper will present some of the technology that was pioneered at Fermilab to implement stochastic cooling systems in both the Antiproton Source and Recycler accelerators. Current performance data will also be presented.

  17. Stochastic cooling technology at Fermilab

    Science.gov (United States)

    Pasquinelli, Ralph J.

    2004-10-01

    The first antiproton cooling systems were installed and commissioned at Fermilab in 1984-1985. In the interim period, there have been several major upgrades, system improvements, and complete reincarnation of cooling systems. This paper will present some of the technology that was pioneered at Fermilab to implement stochastic cooling systems in both the Antiproton Source and Recycler accelerators. Current performance data will also be presented.

  18. Stochastic cooling technology at Fermilab

    International Nuclear Information System (INIS)

    Pasquinelli, R.J.

    2004-01-01

    The first antiproton cooling systems were installed and commissioned at Fermilab in 1984-1985. In the interim period, there have been several major upgrades, system improvements, and complete reincarnation of cooling systems. This paper will present some of the technology that was pioneered at Fermilab to implement stochastic cooling systems in both the Antiproton Source and Recycler accelerators. Current performance data will also be presented

  19. Uncertainty analysis associated with radioactive waste disposal: a discussion paper

    International Nuclear Information System (INIS)

    Cranwell, R.M.; Helton, J.C.

    1980-01-01

    The problem of incorporating and representing uncertainty in the analysis of geologic waste disposal has been discussed. The approach has been to view uncertainty analysis in the context of the problem of how to convert from a deterministic model (i.e., a function whoe input is a sequence of real numbers) to a probabilistic model (i.e., a function whoe input is a sequence of random variables and whose output is one or more random variables). Then, uncertainty analysis becomes the study of how the properaties of the output random variable are determined by the properties of the output random variable are determined by the properties of the input random variables. In the context of this approach, various questions which relate to uncertainty analysis for geologic waste disposal have been discussed and the manner in which the problems associated with these questions are being treated in the Sandia project has been indicated

  20. Earthquake Loss Estimation Uncertainties

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  1. Theory of tapered laser cooling

    International Nuclear Information System (INIS)

    Okamoto, Hiromi; Wei, J.

    1998-01-01

    A theory of tapered laser cooling for fast circulating ion beams in a storage ring is constructed. The authors describe the fundamentals of this new cooling scheme, emphasizing that it might be the most promising way to beam crystallization. The cooling rates are analytically evaluated to study the ideal operating condition. They discuss the physical implication of the tapering factor of cooling laser, and show how to determine its optimum value. Molecular dynamics method is employed to demonstrate the validity of the present theory

  2. Cooling clothing utilizing water evaporation

    DEFF Research Database (Denmark)

    Sakoi, Tomonori; Tominaga, Naoto; Melikov, Arsen Krikor

    2014-01-01

    We developed cooling clothing that utilizes water evaporation to cool the human body and has a mechanism to control the cooling intensity. Clean water was supplied to the outer surface of the T-shirt of the cooling clothing, and a small fan was used to enhance evaporation on this outer surface...... temperature ranging from 27.4 to 30.7 °C to establish a suitable water supply control method. A water supply control method that prevents water accumulation in the T-shirt and water dribbling was validated; this method is established based on the concept of the water evaporation capacity under the applied...

  3. Direct cooled power electronics substrate

    Science.gov (United States)

    Wiles, Randy H [Powell, TN; Wereszczak, Andrew A [Oak Ridge, TN; Ayers, Curtis W [Kingston, TN; Lowe, Kirk T [Knoxville, TN

    2010-09-14

    The disclosure describes directly cooling a three-dimensional, direct metallization (DM) layer in a power electronics device. To enable sufficient cooling, coolant flow channels are formed within the ceramic substrate. The direct metallization layer (typically copper) may be bonded to the ceramic substrate, and semiconductor chips (such as IGBT and diodes) may be soldered or sintered onto the direct metallization layer to form a power electronics module. Multiple modules may be attached to cooling headers that provide in-flow and out-flow of coolant through the channels in the ceramic substrate. The modules and cooling header assembly are preferably sized to fit inside the core of a toroidal shaped capacitor.

  4. Regeneratively Cooled Porous Media Jacket

    Science.gov (United States)

    Mungas, Greg (Inventor); Fisher, David J. (Inventor); London, Adam Pollok (Inventor); Fryer, Jack Merrill (Inventor)

    2013-01-01

    The fluid and heat transfer theory for regenerative cooling of a rocket combustion chamber with a porous media coolant jacket is presented. This model is used to design a regeneratively cooled rocket or other high temperature engine cooling jacket. Cooling jackets comprising impermeable inner and outer walls, and porous media channels are disclosed. Also disclosed are porous media coolant jackets with additional structures designed to transfer heat directly from the inner wall to the outer wall, and structures designed to direct movement of the coolant fluid from the inner wall to the outer wall. Methods of making such jackets are also disclosed.

  5. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  6. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence...... decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s...

  7. Energy and Uncertainty in General Relativity

    Science.gov (United States)

    Cooperstock, F. I.; Dupre, M. J.

    2018-03-01

    The issue of energy and its potential localizability in general relativity has challenged physicists for more than a century. Many non-invariant measures were proposed over the years but an invariant measure was never found. We discovered the invariant localized energy measure by expanding the domain of investigation from space to spacetime. We note from relativity that the finiteness of the velocity of propagation of interactions necessarily induces indefiniteness in measurements. This is because the elements of actual physical systems being measured as well as their detectors are characterized by entire four-velocity fields, which necessarily leads to information from a measured system being processed by the detector in a spread of time. General relativity adds additional indefiniteness because of the variation in proper time between elements. The uncertainty is encapsulated in a generalized uncertainty principle, in parallel with that of Heisenberg, which incorporates the localized contribution of gravity to energy. This naturally leads to a generalized uncertainty principle for momentum as well. These generalized forms and the gravitational contribution to localized energy would be expected to be of particular importance in the regimes of ultra-strong gravitational fields. We contrast our invariant spacetime energy measure with the standard 3-space energy measure which is familiar from special relativity, appreciating why general relativity demands a measure in spacetime as opposed to 3-space. We illustrate the misconceptions by certain authors of our approach.

  8. ATLAS' major cooling project

    CERN Document Server

    2005-01-01

    In 2005, a considerable effort has been put into commissioning the various units of ATLAS' complex cryogenic system. This is in preparation for the imminent cooling of some of the largest components of the detector in their final underground configuration. The liquid helium and nitrogen ATLAS refrigerators in USA 15. Cryogenics plays a vital role in operating massive detectors such as ATLAS. In many ways the liquefied argon, nitrogen and helium are the life-blood of the detector. ATLAS could not function without cryogens that will be constantly pumped via proximity systems to the superconducting magnets and subdetectors. In recent weeks compressors at the surface and underground refrigerators, dewars, pumps, linkages and all manner of other components related to the cryogenic system have been tested and commissioned. Fifty metres underground The helium and nitrogen refrigerators, installed inside the service cavern, are an important part of the ATLAS cryogenic system. Two independent helium refrigerators ...

  9. Water cooled nuclear reactor

    International Nuclear Information System (INIS)

    1975-01-01

    A description is given of a cooling water intake collector for a nuclear reactor. It includes multiple sub-collectors extending out in a generally parallel manner to each other, each one having a first end and a second one separated along their length, and multiple water outlets for connecting each one to a corresponding pressure tube of the reactor. A first end tube and a second one connect the sub-collector tubes together to their first and second ends respectively. It also includes multiple collector tubes extending transversely by crossing over the sub-collector tubes and separated from each other in the direction of these tubes. Each collector tubes has a water intake for connecting to a water pump and multiple connecting tubes separated over its length and connecting each one to the corresponding sub-collector [fr

  10. Cooling of rectangular bars

    International Nuclear Information System (INIS)

    Frainer, V.J.

    1979-01-01

    A solution of the time-transient Heat Transfer Differential Equation in rectangular coordinates is presented, leading to a model which describes the temperature drop with time in rectangular bars. It is similar to an other model for cilindrical bars which has been previously developed in the Laboratory of Mechanical Metallurgy of UFRGS. Following these models, a generalization has been made, which permits cooling time evaluation for all profiles. These results are compared with experimental laboratory data in the 1200 to 800 0 C range. Some other existing models were also studied which have the purpose of studing the same phenomenon. Their mathematical forms and their evaluated values are analyzed and compared with experimental ones. (Author) [pt

  11. Thermoelectrically cooled water trap

    Science.gov (United States)

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  12. The factualization of uncertainty:

    DEFF Research Database (Denmark)

    Meyer, G.; Folker, A.P.; Jørgensen, R.B.

    2005-01-01

    on risk assessment does nothing of the sort and is not likely to present an escape from the international deadlock on the use of genetic modification in agriculture and food production. The new legislation is likely to stimulate the kind of emotive reactions it was intended to prevent. In risk assessment...... exercises, scientific uncertainty is turned into risk, expressed in facts and figures. Paradoxically, this conveys an impression of certainty, while value-disagreement and conflicts of interest remain hidden below the surface of factuality. Public dialogue and negotiation along these lines are rendered...... would be to take care of itself – rethinking the role and the limitations of science in a social context, and, thereby gaining the strength to fulfill this role and to enter into dialogue with the rest of society. Scientific communities appear to be obvious candidates for prompting reflection...

  13. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  14. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    -Nürnberg, Chair for Quality Management and Manufacturing-Oriented Metrology (Germany). 'Metro-E-Learn' project proposes to develop and implement a coherent learning and competence chain that leads from introductory and foundation e-courses in initial manufacturing engineering studies towards higher....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  15. Planning ATES systems under uncertainty

    Science.gov (United States)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions

  16. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  17. Core catcher cooling for a gas-cooled fast breeder

    International Nuclear Information System (INIS)

    Dalle Donne, M.; Dorner, S.; Schretzmann, K.

    1976-01-01

    Water, molten salts, and liquid metals are under discussion as coolants for the core catcher of a gas-cooled fast breeder. The authors state that there is still no technically mature method of cooling a core melt. However, the investigations carried out so far suggest that there is a solution to this problem. (RW/AK) [de

  18. Incorporating Feminist Standpoint Theory

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    2005-01-01

    As has been noted by Alvin Goldman, there are some very interesting similarities between his Veritistic Social Epistemology (VSE) and Sandra Harding’s Feminist Standpoint Theory (FST). In the present paper, it is argued that these similarities are so significant as to motivate an incorporation...... of FST into VSE, considering that (i) a substantial common ground can be found; (ii) the claims that go beyond this common ground are logically compatible; and (iii) the generality of VSE not only does justice to the inclusive ambition of FST, but even solves a well-discussed problem for the latter...

  19. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  20. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Fantke, Peter

    2017-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  1. Uncertainty assessment for accelerator-driven systems

    International Nuclear Information System (INIS)

    Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.

    1999-01-01

    The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems

  2. Impact of discharge data uncertainty on nutrient load uncertainty

    Science.gov (United States)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  3. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  4. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  5. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  6. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  7. Money and Growth under Uncertainty.

    Science.gov (United States)

    ECONOMICS, UNCERTAINTY), (*MONEY, DECISION MAKING), (* BEHAVIOR , MATHEMATICAL MODELS), PRODUCTION, CONSUMPTION , EQUILIBRIUM(PHYSIOLOGY), GROWTH(PHYSIOLOGY), MANAGEMENT ENGINEERING, PROBABILITY, INTEGRAL EQUATIONS, THESES

  8. Be Cool, Man! / Jevgeni Levik

    Index Scriptorium Estoniae

    Levik, Jevgeni

    2005-01-01

    Järg 1995. aasta kriminaalkomöödiale "Tooge jupats" ("Get Shorty") : mängufilm "Be Cool, Chili Palmer on tagasi!" ("Be Cool") : režissöör F. Gary Gray, peaosades J. Travolta ja U. Thurman : USA 2005. Lisatud J. Travolta ja U. Thurmani lühiintervjuud

  9. Dialogues in the COOL Project

    NARCIS (Netherlands)

    Stalpers, S.I.P.; Kroeze, C.

    2013-01-01

    The Climate Options for the Long-term (COOL) Project is a participatory integrated assessment (PIA) comprising extensive dialogues at three levels: national, European and global. The objective of the COOL Project was to ‘develop strategic notions on how to achieve drastic reductions of greenhouse

  10. Newton's Law of Cooling Revisited

    Science.gov (United States)

    Vollmer, M.

    2009-01-01

    The cooling of objects is often described by a law, attributed to Newton, which states that the temperature difference of a cooling body with respect to the surroundings decreases exponentially with time. Such behaviour has been observed for many laboratory experiments, which led to a wide acceptance of this approach. However, the heat transfer…

  11. Improving cooling of cavity blackbodies

    Science.gov (United States)

    Barrat, Catherine; Chauvel, Gildas

    2013-10-01

    A cavity blackbody is the appropriate IR reference source for IR sensors which require high radiance levels. It combines high emissivity independent from wavelength and high speed warm up and high stability thanks to its light trap structure. However, the inconvenient of this structure is that it leads to a prohibitive cooling time. HGH developed a method to speed up the cooling time.

  12. Interactions between Cool Roofs and Urban Irrigation: Do Cooling Strategies Reduce Water Consumption in the San Francisco Bay Area?

    Science.gov (United States)

    Vahmani, P.; Jones, A. D.

    2016-12-01

    California has experienced progressive drought since 2012, with 2012-2014 constituting a nearly 10,000-year drought event, resulting in a suite of policies with the goal of reducing water consumption. At the same time, climate warming effects of accelerated urbanization along with projected global climate change raise an urgent need for sustainable mitigation and adaptation strategies to cool urban climates. In this study, for the first time, we assess the overarching benefits of cooling strategies on urban water consumption. We employ a satellite-supported regional climate-modeling framework over the San Francisco Bay Area to assess the effects of cool roofs on urban irrigation, a topic of increasing importance as it accounts for a significant fraction of urban water use particularly in arid and semi-arid regions. We use a suit of climatological simulations at high (1.5 km) spatial resolution, based on a Weather Research and Forecasting (WRF)-Urban Canopy Model (UCM) modeling framework, reinforced with remotely sensed observations of Green Vegetation Fraction (GVF), leaf area index (LAI), and albedo. Our analysis shows that widespread incorporation of cool roofs would result in a mean daytime cooling of about 0.7° C, which in turn results in roughly 4% reduction in irrigation water, largely due to decreases in surface evapotranspiration rates. We further investigate the critical interactions between cool roofs, wind, and sea-breeze patterns as well as fog formation, a dominant weather pattern in San Francisco Bay area.

  13. Analysis of Infiltration Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper

  14. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  15. Cooling in the single-photon strong-coupling regime of cavity optomechanics

    Science.gov (United States)

    Nunnenkamp, A.; Børkje, K.; Girvin, S. M.

    2012-05-01

    In this Rapid Communication we discuss how red-sideband cooling is modified in the single-photon strong-coupling regime of cavity optomechanics where the radiation pressure of a single photon displaces the mechanical oscillator by more than its zero-point uncertainty. Using Fermi's golden rule we calculate the transition rates induced by the optical drive without linearizing the optomechanical interaction. In the resolved-sideband limit we find multiple-phonon cooling resonances for strong single-photon coupling that lead to nonthermal steady states including the possibility of phonon antibunching. Our study generalizes the standard linear cooling theory.

  16. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and human exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.

  17. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  18. Pandemic influenza: certain uncertainties.

    Science.gov (United States)

    Morens, David M; Taubenberger, Jeffery K

    2011-09-01

    For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, "wave" patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. Published 2011. This article is a US Government work and is in the public domain in the USA.

  19. Pandemic influenza: certain uncertainties

    Science.gov (United States)

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  20. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from this requi......The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... and infers prescriptions from this requirement. These two approaches may conflict, and in this conflict the top-down approach has the upper hand, ethically speaking. However, the implicit goal in the top-down approach of justice between generations needs to be refined in several dimensions. But even given...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  1. Huge opportunity for solar cooling

    International Nuclear Information System (INIS)

    Rowe, Daniel

    2014-01-01

    In Europe more than 400 solar cooling systems have been installed. By contrast, only a small number of solar cooling installations exist in Australia - primarily adsorption and absorption systems for commercial and hospitals - although these systems are growing. As with other renewable energy technologies, cost is a challenge. However solar cooling is currently competitive with other technologies, with some suggesting that system costs have been decreasing by about 20% per annum in recent times. Australia is also leading efforts in the development of residential solar desiccant technology, currently commercialising Australian-developed technology. Commercial and industrial enterprises are increasingly aware of the impact of demand charges, the potential to install technology as a hedge against future energy price rises and opportunities associated with increased on-site generation and reduced reliance on the grid, often necessitating on-site demand reduction and management. They are also driven by environmental and corporate social responsibility objectives as well as the opportunity for energy independence and uninterruptible operation. Interestingly, many of these interests are mirrdred at residential level, inspiring CSIRO's commercialisation of a domestic scale solar air conditioner with Australian manufacturer Brevis Climate Systems. Australia and other countries are increasingly aware of solar cooling as technology which can reduce or replace grid-powered cooling, particularly in applications where large building thermal energy requirements exist. In these applications, heating, cooling and hot water are generated and used in large amounts and the relative amounts of each can be varied dynamically, depending on building requirements. Recent demonstrations of solar cooling technology in Australia include Hunter TAFE's Solar Desiccant Cooling System - which provides heating, cooling and hot water to commercial training kitchens and classrooms - GPT

  2. Plasma impurities and cooling

    International Nuclear Information System (INIS)

    Drawin, H.W.

    1976-11-01

    In high-temperature low-density plasmas radiation cooling by impurity atoms can be an important energy loss mechanism, since the radiation is not reabsorbed. In a brief historical survey it is shown that the problem is not new but was discussed since the first beginning of controlled thermonuclear fusion research. It is then shown radiation losses enter into the general power balance equation of a plasma containing impurities. The equations for the different types of radiation losses are given as a function of the atomic quantities. In a special section simplifications due to the corona model assumption are discussed. It follows a detailed survey of the results obtained by several authors for the ionization balance and power losses of impurity elements observed in present high-temperature plasma machines used in CTR, especially in TOKAMAKS. In the conclusion a survey is given of the atomic data which experimentalists and theorists need for current research on impurities in fusion-like plasmas. (86 references)

  3. Local cooling and warming effects of forests based on satellite observations

    Science.gov (United States)

    Li, Yan; Zhao, Maosheng; Motesharrei, Safa; Mu, Qiaozhen; Kalnay, Eugenia; Li, Shuangcheng

    2015-01-01

    The biophysical effects of forests on climate have been extensively studied with climate models. However, models cannot accurately reproduce local climate effects due to their coarse spatial resolution and uncertainties, and field observations are valuable but often insufficient due to their limited coverage. Here we present new evidence acquired from global satellite data to analyse the biophysical effects of forests on local climate. Results show that tropical forests have a strong cooling effect throughout the year; temperate forests show moderate cooling in summer and moderate warming in winter with net cooling annually; and boreal forests have strong warming in winter and moderate cooling in summer with net warming annually. The spatiotemporal cooling or warming effects are mainly driven by the two competing biophysical effects, evapotranspiration and albedo, which in turn are strongly influenced by rainfall and snow. Implications of our satellite-based study could be useful for informing local forestry policies. PMID:25824529

  4. On synthesis and optimization of cooling water systems with multiple cooling towers

    CSIR Research Space (South Africa)

    Gololo, KV

    2011-01-01

    Full Text Available Cooling water systems are generally designed with a set of heat exchangers arranged in parallel. This arrangement results in higher cooling water flow rate and low cooling water return temperature, thus reducing cooling tower efficiency. Previous...

  5. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  6. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  7. Uncertainty Analysis Principles and Methods

    Science.gov (United States)

    2007-09-01

    total systematic uncertainties be combined in RSS. In many instances, the student’s t-statistic, t95, is set equal to 2 and URSS is replaced by U95...GUM, the total uncertainty UADD, URSS or U95, was offered as type of confi- dence limit. 9595 UxvaluetrueUx +≤≤− In some respects, these limits

  8. Hydrology, society, change and uncertainty

    Science.gov (United States)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  9. Uncertainty and climate change policy

    OpenAIRE

    Quiggin, John

    2008-01-01

    The paper consists of a summary of the main sources of uncertainty about climate change, and a discussion of the major implications for economic analysis and the formulation of climate policy. Uncertainty typically implies that the optimal policy is more risk-averse than otherwise, and therefore enhances the case for action to mitigate climate change.

  10. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... via semi-structured interviews and secondary data. Findings: The findings suggest that relational uncertainty is caused by the partner’s unresolved organisational uncertainty, i.e. their lacking capabilities to deliver or receive (parts of) the service. Furthermore, we found that resolving...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...

  11. Hybrid cooling tower Neckarwestheim 2 cooling function, emission, plume dispersion

    International Nuclear Information System (INIS)

    Braeuning, G.; Ernst, G.; Maeule, R.; Necker, P.

    1990-01-01

    The fan-assisted hybrid cooling tower of the 1300 MW power plant Gemeinschafts-Kernkraftwerk Neckarwestheim 2 was designed and constructed based on results from theoretical and experimental studies and experiences from a smaller prototype. The wet part acts in counterflow. The dry part is arranged above the wet part. Each part contains 44 fans. Special attention was payed to the ducts which mix the dry into the wet plume. The cooling function and state, mass flow and contents of the emission were measured. The dispersion of the plume in the atmosphere was observed. The central results are presented in this paper. The cooling function corresponds to the predictions. The content of drifted cooling water in the plume is extremely low. The high velocity of the plume in the exit causes an undisturbed flow into the atmosphere. The hybrid operation reduces visible plumes strongly, especially in warmer and drier ambient air

  12. Laser cooling, evaporative cooling and Bose-Einstein condensation

    International Nuclear Information System (INIS)

    Ghosh, Pradip N.

    2002-01-01

    Laser radiations are used to slow down atoms by the process of momentum transfer. This leads to reducing the temperature to micro kelvin region. Gas phase atoms are trapped by using magnetic fields. The recent advances have led to the realization of the dream of physicists of confining the atoms and reducing their velocities to the limit imposed by quantum mechanics. A number of new experiments are possible with the cooled and trapped atoms and ions that would be useful to solve many problems of theoretical physics. Further cooling by the process of evaporative technique has led to the observation of Bose-Einstein Condensation predicted by Einstein and Bose nearly seventy-five years ago. A brief review of the method of laser cooling, magnetic trapping and evaporative cooling methods used for obtaining ultracold atoms are discussed. It is possible to obtain temperature in the nano kelvin region without using cryogenic methods thus simplifying the experimental methods to a great extent. (author)

  13. Hydrologic Scenario Uncertainty in a Comprehensive Assessment of Hydrogeologic Uncertainty

    Science.gov (United States)

    Nicholson, T. J.; Meyer, P. D.; Ye, M.; Neuman, S. P.

    2005-12-01

    A method to jointly assess hydrogeologic conceptual model and parameter uncertainties has recently been developed based on a Maximum Likelihood implementation of Bayesian Model Averaging (MLBMA). Evidence from groundwater model post-audits suggests that errors in the projected future hydrologic conditions of a site (hydrologic scenarios) are a significant source of model predictive errors. MLBMA can be extended to include hydrologic scenario uncertainty, along with conceptual model and parameter uncertainties, in a systematic and quantitative assessment of predictive uncertainty. Like conceptual model uncertainty, scenario uncertainty is represented by a discrete set of alternative scenarios. The effect of scenario uncertainty on model predictions is quantitatively assessed by conducting an MLBMA analysis under each scenario. We demonstrate that posterior model probability is a function of the scenario only through the possible dependence of prior model probabilities on the scenario. As a result, the model likelihoods (computed from calibration results), are not a function of the scenario and do not need to be recomputed under each scenario. MLBMA results for each scenario are weighted by the scenario probability and combined to render a joint assessment of scenario, conceptual model, and parameter uncertainty. Like model probability, scenario probability represents a subjective evaluation, in this case of the plausibility of the occurrence of the specific scenario. Because the scenarios describe future conditions, the scenario probabilities represent prior estimates and cannot be updated using the (past) system state data as is used to compute posterior model probabilities. Assessment of hydrologic scenario uncertainty is illustrated using a site-specific application considering future changes in land use, dam operations, and climate. Estimation of scenario probabilities and consideration of scenario characteristics (e.g., timing, magnitude) are discussed.

  14. Efficacy of Liquid, Air, and Phase Change Material Torso Cooling During Light Exercise While Wearing NBC Clothing

    National Research Council Canada - National Science Library

    McLellan, Tom

    1998-01-01

    .... The PCM cooling vests, which were supplied by Microclimate Systems Incorporated, were worn under the NBC overgarment and were tested with a vertical (CVV) and horizontal (CVH) design. Seven males (29 yrs, 75.6 kg, 1.78 m...

  15. Effect of cooling water on stability of NLC linac components

    Energy Technology Data Exchange (ETDEWEB)

    F. Le Pimpec et al.

    2003-02-11

    Vertical vibration of linac components (accelerating structures, girders and quadrupoles) in the NLC has been studied experimentally and analytically. Effects such as structural resonances and vibration caused by cooling water both in accelerating structures and quadrupoles have been considered. Experimental data has been compared with analytical predictions and simulations using ANSYS. A design, incorporating the proper decoupling of structure vibrations from the linac quadrupoles, is being pursued.

  16. Effect of Cooling Water on Stability of NLC Linac Components

    Energy Technology Data Exchange (ETDEWEB)

    Le Pimpec, Frederic

    2002-11-01

    Vertical vibration of linac components (accelerating structures, girders and quadrupoles) in the NLC has been studied experimentally and analytically. Effects such as structural resonances and vibration caused by cooling water both in accelerating structures and quadrupoles have been considered. Experimental data has been compared with analytical predictions and simulations using ANSYS. A design, incorporating the proper decoupling of structure vibrations from the linac quadrupoles, is being pursued.

  17. Sympathetic Cooling of Quantum Simulators

    Science.gov (United States)

    Raghunandan, Meghana; Weimer, Hendrik

    2017-04-01

    We discuss the possibility of maximizing the cooling of a quantum simulator by controlling the system-environment coupling such that the system is driven into the ground state. We make use of various analytical tools such as effective operator formalism and the quantum master equations to exactly solve the model of an Ising spin chain consisting of N particles coupled to a radiation field. We maximize the cooling by finding the dependence of the effective rate of transitions of the various excited states into the ground state. We show that by adding a single dissipative qubit, we already get quite substantial cooling rates. Volkswagen Foundation, DFG.

  18. Cooling towers principles and practice

    CERN Document Server

    Hill, G B; Osborn, Peter D

    1990-01-01

    Cooling Towers: Principles and Practice, Third Edition, aims to provide the reader with a better understanding of the theory and practice, so that installations are correctly designed and operated. As with all branches of engineering, new technology calls for a level of technical knowledge which becomes progressively higher; this new edition seeks to ensure that the principles and practice of cooling towers are set against a background of up-to-date technology. The book is organized into three sections. Section A on cooling tower practice covers topics such as the design and operation of c

  19. Evaluation Of Cooling Tower Degradation

    International Nuclear Information System (INIS)

    Djunaidi

    2001-01-01

    Cooling tower degradation has been evaluated for the last 10 years. Its heat transfer capacity has been decreasing after several years of operation due to aging. Evaluation is carried out by calculating the degradation rate, namely the annual increase of outlet temperatures of the cooling tower. Data was randomly taken daily at 15 MW reactor power. Data was taken after the reactor operation of ± 8 hours. Evaluation since 1990 shows that the degradation rate is nearly one degree per year. This degradation can be by minimized, replacement of damaged components, non-excessive operation and design modification of the cooling tower namely by extending the period of contract between water and air

  20. CLIC inner detectors cooling simulations

    CERN Document Server

    Duarte Ramos, F.; Villarejo Bermudez, M.

    2014-01-01

    The strict requirements in terms of material budget for the inner region of the CLIC detector concepts require the use of a dry gas for the cooling of the respective sen- sors. This, in conjunction with the compactness of the inner volumes, poses several challenges for the design of a cooling system that is able to fulfil the required detec- tor specifications. This note introduces a detector cooling strategy using dry air as a coolant and shows the results of computational fluid dynamics simulations used to validate the proposed strategy.

  1. Mold heating and cooling microprocessor conversion

    Science.gov (United States)

    Hoffman, D. P.

    1995-07-01

    Conversion of the microprocessors and software for the Mold Heating and Cooling (MHAC) pump package control systems was initiated to allow required system enhancements and provide data communications capabilities with the Plastics Information and Control System (PICS). The existing microprocessor-based control systems for the pump packages use an Intel 8088-based microprocessor board with a maximum of 64 Kbytes of program memory. The requirements for the system conversion were developed, and hardware has been selected to allow maximum reuse of existing hardware and software while providing the required additional capabilities and capacity. The new hardware will incorporate an Intel 80286-based microprocessor board with an 80287 math coprocessor, the system includes additional memory, I/O, and RS232 communication ports.

  2. Entropic uncertainty relation based on generalized uncertainty principle

    Science.gov (United States)

    Hsu, Li-Yi; Kawamoto, Shoichi; Wen, Wen-Yu

    2017-09-01

    We explore the modification of the entropic formulation of uncertainty principle in quantum mechanics which measures the incompatibility of measurements in terms of Shannon entropy. The deformation in question is the type so-called generalized uncertainty principle that is motivated by thought experiments in quantum gravity and string theory and is characterized by a parameter of Planck scale. The corrections are evaluated for small deformation parameters by use of the Gaussian wave function and numerical calculation. As the generalized uncertainty principle has proven to be useful in the study of the quantum nature of black holes, this study would be a step toward introducing an information theory viewpoint to black hole physics.

  3. Evaluation of existing cooling systems for reducing cooling power consumption

    Energy Technology Data Exchange (ETDEWEB)

    Hatamipour, M.S. [Chemical Engineering Department, Isfahan University, Isfahan (Iran, Islamic Republic of); Mahiyar, H.; Taheri, M. [Chemical Engineering Department, Shiraz University, Shiraz (Iran, Islamic Republic of)

    2007-07-01

    This work was designed to estimate the cooling load power consumption during the summer in the hot and humid areas of Iran. The actual electrical energy consumption for cooling systems of some typical buildings with various applications (3 residential home buildings, 2 industrial plant buildings, a trade center with 38 shops, 3 public sectors and a city hospital) in a hot and humid region in South of Iran was recorded during the peak load period of the year (July-August). The records were used for estimating the total power consumption of the cooling systems in this region. According to this estimation, which was confirmed by the regional electrical power distribution office, the cooling systems power consumption in this region accounted for more than 60% of the total power consumption during the peak load period of the year. A computer program was developed for simulating the effect of various parameters on cooling load of the buildings in hot and humid regions. According to the simulation results, use of double glazed windows, light colored walls and roofs, and insulated walls and roofs can reduce the cooling load of the buildings more than 40%. (author)

  4. Passive low energy cooling of buildings

    CERN Document Server

    Givoni, Baruch

    1994-01-01

    A practical sourcebook for building designers, providing comprehensive discussion of the impact of basic architectural choices on cooling efficiency, including the layout and orientation of the structure, window size and shading, exterior color, and even the use of plantings around the site. All major varieties of passive cooling systems are presented, with extensive analysis of performance in different types of buildings and in different climates: ventilation; radiant cooling; evaporative cooling; soil cooling; and cooling of outdoor spaces.

  5. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  6. Electromechanically cooled germanium radiation detector system

    International Nuclear Information System (INIS)

    Lavietes, Anthony D.; Joseph Mauger, G.; Anderson, Eric H.

    1999-01-01

    We have successfully developed and fielded an electromechanically cooled germanium radiation detector (EMC-HPGe) at Lawrence Livermore National Laboratory (LLNL). This detector system was designed to provide optimum energy resolution, long lifetime, and extremely reliable operation for unattended and portable applications. For most analytical applications, high purity germanium (HPGe) detectors are the standard detectors of choice, providing an unsurpassed combination of high energy resolution performance and exceptional detection efficiency. Logistical difficulties associated with providing the required liquid nitrogen (LN) for cooling is the primary reason that these systems are found mainly in laboratories. The EMC-HPGe detector system described in this paper successfully provides HPGe detector performance in a portable instrument that allows for isotopic analysis in the field. It incorporates a unique active vibration control system that allows the use of a Sunpower Stirling cycle cryocooler unit without significant spectral degradation from microphonics. All standard isotopic analysis codes, including MGA and MGA++, GAMANL, GRPANL and MGAU, typically used with HPGe detectors can be used with this system with excellent results. Several national and international Safeguards organisations including the International Atomic Energy Agency (IAEA) and U.S. Department of Energy (DOE) have expressed interest in this system. The detector was combined with custom software and demonstrated as a rapid Field Radiometric Identification System (FRIS) for the U.S. Customs Service . The European Communities' Safeguards Directorate (EURATOM) is field-testing the first Safeguards prototype in their applications. The EMC-HPGe detector system design, recent applications, and results will be highlighted

  7. Compressor bleed cooling fluid feed system

    Science.gov (United States)

    Donahoo, Eric E; Ross, Christopher W

    2014-11-25

    A compressor bleed cooling fluid feed system for a turbine engine for directing cooling fluids from a compressor to a turbine airfoil cooling system to supply cooling fluids to one or more airfoils of a rotor assembly is disclosed. The compressor bleed cooling fluid feed system may enable cooling fluids to be exhausted from a compressor exhaust plenum through a downstream compressor bleed collection chamber and into the turbine airfoil cooling system. As such, the suction created in the compressor exhaust plenum mitigates boundary layer growth along the inner surface while providing flow of cooling fluids to the turbine airfoils.

  8. Simulations of High-Energy Electron Cooling

    CERN Document Server

    Fedotov, Alexei V; Bruhwiler, David L; Eidelman, Yury I; Litvinenko, Vladimir N; Malitsky, Nikolay; Meshkov, Igor; Sidorin, Anatoly O; Smirnov, Alexander V; Troubnikov, Grigory

    2005-01-01

    High-energy electron cooling of RHIC presents many unique features and challenges. An accurate estimate of the cooling times requires a detailed calculation of the cooling process, which takes place simultaneously with various diffusive mechanisms in RHIC. In addition, many unexplored effects of high-energy cooling in a collider complicate the task of getting very accurate estimates of cooling times. To address these high-energy cooling issues, a detailed study of cooling dynamics based on computer codes is underway at Brookhaven National Laboratory. In this paper, we present an update on code development and its application to the high-energy cooling dynamics studies for RHIC.

  9. SIMULATIONS OF HIGH-ENERGY ELECTRON COOLING.

    Energy Technology Data Exchange (ETDEWEB)

    FEDOTOV,A.V.; BEN-ZVI,I.; EIDELMAN, YU.; LITVINENKO, V.; MALITSKY, N.

    2005-05-16

    High-energy electron cooling of RHIC presents many unique features and challenges. An accurate estimate of the cooling times requires a detailed calculation of the cooling process, which takes place simultaneously with various diffusive mechanisms in RHIC. In addition, many unexplored effects of high-energy cooling in a collider complicate the task of getting very accurate estimates of cooling times. To address these high-energy cooling issues, a detailed study of cooling dynamics based on computer codes is underway at Brookhaven National Laboratory. In this paper, we present an update on code development and its application to the high-energy cooling dynamics studies for RHIC.

  10. SIMULATIONS OF HIGH-ENERGY ELECTRON COOLING

    International Nuclear Information System (INIS)

    FEDOTOV, A.V.; BEN-ZVI, I.; EIDELMAN, YU.; LITVINENKO, V.; MALITSKY, N.

    2005-01-01

    High-energy electron cooling of RHIC presents many unique features and challenges. An accurate estimate of the cooling times requires a detailed calculation of the cooling process, which takes place simultaneously with various diffusive mechanisms in RHIC. In addition, many unexplored effects of high-energy cooling in a collider complicate the task of getting very accurate estimates of cooling times. To address these high-energy cooling issues, a detailed study of cooling dynamics based on computer codes is underway at Brookhaven National Laboratory. In this paper, we present an update on code development and its application to the high-energy cooling dynamics studies for RHIC

  11. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  12. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  13. NASA Marshall Space Flight Center Improves Cooling System Performance: Best Management Practice Case Study #10: Cooling Towers (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2011-02-01

    National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) has a longstanding sustainability program that revolves around energy and water efficiency as well as environmental protection. MSFC identified a problematic cooling loop with six separate compressor heat exchangers and a history of poor efficiency. The facility engineering team at MSFC partnered with Flozone Services, Incorporated to implement a comprehensive water treatment platform to improve the overall efficiency of the system.

  14. Geothermal heat can cool, too

    International Nuclear Information System (INIS)

    Wellstein, J.

    2008-01-01

    This article takes a look at how geothermal energy can not only be used to supply heating energy, but also be used to provide cooling too. The article reports on a conference on heating and cooling with geothermal energy that was held in Duebendorf, Switzerland, in March 2008. The influence of climate change on needs for heating and cooling and the need for additional knowledge and data on deeper rock layers is noted. The seasonal use of geothermal systems to provide heating in winter and cooling in summer is discussed. The planning of geothermal probe fields and their simulation is addressed. As an example, the geothermal installations under the recently renewed and extended 'Dolder Grand' luxury hotel in Zurich are quoted. The new SIA 384/6 norm on geothermal probes issued by the Swiss Association of Architects SIA is briefly reviewed.

  15. Cooling methods for power plants

    International Nuclear Information System (INIS)

    Gaspersic, B.; Fabjan, L.; Petelin, S.

    1977-01-01

    There are some results of measurements carried out on the wet cooling tower 275 MWe at TE Sostanj and on the experimental cooling tower at Jozef Stefan Institute, as well. They are including: the measurements of the output air conditions, the measurements of the cross current of water film and vapour-air flowing through two plates, and the distribution of velocity in boundary layer measured by anemometer

  16. Insurance Applications of Active Fault Maps Showing Epistemic Uncertainty

    Science.gov (United States)

    Woo, G.

    2005-12-01

    Insurance loss modeling for earthquakes utilizes available maps of active faulting produced by geoscientists. All such maps are subject to uncertainty, arising from lack of knowledge of fault geometry and rupture history. Field work to undertake geological fault investigations drains human and monetary resources, and this inevitably limits the resolution of fault parameters. Some areas are more accessible than others; some may be of greater social or economic importance than others; some areas may be investigated more rapidly or diligently than others; or funding restrictions may have curtailed the extent of the fault mapping program. In contrast with the aleatory uncertainty associated with the inherent variability in the dynamics of earthquake fault rupture, uncertainty associated with lack of knowledge of fault geometry and rupture history is epistemic. The extent of this epistemic uncertainty may vary substantially from one regional or national fault map to another. However aware the local cartographer may be, this uncertainty is generally not conveyed in detail to the international map user. For example, an area may be left blank for a variety of reasons, ranging from lack of sufficient investigation of a fault to lack of convincing evidence of activity. Epistemic uncertainty in fault parameters is of concern in any probabilistic assessment of seismic hazard, not least in insurance earthquake risk applications. A logic-tree framework is appropriate for incorporating epistemic uncertainty. Some insurance contracts cover specific high-value properties or transport infrastructure, and therefore are extremely sensitive to the geometry of active faulting. Alternative Risk Transfer (ART) to the capital markets may also be considered. In order for such insurance or ART contracts to be properly priced, uncertainty should be taken into account. Accordingly, an estimate is needed for the likelihood of surface rupture capable of causing severe damage. Especially where a

  17. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  18. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence.

    Science.gov (United States)

    Araujo Navas, Andrea L; Hamm, Nicholas A S; Soares Magalhães, Ricardo J; Stein, Alfred

    2016-12-01

    Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1) the main uncertainty sources, their definition and quantification and (2) how uncertainty is informative for STH programme managers and scientists working in this domain. We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator). Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches. None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention should be given to mapping and interpreting uncertainty, since they

  19. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence.

    Directory of Open Access Journals (Sweden)

    Andrea L Araujo Navas

    2016-12-01

    Full Text Available Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1 the main uncertainty sources, their definition and quantification and (2 how uncertainty is informative for STH programme managers and scientists working in this domain.We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator. Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches.None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention should be given to mapping and interpreting

  20. Decision making uncertainty, imperfection, deliberation and scalability

    CERN Document Server

    Kárný, Miroslav; Wolpert, David

    2015-01-01

    This volume focuses on uncovering the fundamental forces underlying dynamic decision making among multiple interacting, imperfect and selfish decision makers. The chapters are written by leading experts from different disciplines, all considering the many sources of imperfection in decision making, and always with an eye to decreasing the myriad discrepancies between theory and real world human decision making. Topics addressed include uncertainty, deliberation cost and the complexity arising from the inherent large computational scale of decision making in these systems. In particular, analyses and experiments are presented which concern: • task allocation to maximize “the wisdom of the crowd”; • design of a society of “edutainment” robots who account for one anothers’ emotional states; • recognizing and counteracting seemingly non-rational human decision making; • coping with extreme scale when learning causality in networks; • efficiently incorporating expert knowledge in personalized...

  1. Effect of Exhaust Pressure on the Cooling Characteristics of a Liquid-Cooled Engine

    Science.gov (United States)

    Doyle, Ronald B.; Desmon, Leland G.

    1947-01-01

    Data for a liquid-cooled engine with a displacement volume of 1710 cubic inches were analyzed to determine the effect of exhaust pressure on the engine cooling characteristics. The data covered a range of exhaust pressures from 7 to 62 inches of mercury absolute, inlet-manifold pressures from 30 to 50 inches of mercury absolute, engine speeds from 1600 to 3000 rpm, and fuel-air ratios from 0.063 to 0.100. The effect of exhaust pressure on engine cooling was satisfactorily incorporated in the NACA cooling-correlation method as a variation in effective gas temperature with exhaust pressure. Large variations of cylinder-head temperature with exhaust pressure were obtained for operation at constant charge flow. At a constant charge flow of 2 pounds per second (approximately 1000 bhp) and a fuel-air ratio of 0.085, an increase in exhaust pressure from 10 to 60 inches of mercury absolute resulted in an increase of 40 F in average cylinder-head temperature. For operation at constant engine speed and inlet-manifold pressure and variable exhaust pressure (variable charge flow), however, the effect of exhaust pressure on cylinder-head temperature is small. For example, at an inlet-manifold pressure of 40 inches of mercury absolute, an engine speed of 2400 rpm.- and a fuel-air ratio of 0.085, the average cylinder-head temperature was about the same at exhaust pressures of 10 and 60 inches of,mercury absolute; a rise and a subsequent decrease of about 70 occurred between these extremes.

  2. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  3. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  4. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Directory of Open Access Journals (Sweden)

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  5. Uncertainty as Information: Narrowing the Science-policy Gap

    Directory of Open Access Journals (Sweden)

    G. A. Bradshaw

    2000-07-01

    Full Text Available Conflict and indecision are hallmarks of environmental policy formulation. Some argue that the requisite information and certainty fall short of scientific standards for decision making; others argue that science is not the issue and that indecisiveness reflects a lack of political willpower. One of the most difficult aspects of translating science into policy is scientific uncertainty. Whereas scientists are familiar with uncertainty and complexity, the public and policy makers often seek certainty and deterministic solutions. We assert that environmental policy is most effective if scientific uncertainty is incorporated into a rigorous decision-theoretic framework as knowledge, not ignorance. The policies that best utilize scientific findings are defined here as those that accommodate the full scope of scientifically based predictions.

  6. Asbestos in cooling-tower waters. Final report

    International Nuclear Information System (INIS)

    Lewis, B.A.G.

    1979-03-01

    Water discharges from cooling towers constructed with asbestos fill were found to contain chrysotile--asbestos fibers at concentrations as high as 10 8 fibers/liter. The major source of these fibers, appears to be the components of the towers rather than the air drawn through the towers or the makeup water taken into the towers. Suggested mechanisms for the release of chrysotile fibers from cooling-tower fill include freeze-thaw cycles and dissolution of the cement due to acidic components of the circulating water. Ash- or other material-settling ponds were found to reduce asbestos-fiber concentrations in cooling-tower effluent. The literature reviewed did not support the case for a causal relationship between adverse human health effects and drinking water containing on the order of 10 6 chrysotile--asbestos fibers/liter; for this and other reasons, it is not presently suggested that the use of asbestos fill be discontinued. However, caution and surveillance are dictated by the uncertainties in the epidemiological studies, the absence of evidence for a safe threshold concentration in water, and the conclusive evidence for adverse effects from occupational exposure. It is recommended that monitoring programs be carried out at sites where asbestos fill is used; data from such programs can be used to determine whether any mitigative measures should be taken. On the basis of estimates made in this study, monitoring for asbestos in drift from cooling towers does not appear to be warranted

  7. Adiabatic Cooling for Rovibrational Spectroscopy of Molecular Ions

    DEFF Research Database (Denmark)

    Fisher, Karin

    2017-01-01

    proposes to adiabatically relax the trapping potential, called adiabatic cooling, when performing rovibrational excitations of the molecular ion to reduce the energy spacing of the harmonic motional levels, thus increasing the likelihood of a motional transition. The work presented in this thesis covers...... the implementation of adiabatic cooling for the application of rovibrational spectroscopy on single molecular ions. This entailed constructing and testing a new DC supply capable of employing adiabatic ramps of the ion's axial frequency on the 100's of us timescale. The DC supply went through several iterations...... is possible with some optimization. Rovibrational transitions in 24MgH+ are only known to the 1.5 GHz level compared to their Hz-linewidths. Simulations for broadband spectroscopy aimed reducing this uncertainty are presented for a rovibrational transition in 24MgH+. This technique allows for illumination...

  8. Cooling in the single-photon regime of optomechanics

    Science.gov (United States)

    Nunnenkamp, Andreas; Borkje, Kjetil; Girvin, Steven

    2012-02-01

    Optomechanics experiments are rapidly approaching the regime where the radiation pressure of a single photon displaces the mechanical oscillator by more than its zero-point uncertainty. We show that in this limit the power spectrum has multiple sidebands and that the cavity response has several resonances in the resolved-sideband limit [Phys. Rev. Lett. 107, 063602 (2011)]. We then discuss how red-sideband cooling is modified in this nonlinear regime. Using Fermi's Golden rule we calculate the transition rates induced by the optical drive. In the resolved-sideband limit we find multiple cooling resonances for strong single-photon coupling. They lead to non-thermal steady states and are accompanied by multiple mechanical sidebands in the optical output spectrum. Our study provides the tools to detect and take advantage of this novel regime of optomechanics.

  9. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  10. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  11. Exposing Position Uncertainty in Middleware

    DEFF Research Database (Denmark)

    Langdal, Jakob; Kjærgaard, Mikkel Baun; Toftkjær, Thomas

    2010-01-01

    Traditionally, the goal for positioning middleware is to provide developers with seamless position transparency, i.e., providing a connection between the application domain and the positioning sensors while hiding the complexity of the positioning technologies in use. A key part of the hidden...... complexity is the uncertainty associated to positions caused by inherent limitations when using sensors to convert physical phenomena to digital representations. We propose to use the notion of seamful design for developers to design a positioning middleware that provides transparent positioning and still...... allows developers some control of the uncertainty aspects of the positioning process. The design presented in this paper shows how uncertainty of positioning can be conceptualized and internalized into a positioning middleware. Furthermore, we argue that a developer who is interacting with uncertainty...

  12. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  13. Dimensional measurements with submicrometer uncertainty in production environment

    DEFF Research Database (Denmark)

    De Chiffre, L.; Gudnason, M. M.; Madruga, D.

    2015-01-01

    The work concerns a laboratory investigation of a method to achieve dimensional measurements with submicrometer uncertainty under conditions that are typical of a production environment. The method involves the concurrent determination of dimensions and material properties from measurements carried...... with synchronous measurements of length and temperature during cooling from 25 °C to 20 °C were carried out, using two calibrated gauge blocks as workpieces, i.e., a steel gauge block and a tungsten carbide gauge block. Each measurement was repeated 9 times. Coefficients of thermal expansion (CTE) for the two...

  14. Topology optimization for optical projection lithography with manufacturing uncertainties

    DEFF Research Database (Denmark)

    Zhou, Mingdong; Lazarov, Boyan Stefanov; Sigmund, Ole

    2014-01-01

    to manufacturing without additional optical proximity correction (OPC). The performance of the optimized device is robust toward the considered process variations. With the proposed unified approach, the design for photolithography is achieved by considering the optimal device performance and manufacturability......This article presents a topology optimization approach for micro-and nano-devices fabricated by optical projection lithography. Incorporating the photolithography process and the manufacturing uncertainties into the topology optimization process results in a binary mask that can be sent directly...

  15. MANAGING UNCERTAINTY IN PRODUCT INNOVATION USING MARKETING STRATEGIES

    OpenAIRE

    Fernandes, Gláucia; Brandão, Luiz Eduardo Teixeira

    2016-01-01

    ABSTRACT Innovation is an important factor in increasing competitiveness of Brazilian enterprises. On the other hand, innovative projects are characterized by many technical and market uncertainties. This article proposes incorporating marketing strategies into risk management methods for the development of new products and technology projects, which we call the 4P's of innovation. To illustrate this concept, we apply this model to an IT project in order to determine its value and risks. The ...

  16. Propagation of dynamic measurement uncertainty

    Science.gov (United States)

    Hessling, J. P.

    2011-10-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result.

  17. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  18. How to live with uncertainties?

    International Nuclear Information System (INIS)

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  19. New Perspectives on Policy Uncertainty

    OpenAIRE

    Hlatshwayo, Sandile

    2017-01-01

    In recent years, the ubiquitous and intensifying nature of economic policy uncertainty has made it a popular explanation for weak economic performance in developed and developing markets alike. The primary channel for this effect is decreased and delayed investment as firms adopt a ``wait and see'' approach to irreversible investments (Bernanke, 1983; Dixit and Pindyck, 1994). Deep empirical examination of policy uncertainty's impact is rare because of the difficulty associated in measuring i...

  20. Investment choice and inflation uncertainty

    OpenAIRE

    Gregory Fischer

    2013-01-01

    This paper investigates the relationship between infation uncertainty and investment using a panel of loan-level data from small businesses. Micro-level data makes it possible to study phenomena that are obscured in country or industry aggregates. The data show that periods of increased inflation uncertainty are associated with substantial reductions in total investment. Moreover, there is a shift in the composition of investment away from fixed assets and towards working capital - the more f...

  1. Radiative cooling of relativistic electron beams

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhirong [Stanford Univ., CA (United States)

    1998-05-01

    Modern high-energy particle accelerators and synchrotron light sources demand smaller and smaller beam emittances in order to achieve higher luminosity or better brightness. For light particles such as electrons and positrons, radiation damping is a natural and effective way to obtain low emittance beams. However, the quantum aspect of radiation introduces random noise into the damped beams, yielding equilibrium emittances which depend upon the design of a specific machine. In this dissertation, the author attempts to make a complete analysis of the process of radiation damping and quantum excitation in various accelerator systems, such as bending magnets, focusing channels and laser fields. Because radiation is formed over a finite time and emitted in quanta of discrete energies, he invokes the quantum mechanical approach whenever the quasiclassical picture of radiation is insufficient. He shows that radiation damping in a focusing system is fundamentally different from that in a bending system. Quantum excitation to the transverse dimensions is absent in a straight, continuous focusing channel, and is exponentially suppressed in a focusing-dominated ring. Thus, the transverse normalized emittances in such systems can in principle be damped to the Compton wavelength of the electron, limited only by the Heisenberg uncertainty principle. In addition, he investigates methods of rapid damping such as radiative laser cooling. He proposes a laser-electron storage ring (LESR) where the electron beam in a compact storage ring repetitively interacts with an intense laser pulse stored in an optical resonator. The laser-electron interaction gives rise to rapid cooling of electron beams and can be used to overcome the space charge effects encountered in a medium energy circular machine. Applications to the designs of low emittance damping rings and compact x-ray sources are also explored.

  2. Cost-effective conservation of an endangered frog under uncertainty.

    Science.gov (United States)

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost

  3. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  4. Uncertainty in measurements by counting

    Science.gov (United States)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  5. Uncertainties in land use data

    Directory of Open Access Journals (Sweden)

    G. Castilla

    2007-11-01

    Full Text Available This paper deals with the description and assessment of uncertainties in land use data derived from Remote Sensing observations, in the context of hydrological studies. Land use is a categorical regionalised variable reporting the main socio-economic role each location has, where the role is inferred from the pattern of occupation of land. The properties of this pattern that are relevant to hydrological processes have to be known with some accuracy in order to obtain reliable results; hence, uncertainty in land use data may lead to uncertainty in model predictions. There are two main uncertainties surrounding land use data, positional and categorical. The first one is briefly addressed and the second one is explored in more depth, including the factors that influence it. We (1 argue that the conventional method used to assess categorical uncertainty, the confusion matrix, is insufficient to propagate uncertainty through distributed hydrologic models; (2 report some alternative methods to tackle this and other insufficiencies; (3 stress the role of metadata as a more reliable means to assess the degree of distrust with which these data should be used; and (4 suggest some practical recommendations.

  6. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  7. Systematic Uncertainties in High-Rate Germanium Data

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, Andrew J.; Fast, James E.; Fulsom, Bryan G.; Pitts, William K.; VanDevender, Brent A.; Wood, Lynn S.

    2016-10-06

    For many nuclear material safeguards inspections, spectroscopic gamma detectors are required which can achieve high event rates (in excess of 10^6 s^-1) while maintaining very good energy resolution for discrimination of neighboring gamma signatures in complex backgrounds. Such spectra can be useful for non-destructive assay (NDA) of spent nuclear fuel with long cooling times, which contains many potentially useful low-rate gamma lines, e.g., Cs-134, in the presence of a few dominating gamma lines, such as Cs-137. Detectors in use typically sacrifice energy resolution for count rate, e.g., LaBr3, or visa versa, e.g., CdZnTe. In contrast, we anticipate that beginning with a detector with high energy resolution, e.g., high-purity germanium (HPGe), and adapting the data acquisition for high throughput will be able to achieve the goals of the ideal detector. In this work, we present quantification of Cs-134 and Cs-137 activities, useful for fuel burn-up quantification, in fuel that has been cooling for 22.3 years. A segmented, planar HPGe detector is used for this inspection, which has been adapted for a high-rate throughput in excess of 500k counts/s. Using a very-high-statistic spectrum of 2.4*10^11 counts, isotope activities can be determined with very low statistical uncertainty. However, it is determined that systematic uncertainties dominate in such a data set, e.g., the uncertainty in the pulse line shape. This spectrum offers a unique opportunity to quantify this uncertainty and subsequently determine required counting times for given precision on values of interest.

  8. Immersion cooling of silicon photomultipliers (SiPM) for nuclear medicine imaging applications

    International Nuclear Information System (INIS)

    Raylman, R.R.; Stolin, A.V.

    2016-01-01

    Silicon photomultipliers (SiPM) are compact, high amplification light detection devices that have recently been incorporated into magnetic field-compatible positron emission tomography (PET) scanners. To take full advantage of these devices, it is preferable to cool them below room temperature. Most current methods are limited to the cooling of individual detector modules, increasing complexity and cost of scanners made-up of a large number of modules. In this work we investigated a new method of cooling, immersion of the detector modules in non-electrically conductive, cooled liquid. A small-scale prototype system was constructed to cool a relatively large area SiPM-based, scintillator detector module by immersing it in a circulating bath of mineral oil. Testing demonstrated that the system rapidly decreased and stabilized the temperature of the device. Operation of the detector illustrated the expected benefits of cooling, with no apparent degradation of performance attributable to immersion in fluid. - Highlights: • Immersion cooling is new, simple and inexpensive method to cool solid state based nuclear medicine scanner. • Method successfully tested on a scaled version of an SiPM-based PET detector module. • Can be scaled up to cool a complete PET scanner.

  9. 14 CFR 29.908 - Cooling fans.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Cooling fans. 29.908 Section 29.908... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Powerplant General § 29.908 Cooling fans. For cooling fans that are a part of a powerplant installation the following apply: (a) Category A. For cooling fans installed...

  10. Impingement jet cooling in gas turbines

    CERN Document Server

    Amano, R S

    2014-01-01

    Due to the requirement for enhanced cooling technologies on modern gas turbine engines, advanced research and development has had to take place in field of thermal engineering. Impingement jet cooling is one of the most effective in terms of cooling, manufacturability and cost. This is the first to book to focus on impingement cooling alone.

  11. Incorporation, plurality, and the incorporation of plurals: a dynamic approach

    NARCIS (Netherlands)

    de Swart, H.E.; Farkas, D. F.

    2004-01-01

    This paper deals with the semantic properties of incorporated nominals that are present at clausal syntax. Such nominals exhibit a complex cluster of semantic properties, ranging from argument structure, scope, and number to discourse transparency. We develop an analysis of incorporation in the

  12. Measuring the coolness of interactive products

    DEFF Research Database (Denmark)

    Bruun, Anders; Raptis, Dimitrios; Kjeldskov, Jesper

    2016-01-01

    is the COOL questionnaire. We based the creation of the questionnaire on literature suggesting that perceived coolness is decomposed to outer cool (the style of a product) and inner cool (the personality characteristics assigned to it). In this paper, we focused on inner cool, and we identified 11 inner cool...... characteristics. These were used to create an initial pool of question items and 2236 participants were asked to assess 16 mobile devices. By performing exploratory and confirmatory factor analyses, we identified three factors that can measure the perceived inner coolness of interactive products: desirability...

  13. New cooling regulation technology of secondary cooling station in DCS

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Xuan; Yan, Jun-wei; Zhu, Dong-sheng; Liu, Fei-long; Lei, Jun-xi [The Key Lab of Enhanced Heat Transfer and Energy Conservation of Ministry of Education, School of Chemical and Energy Engineering, South China University of Technology, Guangzhou 510641 (China); Liang, Lie-quan [The Key Lab of E-Commerce Market Application Technology of Guangdong Province, Guangdong University of Business Studies, Guangzhou 510320 (China)

    2008-07-01

    In this paper, a kind of new control technology of secondary cooling station (constant flow rate/variable temperature difference) in district cooling system (DCS) is proposed in view of serial consequences including low efficiency and high operating cost caused by low temperature of supply water in DCS. This technology has been applied in DCS of Guangzhou University City. The result has already indicated that such technology can increase the supply and return temperatures of buildings, return water temperature of primary side in the plate heat exchanger unit, moreover, the efficiency of both the chiller and the whole system are improved significantly. (author)

  14. Thermodynamic limits of dynamic cooling.

    Science.gov (United States)

    Allahverdyan, Armen E; Hovhannisyan, Karen V; Janzing, Dominik; Mahler, Guenter

    2011-10-01

    We study dynamic cooling, where an externally driven two-level system is cooled via reservoir, a quantum system with initial canonical equilibrium state. We obtain explicitly the minimal possible temperature T(min)>0 reachable for the two-level system. The minimization goes over all unitary dynamic processes operating on the system and reservoir and over the reservoir energy spectrum. The minimal work needed to reach T(min) grows as 1/T(min). This work cost can be significantly reduced, though, if one is satisfied by temperatures slightly above T(min). Our results on T(min)>0 prove unattainability of the absolute zero temperature without ambiguities that surround its derivation from the entropic version of the third law. We also study cooling via a reservoir consisting of N≫1 identical spins. Here we show that T(min)∝1/N and find the maximal cooling compatible with the minimal work determined by the free energy. Finally we discuss cooling by reservoir with an initially microcanonic state and show that although a purely microcanonic state can yield the zero temperature, the unattainability is recovered when taking into account imperfections in preparing the microcanonic state.

  15. Newton's law of cooling revisited

    International Nuclear Information System (INIS)

    Vollmer, M

    2009-01-01

    The cooling of objects is often described by a law, attributed to Newton, which states that the temperature difference of a cooling body with respect to the surroundings decreases exponentially with time. Such behaviour has been observed for many laboratory experiments, which led to a wide acceptance of this approach. However, the heat transfer from any object to its surrounding is not only due to conduction and convection but also due to radiation. The latter does not vary linearly with temperature difference, which leads to deviations from Newton's law. This paper presents a theoretical analysis of the cooling of objects with a small Biot number. It is shown that Newton's law of cooling, i.e. simple exponential behaviour, is mostly valid if temperature differences are below a certain threshold which depends on the experimental conditions. For any larger temperature differences appreciable deviations occur which need the complete nonlinear treatment. This is demonstrated by results of some laboratory experiments which use IR imaging to measure surface temperatures of solid cooling objects with temperature differences of up to 300 K.

  16. Beam Cooling with ionisation losses

    CERN Document Server

    Rubbia, Carlo; Kadi, Y; Vlachoudis, V

    2006-01-01

    A novel type of particle "cooling", called Ionization Cooling, is applicable to slow (v of the order of 0.1c) ions stored in a small ring. The many traversals through a thin foil enhance the nuclear reaction probability, in a steady configuration in which ionisation losses are recovered at each turn by a RF-cavity. For a uniform target "foil" the longitudinal momentum spread diverges exponentially since faster (slower) particles ionise less (more) than the average. In order to "cool" also longitudinally, a chromaticity has to be introduced with a wedge shaped "foil". Multiple scattering and straggling are then "cooled" in all three dimensions, with a method similar to the one of synchrotron cooling, but valid for low energy ions. Particles then stably circulate in the beam indefinitely, until they undergo for instance nuclear processes in the thin target foil. This new method is under consideration for the nuclear production of a few MeV/A ion beams. Simple reactions, for instance Li 7 + D Li 8 + p, are more ...

  17. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    International Nuclear Information System (INIS)

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  18. Condenser cooling water quality at Kaiga

    International Nuclear Information System (INIS)

    Namboodiri, E.G.A.

    1995-01-01

    Once-through circulation of river water is envisaged in Kaiga for cooling the condenser and other related equipment. Water drawn from Kali river will be used for this purpose. After cooling the condenser, the water is let into the river through the outfall system. The materials used in the cooling water system consist mainly of SS 316 and carbon steel. Chlorination is the treatment proposed to the cooling water. The cooling water quality is found to be satisfactory. (author). 2 refs

  19. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  20. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities

    International Nuclear Information System (INIS)

    Benjamin, Serge; Descures, Sylvain; Du Pasquier, Louis; Francois, Patrice; Buonarotti, Stefano; Mariotti, Giovanni; Tarakonov, Jurij; Daniska, Vladimir; Bergh, Niklas; Carroll, Simon; AaSTRoeM, Annika; Cato, Anna; De La Gardie, Fredrik; Haenggi, Hannes; Rodriguez, Jose; Laird, Alastair; Ridpath, Andy; La Guardia, Thomas; O'Sullivan, Patrick; ); Weber, Inge; )

    2017-01-01

    The cost estimation process of decommissioning nuclear facilities has continued to evolve in recent years, with a general trend towards demonstrating greater levels of detail in the estimate and more explicit consideration of uncertainties, the latter of which may have an impact on decommissioning project costs. The 2012 report on the International Structure for Decommissioning Costing (ISDC) of Nuclear Installations, a joint recommendation by the Nuclear Energy Agency (NEA), the International Atomic Energy Agency (IAEA) and the European Commission, proposes a standardised structure of cost items for decommissioning projects that can be used either directly for the production of cost estimates or for mapping of cost items for benchmarking purposes. The ISDC, however, provides only limited guidance on the treatment of uncertainty when preparing cost estimates. Addressing Uncertainties in Cost Estimates for Decommissioning Nuclear Facilities, prepared jointly by the NEA and IAEA, is intended to complement the ISDC, assisting cost estimators and reviewers in systematically addressing uncertainties in decommissioning cost estimates. Based on experiences gained in participating countries and projects, the report describes how uncertainty and risks can be analysed and incorporated in decommissioning cost estimates, while presenting the outcomes in a transparent manner

  1. Forced draft wet cooling systems

    International Nuclear Information System (INIS)

    Daubert, A.; Caudron, L.; Viollet, P.L.

    1975-01-01

    The disposal of the heat released from a 1000MW power plant needs a natural draft tower of about 130m of diameter at the base, and 170m height, or a cooling system with a draft forced by about forty vans, a hundred meters in diameter, and thirty meters height. The plumes from atmospheric cooling systems form, in terms of fluid mechanics, hot jets in a cross current. They consist in complex flows that must be finely investigated with experimental and computer means. The study, currently being performed at the National Hydraulics Laboratory, shows that as far as the length and height of visible plumes are concerned, the comparison is favorable to some types of forced draft cooling system, for low and medium velocities, (below 5 or 6m/s at 10m height. Beyond these velocities, the forced draft sends the plume up to smaller heights, but the plume is generally more dilute [fr

  2. Cooled Beam Diagnostics on LEIR

    CERN Document Server

    Tranquille, G; Carli, C; Chanel, M; Prieto, V; Sautier, R; Tan, J

    2008-01-01

    Electron cooling is central in the preparation of dense bunches of lead beams for the LHC. Ion beam pulses from the LINAC3 are transformed into short highbrightness bunches using multi-turn injection, cooling and accumulation in the Low Energy Ion Ring, LEIR [1]. The cooling process must therefore be continuously monitored in order to guarantee that the lead ions have the required characteristics in terms of beam size and momentum spread. In LEIR a number of systems have been developed to perform these measurements. These include Schottky diagnostics, ionisation profile monitors and scrapers. Along with their associated acquisition and analysis software packages these instruments have proved to be invaluable for the optimisation of the electron cooler.

  3. An Investigation of the Ranger V-770-8 Engine Installation for the Edo XOSE-1 Airplane I : Cooling

    Science.gov (United States)

    Emmons, M. Arnold; Conway, Robert N.

    1945-01-01

    Engine temperature data and cooling correlating analyses of the engine and oil cooler are presented in connection with an investigation of the cowling and cooling of the ranger V-770-8 engine installation in the Edo XOSE-1 airplane. Three types of baffles were installed in the course of the tests: the conventional, the turbulent-flow, and the NACA diffuser baffles. Each of the types was of merit in cooling a different region on the cylinder. Incorporation of the best features of the three types into one baffle, a method which appears to be feasible, would provide improvements in cylinder cooling.

  4. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  5. Sodium-cooled nuclear reactors

    International Nuclear Information System (INIS)

    Berthoud, Georges; Ducros, Gerard; Feron, Damien; Guerin, Yannick; Latge, Christian; Limoge, Yves; Santarini, Gerard; Seiler, Jean-Marie; Vernaz, Etienne; Guidez, Joel; Andrieux, Catherine; Baque, Francois; Bonin, Bernard; Boullis, Bernard; Cabet, Celine; Carre, Frank; Dufour, Philippe; Gauche, Francois; Grouiller, Jean-Paul; Jeannot, Jean-Philippe; Le Flem, Marion; Le Coz, Pierre; Martin, Laurent; Masson, Michel; Mathonniere, Gilles; Nokhamzon, Jean-Guy; Pelletier, Michel; Rodriguez, Gilles; Saez, Manuel; Seran, Jean-Louis; Varaine, Frederic; Zaetta, Alain; Behar, Christophe; Provitina, Olivier; Lecomte, Michael; Forestier, Alain; Bender, Alexandra; Parisot, Jean-Francois; Finot, Pierre

    2014-01-01

    This book first explains the choice of sodium-cooled reactors by outlining the reasons of the choice of fast neutron reactors (fast neutrons instead of thermal neutrons, recycling opportunity for plutonium, full use of natural uranium, nuclear waste optimization, flexibility of fast neutron reactors in nuclear material management, fast neutron reactors as complements of water-cooled reactors), and by outlining the reasons for the choice of sodium as heat-transfer material. Physical, chemical, and neutron properties of sodium are presented. The second part of the book first presents the main design principles for sodium-cooled fast neutron reactors and their core. The third part proposes an historical overview and an assessment of previously operated sodium-cooled fast neutron reactors (French reactors from Rapsodie to Superphenix, other reactors in the world), and an assessment of the main incidents which occurred in these reactors. It also reports the experience and lessons learned from the dismantling of various sodium-cooled fast breeder reactors in the world. The next chapter addresses safety issues (technical and safety aspects related to the use of sodium) and environmental issues (dosimetry, gaseous and liquid releases, solid wastes, and cooling water). Then, various technological aspects of these reactors are addressed: the energy conversion system, main components, sodium chemistry, sodium-related technology, advances in in-service inspection, materials used in reactors and their behaviour, and fuel system. The next chapter addresses the fuel cycle in these reactors: its integrated specific character, report of the French experience in fast neutron reactor fuel processing, description of the transmutation of minor actinides in these reactors. The last chapter proposes an overview of reactors currently projected or under construction in the world, presents the Astrid project, and gives an assessment of the economy of these reactors. A glossary and an index

  6. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  7. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  8. Frequency stabilization of internal-mirror He-Ne lasers by air cooling.

    Science.gov (United States)

    Qian, Jin; Liu, Zhongyou; Shi, Chunying; Liu, Xiuying; Wang, Jianbo; Yin, Cong; Cai, Shan

    2012-09-01

    Instead of the traditional heating method, the cavity length of an internal-mirror He-Ne laser is controlled by air cooling which is implemented by a mini cooling fan. The responsive property of the cooling fan and the thermal expansion of the internal-mirror laser tube are investigated. According to these investigations, a controlling system is designed to drive the cooling fan controlling the cavity length of the laser. Then the frequency is stabilized by comparing the light intensities of two operating longitudinal modes. The results of beating with an iodine stabilized He-Ne laser show that a relative uncertainty (Δf/f-) of 4.3×10(-9) in 5 months, a frequency fluctuation of <1.4 MHz, and an Allan deviation of 6×10(-11) (τ=10,000 s) in 20 h are obtained.

  9. Simulated Measurements of Cooling in Muon Ionization Cooling Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Mohayai, Tanaz [IIT, Chicago; Rogers, Chris [Rutherford; Snopok, Pavel [Fermilab

    2016-06-01

    Cooled muon beams set the basis for the exploration of physics of flavour at a Neutrino Factory and for multi-TeV collisions at a Muon Collider. The international Muon Ionization Cooling Experiment (MICE) measures beam emittance before and after an ionization cooling cell and aims to demonstrate emittance reduction in muon beams. In the current MICE Step IV configuration, the MICE muon beam passes through low-Z absorber material for reducing its transverse emittance through ionization energy loss. Two scintillating fiber tracking detectors, housed in spectrometer solenoid modules upstream and downstream of the absorber are used for reconstructing position and momentum of individual muons for calculating transverse emittance reduction. However, due to existence of non-linear effects in beam optics, transverse emittance growth can be observed. Therefore, it is crucial to develop algorithms that are insensitive to this apparent emittance growth. We describe a different figure of merit for measuring muon cooling which is the direct measurement of the phase space density.

  10. Beam Dynamics With Electron Cooling

    CERN Document Server

    Uesugi, T; Noda, K; Shibuya, S; Syresin, E M

    2004-01-01

    Electron cooling experiments have been carried out at HIMAC in order to develop new technologies in heavy-ion therapy and related researches. The cool-stacking method, in particular, has been studied to increase the intensity of heavy-ions. The maximum stack intensity was 2 mA, above which a fast ion losses occurred simulatneously with the vertical coherent oscillations. The instability depends on the working point, the stacked ion-density and the electron-beam density. The instability was suppressed by reducing the peak ion-density with RF-knockout heating.

  11. New Approaches to Final Cooling

    Energy Technology Data Exchange (ETDEWEB)

    Neuffer, David [Fermilab

    2014-11-10

    A high-energy muon collider scenario require a “final cooling” system that reduces transverse emittances by a factor of ~10 while allowing longitudinal emittance increase. The baseline approach has low-energy transverse cooling within high-field solenoids, with strong longitudinal heating. This approach and its recent simulation are discussed. Alternative approaches which more explicitly include emittance exchange are also presented. Round-to-flat beam transform, transverse slicing, and longitudinal bunch coalescence are possible components of the alternative approach. A more explicit understanding of solenoidal cooling beam dynamics is introduced.

  12. Quantifying allometric model uncertainty for plot-level live tree biomass stocks with a data-driven, hierarchical framework

    Science.gov (United States)

    Brian J. Clough; Matthew B. Russell; Grant M. Domke; Christopher W. Woodall

    2016-01-01

    Accurate uncertainty assessments of plot-level live tree biomass stocks are an important precursor to estimating uncertainty in annual national greenhouse gas inventories (NGHGIs) developed from forest inventory data. However, current approaches employed within the United States’ NGHGI do not specifically incorporate methods to address error in tree-scale biomass...

  13. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  14. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  15. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  16. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    .D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  17. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  18. Uncertainty and Intelligence in Computational Stochastic Mechanics

    Science.gov (United States)

    Ayyub, Bilal M.

    1996-01-01

    Classical structural reliability assessment techniques are based on precise and crisp (sharp) definitions of failure and non-failure (survival) of a structure in meeting a set of strength, function and serviceability criteria. These definitions are provided in the form of performance functions and limit state equations. Thus, the criteria provide a dichotomous definition of what real physical situations represent, in the form of abrupt change from structural survival to failure. However, based on observing the failure and survival of real structures according to the serviceability and strength criteria, the transition from a survival state to a failure state and from serviceability criteria to strength criteria are continuous and gradual rather than crisp and abrupt. That is, an entire spectrum of damage or failure levels (grades) is observed during the transition to total collapse. In the process, serviceability criteria are gradually violated with monotonically increasing level of violation, and progressively lead into the strength criteria violation. Classical structural reliability methods correctly and adequately include the ambiguity sources of uncertainty (physical randomness, statistical and modeling uncertainty) by varying amounts. However, they are unable to adequately incorporate the presence of a damage spectrum, and do not consider in their mathematical framework any sources of uncertainty of the vagueness type. Vagueness can be attributed to sources of fuzziness, unclearness, indistinctiveness, sharplessness and grayness; whereas ambiguity can be attributed to nonspecificity, one-to-many relations, variety, generality, diversity and divergence. Using the nomenclature of structural reliability, vagueness and ambiguity can be accounted for in the form of realistic delineation of structural damage based on subjective judgment of engineers. For situations that require decisions under uncertainty with cost/benefit objectives, the risk of failure should

  19. Uncertainty is a major concern for patients with implantable cardioverter defibrillators.

    Science.gov (United States)

    Flemme, Inger; Hallberg, Ulrika; Johansson, Ingela; Strömberg, Anna

    2011-01-01

    The study objective was to explore the main concern of individuals living with an implantable cardioverter defibrillator (ICD) and how they handle this in daily life. For improved management and follow-up, it is important to understand how the ICD affects the recipient's daily life. A grounded theory method was used. Sixteen Swedish recipients (9 men) living with an ICD for 6 to 24 months were interviewed. The core category labeled, "Incorporating uncertainty in daily life," illuminates the main concern. To handle uncertainty, recipients used the following strategies: restricting activities, distracting oneself, accepting being an ICD recipient, and reevaluating life. Recipients were not paralyzed by uncertainty. Instead, they incorporated uncertainty in life by using strategies to handle their daily life. Questions, comments, and plans for supportive communication were provided, which can be used by healthcare professionals in cardiac rehabilitation. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Statistics, Uncertainty, and Transmitted Variation

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, Joanne Roth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  1. Regulating renewable resources under uncertainty

    DEFF Research Database (Denmark)

    Hansen, Lars Gårn

    Renewable natural resources (like water, fish and wildlife stocks, forests and grazing lands) are critical for the livelihood of millions of people and understanding how they can be managed efficiently is an important economic problem. I show how regulator uncertainty about different economic......) that a pro-quota result under uncertainty about prices and marginal costs is unlikely, requiring that the resource growth function is highly concave locally around the optimum and, 3) that quotas are always preferred if uncertainly about underlying structural economic parameters dominates. These results...

  2. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  3. Awe, uncertainty, and agency detection.

    Science.gov (United States)

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  4. The study on the evaporation cooling efficiency and effectiveness of cooling tower of film type

    International Nuclear Information System (INIS)

    Li Yingjian; You Xinkui; Qiu Qi; Li Jiezhi

    2011-01-01

    Based on heat and mass transport mechanism of film type cooling, which was combined with an on-site test on counter flow film type cooling tower, a mathematical model on the evaporation and cooling efficiency and effectiveness has been developed. Under typical climatic conditions, air conditioning load and the operating condition, the mass and heat balances have been calculated for the air and the cooling water including the volume of evaporative cooling water. Changing rule has been measured and calculated between coefficient of performance (COP) and chiller load. The influences of air and cooling water parameters on the evaporative cooling efficiency were analyzed in cooling tower restrained by latent heat evaporative cooling, and detailed derivation and computation revealed that both the evaporative cooling efficiency and effectiveness of cooling tower are the same characteristics parameters of the thermal performance of a cooling tower under identical assumptions.

  5. Minimizing Slope and Kick of Intermediate Bunches for Electron Cooling

    Science.gov (United States)

    Dotson, Andrew

    2017-09-01

    Ions in the Jefferson Lab Electron-Ion Collider (JLEIC) will have transverse energy, which limits the beams density. Electron cooling is a process by which a beam of bunched electrons with small transverse kinetic energy is directed along the ion beam with the same velocity. The ions transfer their transverse kinetic energy to the electron bunches, making the ions lose transverse energy. Electron bunches will be supplied by an electron gun. The required current needed to cool the ion beam can be reached by reusing electrons and incorporating RF kicker cavities to supply a pulsed electric field that kicks every 11th bunch out of the cooling ring. This provides an exact solution that yields zero kick and slope to all intermediate bunches in the cooling ring, which is described by a cosine series with 11 terms. The goal of this project is to determine if solutions exist that are sufficiently close to zero kick and slope, but require less than 4 kicker cavities. The method used to find these solutions is minimizing an objective function through Sequential Least Squares Programming (SLSQP). A Pareto front then demonstrated the average kick vs. average slope when using 1 through 4 kickers.

  6. Assessing population viability while accounting for demographic and environmental uncertainty.

    Science.gov (United States)

    Oppel, Steffen; Hilton, Geoff; Ratcliffe, Norman; Fenton, Calvin; Daley, James; Gray, Gerard; Vickery, Juliet; Gibbons, David

    2014-07-01

    Predicting the future trend and viability of populations is an essential task in ecology. Because many populations respond to changing environments, uncertainty surrounding environmental responses must be incorporated into population assessments. However, understanding the effects of environmental variation on population dynamics requires information on several important demographic parameters that are often difficult to estimate. Integrated population models facilitate the integration of time series data on population size and all existing demographic information from a species, allowing the estimation of demographic parameters for which limited or no empirical data exist. Although these models are ideal for assessments of population viability, they have so far not included environmental uncertainty. We incorporated environmental variation in an integrated population model to account for both demographic and environmental uncertainty in an assessment of population viability. In addition, we used this model to estimate true juvenile survival, an important demographic parameter for population dynamics that is difficult to estimate empirically. We applied this model to assess the past and future population trend of a rare island endemic songbird, the Montserrat Oriole Icterus oberi, which is threatened by volcanic activity. Montserrat Orioles experienced lower survival in years with volcanic ashfall, causing periodic population declines that were compensated by higher seasonal fecundity in years with high pre-breeding season rainfall. Due to the inclusion of both demographic and environmental uncertainty in the model, the estimated population growth rate in the immediate future was highly imprecise (95% credible interval 0.844-1.105), and the probability of extinction after three generations (in the year 2028) was low (2.1%). This projection demonstrates that accounting for both demographic and environmental sources of uncertainty provides a more realistic assessment

  7. Cool Runnings For String 2

    CERN Multimedia

    2001-01-01

    String 2 is a series of superconducting magnets that are prototypes of those which will be installed in the LHC. It was cooled down to 1.9 Kelvin on September 14th. On Thursday last week, the dipoles of String 2 were successfully taken to nominal current, 11850 A.

  8. Peltier cooling in molecular junctions

    Science.gov (United States)

    Cui, Longji; Miao, Ruijiao; Wang, Kun; Thompson, Dakotah; Zotti, Linda Angela; Cuevas, Juan Carlos; Meyhofer, Edgar; Reddy, Pramod

    2018-02-01

    The study of thermoelectricity in molecular junctions is of fundamental interest for the development of various technologies including cooling (refrigeration) and heat-to-electricity conversion1-4. Recent experimental progress in probing the thermopower (Seebeck effect) of molecular junctions5-9 has enabled studies of the relationship between thermoelectricity and molecular structure10,11. However, observations of Peltier cooling in molecular junctions—a critical step for establishing molecular-based refrigeration—have remained inaccessible. Here, we report direct experimental observations of Peltier cooling in molecular junctions. By integrating conducting-probe atomic force microscopy12,13 with custom-fabricated picowatt-resolution calorimetric microdevices, we created an experimental platform that enables the unified characterization of electrical, thermoelectric and energy dissipation characteristics of molecular junctions. Using this platform, we studied gold junctions with prototypical molecules (Au-biphenyl-4,4'-dithiol-Au, Au-terphenyl-4,4''-dithiol-Au and Au-4,4'-bipyridine-Au) and revealed the relationship between heating or cooling and charge transmission characteristics. Our experimental conclusions are supported by self-energy-corrected density functional theory calculations. We expect these advances to stimulate studies of both thermal and thermoelectric transport in molecular junctions where the possibility of extraordinarily efficient energy conversion has been theoretically predicted2-4,14.

  9. A cool present for LEIR

    CERN Multimedia

    2005-01-01

    LEIR (Low Energy Ion Ring), which will supply lead ions to the LHC experiments, has taken delivery of one of its key components, its electron cooling system. From left to right, Gérard Tranquille, Virginia Prieto and Roland Sautier, in charge of the electron cooling system for LEIR at CERN, and Christian Lacroix, in charge of installation for the LEIR machine. On 16 December, the day before CERN's annual closure, the LEIR teams received a rather impressive Christmas present. The "parcel" from Russia, measuring 7 metres in length and 4 metres in height, weighed no less than 20 tonnes! The component will, in fact, be one of the key elements of the future LEIR, namely its electron cooling system. LEIR is one of the links in the injector chain that will supply lead ions to the LHC experiments, in particular ALICE (see Bulletin No. 28/2004 of 5 July 2004), within the framework of the I-LHC Project. The electron cooling system is designed to reduce and standardise transverse ion velocity. This focuses the bea...

  10. Chromospheres of Luminous Cool Stars

    Science.gov (United States)

    Dupree, A. K.

    Direct ultraviolet imaging and spectroscopy of Alpha Orionis (Betelgeuse) reveals variable chromospheric structures and mass motions. Spectroscopy also demonstrates the changes of wind opacity, speeds, and mass loss in luminous stars. Cool stars have complex chromospheres that need to be considered in construction of stellar atmospheric models and subsequent spectral analyses.

  11. International Ventilation Cooling Application Database

    DEFF Research Database (Denmark)

    Holzer, Peter; Psomas, Theofanis Ch.; OSullivan, Paul

    2016-01-01

    The currently running International Energy Agency, Energy and Conservation in Buildings, Annex 62 Ventilative Cooling (VC) project, is coordinating research towards extended use of VC. Within this Annex 62 the joint research activity of International VC Application Database has been carried out, ...

  12. System for cooling a cabinet

    DEFF Research Database (Denmark)

    2015-01-01

    The present disclosure relates to a cooling system comprising an active magnetic regenerator having a cold side and a hot side, a hot side heat exchanger connected to the hot side of the magnetic regenerator, one or more cold side heat exchangers, and a cold store reservoir comprising a volume...

  13. Passive Cooling of Body Armor

    Science.gov (United States)

    Holtz, Ronald; Matic, Peter; Mott, David

    2013-03-01

    Warfighter performance can be adversely affected by heat load and weight of equipment. Current tactical vest designs are good insulators and lack ventilation, thus do not provide effective management of metabolic heat generated. NRL has undertaken a systematic study of tactical vest thermal management, leading to physics-based strategies that provide improved cooling without undesirable consequences such as added weight, added electrical power requirements, or compromised protection. The approach is based on evaporative cooling of sweat produced by the wearer of the vest, in an air flow provided by ambient wind or ambulatory motion of the wearer. Using an approach including thermodynamic analysis, computational fluid dynamics modeling, air flow measurements of model ventilated vest architectures, and studies of the influence of fabric aerodynamic drag characteristics, materials and geometry were identified that optimize passive cooling of tactical vests. Specific architectural features of the vest design allow for optimal ventilation patterns, and selection of fabrics for vest construction optimize evaporation rates while reducing air flow resistance. Cooling rates consistent with the theoretical and modeling predictions were verified experimentally for 3D mockups.

  14. Inductive cooling in quantum magnetomechanics

    Science.gov (United States)

    Romero-Sanchez, Erick; Twamley, Jason; Bowen, Warwick P.; Vanner, Michael R.

    Coupling to light or microwave fields allows quantum control of the motion of a mechanical oscillator, and offers prospects for precision sensing, quantum information systems, and tests of fundamental physics. In cavity electromechanics ground state cooling has been achieved using resolved sideband cooling. Here we present an alternative approach based on a magnetomechanical system that inductively couples an LC resonator to a mechanical oscillator. The experimental setup consists of a micro cantilever with a pyramidal magnetic tip attached at the end of the beam. The sharp end of the magnetic tip is positioned close to the planar microfabricated inductor of the LC resonator. The displacement in the position of the end of the cantilever generates a change in flux through the coil inducing an electromotive force in the circuit. The current in the LC resonator generates a magnetic field, and then a force between the tip and the coil. When they are strongly coupled and the mechanical resonance frequency ωm exceeds the electrical decay rate of the resonator γe, resolved sideband cooling can be used to cool the mechanics. We present estimations for the coupling rates and the experimental parameters required for these experiments. E. Romero acknowledges to CONACyT.

  15. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model......In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...

  16. The Effect of Cool Deformation on the Microstructural Evolution and Flow Strength of Microalloyed Steels

    Science.gov (United States)

    Mousavi Anijdan, Seyyed Hashem

    Cool deformation is a process in which a small amount of plastic deformation is applied at temperatures well below the end of the austenite transformation temperature. In this thesis, a systematic study was conducted to evaluate the microstructural evolution and mechanical properties of microalloyed steels processed by thermomechanical schedules incorporating cool deformation. Thermodynamic analysis was conducted to predict equilibrium phases formed by the presence of microalloying elements such as Ti, Nb, Mo and their appearance were then elaborated by means of TEM microscopy. As well, continuous cooling torsion (CCT) was employed to study the transformation behavior of steels for austenite conditioned and unconditioned. Cool deformation was incorporated into a full scale simulation of hotrolling, and the effect of prior austenite conditioning on the cool deformability of microalloyed steels was investigated. Out of these studies, a new definition of no-recystallization temperature (Tnr) was proposed based on dynamic precipitation, which was then recognized in the Nb bearing steels by using TEM analysis as well as flow curves analysis. Results show that cool deformation greatly improves the strength of microalloyed steels. Of the several mechanisms identified, such as work hardening, precipitation, grain refinement, and strain induced transformation (SIT) of retained austenite, SIT was proposed, for the first time in microalloyed steels, to be the significant mechanism of strengthening due to the deformation in ferrite. Results also show that the effect of ferrite precipitation is greatly overshadowed by SIT at room temperature. Finally, considering the interplay of SIT and precipitation for the Nb bearing steels, a rolling schedule was designed incorporating austenite conditioning, cooling rate and cool deformation that maximized the strength.

  17. Experimental study of in-and-ex-vessel melt cooling during a severe accident

    International Nuclear Information System (INIS)

    Kim, Sang Baik; Yoo, K. J.; Park, C. K.; Seok, S. D.; Park, R. J.; Yi, S. J.; Kang, K. H.; Ham, Y. S.; Cho, Y. R.; Kim, J. H.; Jeong, J. H.; Shin, K. Y.; Cho, J. S.; Kim, D. H.

    1997-07-01

    After code damage during a severe accident in a nuclear reactor, the degraded core has to be cooled down and the decay heat should be removed in order to cease the accident progression and maintain a stable state. The cooling of core melt is divided into in-vessel and ex-vessel cooling depending on the location of molten core which is dependent on the timing of vessel failure. Since the cooling mechanism varies with the conditions of molten core and surroundings and related phenomena, it contains many phenomenological uncertainties so far. In this study, an experimental study for verification of in-vessel corium cooling and several separate effect experiments for ex-vessel cooling are carried out to verify in- and ex-vessel cooling phenomena and finally to develop the accident management strategy and improve engineered reactor design for the severe accidents. SONATA-IV (Simulation of Naturally Arrested Thermal Attack in Vessel) program is set up for in-vessel cooling and a progression of the verification experiment has been done, and an integral verification experiment of the containment integrity for ex-vessel cooling is planned to be carried out based on the separate effect experiments performed in the first phase. First phase study of SONATA-IV is proof of principle experiment and it is composed of LALA (Lower-plenum Arrested Vessel Attack) experiment to find the gap between melt and the lower plenum during melt relocation and to certify melt quenching and CHFG (Critical Heat Flux in Gap) experiment to certify heat transfer mechanism in an artificial gap. As separate effect experiments for ex-vessel cooling, high pressure melt ejection experiment related to the initial condition for debris layer formation in the reactor cavity, crust formation and heat transfer experiment in the molten pool and molten core concrete interaction experiment are performed. (author). 150 refs., 24 tabs., 127 figs

  18. Fossil fuel and biomass burning effect on climate - heating or cooling?

    International Nuclear Information System (INIS)

    Kaufman, Y.J.; Fraser, R.S.; Mahoney, R.L.

    1991-01-01

    Emission from burning of fossil fuels and biomass (associated with deforestation) generates a radiative forcing on the atmosphere and a possible climate change. Emitted trace gases heat the atmosphere through their greenhouse effect, while particulates formed from emitted SO 2 cause cooling by increasing cloud albedos through alteration of droplet size distributions. This paper reviews the characteristics of the cooling effect and applies Twomey's theory to check whether the radiative balance favours heating or cooling for the cases of fossil fuel and biomass burning. It is also shown that although coal and oil emit 120 times as many CO 2 molecules as SO 2 molecules, each SO 2 molecule is 50-1100 times more effective in cooling the atmosphere (through the effect of aerosol particles on cloud albedo) than a CO 2 molecule is in heating it. Note that this ratio accounts for the large difference in the aerosol (3-10 days) and CO 2 (7-100 years) lifetimes. It is concluded, that the cooling effect from coal and oil burning may presently range from 0.4 to 8 times the heating effect. Within this large uncertainty, it is presently more likely that fossil fuel burning causes cooling of the atmosphere rather than heating. Biomass burning associated with deforestation, on the other hand, is more likely to cause heating of the atmosphere than cooling since its aerosol cooling effect is only half that from fossil fuel burning and its heating effect is twice as large. Future increases in coal and oil burning, and the resultant increase in concentration of cloud condensation nuclei, may saturate the cooling effect, allowing the heating effect to dominate. For a doubling in the CO 2 concentration due to fossil fuel burning, the cooling effect is expected to be 0.1 to 0.3 of the heating effect. 75 refs., 8 tabs

  19. Experimental study of in-and-ex-vessel melt cooling during a severe accident

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sang Baik; Yoo, K. J.; Park, C. K.; Seok, S. D.; Park, R. J.; Yi, S. J.; Kang, K. H.; Ham, Y. S.; Cho, Y. R.; Kim, J. H.; Jeong, J. H.; Shin, K. Y.; Cho, J. S.; Kim, D. H.

    1997-07-01

    After code damage during a severe accident in a nuclear reactor, the degraded core has to be cooled down and the decay heat should be removed in order to cease the accident progression and maintain a stable state. The cooling of core melt is divided into in-vessel and ex-vessel cooling depending on the location of molten core which is dependent on the timing of vessel failure. Since the cooling mechanism varies with the conditions of molten core and surroundings and related phenomena, it contains many phenomenological uncertainties so far. In this study, an experimental study for verification of in-vessel corium cooling and several separate effect experiments for ex-vessel cooling are carried out to verify in- and ex-vessel cooling phenomena and finally to develop the accident management strategy and improve engineered reactor design for the severe accidents. SONATA-IV (Simulation of Naturally Arrested Thermal Attack in Vessel) program is set up for in-vessel cooling and a progression of the verification experiment has been done, and an integral verification experiment of the containment integrity for ex-vessel cooling is planned to be carried out based on the separate effect experiments performed in the first phase. First phase study of SONATA-IV is proof of principle experiment and it is composed of LALA (Lower-plenum Arrested Vessel Attack) experiment to find the gap between melt and the lower plenum during melt relocation and to certify melt quenching and CHFG (Critical Heat Flux in Gap) experiment to certify heat transfer mechanism in an artificial gap. As separate effect experiments for ex-vessel cooling, high pressure melt ejection experiment related to the initial condition for debris layer formation in the reactor cavity, crust formation and heat transfer experiment in the molten pool and molten core concrete interaction experiment are performed. (author). 150 refs., 24 tabs., 127 figs.

  20. CFD Model Development and validation for High Temperature Gas Cooled Reactor Cavity Cooling System (RCCS) Applications

    Energy Technology Data Exchange (ETDEWEB)

    Hassan, Yassin [Univ. of Wisconsin, Madison, WI (United Texas A & M Univ., College Station, TX (United States); Corradini, Michael; Tokuhiro, Akira; Wei, Thomas Y.C.

    2014-07-14

    The Reactor Cavity Cooling Systems (RCCS) is a passive safety system that will be incorporated in the VTHR design. The system was designed to remove the heat from the reactor cavity and maintain the temperature of structures and concrete walls under desired limits during normal operation (steady-state) and accident scenarios. A small scale (1:23) water-cooled experimental facility was scaled, designed, and constructed in order to study the complex thermohydraulic phenomena taking place in the RCCS during steady-state and transient conditions. The facility represents a portion of the reactor vessel with nine stainless steel coolant risers and utilizes water as coolant. The facility was equipped with instrumentation to measure temperatures and flow rates and a general verification was completed during the shakedown. A model of the experimental facility was prepared using RELAP5-3D and simulations were performed to validate the scaling procedure. The experimental data produced during the steady-state run were compared with the simulation results obtained using RELAP5-3D. The overall behavior of the facility met the expectations. The facility capabilities were confirmed to be very promising in performing additional experimental tests, including flow visualization, and produce data for code validation.

  1. Heat exchanger with auxiliary cooling system

    Science.gov (United States)

    Coleman, John H.

    1980-01-01

    A heat exchanger with an auxiliary cooling system capable of cooling a nuclear reactor should the normal cooling mechanism become inoperable. A cooling coil is disposed around vertical heat transfer tubes that carry secondary coolant therethrough and is located in a downward flow of primary coolant that passes in heat transfer relationship with both the cooling coil and the vertical heat transfer tubes. A third coolant is pumped through the cooling coil which absorbs heat from the primary coolant which increases the downward flow of the primary coolant thereby increasing the natural circulation of the primary coolant through the nuclear reactor.

  2. Gas turbine heat transfer and cooling technology

    CERN Document Server

    Han, Je-Chin; Ekkad, Srinath

    2012-01-01

    FundamentalsNeed for Turbine Blade CoolingTurbine-Cooling TechnologyTurbine Heat Transfer and Cooling IssuesStructure of the BookReview Articles and Book Chapters on Turbine Cooling and Heat TransferNew Information from 2000 to 2010ReferencesTurbine Heat TransferIntroductionTurbine-Stage Heat TransferCascade Vane Heat-Transfer ExperimentsCascade Blade Heat TransferAirfoil Endwall Heat TransferTurbine Rotor Blade Tip Heat TransferLeading-Edge Region Heat TransferFlat-Surface Heat TransferNew Information from 2000 to 20102.10 ClosureReferencesTurbine Film CoolingIntroductionFilm Cooling on Rotat

  3. Cooling vests with phase change material packs: the effects of temperature gradient, mass and covering area.

    Science.gov (United States)

    Gao, Chuansi; Kuklane, Kalev; Holmer, Ingvar

    2010-05-01

    Phase change material (PCM) absorbs or releases latent heat when it changes phases, making thermal-regulated clothing possible. The objective of this study was to quantify the relationships between PCM cooling rate and temperature gradient, mass and covering area on a thermal manikin in a climatic chamber. Three melting temperatures (24, 28, 32 degrees C) of the PCMs, different mass, covering areas and two manikin temperatures (34 and 38 degrees C) were used. The results showed that the cooling rate of the PCM vests tested is positively correlated with the temperature gradient between the thermal manikin and the melting temperature of the PCMs. The required temperature gradient is suggested to be greater than 6 degrees C when PCM vests are used in hot climates. With the same temperature gradient, the cooling rate is mainly determined by the covering area. The duration of the cooling effect is dependent on PCM mass and the latent heat. STATEMENT OF RELEVANCE: The study of factors affecting the cooling rate of personal cooling equipment incorporated with PCM helps to understand cooling mechanisms. The results suggest climatic conditions, the required temperature gradient, PCM mass and covering area should be taken into account when choosing personal PCM cooling equipment.

  4. Closed circuit steam cooled turbine shroud and method for steam cooling turbine shroud

    Science.gov (United States)

    Burdgick, Steven Sebastian; Sexton, Brendan Francis; Kellock, Iain Robertson

    2002-01-01

    A turbine shroud cooling cavity is partitioned to define a plurality of cooling chambers for sequentially receiving cooling steam and impingement cooling of the radially inner wall of the shoud. An impingement baffle is provided in each cooling chamber for receiving the cooling media from a cooling media inlet in the case of the first chamber or from the immediately upstream chamber in the case of the second through fourth chambers and includes a plurality of impingement holes for effecting the impingement cooling of the shroud inner wall.

  5. Uncertainty assessment using uncalibrated objects:

    DEFF Research Database (Denmark)

    Meneghello, R.; Savio, Enrico; Larsen, Erik

    This report is made as a part of the project Easytrac, an EU project under the programme: Competitive and Sustainable Growth: Contract No: G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines...

  6. Uncertainty of dustfall monitoring results

    Directory of Open Access Journals (Sweden)

    Martin A. van Nierop

    2017-06-01

    Full Text Available Fugitive dust has the ability to cause a nuisance and pollute the ambient environment, particularly from human activities including construction and industrial sites and mining operations. As such, dustfall monitoring has occurred for many decades in South Africa; little has been published on the repeatability, uncertainty, accuracy and precision of dustfall monitoring. Repeatability assesses the consistency associated with the results of a particular measurement under the same conditions; the consistency of the laboratory is assessed to determine the uncertainty associated with dustfall monitoring conducted by the laboratory. The aim of this study was to improve the understanding of the uncertainty in dustfall monitoring; thereby improving the confidence in dustfall monitoring. Uncertainty of dustfall monitoring was assessed through a 12-month study of 12 sites that were located on the boundary of the study area. Each site contained a directional dustfall sampler, which was modified by removing the rotating lid, with four buckets (A, B, C and D installed. Having four buckets on one stand allows for each bucket to be exposed to the same conditions, for the same period of time; therefore, should have equal amounts of dust deposited in these buckets. The difference in the weight (mg of the dust recorded from each bucket at each respective site was determined using the American Society for Testing and Materials method D1739 (ASTM D1739. The variability of the dust would provide the confidence level of dustfall monitoring when reporting to clients.

  7. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  8. Uncertainty covariances in robotics applications

    International Nuclear Information System (INIS)

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  9. Subsidized Capacity Investment under Uncertainty

    NARCIS (Netherlands)

    Wen, Xingang; Hagspiel, V.; Kort, Peter

    2017-01-01

    This paper studies how the subsidy support, e.g. price support and reimbursed investment cost support, affects the investment decision of a monopoly firm under uncertainty and analyzes the implications for social welfare. The analytical results show that the unconditional, i.e., subsidy support that

  10. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  11. Uncertainty in the Real World

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty in the Real World - Fuzzy Sets. Satish Kumar. General Article Volume 4 Issue 2 February 1999 pp 37-47. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/004/02/0037-0047 ...

  12. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management , * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  13. Labeling uncertainty in multitarget tracking

    NARCIS (Netherlands)

    Aoki, E.H.; Mandal, Pranab K.; Svensson, Lennart; Boers, Y.; Bagchi, Arunabha

    In multitarget tracking, the problem of track labeling (assigning labels to tracks) is an ongoing research topic. The existing literature, however, lacks an appropriate measure of uncertainty related to the assigned labels that has a sound mathematical basis as well as clear practical meaning to the

  14. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  15. Uncertainty Principles and Fourier Analysis

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty Principles and Fourier Analysis. Alladi Sitaram. General Article Volume 4 Issue 2 February 1999 pp 20-23. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/004/02/0020-0023 ...

  16. Uncertainty in the Real World

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty in the Real World - Fuzzy Sets. Satish Kumar. General Article Volume 4 Issue 2 February 1999 pp 37-47. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/004/02/0037-0047 ...

  17. Measured performance of a 3 ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A three ton lithium bromide absorption water chiller was tested for a number of conditions involving hot water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It was concluded that a three-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  18. Measured performance of a 3-ton LiBr absorption water chiller and its effect on cooling system operation

    Science.gov (United States)

    Namkoong, D.

    1976-01-01

    A 3-ton lithium bromide absorption water chiller was tested for a number of conditions involving hot-water input, chilled water, and the cooling water. The primary influences on chiller capacity were the hot water inlet temperature and the cooling water inlet temperature. One combination of these two parameters extended the output to as much as 125% of design capacity, but no combination could lower the capacity to below 60% of design. A cooling system was conceptually designed so that it could provide several modes of operation. Such flexibility is needed for any solar cooling system to be able to accommodate the varying solar energy collection and the varying building demand. It is concluded that a 3-ton absorption water chiller with the kind of performance that was measured can be incorporated into a cooling system such as that proposed, to provide efficient cooling over the specified ranges of operating conditions.

  19. Passive safety features in current and future water cooled reactors

    International Nuclear Information System (INIS)

    1990-11-01

    Better understanding of the passive safety systems and components in current and future water-cooled reactors may enhance the safety of present reactors, to the extend passive features are backfitted. This better understanding should also improve the safety of future reactors, which can incorporate more of these features. Passive safety systems and components may help to prevent accidents, core damage, or release radionuclides to the environment. The Technical Committee Meeting which was hosted by the USSR State Committee for Utilization of Nuclear Energy was attended by about 80 experts from 16 IAEA Member States and the NEA-OECD. A total of 21 papers were presented during the meeting. The objective of the meeting was to review and discuss passive safety systems and features of current and future water cooled reactor designs and to exchange information in this area of activity. A separate abstract was prepared for each of the 21 papers published in this proceedings. Refs, figs and tabs

  20. Structural Damage Assessment under Uncertainty

    Science.gov (United States)

    Lopez Martinez, Israel

    Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are

  1. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  2. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  3. Predicting ecological responses in a changing ocean: the effects of future climate uncertainty.

    Science.gov (United States)

    Freer, Jennifer J; Partridge, Julian C; Tarling, Geraint A; Collins, Martin A; Genner, Martin J

    2018-01-01

    Predicting how species will respond to climate change is a growing field in marine ecology, yet knowledge of how to incorporate the uncertainty from future climate data into these predictions remains a significant challenge. To help overcome it, this review separates climate uncertainty into its three components (scenario uncertainty, model uncertainty, and internal model variability) and identifies four criteria that constitute a thorough interpretation of an ecological response to climate change in relation to these parts (awareness, access, incorporation, communication). Through a literature review, the extent to which the marine ecology community has addressed these criteria in their predictions was assessed. Despite a high awareness of climate uncertainty, articles favoured the most severe emission scenario, and only a subset of climate models were used as input into ecological analyses. In the case of sea surface temperature, these models can have projections unrepresentative against a larger ensemble mean. Moreover, 91% of studies failed to incorporate the internal variability of a climate model into results. We explored the influence that the choice of emission scenario, climate model, and model realisation can have when predicting the future distribution of the pelagic fish, Electrona antarctica . Future distributions were highly influenced by the choice of climate model, and in some cases, internal variability was important in determining the direction and severity of the distribution change. Increased clarity and availability of processed climate data would facilitate more comprehensive explorations of climate uncertainty, and increase in the quality and standard of marine prediction studies.

  4. A combined capillary cooling system for cooling fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Ana Paula; Pelizza, Pablo Rodrigo; Galante, Renan Manozzo; Bazzo, Edson [Universidade Federal de Santa Catarina (LabCET/UFSC), Florianopolis, SC (Brazil). Dept. de Engenharia Mecanica. Lab. de Combustao e Engenharia de Sistemas Termicos], Emails: ana@labcet.ufsc.br, pablo@labcet.ufsc.br, renan@labcet.ufsc.br, ebazzo@emc.ufsc.br

    2010-07-01

    The operation temperature control has an important influence over the PEMFC (Proton Exchange Membrane Fuel Cell) performance. A two-phase heat transfer system is proposed as an alternative for cooling and thermal control of PEMFC. The proposed system consists of a CPL (Capillary Pumped Loop) connected to a set of constant conductance heat pipes. In this work ceramic wick and stainless mesh wicks have been used as capillary structure of the CPL and heat pipes, respectively. Acetone has been used as the working fluid for CPL and deionized water for the heat pipes. Experimental results of three 1/4 inch stainless steel outlet diameter heats pipes and one CPL have been carried out and presented in this paper. Further experiments are planned coupling the proposed cooling system to a module which simulates the fuel cell. (author)

  5. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    International Nuclear Information System (INIS)

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo

    2012-01-01

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  6. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany). Inst. for Neutron Physics and Reactor Technology

    2012-11-15

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  7. Cooling rate of chondrules in ordinary chondrites revisited by a new geospeedometer based on the compensation rule

    Science.gov (United States)

    Béjina, Frédéric; Sautter, Violaine; Jaoul, Olivier

    2009-01-01

    For several decades efforts to constrain chondrite cooling rates from diffusion zoning in olivine gave rise to a range of values from 5 to 8400 K/h (Desch, S.J., Connolly Jr., H.C., 2002. A model for the thermal processing of particles in solar nebula shocks: application to cooling rates of chondrules. Meteorit. Planet. Sci. 37, 183-208; Greeney, S., Ruzicka, A., 2004. Relict forsterite in chondrules: implications for cooling rates. Lunar Planet. Sci. XXXV, abstract # 1246.). Such large uncertainties directly reflect the variability of diffusion data. Alternatively, from this variability results a compensation rule, log D0 = a + bE (diffusion coefficients are written D = D0 exp(- E/ RT)). We test a new geospeemetry approach, based on this rule, on cooling of chondrules in chondrites, Sahara-97210 LL 3.2 and Wells LL 3.3. Greeney and Ruzicka (2004) matched Fe-Mg diffusion profiles in olivine from these chondrites with cooling rates between 200 and 6000 K/h. In our geospeedometry model, the use of the compensation rule greatly reduces the uncertainties by avoiding the choice of one diffusion coefficient among many. The cooling rates we found are between 700 and 3600 K/h for Sahara and 700-1600 K/h for Wells. Finally, we discuss the influence of our analytical model parameters on our cooling rate estimates.

  8. Stereo-particle image velocimetry uncertainty quantification

    Science.gov (United States)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  9. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  10. Optimal Time to Invest Energy Storage System under Uncertainty Conditions

    Directory of Open Access Journals (Sweden)

    Yongma Moon

    2014-04-01

    Full Text Available This paper proposes a model to determine the optimal investment time for energy storage systems (ESSs in a price arbitrage trade application under conditions of uncertainty over future profits. The adoption of ESSs can generate profits from price arbitrage trade, which are uncertain because the future marginal prices of electricity will change depending on supply and demand. In addition, since the investment is optional, an investor can delay adopting an ESS until it becomes profitable, and can decide the optimal time. Thus, when we evaluate this investment, we need to incorporate the investor’s option which is not captured by traditional evaluation methods. In order to incorporate these aspects, we applied real option theory to our proposed model, which provides an optimal investment threshold. Our results concerning the optimal time to invest show that if future profits that are expected to be obtained from arbitrage trade become more uncertain, an investor needs to wait longer to invest. Also, improvement in efficiency of ESSs can reduce the uncertainty of arbitrage profit and, consequently, the reduced uncertainty enables earlier ESS investment, even for the same power capacity. Besides, when a higher rate of profits is expected and ESS costs are higher, an investor needs to wait longer. Also, by comparing a widely used net present value model to our real option model, we show that the net present value method underestimates the value for ESS investment and misleads the investor to make an investment earlier.

  11. Insight from uncertainty: bootstrap-derived diffusion metrics differentially predict memory function among older adults.

    Science.gov (United States)

    Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M

    2016-01-01

    Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.

  12. Theoretical analysis of the performance of different cooling strategies with the concept of cool exergy

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Shukuya, Masanori; Olesen, Bjarne W.

    2016-01-01

    The whole chains of exergy flows for different cooling systems were compared. The effects of cooling demand (internal vs. external solar shading), space cooling method (floor cooling vs. air cooling with ventilation system), and the availability of a nearby natural heat sink (intake air...... for the ventilation system being outdoor air vs. air from the crawl-space, and air-to-water heat pump vs. ground heat exchanger as cooling source) on system exergy performance were investigated. It is crucial to minimize the cooling demand because it is possible to use a wide range of heat sinks (ground, lake, sea......-water, etc.) and indoor terminal units, only with a minimized demand. The water-based floor cooling system performed better than the air-based cooling system; when an air-to-water heat pump was used as the cooling source, the required exergy input was 28% smaller for the floor cooling system. The auxiliary...

  13. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  14. Incorporating Parameter Uncertainty in Bayesian Segmentation Models: Application to Hippocampal Subfield Volumetry

    DEFF Research Database (Denmark)

    Iglesias, J. E.; Sabuncu, M. R.; Van Leemput, Koen

    2012-01-01

    Many successful segmentation algorithms are based on Bayesian models in which prior anatomical knowledge is combined with the available image information. However, these methods typically have many free parameters that are estimated to obtain point estimates only, whereas a faithful Bayesian anal...

  15. Land-cover impacts on streamflow: a change-detection modelling approach that incorporates parameter uncertainty

    Science.gov (United States)

    Jan Seibert; Jeffrey J. McDonnell

    2010-01-01

    The effect of land-use or land-cover change on stream runoff dynamics is not fully understood. In many parts of the world, forest management is the major land-cover change agent. While the paired catchment approach has been the primary methodology used to quantify such effects, it is only possible for small headwater catchments where there is uniformity in...

  16. Towards reliable mapping of biosecurity risk: incorporating uncertainty and decision-makers’ risk aversion

    Science.gov (United States)

    Denys Yemshanov; Frank H. Koch; Mark Ducey; Robert A. Haack

    2015-01-01

    Pest risk maps are an important source of decision support when devising strategies to minimize introductions of invasive organisms and mitigate their impacts. When possible management responses to an invader include costly or socially sensitive activities, decision makers tend to follow a more certain (i.e. risk-averse) course of action. We present a new mapping...

  17. A hierarchical Bayesian model to incorporate uncertainty into methods for diversity partitioning.

    Science.gov (United States)

    Marion, Zachary H; Fordyce, James A; Fitzpatrick, Benjamin M

    2018-04-01

    Recently there have been major theoretical advances in the quantification and partitioning of diversity within and among communities, regions, and ecosystems. However, applying those advances to real data remains a challenge. Ecologists often end up describing their samples rather than estimating the diversity components of an underlying study system, and existing approaches do not easily provide statistical frameworks for testing ecological questions. Here we offer one avenue to do all of the above using a hierarchical Bayesian approach. We estimate posterior distributions of the underlying "true" relative abundances of each species within each unit sampled. These posterior estimates of relative abundance can then be used with existing formulae to estimate and partition diversity. The result is a posterior distribution of diversity metrics describing our knowledge (or beliefs) about the study system. This approach intuitively leads to statistical inferences addressing biologically motivated hypotheses via Bayesian model comparison. Using simulations, we demonstrate that our approach does as well or better at approximating the "true" diversity of a community relative to naïve or ad-hoc bias-corrected estimates. Moreover, model comparison correctly distinguishes between alternative hypotheses about the distribution of diversity within and among samples. Finally, we use an empirical ecological dataset to illustrate how the approach can be used to address questions about the makeup and diversities of assemblages at local and regional scales. © 2018 by the Ecological Society of America.

  18. Modular high-temperature gas-cooled reactor core heatup accident simulations

    International Nuclear Information System (INIS)

    Ball, S.J.; Conklin, J.C.

    1989-01-01

    The design features of the modular high-temperature gas-cooled reactor (HTGR) have the potential to make it essentially invulnerable to damage from postulated core heatup accidents. Simulations of long-term loss-of-forced-convection (LOFC) accidents, both with and without depressurization of the primary coolant and with only passive cooling available to remove afterheat, have shown that maximum core temperatures stay below the point at which fuel failures and fission product releases are expected. Sensitivity studies also have been done to determine the effects of errors in the predictions due both to uncertainties in the modeling and to the assumptions about operational parameters. 4 refs., 5 figs

  19. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  20. Uncertainty analysis of geothermal energy economics

    Science.gov (United States)

    Sener, Adil Caner

    This dissertation research endeavors to explore geothermal energy economics by assessing and quantifying the uncertainties associated with the nature of geothermal energy and energy investments overall. The study introduces a stochastic geothermal cost model and a valuation approach for different geothermal power plant development scenarios. The Monte Carlo simulation technique is employed to obtain probability distributions of geothermal energy development costs and project net present values. In the study a stochastic cost model with incorporated dependence structure is defined and compared with the model where random variables are modeled as independent inputs. One of the goals of the study is to attempt to shed light on the long-standing modeling problem of dependence modeling between random input variables. The dependence between random input variables will be modeled by employing the method of copulas. The study focuses on four main types of geothermal power generation technologies and introduces a stochastic levelized cost model for each technology. Moreover, we also compare the levelized costs of natural gas combined cycle and coal-fired power plants with geothermal power plants. The input data used in the model relies on the cost data recently reported by government agencies and non-profit organizations, such as the Department of Energy, National Laboratories, California Energy Commission and Geothermal Energy Association. The second part of the study introduces the stochastic discounted cash flow valuation model for the geothermal technologies analyzed in the first phase. In this phase of the study, the Integrated Planning Model (IPM) software was used to forecast the revenue streams of geothermal assets under different price and regulation scenarios. These results are then combined to create a stochastic revenue forecast of the power plants. The uncertainties in gas prices and environmental regulations will be modeled and their potential impacts will be