WorldWideScience

Sample records for rise methodology estimates

  1. Methodology for estimating human perception to tremors in high-rise buildings

    Science.gov (United States)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  2. Methodology of project management at implementation of projects of high-rise construction

    Science.gov (United States)

    Papelniuk, Oksana

    2018-03-01

    High-rise construction is the perspective direction in urban development. An opportunity to arrange on rather small land plot a huge number of the living and commercial space makes high-rise construction very attractive for developers. However investment projects of high-rise buildings' construction are very expensive and complex that sets a task of effective management of such projects for the company builder. The best tool in this area today is the methodology of project management, which becomes a key factor of efficiency.

  3. Methodology for determining the investment attractiveness of construction of high-rise buildings

    Science.gov (United States)

    Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana

    2018-03-01

    The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.

  4. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    Miller, J.Q.; Hale, T.; Miller, D.

    1991-09-01

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  5. Estimates of the Economic Effects of Sea Level Rise

    International Nuclear Information System (INIS)

    Darwin, R.F.; Tol, R.S.J.

    2001-01-01

    Regional estimates of direct cost (DC) are commonly used to measure the economic damages of sea level rise. Such estimates suffer from three limitations: (1) values of threatened endowments are not well known, (2) loss of endowments does not affect consumer prices, and (3) international trade is disregarded. Results in this paper indicate that these limitations can significantly affect economic assessments of sea level rise. Current uncertainty regarding endowment values (as reflected in two alternative data sets), for example, leads to a 17 percent difference in coastal protection, a 36 percent difference in the amount of land protected, and a 36 percent difference in DC globally. Also, global losses in equivalent variation (EV), a welfare measure that accounts for price changes, are 13 percent higher than DC estimates. Regional EV losses may be up to 10 percent lower than regional DC, however, because international trade tends to redistribute losses from regions with relatively high damages to regions with relatively low damages. 43 refs

  6. Cost estimating for CERCLA remedial alternatives a unit cost methodology

    International Nuclear Information System (INIS)

    Brettin, R.W.; Carr, D.J.; Janke, R.J.

    1995-06-01

    The United States Environmental Protection Agency (EPA) Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA, Interim Final, dated October 1988 (EPA 1988) requires a detailed analysis be conducted of the most promising remedial alternatives against several evaluation criteria, including cost. To complete the detailed analysis, order-of-magnitude cost estimates (having an accuracy of +50 percent to -30 percent) must be developed for each remedial alternative. This paper presents a methodology for developing cost estimates of remedial alternatives comprised of various technology and process options with a wide range of estimated contaminated media quantities. In addition, the cost estimating methodology provides flexibility for incorporating revisions to remedial alternatives and achieves the desired range of accuracy. It is important to note that the cost estimating methodology presented here was developed as a concurrent path to the development of contaminated media quantity estimates. This methodology can be initiated before contaminated media quantities are estimated. As a result, this methodology is useful in developing cost estimates for use in screening and evaluating remedial technologies and process options. However, remedial alternative cost estimates cannot be prepared without the contaminated media quantity estimates. In the conduct of the feasibility study for Operable Unit 5 at the Fernald Environmental Management Project (FEMP), fourteen remedial alternatives were retained for detailed analysis. Each remedial alternative was composed of combinations of remedial technologies and processes which were earlier determined to be best suited for addressing the media-specific contaminants found at the FEMP site, and achieving desired remedial action objectives

  7. A reconciled estimate of glacier contributions to sea level rise: 2003 to 2009.

    Science.gov (United States)

    Gardner, Alex S; Moholdt, Geir; Cogley, J Graham; Wouters, Bert; Arendt, Anthony A; Wahr, John; Berthier, Etienne; Hock, Regine; Pfeffer, W Tad; Kaser, Georg; Ligtenberg, Stefan R M; Bolch, Tobias; Sharp, Martin J; Hagen, Jon Ove; van den Broeke, Michiel R; Paul, Frank

    2013-05-17

    Glaciers distinct from the Greenland and Antarctic Ice Sheets are losing large amounts of water to the world's oceans. However, estimates of their contribution to sea level rise disagree. We provide a consensus estimate by standardizing existing, and creating new, mass-budget estimates from satellite gravimetry and altimetry and from local glaciological records. In many regions, local measurements are more negative than satellite-based estimates. All regions lost mass during 2003-2009, with the largest losses from Arctic Canada, Alaska, coastal Greenland, the southern Andes, and high-mountain Asia, but there was little loss from glaciers in Antarctica. Over this period, the global mass budget was -259 ± 28 gigatons per year, equivalent to the combined loss from both ice sheets and accounting for 29 ± 13% of the observed sea level rise.

  8. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  9. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    International Nuclear Information System (INIS)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-01-01

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  10. Observation-Driven Estimation of the Spatial Variability of 20th Century Sea Level Rise

    Science.gov (United States)

    Hamlington, B. D.; Burgos, A.; Thompson, P. R.; Landerer, F. W.; Piecuch, C. G.; Adhikari, S.; Caron, L.; Reager, J. T.; Ivins, E. R.

    2018-03-01

    Over the past two decades, sea level measurements made by satellites have given clear indications of both global and regional sea level rise. Numerous studies have sought to leverage the modern satellite record and available historic sea level data provided by tide gauges to estimate past sea level rise, leading to several estimates for the 20th century trend in global mean sea level in the range between 1 and 2 mm/yr. On regional scales, few attempts have been made to estimate trends over the same time period. This is due largely to the inhomogeneity and quality of the tide gauge network through the 20th century, which render commonly used reconstruction techniques inadequate. Here, a new approach is adopted, integrating data from a select set of tide gauges with prior estimates of spatial structure based on historical sea level forcing information from the major contributing processes over the past century. The resulting map of 20th century regional sea level rise is optimized to agree with the tide gauge-measured trends, and provides an indication of the likely contributions of different sources to regional patterns. Of equal importance, this study demonstrates the sensitivities of this regional trend map to current knowledge and uncertainty of the contributing processes.

  11. Methodology for completing Hanford 200 Area tank waste physical/chemical profile estimations

    International Nuclear Information System (INIS)

    Kruger, A.A.

    1996-01-01

    The purpose of the Methodology for Completing Hanford 200 Area Tank Waste Physical/Chemical Profile Estimations is to capture the logic inherent to completing 200 Area waste tank physical and chemical profile estimates. Since there has been good correlation between the estimate profiles and actual conditions during sampling and sub-segment analysis, it is worthwhile to document the current estimate methodology

  12. Methodology for estimating biomass energy potential and its application to Colombia

    International Nuclear Information System (INIS)

    Gonzalez-Salazar, Miguel Angel; Morini, Mirko; Pinelli, Michele; Spina, Pier Ruggero; Venturini, Mauro; Finkenrath, Matthias; Poganietz, Witold-Roger

    2014-01-01

    Highlights: • Methodology to estimate the biomass energy potential and its uncertainty at a country level. • Harmonization of approaches and assumptions in existing assessment studies. • The theoretical and technical biomass energy potential in Colombia are estimated in 2010. - Abstract: This paper presents a methodology to estimate the biomass energy potential and its associated uncertainty at a country level when quality and availability of data are limited. The current biomass energy potential in Colombia is assessed following the proposed methodology and results are compared to existing assessment studies. The proposed methodology is a bottom-up resource-focused approach with statistical analysis that uses a Monte Carlo algorithm to stochastically estimate the theoretical and the technical biomass energy potential. The paper also includes a proposed approach to quantify uncertainty combining a probabilistic propagation of uncertainty, a sensitivity analysis and a set of disaggregated sub-models to estimate reliability of predictions and reduce the associated uncertainty. Results predict a theoretical energy potential of 0.744 EJ and a technical potential of 0.059 EJ in 2010, which might account for 1.2% of the annual primary energy production (4.93 EJ)

  13. Methodical approaches to value assessment and determination of the capitalization level of high-rise construction

    Science.gov (United States)

    Smirnov, Vitaly; Dashkov, Leonid; Gorshkov, Roman; Burova, Olga; Romanova, Alina

    2018-03-01

    The article presents the analysis of the methodological approaches to cost estimation and determination of the capitalization level of high-rise construction objects. Factors determining the value of real estate were considered, three main approaches for estimating the value of real estate objects are given. The main methods of capitalization estimation were analyzed, the most reasonable method for determining the level of capitalization of high-rise buildings was proposed. In order to increase the value of real estate objects, the author proposes measures that enable to increase significantly the capitalization of the enterprise through more efficient use of intangible assets and goodwill.

  14. A robust methodology for kinetic model parameter estimation for biocatalytic reactions

    DEFF Research Database (Denmark)

    Al-Haque, Naweed; Andrade Santacoloma, Paloma de Gracia; Lima Afonso Neto, Watson

    2012-01-01

    lead to globally optimized parameter values. In this article, a robust methodology to estimate parameters for biocatalytic reaction kinetic expressions is proposed. The methodology determines the parameters in a systematic manner by exploiting the best features of several of the current approaches...... parameters, which are strongly correlated with each other. State-of-the-art methodologies such as nonlinear regression (using progress curves) or graphical analysis (using initial rate data, for example, the Lineweaver-Burke plot, Hanes plot or Dixon plot) often incorporate errors in the estimates and rarely...

  15. Methodology for Estimating Ingestion Dose for Emergency Response at SRS

    CERN Document Server

    Simpkins, A A

    2002-01-01

    At the Savannah River Site (SRS), emergency response models estimate dose for inhalation and ground shine pathways. A methodology has been developed to incorporate ingestion doses into the emergency response models. The methodology follows a two-phase approach. The first phase estimates site-specific derived response levels (DRLs) which can be compared with predicted ground-level concentrations to determine if intervention is needed to protect the public. This phase uses accepted methods with little deviation from recommended guidance. The second phase uses site-specific data to estimate a 'best estimate' dose to offsite individuals from ingestion of foodstuffs. While this method deviates from recommended guidance, it is technically defensibly and more realistic. As guidance is updated, these methods also will need to be updated.

  16. Estimating sea-level allowances for Atlantic Canada under conditions of uncertain sea-level rise

    Directory of Open Access Journals (Sweden)

    B. Greenan

    2015-03-01

    Full Text Available This paper documents the methodology of computing sea-level rise allowances for Atlantic Canada in the 21st century under conditions of uncertain sea-level rise. The sea-level rise allowances are defined as the amount by which an asset needs to be raised in order to maintain the same likelihood of future flooding events as that site has experienced in the recent past. The allowances are determined by combination of the statistics of present tides and storm surges (storm tides and the regional projections of sea-level rise and associated uncertainty. Tide-gauge data for nine sites from the Canadian Atlantic coast are used to derive the scale parameters of present sea-level extremes using the Gumbel distribution function. The allowances in the 21st century, with respect to the year 1990, were computed for the Intergovernmental Panel on Climate Change (IPCC A1FI emission scenario. For Atlantic Canada, the allowances are regionally variable and, for the period 1990–2050, range between –13 and 38 cm while, for the period 1990–2100, they range between 7 and 108 cm. The negative allowances in the northern Gulf of St. Lawrence region are caused by land uplift due to glacial isostatic adjustment (GIA.

  17. Methodology for estimating sodium aerosol concentrations during breeder reactor fires

    International Nuclear Information System (INIS)

    Fields, D.E.; Miller, C.W.

    1985-01-01

    We have devised and applied a methodology for estimating the concentration of aerosols released at building surfaces and monitored at other building surface points. We have used this methodology to make calculations that suggest, for one air-cooled breeder reactor design, cooling will not be compromised by severe liquid-metal fires

  18. Application of precursor methodology in initiating frequency estimates

    International Nuclear Information System (INIS)

    Kohut, P.; Fitzpatrick, R.G.

    1991-01-01

    The precursor methodology developed in recent years provides a consistent technique to identify important accident sequence precursors. It relies on operational events (extracting information from actual experience) and infers core damage scenarios based on expected safety system responses. The ranking or categorization of each precursor is determined by considering the full spectrum of potential core damage sequences. The methodology estimates the frequency of severe core damage based on the approach suggested by Apostolakis and Mosleh, which may lead to a potential overestimation of the severe-accident sequence frequency due to the inherent dependencies between the safety systems and the initiating events. The methodology is an encompassing attempt to incorporate most of the operating information available from nuclear power plants and is an attractive tool from the point of view of risk management. In this paper, a further extension of this methodology is discussed with regard to the treatment of initiating frequency of the accident sequences

  19. Perception-oriented methodology for robust motion estimation design

    NARCIS (Netherlands)

    Heinrich, A.; Vleuten, van der R.J.; Haan, de G.

    2014-01-01

    Optimizing a motion estimator (ME) for picture rate conversion is challenging. This is because there are many types of MEs and, within each type, many parameters, which makes subjective assessment of all the alternatives impractical. To solve this problem, we propose an automatic design methodology

  20. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  1. A simple model to estimate the impact of sea-level rise on platform beaches

    Science.gov (United States)

    Taborda, Rui; Ribeiro, Mónica Afonso

    2015-04-01

    Estimates of future beach evolution in response to sea-level rise are needed to assess coastal vulnerability. A research gap is identified in providing adequate predictive methods to use for platform beaches. This work describes a simple model to evaluate the effects of sea-level rise on platform beaches that relies on the conservation of beach sand volume and assumes an invariant beach profile shape. In closed systems, when compared with the Inundation Model, results show larger retreats; the differences are higher for beaches with wide berms and when the shore platform develops at shallow depths. The application of the proposed model to Cascais (Portugal) beaches, using 21st century sea-level rise scenarios, shows that there will be a significant reduction in beach width.

  2. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  3. A Novel Methodology for Estimating State-Of-Charge of Li-Ion Batteries Using Advanced Parameters Estimation

    Directory of Open Access Journals (Sweden)

    Ibrahim M. Safwat

    2017-11-01

    Full Text Available State-of-charge (SOC estimations of Li-ion batteries have been the focus of many research studies in previous years. Many articles discussed the dynamic model’s parameters estimation of the Li-ion battery, where the fixed forgetting factor recursive least square estimation methodology is employed. However, the change rate of each parameter to reach the true value is not taken into consideration, which may tend to poor estimation. This article discusses this issue, and proposes two solutions to solve it. The first solution is the usage of a variable forgetting factor instead of a fixed one, while the second solution is defining a vector of forgetting factors, which means one factor for each parameter. After parameters estimation, a new idea is proposed to estimate state-of-charge (SOC of the Li-ion battery based on Newton’s method. Also, the error percentage and computational cost are discussed and compared with that of nonlinear Kalman filters. This methodology is applied on a 36 V 30 A Li-ion pack to validate this idea.

  4. Expanded uncertainty estimation methodology in determining the sandy soils filtration coefficient

    Science.gov (United States)

    Rusanova, A. D.; Malaja, L. D.; Ivanov, R. N.; Gruzin, A. V.; Shalaj, V. V.

    2018-04-01

    The combined standard uncertainty estimation methodology in determining the sandy soils filtration coefficient has been developed. The laboratory researches were carried out which resulted in filtration coefficient determination and combined uncertainty estimation obtaining.

  5. Integrated Methodology for Estimating Water Use in Mediterranean Agricultural Areas

    Directory of Open Access Journals (Sweden)

    George C. Zalidis

    2009-08-01

    Full Text Available Agricultural use is by far the largest consumer of fresh water worldwide, especially in the Mediterranean, where it has reached unsustainable levels, thus posing a serious threat to water resources. Having a good estimate of the water used in an agricultural area would help water managers create incentives for water savings at the farmer and basin level, and meet the demands of the European Water Framework Directive. This work presents an integrated methodology for estimating water use in Mediterranean agricultural areas. It is based on well established methods of estimating the actual evapotranspiration through surface energy fluxes, customized for better performance under the Mediterranean conditions: small parcel sizes, detailed crop pattern, and lack of necessary data. The methodology has been tested and validated on the agricultural plain of the river Strimonas (Greece using a time series of Terra MODIS and Landsat 5 TM satellite images, and used to produce a seasonal water use map at a high spatial resolution. Finally, a tool has been designed to implement the methodology with a user-friendly interface, in order to facilitate its operational use.

  6. New way to rise estimation objectivity of radiation consequences on human

    International Nuclear Information System (INIS)

    Akhmatullina, N. B.

    2001-01-01

    The discussion of negative consequences of radiation on human often leaves without attention the fact that the basic meaning of danger of radiation level rise in environment for human connected with genetic structure defects. Namely changes in genome lead to different negative consequences and not only accompany, but also precede them. However the tendency which appeared in our country to substitute the direct genetic analysis with references to rise of frequency of morbidity on separate nosologic groups, whose area is widen arbitrary, has brought to nonadequateness of methodologic approach, distorted the determination of 'genetic consequence' itself and as the effect of this, distorted the real estimation of consequences of Kazakh Test Sites (TS) and other sources of radiation contamination activity. The question is arising: how can we distinguish observed and discribed effects of other genotoxicants of chemical and biological origin? There are different cytogenetic methods to detect genetic damages. The more widely used - is the estimation of the chromosome anomalies frequency estimation in somatic cells, especially in lymphocytes of peripheral blood. Traditionally researches proceeds from thin mechanisms of mutagenesis, which points that radiation mutagenesis leads primarily to chromosome, and chemical - to chromatide aberrations. In radiation influence chromosome aberrations appears in nondivided lymphocytes (G1-phase) and became easily observed in first metaphase (Browen e.a.1972, Bender e.a.1966). On the contrary the aberrations, induced by chemical factors, appears primarily in the S- phase irrespectively of what is cycle's stage, when the cells were exposed. Therefore the majority of aberrations have chromatide type (Evanse e.a.1980, Preston e.a.1981). Following pointed criteria many original investigations on people exposed to radiation were carried out. Moreover it was proved the application of such method to estimate the absorbed radiation in the

  7. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  8. Assessment of compliance with regulatory requirements for a best estimate methodology for evaluation of ECCS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Un Chul; Jang, Jin Wook; Lim, Ho Gon; Jeong, Ik [Seoul National Univ., Seoul (Korea, Republic of); Sim, Suk Ku [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2000-03-15

    Best estimate methodology for evaluation of ECCS proposed by KEPCO(KREM) os using thermal-hydraulic best-estimate code and the topical report for the methodology is described that it meets the regulatory requirement of USNRC regulatory guide. In this research the assessment of compliance with regulatory guide. In this research the assessment of compliance with regulatory requirements for the methodology is performed. The state of licensing procedure of other countries and best-estimate evaluation methodologies of Europe is also investigated, The applicability of models and propriety of procedure of uncertainty analysis of KREM are appraised and compliance with USNRC regulatory guide is assessed.

  9. Consistent estimate of ocean warming, land ice melt and sea level rise from Observations

    Science.gov (United States)

    Blazquez, Alejandro; Meyssignac, Benoît; Lemoine, Jean Michel

    2016-04-01

    Based on the sea level budget closure approach, this study investigates the consistency of observed Global Mean Sea Level (GMSL) estimates from satellite altimetry, observed Ocean Thermal Expansion (OTE) estimates from in-situ hydrographic data (based on Argo for depth above 2000m and oceanic cruises below) and GRACE observations of land water storage and land ice melt for the period January 2004 to December 2014. The consistency between these datasets is a key issue if we want to constrain missing contributions to sea level rise such as the deep ocean contribution. Numerous previous studies have addressed this question by summing up the different contributions to sea level rise and comparing it to satellite altimetry observations (see for example Llovel et al. 2015, Dieng et al. 2015). Here we propose a novel approach which consists in correcting GRACE solutions over the ocean (essentially corrections of stripes and leakage from ice caps) with mass observations deduced from the difference between satellite altimetry GMSL and in-situ hydrographic data OTE estimates. We check that the resulting GRACE corrected solutions are consistent with original GRACE estimates of the geoid spherical harmonic coefficients within error bars and we compare the resulting GRACE estimates of land water storage and land ice melt with independent results from the literature. This method provides a new mass redistribution from GRACE consistent with observations from Altimetry and OTE. We test the sensibility of this method to the deep ocean contribution and the GIA models and propose best estimates.

  10. [Methodologies for estimating the indirect costs of traffic accidents].

    Science.gov (United States)

    Carozzi, Soledad; Elorza, María Eugenia; Moscoso, Nebel Silvana; Ripari, Nadia Vanina

    2017-01-01

    Traffic accidents generate multiple costs to society, including those associated with the loss of productivity. However, there is no consensus about the most appropriate methodology for estimating those costs. The aim of this study was to review methods for estimating indirect costs applied in crash cost studies. A thematic review of the literature was carried out between 1995 and 2012 in PubMed with the terms cost of illness, indirect cost, road traffic injuries, productivity loss. For the assessment of costs we used the the human capital method, on the basis of the wage-income lost during the time of treatment and recovery of patients and caregivers. In the case of premature death or total disability, the discount rate was applied to obtain the present value of lost future earnings. The computed years arose by subtracting to life expectancy at birth the average age of those affected who are not incorporated into the economically active life. The interest in minimizing the problem is reflected in the evolution of the implemented methodologies. We expect that this review is useful to estimate efficiently the real indirect costs of traffic accidents.

  11. A methodology to calibrate water saturation estimated from 4D seismic data

    International Nuclear Information System (INIS)

    Davolio, Alessandra; Maschio, Célio; José Schiozer, Denis

    2014-01-01

    Time-lapse seismic data can be used to estimate saturation changes within a reservoir, which is valuable information for reservoir management as it plays an important role in updating reservoir simulation models. The process of updating reservoir properties, history matching, can incorporate estimated saturation changes qualitatively or quantitatively. For quantitative approaches, reliable information from 4D seismic data is important. This work proposes a methodology to calibrate the volume of water in the estimated saturation maps, as these maps can be wrongly estimated due to problems with seismic signals (such as noise, errors associated with data processing and resolution issues). The idea is to condition the 4D seismic data to known information provided by engineering, in this case the known amount of injected and produced water in the field. The application of the proposed methodology in an inversion process (previously published) that estimates saturation from 4D seismic data is presented, followed by a discussion concerning the use of such data in a history matching process. The methodology is applied to a synthetic dataset to validate the results, the main of which are: (1) reduction of the effects of noise and errors in the estimated saturation, yielding more reliable data to be used quantitatively or qualitatively and (2) an improvement in the properties update after using this data in a history matching procedure. (paper)

  12. Are sea-level-rise trends along the coasts of the north Indian Ocean consistent with global estimates?

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A.S.; Shankar, D.

    yielded sea-level-rise estimates between 1.06–1.75 mm/ yrear-1 , with a regional average of 1.29 mm yr-1, when corrected for global isostatic adjustment (GIA) using model data, with a regional average of 1.29 mm-1.. These estimates are consistent...

  13. Methodology used in IRSN nuclear accident cost estimates in France

    International Nuclear Information System (INIS)

    2015-01-01

    This report describes the methodology used by IRSN to estimate the cost of potential nuclear accidents in France. It concerns possible accidents involving pressurized water reactors leading to radioactive releases in the environment. These accidents have been grouped in two accident families called: severe accidents and major accidents. Two model scenarios have been selected to represent each of these families. The report discusses the general methodology of nuclear accident cost estimation. The crucial point is that all cost should be considered: if not, the cost is underestimated which can lead to negative consequences for the value attributed to safety and for crisis preparation. As a result, the overall cost comprises many components: the most well-known is offsite radiological costs, but there are many others. The proposed estimates have thus required using a diversity of methods which are described in this report. Figures are presented at the end of this report. Among other things, they show that purely radiological costs only represent a non-dominant part of foreseeable economic consequences. (authors)

  14. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  15. Methodology for estimating soil carbon for the forest carbon budget model of the United States, 2001

    Science.gov (United States)

    L. S. Heath; R. A. Birdsey; D. W. Williams

    2002-01-01

    The largest carbon (C) pool in United States forests is the soil C pool. We present methodology and soil C pool estimates used in the FORCARB model, which estimates and projects forest carbon budgets for the United States. The methodology balances knowledge, uncertainties, and ease of use. The estimates are calculated using the USDA Natural Resources Conservation...

  16. Estimation Methodology for the Electricity Consumption with the Daylight- and Occupancy-Controlled Artificial Lighting

    DEFF Research Database (Denmark)

    Larsen, Olena Kalyanova; Jensen, Rasmus Lund; Strømberg, Ida Kristine

    2017-01-01

    Artificial lighting represents 15-30% of the total electricity consumption in buildings in Scandinavia. It is possible to avoid a large share of electricity use for lighting by application of daylight control systems for artificial lighting. Existing methodology for estimation of electricity...... consumption with application of such control systems in Norway is based on Norwegian standard NS 3031:2014 and can only provide results from a rough estimate. This paper aims to introduce a new estimation methodology for the electricity usage with the daylight- and occupancy-controlled artificial lighting...

  17. Review of the Palisades pressure vessel accumulated fluence estimate and of the least squares methodology employed

    Energy Technology Data Exchange (ETDEWEB)

    Griffin, P.J.

    1998-05-01

    This report provides a review of the Palisades submittal to the Nuclear Regulatory Commission requesting endorsement of their accumulated neutron fluence estimates based on a least squares adjustment methodology. This review highlights some minor issues in the applied methodology and provides some recommendations for future work. The overall conclusion is that the Palisades fluence estimation methodology provides a reasonable approach to a {open_quotes}best estimate{close_quotes} of the accumulated pressure vessel neutron fluence and is consistent with the state-of-the-art analysis as detailed in community consensus ASTM standards.

  18. Air quality estimation by computational intelligence methodologies

    Directory of Open Access Journals (Sweden)

    Ćirić Ivan T.

    2012-01-01

    Full Text Available The subject of this study is to compare different computational intelligence methodologies based on artificial neural networks used for forecasting an air quality parameter - the emission of CO2, in the city of Niš. Firstly, inputs of the CO2 emission estimator are analyzed and their measurement is explained. It is known that the traffic is the single largest emitter of CO2 in Europe. Therefore, a proper treatment of this component of pollution is very important for precise estimation of emission levels. With this in mind, measurements of traffic frequency and CO2 concentration were carried out at critical intersections in the city, as well as the monitoring of a vehicle direction at the crossroad. Finally, based on experimental data, different soft computing estimators were developed, such as feed forward neural network, recurrent neural network, and hybrid neuro-fuzzy estimator of CO2 emission levels. Test data for some characteristic cases presented at the end of the paper shows good agreement of developed estimator outputs with experimental data. Presented results are a true indicator of the implemented method usability. [Projekat Ministarstva nauke Republike Srbije, br. III42008-2/2011: Evaluation of Energy Performances and br. TR35016/2011: Indoor Environment Quality of Educational Buildings in Serbia with Impact to Health and Research of MHD Flows around the Bodies, in the Tip Clearances and Channels and Application in the MHD Pumps Development

  19. An Estimator of Heavy Tail Index through the Generalized Jackknife Methodology

    Directory of Open Access Journals (Sweden)

    Weiqi Liu

    2014-01-01

    Full Text Available In practice, sometimes the data can be divided into several blocks but only a few of the largest observations within each block are available to estimate the heavy tail index. To address this problem, we propose a new class of estimators through the Generalized Jackknife methodology based on Qi’s estimator (2010. These estimators are proved to be asymptotically normal under suitable conditions. Compared to Hill’s estimator and Qi’s estimator, our new estimator has better asymptotic efficiency in terms of the minimum mean squared error, for a wide range of the second order shape parameters. For the finite samples, our new estimator still compares favorably to Hill’s estimator and Qi’s estimator, providing stable sample paths as a function of the number of dividing the sample into blocks, smaller estimation bias, and MSE.

  20. Methodology for estimation of potential for solar water heating in a target area

    International Nuclear Information System (INIS)

    Pillai, Indu R.; Banerjee, Rangan

    2007-01-01

    Proper estimation of potential of any renewable energy technology is essential for planning and promotion of the technology. The methods reported in literature for estimation of potential of solar water heating in a target area are aggregate in nature. A methodology for potential estimation (technical, economic and market potential) of solar water heating in a target area is proposed in this paper. This methodology links the micro-level factors and macro-level market effects affecting the diffusion or adoption of solar water heating systems. Different sectors with end uses of low temperature hot water are considered for potential estimation. Potential is estimated at each end use point by simulation using TRNSYS taking micro-level factors. The methodology is illustrated for a synthetic area in India with an area of 2 sq. km and population of 10,000. The end use sectors considered are residential, hospitals, nursing homes and hotels. The estimated technical potential and market potential are 1700 m 2 and 350 m 2 of collector area, respectively. The annual energy savings for the technical potential in the area is estimated as 110 kW h/capita and 0.55 million-kW h/sq. km. area, with an annual average peak saving of 1 MW. The annual savings is 650-kW h per m 2 of collector area and accounts for approximately 3% of the total electricity consumption of the target area. Some of the salient features of the model are the factors considered for potential estimation; estimation of electrical usage pattern for typical day, amount of electricity savings and savings during the peak load. The framework is general and enables accurate estimation of potential of solar water heating for a city, block. Energy planners and policy makers can use this framework for tracking and promotion of diffusion of solar water heating systems. (author)

  1. Theoretical estimation of adiabatic temperature rise from the heat flow data obtained from a reaction calorimeter

    International Nuclear Information System (INIS)

    Das, Parichay K.

    2012-01-01

    Highlights: ► This method for estimating ΔT ad (t) against time in a semi-batch reactor is distinctively pioneer and novel. ► It has established uniquely a direct correspondence between the evolution of ΔT ad (t) in RC and C A (t) in a semi-batch reactor. ► Through a unique reaction scheme, the independent effects of heat of mixing and reaction on ΔT ad (t) has been demonstrated quantitatively. ► This work will help to build a thermally safe corridor of a thermally hazard reaction. ► This manuscript, the author believes will open a new vista for further research in Adiabatic Calorimetry. - Abstract: A novel method for estimating the transient profile of adiabatic rise in temperature has been developed from the heat flow data for exothermic chemical reactions that are conducted in reaction calorimeter (RC). It has also been mathematically demonstrated by the present design that there exists a direct qualitative equivalence between the temporal evolution of the adiabatic temperature rise and the concentration of the limiting reactant for an exothermic chemical reaction, carried out in semi batch mode. The proposed procedure shows that the adiabatic temperature rise will always be less than that of the reaction executed at batch mode thereby affording a thermally safe corridor. Moreover, a unique reaction scheme has been designed to establish the independent heat effect of dissolution and reaction quantitatively. It is hoped that the testimony of the transient adiabatic temperature rise that can be prepared by the proposed method, may provide ample scope for further research.

  2. A Methodology for Estimating Large-Customer Demand Response MarketPotential

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles; Hopper, Nicole; Bharvirkar, Ranjit; Neenan,Bernie; Cappers,Peter

    2007-08-01

    Demand response (DR) is increasingly recognized as an essential ingredient to well-functioning electricity markets. DR market potential studies can answer questions about the amount of DR available in a given area and from which market segments. Several recent DR market potential studies have been conducted, most adapting techniques used to estimate energy-efficiency (EE) potential. In this scoping study, we: reviewed and categorized seven recent DR market potential studies; recommended a methodology for estimating DR market potential for large, non-residential utility customers that uses price elasticities to account for behavior and prices; compiled participation rates and elasticity values from six DR options offered to large customers in recent years, and demonstrated our recommended methodology with large customer market potential scenarios at an illustrative Northeastern utility. We observe that EE and DR have several important differences that argue for an elasticity approach for large-customer DR options that rely on customer-initiated response to prices, rather than the engineering approaches typical of EE potential studies. Base-case estimates suggest that offering DR options to large, non-residential customers results in 1-3% reductions in their class peak demand in response to prices or incentive payments of $500/MWh. Participation rates (i.e., enrollment in voluntary DR programs or acceptance of default hourly pricing) have the greatest influence on DR impacts of all factors studied, yet are the least well understood. Elasticity refinements to reflect the impact of enabling technologies and response at high prices provide more accurate market potential estimates, particularly when arc elasticities (rather than substitution elasticities) are estimated.

  3. Methodology applied by IRSN for nuclear accident cost estimations in France

    International Nuclear Information System (INIS)

    2013-01-01

    This report describes the methodology used by IRSN to estimate the cost of potential nuclear accidents in France. It concerns possible accidents involving pressurized water reactors leading to radioactive releases in the environment. These accidents have been grouped in two accident families called: severe accidents and major accidents. Two model scenarios have been selected to represent each of these families. The report discusses the general methodology of nuclear accident cost estimation. The crucial point is that all cost should be considered: if not, the cost is underestimated which can lead to negative consequences for the value attributed to safety and for crisis preparation. As a result, the overall cost comprises many components: the most well-known is offsite radiological costs, but there are many others. The proposed estimates have thus required using a diversity of methods which are described in this report. Figures are presented at the end of this report. Among other things, they show that purely radiological costs only represent a non-dominant part of foreseeable economic consequences

  4. Estimating small area health-related characteristics of populations: a methodological review

    Directory of Open Access Journals (Sweden)

    Azizur Rahman

    2017-05-01

    Full Text Available Estimation of health-related characteristics at a fine local geographic level is vital for effective health promotion programmes, provision of better health services and population-specific health planning and management. Lack of a micro-dataset readily available for attributes of individuals at small areas negatively impacts the ability of local and national agencies to manage serious health issues and related risks in the community. A solution to this challenge would be to develop a method that simulates reliable small-area statistics. This paper provides a significant appraisal of the methodologies for estimating health-related characteristics of populations at geographical limited areas. Findings reveal that a range of methodologies are in use, which can be classified as three distinct set of approaches: i indirect standardisation and individual level modelling; ii multilevel statistical modelling; and iii micro-simulation modelling. Although each approach has its own strengths and weaknesses, it appears that microsimulation- based spatial models have significant robustness over the other methods and also represent a more precise means of estimating health-related population characteristics over small areas.

  5. Methodology proposal for estimation of carbon storage in urban green areas

    NARCIS (Netherlands)

    Schröder, C.; Mancosu, E.; Roerink, G.J.

    2013-01-01

    Methodology proposal for estimation of carbon storage in urban green areas; final report. Subtitle: Final report of task Task 262-5-6 "Carbon sequestration in urban green infrastructure" Project manager Marie Cugny-Seguin. Date: 15-10-2013

  6. Forensic anthropology casework-essential methodological considerations in stature estimation.

    Science.gov (United States)

    Krishan, Kewal; Kanchan, Tanuj; Menezes, Ritesh G; Ghosh, Abhik

    2012-03-01

    The examination of skeletal remains is a challenge to the medical examiner's/coroner's office and the forensic anthropologist conducting the investigation. One of the objectives of the medico-legal investigation is to estimate stature or height from various skeletal remains and body parts brought for examination. Various skeletal remains and body parts bear a positive and linear correlation with stature and have been successfully used for stature estimation. This concept is utilized in estimation of stature in forensic anthropology casework in mass disasters and other forensic examinations. Scientists have long been involved in standardizing the anthropological data with respect to various populations of the world. This review deals with some essential methodological issues that need to be addressed in research related to estimation of stature in forensic examinations. These issues have direct relevance in the identification of commingled or unknown remains and therefore it is essential that forensic nurses are familiar with the theories and techniques used in forensic anthropology. © 2012 International Association of Forensic Nurses.

  7. Compendium of Greenhouse Gas Emissions Estimation Methodologies for the Oil and Gas Industry

    Energy Technology Data Exchange (ETDEWEB)

    Shires, T.M.; Loughran, C.J. [URS Corporation, Austin, TX (United States)

    2004-02-01

    This document is a compendium of currently recognized methods and provides details for all oil and gas industry segments to enhance consistency in emissions estimation. This Compendium aims to accomplish the following goals: Assemble an expansive collection of relevant emission factors for estimating GHG emissions, based on currently available public documents; Outline detailed procedures for conversions between different measurement unit systems, with particular emphasis on implementation of oil and gas industry standards; Provide descriptions of the multitude of oil and gas industry operations, in its various segments, and the associated emissions sources that should be considered; and Develop emission inventory examples, based on selected facilities from the various segments, to demonstrate the broad applicability of the methodologies. The overall objective of developing this document is to promote the use of consistent, standardized methodologies for estimating GHG emissions from petroleum industry operations. The resulting Compendium documents recognized calculation techniques and emission factors for estimating GHG emissions for oil and gas industry operations. These techniques cover the calculation or estimation of emissions from the full range of industry operations - from exploration and production through refining, to the marketing and distribution of products. The Compendium presents and illustrates the use of preferred and alternative calculation approaches for carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) emissions for all common emission sources, including combustion, vented, and fugitive. Decision trees are provided to guide the user in selecting an estimation technique based on considerations of materiality, data availability, and accuracy. API will provide (free of charge) a calculation tool based on the emission estimation methodologies described herein. The tool will be made available at http://ghg.api.org/.

  8. Heuristic Methodology for Estimating the Liquid Biofuel Potential of a Region

    Directory of Open Access Journals (Sweden)

    Dorel Dusmanescu

    2016-08-01

    Full Text Available This paper presents a heuristic methodology for estimating the possible variation of the liquid biofuel potential of a region, an appraisal made for a future period of time. The determination of the liquid biofuel potential has been made up either on the account of an average (constant yield of the energetic crops that were used, or on the account of a yield that varies depending on a known trend, which can be estimated through a certain method. The proposed methodology uses the variation of the yield of energetic crops over time in order to simulate a variation of the biofuel potential for a future ten year time period. This new approach to the problem of determining the liquid biofuel potential of a certain land area can be useful for investors, as it allows making a more realistic analysis of the investment risk and of the possibilities of recovering the investment. On the other hand, the presented methodology can be useful to the governmental administration in order to elaborate strategies and policies to ensure the necessity of fuels and liquid biofuels for transportation, in a certain area. Unlike current methods, which approach the problem of determining the liquid biofuel potential in a deterministic way, by using econometric methods, the proposed methodology uses heuristic reasoning schemes in order to reduce the great number of factors that actually influence the biofuel potential and which usually have unknown values.

  9. Methodological Framework for Estimating the Correlation Dimension in HRV Signals

    Directory of Open Access Journals (Sweden)

    Juan Bolea

    2014-01-01

    Full Text Available This paper presents a methodological framework for robust estimation of the correlation dimension in HRV signals. It includes (i a fast algorithm for on-line computation of correlation sums; (ii log-log curves fitting to a sigmoidal function for robust maximum slope estimation discarding the estimation according to fitting requirements; (iii three different approaches for linear region slope estimation based on latter point; and (iv exponential fitting for robust estimation of saturation level of slope series with increasing embedded dimension to finally obtain the correlation dimension estimate. Each approach for slope estimation leads to a correlation dimension estimate, called D^2, D^2⊥, and D^2max. D^2 and D^2max estimate the theoretical value of correlation dimension for the Lorenz attractor with relative error of 4%, and D^2⊥ with 1%. The three approaches are applied to HRV signals of pregnant women before spinal anesthesia for cesarean delivery in order to identify patients at risk for hypotension. D^2 keeps the 81% of accuracy previously described in the literature while D^2⊥ and D^2max approaches reach 91% of accuracy in the same database.

  10. Development of Cost Estimation Methodology of Decommissioning for PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il; Yoo, Yeon Jae; Lim, Yong Kyu; Chang, Hyeon Sik; Song, Geun Ho

    2013-01-01

    The permanent closure of nuclear power plant should be conducted with the strict laws and the profound planning including the cost and schedule estimation because the plant is very contaminated with the radioactivity. In Korea, there are two types of the nuclear power plant. One is the pressurized light water reactor (PWR) and the other is the pressurized heavy water reactor (PHWR) called as CANDU reactor. Also, the 50% of the operating nuclear power plant in Korea is the PWRs which were originally designed by CE (Combustion Engineering). There have been experiences about the decommissioning of Westinghouse type PWR, but are few experiences on that of CE type PWR. Therefore, the purpose of this paper is to develop the cost estimation methodology and evaluate technical level of decommissioning for the application to CE type PWR based on the system engineering technology. The aim of present study is to develop the cost estimation methodology of decommissioning for application to PWR. Through the study, the following conclusions are obtained: · Based on the system engineering, the decommissioning work can be classified as Set, Subset, Task, Subtask and Work cost units. · The Set and Task structure are grouped as 29 Sets and 15 Task s, respectively. · The final result shows the cost and project schedule for the project control and risk management. · The present results are preliminary and should be refined and improved based on the modeling and cost data reflecting available technology and current costs like labor and waste data

  11. A new method to estimate global mass transport and its implication for sea level rise

    Science.gov (United States)

    Yi, S.; Heki, K.

    2017-12-01

    Estimates of changes in global land mass by using GRACE observations can be achieved by two methods, a mascon method and a forward modeling method. However, results from these two methods show inconsistent secular trend. Sea level budget can be adopted to validate the consistency among observations of sea level rise by altimetry, steric change by the Argo project, and mass change by GRACE. Mascon products from JPL, GSFC and CSR are compared here, we find that all these three products cannot achieve a reconciled sea level budget, while this problem can be solved by a new forward modeling method. We further investigate the origin of this difference, and speculate that it is caused by the signal leakage from the ocean mass. Generally, it is well recognized that land signals leak into oceans, but it also happens the other way around. We stress the importance of correction of leakage from the ocean in the estimation of global land masses. Based on a reconciled sea level budget, we confirmed that global sea level rise has been accelerating significantly over 2005-2015, as a result of the ongoing global temperature increase.

  12. Review of the Palisades pressure vessel accumulated fluence estimate and of the least squares methodology employed

    International Nuclear Information System (INIS)

    Griffin, P.J.

    1998-05-01

    This report provides a review of the Palisades submittal to the Nuclear Regulatory Commission requesting endorsement of their accumulated neutron fluence estimates based on a least squares adjustment methodology. This review highlights some minor issues in the applied methodology and provides some recommendations for future work. The overall conclusion is that the Palisades fluence estimation methodology provides a reasonable approach to a open-quotes best estimateclose quotes of the accumulated pressure vessel neutron fluence and is consistent with the state-of-the-art analysis as detailed in community consensus ASTM standards

  13. Estimation of CO2 emissions from China’s cement production: Methodologies and uncertainties

    International Nuclear Information System (INIS)

    Ke, Jing; McNeil, Michael; Price, Lynn; Khanna, Nina Zheng; Zhou, Nan

    2013-01-01

    In 2010, China’s cement output was 1.9 Gt, which accounted for 56% of world cement production. Total carbon dioxide (CO 2 ) emissions from Chinese cement production could therefore exceed 1.2 Gt. The magnitude of emissions from this single industrial sector in one country underscores the need to understand the uncertainty of current estimates of cement emissions in China. This paper compares several methodologies for calculating CO 2 emissions from cement production, including the three main components of emissions: direct emissions from the calcination process for clinker production, direct emissions from fossil fuel combustion and indirect emissions from electricity consumption. This paper examines in detail the differences between common methodologies for each emission component, and considers their effect on total emissions. We then evaluate the overall level of uncertainty implied by the differences among methodologies according to recommendations of the Joint Committee for Guides in Metrology. We find a relative uncertainty in China’s cement-related emissions in the range of 10 to 18%. This result highlights the importance of understanding and refining methods of estimating emissions in this important industrial sector. - Highlights: ► CO 2 emission estimates are critical given China’s cement production scale. ► Methodological differences for emission components are compared. ► Results show relative uncertainty in China’s cement-related emissions of about 10%. ► IPCC Guidelines and CSI Cement CO 2 and Energy Protocol are recommended

  14. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    Science.gov (United States)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  15. A systematic methodology to estimate added sugar content of foods.

    Science.gov (United States)

    Louie, J C Y; Moshtaghian, H; Boylan, S; Flood, V M; Rangan, A M; Barclay, A W; Brand-Miller, J C; Gill, T P

    2015-02-01

    The effect of added sugar on health is a topical area of research. However, there is currently no analytical or other method to easily distinguish between added sugars and naturally occurring sugars in foods. This study aimed to develop a systematic methodology to estimate added sugar values on the basis of analytical data and ingredients of foods. A 10-step, stepwise protocol was developed, starting with objective measures (six steps) and followed by more subjective estimation (four steps) if insufficient objective data are available. The method developed was applied to an Australian food composition database (AUSNUT2007) as an example. Out of the 3874 foods available in AUSNUT2007, 2977 foods (77%) were assigned an estimated value on the basis of objective measures (steps 1-6), and 897 (23%) were assigned a subjectively estimated value (steps 7-10). Repeatability analysis showed good repeatability for estimated values in this method. We propose that this method can be considered as a standardised approach for the estimation of added sugar content of foods to improve cross-study comparison.

  16. Systematic methodology for estimating direct capital costs for blanket tritium processing systems

    International Nuclear Information System (INIS)

    Finn, P.A.

    1985-01-01

    This paper describes the methodology developed for estimating the relative capital costs of blanket processing systems. The capital costs of the nine blanket concepts selected in the Blanket Comparison and Selection Study are presented and compared

  17. A regressive methodology for estimating missing data in rainfall daily time series

    Science.gov (United States)

    Barca, E.; Passarella, G.

    2009-04-01

    The "presence" of gaps in environmental data time series represents a very common, but extremely critical problem, since it can produce biased results (Rubin, 1976). Missing data plagues almost all surveys. The problem is how to deal with missing data once it has been deemed impossible to recover the actual missing values. Apart from the amount of missing data, another issue which plays an important role in the choice of any recovery approach is the evaluation of "missingness" mechanisms. When data missing is conditioned by some other variable observed in the data set (Schafer, 1997) the mechanism is called MAR (Missing at Random). Otherwise, when the missingness mechanism depends on the actual value of the missing data, it is called NCAR (Not Missing at Random). This last is the most difficult condition to model. In the last decade interest arose in the estimation of missing data by using regression (single imputation). More recently multiple imputation has become also available, which returns a distribution of estimated values (Scheffer, 2002). In this paper an automatic methodology for estimating missing data is presented. In practice, given a gauging station affected by missing data (target station), the methodology checks the randomness of the missing data and classifies the "similarity" between the target station and the other gauging stations spread over the study area. Among different methods useful for defining the similarity degree, whose effectiveness strongly depends on the data distribution, the Spearman correlation coefficient was chosen. Once defined the similarity matrix, a suitable, nonparametric, univariate, and regressive method was applied in order to estimate missing data in the target station: the Theil method (Theil, 1950). Even though the methodology revealed to be rather reliable an improvement of the missing data estimation can be achieved by a generalization. A first possible improvement consists in extending the univariate technique to

  18. Methodology for estimating reprocessing costs for nuclear fuels

    International Nuclear Information System (INIS)

    Carter, W.L.; Rainey, R.H.

    1980-02-01

    A technological and economic evaluation of reprocessing requirements for alternate fuel cycles requires a common assessment method and a common basis to which various cycles can be related. A methodology is described for the assessment of alternate fuel cycles utilizing a side-by-side comparison of functional flow diagrams of major areas of the reprocessing plant with corresponding diagrams of the well-developed Purex process as installed in the Barnwell Nuclear Fuel Plant (BNFP). The BNFP treats 1500 metric tons of uranium per year (MTU/yr). Complexity and capacity factors are determined for adjusting the estimated facility and equipment costs of BNFP to determine the corresponding costs for the alternate fuel cycle. Costs of capacities other than the reference 1500 MT of heavy metal per year are estimated by the use of scaling factors. Unit costs of reprocessed fuel are calculated using a discounted cash flow analysis for three economic bases to show the effect of low-risk, typical, and high-risk financing methods

  19. Space-planning and structural solutions of low-rise buildings: Optimal selection methods

    Science.gov (United States)

    Gusakova, Natalya; Minaev, Nikolay; Filushina, Kristina; Dobrynina, Olga; Gusakov, Alexander

    2017-11-01

    The present study is devoted to elaboration of methodology used to select appropriately the space-planning and structural solutions in low-rise buildings. Objective of the study is working out the system of criteria influencing the selection of space-planning and structural solutions which are most suitable for low-rise buildings and structures. Application of the defined criteria in practice aim to enhance the efficiency of capital investments, energy and resource saving, create comfortable conditions for the population considering climatic zoning of the construction site. Developments of the project can be applied while implementing investment-construction projects of low-rise housing at different kinds of territories based on the local building materials. The system of criteria influencing the optimal selection of space-planning and structural solutions of low-rise buildings has been developed. Methodological basis has been also elaborated to assess optimal selection of space-planning and structural solutions of low-rise buildings satisfying the requirements of energy-efficiency, comfort and safety, and economical efficiency. Elaborated methodology enables to intensify the processes of low-rise construction development for different types of territories taking into account climatic zoning of the construction site. Stimulation of low-rise construction processes should be based on the system of approaches which are scientifically justified; thus it allows enhancing energy efficiency, comfort, safety and economical effectiveness of low-rise buildings.

  20. Australian methodology for the estimation of greenhouse gas emissions and sinks: Agriculture: Workbook for livestock: Workbook 6.0

    Energy Technology Data Exchange (ETDEWEB)

    Bureau of Resource Sciences, Canberra, ACT (Australia)

    1994-12-31

    This workbook details a methodology for estimating methane emissions from Australian livestock. The workbook is designed to be consistent with international guidelines and takes into account special Australian conditions. While regarded as a significant source of anthropogenic methane emissions, it is also acknowledged in this document that livestock do not provide sinks for methane or any other greenhouse gas. Methane can originate from both fermentation processes in the digestive tracts of all livestock and from manure under certain management conditions. Methane emissions were estimated from beef cattle, dairy cattle, sheep, pigs, poultry, goats, horses, deer, buffalo, camels, emus and ostriches, alpacas and donkeys and mules. Two methodologies were used to estimate emissions. One is the standard Intergovernmental Panel on Climate Change (IPCC) Tier 1 methodology that is needed to provide inter-country comparisons of emissions. The other has been developed by the Inventory Methodology Working Group. It represents the best current Australian method for estimating greenhouse gas emissions from Australian livestock. (author). 6 tabs., 22 refs.

  1. Methodologies for estimating toxicity of shoreline cleaning agents in the field

    International Nuclear Information System (INIS)

    Clayton, J.R.Jr.; Stransky, B.C.; Schwartz, M.J.; Snyder, B.J.; Lees, D.C.; Michel, J.; Reilly, T.J.

    1996-01-01

    Four methodologies that could be used in a portable kit to estimate quantitative and qualitative information regarding the toxicity of oil spill cleaning agents, were evaluated. Onshore cleaning agents (SCAs) are meant to enhance the removal of treated oil from shoreline surfaces, and should not increase adverse impacts to organisms in a treated area. Tests, therefore, should be performed with resident organisms likely to be impacted during the use of SCAs. The four methodologies were Microtox T M, fertilization success for echinoderm eggs, byssal thread attachment in mussels, and righting and water-escaping ability in periwinkle snails. Site specific variations in physical and chemical properties of the oil and SCAs were considered. Results were provided, showing all combinations of oils and SCAs. Evaluation showed that all four methodologies provided sufficient information to assist a user in deciding whether or not the use of an SCA was warranted. 33 refs., 7 tabs., 11 figs

  2. Uterotonic use immediately following birth: using a novel methodology to estimate population coverage in four countries.

    Science.gov (United States)

    Ricca, Jim; Dwivedi, Vikas; Varallo, John; Singh, Gajendra; Pallipamula, Suranjeen Prasad; Amade, Nazir; de Luz Vaz, Maria; Bishanga, Dustan; Plotkin, Marya; Al-Makaleh, Bushra; Suhowatsky, Stephanie; Smith, Jeffrey Michael

    2015-01-22

    Postpartum hemorrhage (PPH) is the leading cause of maternal mortality in developing countries. While incidence of PPH can be dramatically reduced by uterotonic use immediately following birth (UUIFB) in both community and facility settings, national coverage estimates are rare. Most national health systems have no indicator to track this, and community-based measurements are even more scarce. To fill this information gap, a methodology for estimating national coverage for UUIFB was developed and piloted in four settings. The rapid estimation methodology consisted of convening a group of national technical experts and using the Delphi method to come to consensus on key data elements that were applied to a simple algorithm, generating a non-precise national estimate of coverage of UUIFB. Data elements needed for the calculation were the distribution of births by location and estimates of UUIFB in each of those settings, adjusted to take account of stockout rates and potency of uterotonics. This exercise was conducted in 2013 in Mozambique, Tanzania, the state of Jharkhand in India, and Yemen. Available data showed that deliveries in public health facilities account for approximately half of births in Mozambique and Tanzania, 16% in Jharkhand and 24% of births in Yemen. Significant proportions of births occur in private facilities in Jharkhand and faith-based facilities in Tanzania. Estimated uterotonic use for facility births ranged from 70 to 100%. Uterotonics are not used routinely for PPH prevention at home births in any of the settings. National UUIFB coverage estimates of all births were 43% in Mozambique, 40% in Tanzania, 44% in Jharkhand, and 14% in Yemen. This methodology for estimating coverage of UUIFB was found to be feasible and acceptable. While the exercise produces imprecise estimates whose validity cannot be assessed objectively in the absence of a gold standard estimate, stakeholders felt they were accurate enough to be actionable. The exercise

  3. Methodology to estimate the cost of the severe accidents risk / maximum benefit

    International Nuclear Information System (INIS)

    Mendoza, G.; Flores, R. M.; Vega, E.

    2016-09-01

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  4. Methodology development for estimating support behavior of spacer grid spring in core

    International Nuclear Information System (INIS)

    Yoon, Kyung Ho; Kang, Heung Seok; Kim, Hyung Kyu; Song, Kee Nam

    1998-04-01

    The fuel rod (FR) support behavior is changed during operation resulting from effects such as clad creep-down, spring force relaxation due to irradiation, and irradiation growth of spacer straps in accordance with time or increase of burnup. The FR support behavior is closely associated with time or increase of burnup. The FR support behavior is closely associated with FR damage due to fretting, therefore the analysis on the FR support behavior is normally required to minimize the damage. The characteristics of the parameters, which affect the FR support behavior, and the methodology developed for estimating the FR support behavior in the reactor core are described in this work. The FR support condition for the KOFA (KOrean Fuel Assembly) fuel has been analyzed by this method, and the results of the analysis show that the fuel failure due to the fuel rod fretting wear is closely related to the support behavior of FR in the core. Therefore, the present methodology for estimating the FR support condition seems to be useful for estimating the actual FR support condition. In addition, the optimization seems to be a reliable tool for establishing the optimal support condition on the basis of these results. (author). 15 refs., 3 tabs., 26 figs

  5. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    Science.gov (United States)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort

  6. Methodological Challenges in Estimating Trends and Burden of Cardiovascular Disease in Sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Jacob K. Kariuki

    2015-01-01

    Full Text Available Background. Although 80% of the burden of cardiovascular disease (CVD is in developing countries, the 2010 global burden of disease (GBD estimates have been cited to support a premise that sub-Saharan Africa (SSA is exempt from the CVD epidemic sweeping across developing countries. The widely publicized perspective influences research priorities and resource allocation at a time when secular trends indicate a rapid increase in prevalence of CVD in SSA by 2030. Purpose. To explore methodological challenges in estimating trends and burden of CVD in SSA via appraisal of the current CVD statistics and literature. Methods. This review was guided by the Critical review methodology described by Grant and Booth. The review traces the origins and evolution of GBD metrics and then explores the methodological limitations inherent in the current GBD statistics. Articles were included based on their conceptual contribution to the existing body of knowledge on the burden of CVD in SSA. Results/Conclusion. Cognizant of the methodological challenges discussed, we caution against extrapolation of the global burden of CVD statistics in a way that underrates the actual but uncertain impact of CVD in SSA. We conclude by making a case for optimal but cost-effective surveillance and prevention of CVD in SSA.

  7. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  8. Experimentation and Prediction of Temperature Rise in Turning ...

    African Journals Online (AJOL)

    Experimentation and Prediction of Temperature Rise in Turning Process using Response Surface Methodology. ... Science, Technology and Arts Research Journal. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue ...

  9. Utility Estimation for Pediatric Vesicoureteral Reflux: Methodological Considerations Using an Online Survey Platform.

    Science.gov (United States)

    Tejwani, Rohit; Wang, Hsin-Hsiao S; Lloyd, Jessica C; Kokorowski, Paul J; Nelson, Caleb P; Routh, Jonathan C

    2017-03-01

    The advent of online task distribution has opened a new avenue for efficiently gathering community perspectives needed for utility estimation. Methodological consensus for estimating pediatric utilities is lacking, with disagreement over whom to sample, what perspective to use (patient vs parent) and whether instrument induced anchoring bias is significant. We evaluated what methodological factors potentially impact utility estimates for vesicoureteral reflux. Cross-sectional surveys using a time trade-off instrument were conducted via the Amazon Mechanical Turk® (https://www.mturk.com) online interface. Respondents were randomized to answer questions from child, parent or dyad perspectives on the utility of a vesicoureteral reflux health state and 1 of 3 "warm-up" scenarios (paralysis, common cold, none) before a vesicoureteral reflux scenario. Utility estimates and potential predictors were fitted to a generalized linear model to determine what factors most impacted utilities. A total of 1,627 responses were obtained. Mean respondent age was 34.9 years. Of the respondents 48% were female, 38% were married and 44% had children. Utility values were uninfluenced by child/personal vesicoureteral reflux/urinary tract infection history, income or race. Utilities were affected by perspective and were higher in the child group (34% lower in parent vs child, p pediatric conditions. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  10. Tidally adjusted estimates of topographic vulnerability to sea level rise and flooding for the contiguous United States

    International Nuclear Information System (INIS)

    Strauss, Benjamin H; Ziemlinski, Remik; Weiss, Jeremy L; Overpeck, Jonathan T

    2012-01-01

    Because sea level could rise 1 m or more during the next century, it is important to understand what land, communities and assets may be most at risk from increased flooding and eventual submersion. Employing a recent high-resolution edition of the National Elevation Dataset and using VDatum, a newly available tidal model covering the contiguous US, together with data from the 2010 Census, we quantify low-lying coastal land, housing and population relative to local mean high tide levels, which range from ∼0 to 3 m in elevation (North American Vertical Datum of 1988). Previous work at regional to national scales has sometimes equated elevation with the amount of sea level rise, leading to underestimated risk anywhere where the mean high tide elevation exceeds 0 m, and compromising comparisons across regions with different tidal levels. Using our tidally adjusted approach, we estimate the contiguous US population living on land within 1 m of high tide to be 3.7 million. In 544 municipalities and 38 counties, we find that over 10% of the population lives below this line; all told, some 2150 towns and cities have some degree of exposure. At the state level, Florida, Louisiana, California, New York and New Jersey have the largest sub-meter populations. We assess topographic susceptibility of land, housing and population to sea level rise for all coastal states, counties and municipalities, from 0 to 6 m above mean high tide, and find important threat levels for widely distributed communities of every size. We estimate that over 22.9 million Americans live on land within 6 m of local mean high tide. (letter)

  11. A robust methodology for modal parameters estimation applied to SHM

    Science.gov (United States)

    Cardoso, Rharã; Cury, Alexandre; Barbosa, Flávio

    2017-10-01

    The subject of structural health monitoring is drawing more and more attention over the last years. Many vibration-based techniques aiming at detecting small structural changes or even damage have been developed or enhanced through successive researches. Lately, several studies have focused on the use of raw dynamic data to assess information about structural condition. Despite this trend and much skepticism, many methods still rely on the use of modal parameters as fundamental data for damage detection. Therefore, it is of utmost importance that modal identification procedures are performed with a sufficient level of precision and automation. To fulfill these requirements, this paper presents a novel automated time-domain methodology to identify modal parameters based on a two-step clustering analysis. The first step consists in clustering modes estimates from parametric models of different orders, usually presented in stabilization diagrams. In an automated manner, the first clustering analysis indicates which estimates correspond to physical modes. To circumvent the detection of spurious modes or the loss of physical ones, a second clustering step is then performed. The second step consists in the data mining of information gathered from the first step. To attest the robustness and efficiency of the proposed methodology, numerically generated signals as well as experimental data obtained from a simply supported beam tested in laboratory and from a railway bridge are utilized. The results appeared to be more robust and accurate comparing to those obtained from methods based on one-step clustering analysis.

  12. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  13. Project risk management in the construction of high-rise buildings

    Science.gov (United States)

    Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya

    2018-03-01

    This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.

  14. Methodological Framework for World Health Organization Estimates of the Global Burden of Foodborne Disease.

    Directory of Open Access Journals (Sweden)

    Brecht Devleesschauwer

    Full Text Available The Foodborne Disease Burden Epidemiology Reference Group (FERG was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs. This paper describes the methodological framework developed by FERG's Computational Task Force to transform epidemiological information into FBD burden estimates.The global and regional burden of 31 FBDs was quantified, along with limited estimates for 5 other FBDs, using Disability-Adjusted Life Years in a hazard- and incidence-based approach. To accomplish this task, the following workflow was defined: outline of disease models and collection of epidemiological data; design and completion of a database template; development of an imputation model; identification of disability weights; probabilistic burden assessment; and estimating the proportion of the disease burden by each hazard that is attributable to exposure by food (i.e., source attribution. All computations were performed in R and the different functions were compiled in the R package 'FERG'. Traceability and transparency were ensured by sharing results and methods in an interactive way with all FERG members throughout the process.We developed a comprehensive framework for estimating the global burden of FBDs, in which methodological simplicity and transparency were key elements. All the tools developed have been made available and can be translated into a user-friendly national toolkit for studying and monitoring food safety at the local level.

  15. A PROPOSED METHODOLOGY FOR ESTIMATING ECOREGIONAL VALUES FOR OUTDOOR RECREATION IN THE UNITED STATES

    OpenAIRE

    Bhat, Gajanan; Bergstrom, John C.; Bowker, James Michael; Cordell, H. Ken

    1996-01-01

    This paper provides a methodology for the estimation of recreational demand functions and values using an ecoregional approach. Ten ecoregions in the continental US were defined based on similarly functioning ecosystem characters. The individual travel cost method was employed to estimate the recreational demand functions for activities such as motorboating and waterskiing, developed and primative camping, coldwater fishing, sightseeing and pleasure driving, and big game hunting for each ecor...

  16. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud; Alshareef, Husam N.

    2010-01-01

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 p

  17. Development of extreme rainfall PRA methodology for sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2016-01-01

    The objective of this study is to develop a probabilistic risk assessment (PRA) methodology for extreme rainfall with focusing on decay heat removal system of a sodium-cooled fast reactor. For the extreme rainfall, annual excess probability depending on the hazard intensity was statistically estimated based on meteorological data. To identify core damage sequence, event trees were developed by assuming scenarios that structures, systems and components (SSCs) important to safety are flooded with rainwater coming into the buildings through gaps in the doors and the SSCs fail when the level of rainwater on the ground or on the roof of the building becomes higher than thresholds of doors on first floor or on the roof during the rainfall. To estimate the failure probability of the SSCs, the level of water rise was estimated by comparing the difference between precipitation and drainage capacity. By combining annual excess probability and the failure probability of SSCs, the event trees led to quantification of core damage frequency, and therefore the PRA methodology for rainfall was developed. (author)

  18. ESTIMATION OF THE TEMPERATURE RISE OF A MCU ACID STREAM PIPE IN NEAR PROXIMITY TO A SLUDGE STREAM PIPE

    International Nuclear Information System (INIS)

    Fondeur, F; Michael Poirier, M; Samuel Fink, S

    2007-01-01

    Effluent streams from the Modular Caustic-Side Solvent Extraction Unit (MCU) will transfer to the tank farms and to the Defense Waste Processing Facility (DWPF). These streams will contain entrained solvent. A significant portion of the Strip Effluent (SE) pipeline (i.e., acid stream containing Isopar(reg s ign) L residues) length is within one inch of a sludge stream. Personnel envisioned the sludge stream temperature may reach 100 C during operation. The nearby SE stream may receive heat from the sludge stream and reach temperatures that may lead to flammability issues once the contents of the SE stream discharge into a larger reservoir. To this end, personnel used correlations from the literature to estimate the maximum temperature rise the SE stream may experience if the nearby sludge stream reaches boiling temperature. Several calculation methods were used to determine the temperature rise of the SE stream. One method considered a heat balance equation under steady state that employed correlation functions to estimate heat transfer rate. This method showed the maximum temperature of the acid stream (SE) may exceed 45 C when the nearby sludge stream is 80 C or higher. A second method used an effectiveness calculation used to predict the heat transfer rate in single pass heat exchanger. By envisioning the acid and sludge pipes as a parallel flow pipe-to-pipe heat exchanger, this method provides a conservative estimation of the maximum temperature rise. Assuming the contact area (i.e., the area over which the heat transfer occurs) is the whole pipe area, the results found by this method nearly matched the results found with the previous calculation method. It is recommended that the sludge stream be maintained below 80 C to minimize a flammable vapor hazard from occurring

  19. Methodological framework for World Health Organization estimates of the global burden of foodborne disease

    NARCIS (Netherlands)

    B. Devleesschauwer (Brecht); J.A. Haagsma (Juanita); F.J. Angulo (Frederick); D.C. Bellinger (David); D. Cole (Dana); D. Döpfer (Dörte); A. Fazil (Aamir); E.M. Fèvre (Eric); H.J. Gibb (Herman); T. Hald (Tine); M.D. Kirk (Martyn); R.J. Lake (Robin); C. Maertens De Noordhout (Charline); C. Mathers (Colin); S.A. McDonald (Scott); S.M. Pires (Sara); N. Speybroeck (Niko); M.K. Thomas (Kate); D. Torgerson; F. Wu (Felicia); A.H. Havelaar (Arie); N. Praet (Nicolas)

    2015-01-01

    textabstractBackground: The Foodborne Disease Burden Epidemiology Reference Group (FERG) was established in 2007 by the World Health Organization to estimate the global burden of foodborne diseases (FBDs). This paper describes the methodological framework developed by FERG's Computational Task Force

  20. Single-point reactive power control method on voltage rise mitigation in residential networks with high PV penetration

    DEFF Research Database (Denmark)

    Hasheminamin, Maryam; Agelidis, Vassilios; Ahmadi, Abdollah

    2018-01-01

    Voltage rise (VR) due to reverse power flow is an important obstacle for high integration of Photovoltaic (PV) into residential networks. This paper introduces and elaborates a novel methodology of an index-based single-point-reactive power-control (SPRPC) methodology to mitigate voltage rise by ...... system with high r/x ratio. Efficacy, effectiveness and cost study of SPRPC is compared to droop control to evaluate its advantages.......Voltage rise (VR) due to reverse power flow is an important obstacle for high integration of Photovoltaic (PV) into residential networks. This paper introduces and elaborates a novel methodology of an index-based single-point-reactive power-control (SPRPC) methodology to mitigate voltage rise...... by absorbing adequate reactive power from one selected point. The proposed index utilizes short circuit analysis to select the best point to apply this Volt/Var control method. SPRPC is supported technically and financially by distribution network operator that makes it cost effective, simple and efficient...

  1. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  2. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  3. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  4. Estimation of retired mobile phones generation in China: A comparative study on methodology

    International Nuclear Information System (INIS)

    Li, Bo; Yang, Jianxin; Lu, Bin; Song, Xiaolong

    2015-01-01

    Highlights: • The sales data of mobile phones in China was revised by considering the amount of smuggled and counterfeit mobile phones. • The estimation of retired mobile phones in China was made by comparing some relevant methods. • The advanced result of estimation can help improve the policy-making. • The method suggested in this paper can be also used in other countries. • Some discussions on methodology are also conducted in order for the improvement. - Abstract: Due to the rapid development of economy and technology, China has the biggest production and possession of mobile phones around the world. In general, mobile phones have relatively short life time because the majority of users replace their mobile phones frequently. Retired mobile phones represent the most valuable electrical and electronic equipment (EEE) in the main waste stream because of such characteristics as large quantity, high reuse/recovery value and fast replacement frequency. Consequently, the huge amount of retired mobile phones in China calls for a sustainable management system. The generation estimation can provide fundamental information to construct the sustainable management system of retired mobile phones and other waste electrical and electronic equipment (WEEE). However, the reliable estimation result is difficult to get and verify. The priority aim of this paper is to provide proper estimation approach for the generation of retired mobile phones in China, by comparing some relevant methods. The results show that the sales and new method is in the highest priority in estimation of the retired mobile phones. The result of sales and new method shows that there are 47.92 million mobile phones retired in 2002, and it reached to 739.98 million in China in 2012. It presents an increasing tendency with some fluctuations clearly. Furthermore, some discussions on methodology, such as the selection of improper approach and error in the input data, are also conducted in order to

  5. Estimation of retired mobile phones generation in China: A comparative study on methodology

    Energy Technology Data Exchange (ETDEWEB)

    Li, Bo [State Key Laboratory of Urban and Regional Ecology, Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, Shuangqing Road 18, Haidian District, Beijing 100085 (China); Yang, Jianxin, E-mail: yangjx@rcees.ac.cn [State Key Laboratory of Urban and Regional Ecology, Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, Shuangqing Road 18, Haidian District, Beijing 100085 (China); Lu, Bin [State Key Laboratory of Urban and Regional Ecology, Research Center for Eco-Environmental Sciences, Chinese Academy of Sciences, Shuangqing Road 18, Haidian District, Beijing 100085 (China); Song, Xiaolong [Shanghai Cooperative Centre for WEEE Recycling, Shanghai Second Polytechnic University, Jinhai Road 2360, Pudong District, Shanghai 201209 (China)

    2015-01-15

    Highlights: • The sales data of mobile phones in China was revised by considering the amount of smuggled and counterfeit mobile phones. • The estimation of retired mobile phones in China was made by comparing some relevant methods. • The advanced result of estimation can help improve the policy-making. • The method suggested in this paper can be also used in other countries. • Some discussions on methodology are also conducted in order for the improvement. - Abstract: Due to the rapid development of economy and technology, China has the biggest production and possession of mobile phones around the world. In general, mobile phones have relatively short life time because the majority of users replace their mobile phones frequently. Retired mobile phones represent the most valuable electrical and electronic equipment (EEE) in the main waste stream because of such characteristics as large quantity, high reuse/recovery value and fast replacement frequency. Consequently, the huge amount of retired mobile phones in China calls for a sustainable management system. The generation estimation can provide fundamental information to construct the sustainable management system of retired mobile phones and other waste electrical and electronic equipment (WEEE). However, the reliable estimation result is difficult to get and verify. The priority aim of this paper is to provide proper estimation approach for the generation of retired mobile phones in China, by comparing some relevant methods. The results show that the sales and new method is in the highest priority in estimation of the retired mobile phones. The result of sales and new method shows that there are 47.92 million mobile phones retired in 2002, and it reached to 739.98 million in China in 2012. It presents an increasing tendency with some fluctuations clearly. Furthermore, some discussions on methodology, such as the selection of improper approach and error in the input data, are also conducted in order to

  6. Methodologies for estimating one-time hazardous waste generation for capacity generation for capacity assurance planning

    International Nuclear Information System (INIS)

    Tonn, B.; Hwang, Ho-Ling; Elliot, S.; Peretz, J.; Bohm, R.; Hendrucko, B.

    1994-04-01

    This report contains descriptions of methodologies to be used to estimate the one-time generation of hazardous waste associated with five different types of remediation programs: Superfund sites, RCRA Corrective Actions, Federal Facilities, Underground Storage Tanks, and State and Private Programs. Estimates of the amount of hazardous wastes generated from these sources to be shipped off-site to commercial hazardous waste treatment and disposal facilities will be made on a state by state basis for the years 1993, 1999, and 2013. In most cases, estimates will be made for the intervening years, also

  7. Automated methodology for estimating waste streams generated from decommissioning contaminated facilities

    International Nuclear Information System (INIS)

    Toth, J.J.; King, D.A.; Humphreys, K.K.; Haffner, D.R.

    1994-01-01

    As part of the DOE Programmatic Environmental Impact Statement (PEIS), a viable way to determine aggregate waste volumes, cost, and direct labor hours for decommissioning and decontaminating facilities is required. In this paper, a methodology is provided for determining waste streams, cost and direct labor hours from remediation of contaminated facilities. The method is developed utilizing U.S. facility remediation data and information from several decommissioning programs, including reactor decommissioning projects. The method provides for rapid, consistent analysis for many facility types. Three remediation scenarios are considered for facility D ampersand D: unrestricted land use, semi-restricted land use, and restricted land use. Unrestricted land use involves removing radioactive components, decontaminating the building surfaces, and demolishing the remaining structure. Semi-restricted land use involves removing transuranic contamination and immobilizing the contamination on-site. Restricted land use involves removing the transuranic contamination and leaving the building standing. In both semi-restricted and restricted land use scenarios, verification of containment with environmental monitoring is required. To use the methodology, facilities are placed in a building category depending upon the level of contamination, construction design, and function of the building. Unit volume and unit area waste generation factors are used to calculate waste volumes and estimate the amount of waste generated in each of the following classifications: low-level, transuranic, and hazardous waste. Unit factors for cost and labor hours are also applied to the result to estimate D ampersand D cost and labor hours

  8. Analysis of coastal protection under rising flood risk

    Directory of Open Access Journals (Sweden)

    Megan J. Lickley

    2014-01-01

    Full Text Available Infrastructure located along the U.S. Atlantic and Gulf coasts is exposed to rising risk of flooding from sea level rise, increasing storm surge, and subsidence. In these circumstances coastal management commonly based on 100-year flood maps assuming current climatology is no longer adequate. A dynamic programming cost–benefit analysis is applied to the adaptation decision, illustrated by application to an energy facility in Galveston Bay. Projections of several global climate models provide inputs to estimates of the change in hurricane and storm surge activity as well as the increase in sea level. The projected rise in physical flood risk is combined with estimates of flood damage and protection costs in an analysis of the multi-period nature of adaptation choice. The result is a planning method, using dynamic programming, which is appropriate for investment and abandonment decisions under rising coastal risk.

  9. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    Science.gov (United States)

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil

  10. Sea level rise and the geoid: factor analysis approach

    Directory of Open Access Journals (Sweden)

    Alexey Sadovski

    2013-08-01

    Full Text Available Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical approach by using factor analysis of regional sea level rates of change. Unlike physical models and semi-empirical models that attempt to approach how much and how fast sea levels are changing, this methodology allows for a discussion of the factor(s that statistically affects sea level rates of change, and seeks patterns to explain spatial correlations.

  11. Integrated cost estimation methodology to support high-performance building design

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, Prasad; Greden, Lara; Eijadi, David; McDougall, Tom [The Weidt Group, Minnetonka (United States); Cole, Ray [Axiom Engineers, Monterey (United States)

    2007-07-01

    Design teams evaluating the performance of energy conservation measures (ECMs) calculate energy savings rigorously with established modelling protocols, accounting for the interaction between various measures. However, incremental cost calculations do not have a similar rigor. Often there is no recognition of cost reductions with integrated design, nor is there assessment of cost interactions amongst measures. This lack of rigor feeds the notion that high-performance buildings cost more, creating a barrier for design teams pursuing aggressive high-performance outcomes. This study proposes an alternative integrated methodology to arrive at a lower perceived incremental cost for improved energy performance. The methodology is based on the use of energy simulations as means towards integrated design and cost estimation. Various points along the spectrum of integration are identified and characterized by the amount of design effort invested, the scheduling of effort, and relative energy performance of the resultant design. It includes a study of the interactions between building system parameters as they relate to capital costs. Several cost interactions amongst energy measures are found to be significant.The value of this approach is demonstrated with alternatives in a case study that shows the differences between perceived costs for energy measures along various points on the integration spectrum. These alternatives show design tradeoffs and identify how decisions would have been different with a standard costing approach. Areas of further research to make the methodology more robust are identified. Policy measures to encourage the integrated approach and reduce the barriers towards improved energy performance are discussed.

  12. Methodology for estimation of secondary meteorological variables to be used in local dispersion of air pollutants

    International Nuclear Information System (INIS)

    Turtos, L.; Sanchez, M.; Roque, A.; Soltura, R.

    2003-01-01

    Methodology for estimation of secondary meteorological variables to be used in local dispersion of air pollutants. This paper include the main works, carried out into the frame of the project Atmospheric environmental externalities of the electricity generation in Cuba, aiming to develop methodologies and corresponding software, which will allow to improve the quality of the secondary meteorological data used in atmospheric pollutant calculations; specifically the wind profiles coefficient, urban and rural mixed high and temperature gradients

  13. A methodology for estimating health benefits of electricity generation using renewable technologies.

    Science.gov (United States)

    Partridge, Ian; Gamkhar, Shama

    2012-02-01

    At Copenhagen, the developed countries agreed to provide up to $100 bn per year to finance climate change mitigation and adaptation by developing countries. Projects aimed at cutting greenhouse gas (GHG) emissions will need to be evaluated against dual criteria: from the viewpoint of the developed countries they must cut emissions of GHGs at reasonable cost, while host countries will assess their contribution to development, or simply their overall economic benefits. Co-benefits of some types of project will also be of interest to host countries: for example some projects will contribute to reducing air pollution, thus improving the health of the local population. This paper uses a simple damage function methodology to quantify some of the health co-benefits of replacing coal-fired generation with wind or small hydro in China. We estimate the monetary value of these co-benefits and find that it is probably small compared to the added costs. We have not made a full cost-benefit analysis of renewable energy in China as some likely co-benefits are omitted from our calculations. Our results are subject to considerable uncertainty however, after careful consideration of their likely accuracy and comparisons with other studies, we believe that they provide a good first cut estimate of co-benefits and are sufficiently robust to stand as a guide for policy makers. In addition to these empirical results, a key contribution made by the paper is to demonstrate a simple and reasonably accurate methodology for health benefits estimation that applies the most recent academic research in the field to the solution of an increasingly important problem. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Estimating the Greenland ice sheet surface mass balance contribution to future sea level rise using the regional atmospheric climate model MAR

    NARCIS (Netherlands)

    Fettweis, X.; Franco, B.; Tedesco, M.; van Angelen, J.H.; Lenaerts, J.T.M.; van den Broeke, M.R.; Gallée, H.

    2013-01-01

    To estimate the sea level rise (SLR) originating from changes in surface mass balance (SMB) of the Greenland ice sheet (GrIS), we present 21st century climate projections obtained with the regional climate model MAR (Mod`ele Atmosph´erique R´egional), forced by output of three CMIP5 (Coupled Model

  15. Dasymetric high resolution population distribution estimates for improved decision making, with a case study of sea-level rise vulnerability in Boca Raton, Florida

    Science.gov (United States)

    Ziegler, Hannes Moritz

    Planners and managers often rely on coarse population distribution data from the census for addressing various social, economic, and environmental problems. In the analysis of physical vulnerabilities to sea-level rise, census units such as blocks or block groups are coarse relative to the required decision-making application. This study explores the benefits offered from integrating image classification and dasymetric mapping at the household level to provide detailed small area population estimates at the scale of residential buildings. In a case study of Boca Raton, FL, a sea-level rise inundation grid based on mapping methods by NOAA is overlaid on the highly detailed population distribution data to identify vulnerable residences and estimate population displacement. The enhanced spatial detail offered through this method has the potential to better guide targeted strategies for future development, mitigation, and adaptation efforts.

  16. Model methodology for estimating pesticide concentration extremes based on sparse monitoring data

    Science.gov (United States)

    Vecchia, Aldo V.

    2018-03-22

    This report describes a new methodology for using sparse (weekly or less frequent observations) and potentially highly censored pesticide monitoring data to simulate daily pesticide concentrations and associated quantities used for acute and chronic exposure assessments, such as the annual maximum daily concentration. The new methodology is based on a statistical model that expresses log-transformed daily pesticide concentration in terms of a seasonal wave, flow-related variability, long-term trend, and serially correlated errors. Methods are described for estimating the model parameters, generating conditional simulations of daily pesticide concentration given sparse (weekly or less frequent) and potentially highly censored observations, and estimating concentration extremes based on the conditional simulations. The model can be applied to datasets with as few as 3 years of record, as few as 30 total observations, and as few as 10 uncensored observations. The model was applied to atrazine, carbaryl, chlorpyrifos, and fipronil data for U.S. Geological Survey pesticide sampling sites with sufficient data for applying the model. A total of 112 sites were analyzed for atrazine, 38 for carbaryl, 34 for chlorpyrifos, and 33 for fipronil. The results are summarized in this report; and, R functions, described in this report and provided in an accompanying model archive, can be used to fit the model parameters and generate conditional simulations of daily concentrations for use in investigations involving pesticide exposure risk and uncertainty.

  17. Probabilistic reanalysis of twentieth-century sea-level rise.

    Science.gov (United States)

    Hay, Carling C; Morrow, Eric; Kopp, Robert E; Mitrovica, Jerry X

    2015-01-22

    Estimating and accounting for twentieth-century global mean sea level (GMSL) rise is critical to characterizing current and future human-induced sea-level change. Several previous analyses of tide gauge records--employing different methods to accommodate the spatial sparsity and temporal incompleteness of the data and to constrain the geometry of long-term sea-level change--have concluded that GMSL rose over the twentieth century at a mean rate of 1.6 to 1.9 millimetres per year. Efforts to account for this rate by summing estimates of individual contributions from glacier and ice-sheet mass loss, ocean thermal expansion, and changes in land water storage fall significantly short in the period before 1990. The failure to close the budget of GMSL during this period has led to suggestions that several contributions may have been systematically underestimated. However, the extent to which the limitations of tide gauge analyses have affected estimates of the GMSL rate of change is unclear. Here we revisit estimates of twentieth-century GMSL rise using probabilistic techniques and find a rate of GMSL rise from 1901 to 1990 of 1.2 ± 0.2 millimetres per year (90% confidence interval). Based on individual contributions tabulated in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change, this estimate closes the twentieth-century sea-level budget. Our analysis, which combines tide gauge records with physics-based and model-derived geometries of the various contributing signals, also indicates that GMSL rose at a rate of 3.0 ± 0.7 millimetres per year between 1993 and 2010, consistent with prior estimates from tide gauge records.The increase in rate relative to the 1901-90 trend is accordingly larger than previously thought; this revision may affect some projections of future sea-level rise.

  18. Comparison of methodologies estimating emissions of aircraft pollutants, environmental impact assessment around airports

    International Nuclear Information System (INIS)

    Kurniawan, Jermanto S.; Khardi, S.

    2011-01-01

    Air transportation growth has increased continuously over the years. The rise in air transport activity has been accompanied by an increase in the amount of energy used to provide air transportation services. It is also assumed to increase environmental impacts, in particular pollutant emissions. Traditionally, the environmental impacts of atmospheric emissions from aircraft have been addressed in two separate ways; aircraft pollutant emissions occurring during the landing and take-off (LTO) phase (local pollutant emissions) which is the focus of this study, and the non-LTO phase (global/regional pollutant emissions). Aircraft pollutant emissions are an important source of pollution and directly or indirectly harmfully affect human health, ecosystems and cultural heritage. There are many methods to asses pollutant emissions used by various countries. However, using different and separate methodology will cause a variation in results, some lack of information and the use of certain methods will require justification and reliability that must be demonstrated and proven. In relation to this issue, this paper presents identification, comparison and reviews of some of the methodologies of aircraft pollutant assessment from the past, present and future expectations of some studies and projects focusing on emissions factors, fuel consumption, and uncertainty. This paper also provides reliable information on the impacts of aircraft pollutant emissions in short term and long term predictions.

  19. Estimating the potential impacts of a nuclear reactor accident: methodology and case studies

    International Nuclear Information System (INIS)

    Cartwright, J.V.; Beemiller, R.M.; Trott, E.A. Jr.; Younger, J.M.

    1982-04-01

    This monograph describes an industrial impact model that can be used to estimate the regional industry-specific impacts of disasters. Special attention is given to the impacts of possible nuclear reactor accidents. The monograph also presents three applications of the model. The impacts estimated in the case studies are based on (1) general information and reactor-specific data, supplied by the US Nuclear Regulatory Commission (NRC), (2) regional economic models derived from the Regional Input-Output Modeling System (RIMS II) developed at the Bureau of Economic Analysis (BEA), and (3) additional methodology developed especially for taking into account the unique characteristics of a nuclear reactor accident with respect to regional industrial activity

  20. Algorithm for evaluating the effectiveness of a high-rise development project based on current yield

    Science.gov (United States)

    Soboleva, Elena

    2018-03-01

    The article is aimed at the issues of operational evaluation of development project efficiency in high-rise construction under the current economic conditions in Russia. The author touches the following issues: problems of implementing development projects, the influence of the operational evaluation quality of high-rise construction projects on general efficiency, assessing the influence of the project's external environment on the effectiveness of project activities under crisis conditions and the quality of project management. The article proposes the algorithm and the methodological approach to the quality management of the developer project efficiency based on operational evaluation of the current yield efficiency. The methodology for calculating the current efficiency of a development project for high-rise construction has been updated.

  1. Wind turbine power coefficient estimation by soft computing methodologies: Comparative study

    International Nuclear Information System (INIS)

    Shamshirband, Shahaboddin; Petković, Dalibor; Saboohi, Hadi; Anuar, Nor Badrul; Inayat, Irum; Akib, Shatirah; Ćojbašić, Žarko; Nikolić, Vlastimir; Mat Kiah, Miss Laiha; Gani, Abdullah

    2014-01-01

    Highlights: • Variable speed operation of wind turbine to increase power generation. • Changeability and fluctuation of wind has to be accounted. • To build an effective prediction model of wind turbine power coefficient. • The impact of the variation in the blade pitch angle and tip speed ratio. • Support vector regression methodology application as predictive methodology. - Abstract: Wind energy has become a large contender of traditional fossil fuel energy, particularly with the successful operation of multi-megawatt sized wind turbines. However, reasonable wind speed is not adequately sustainable everywhere to build an economical wind farm. In wind energy conversion systems, one of the operational problems is the changeability and fluctuation of wind. In most cases, wind speed can vacillate rapidly. Hence, quality of produced energy becomes an important problem in wind energy conversion plants. Several control techniques have been applied to improve the quality of power generated from wind turbines. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of support vector regression (SVR) to estimate optimal power coefficient value of the wind turbines. Instead of minimizing the observed training error, SVR p oly and SVR r bf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR approach in compare to other soft computing methodologies

  2. Towards a unified estimate of arctic glaciers contribution to sea level rise since 1972.

    Science.gov (United States)

    Dehecq, A.; Gardner, A. S.; Alexandrov, O.; McMichael, S.

    2017-12-01

    Glaciers retreat contributed to about 1/3 of the observed sea level rise since 1971 (IPCC). However, long term estimates of glaciers volume changes rely on sparse field observations and region-wide satellite observations are available mostly after 2000. The recently declassified images from the reconnaissance satellite series Hexagon (KH9), that acquired 6 m resolution stereoscopic images from 1971 to 1986, open new possibilities for glaciers observation. But the film-printed images represent a processing challenge. Here we present an automatic workflow developed to generate Digital Elevation Models (DEMs) at 24 m resolution from the raw scanned KH9 images. It includes a preprocessing step to detect fiducial marks and to correct distortions of the film caused by the 40-year storage. An estimate of the unknown satellite position is obtained from a crude geolocation of the images. Each stereo image pair/triplet is then processed using the NASA Ames Stereo Pipeline to derive an unscaled DEM using standard photogrammetric techniques. This DEM is finally aligned to a reference topography, to account for errors in translation, rotation and scaling. In a second part, we present DEMs generated over glaciers in the Canadian Arctic and analyze glaciers volume changes from 1970 to the more recent WorldView ArcticDEM.

  3. Estimation of undernutrition and mean calorie intake in Africa: methodology, findings and implications.

    Science.gov (United States)

    van Wesenbeeck, Cornelia F A; Keyzer, Michiel A; Nubé, Maarten

    2009-06-27

    As poverty and hunger are basic yardsticks of underdevelopment and destitution, the need for reliable statistics in this domain is self-evident. While the measurement of poverty through surveys is relatively well documented in the literature, for hunger, information is much scarcer, particularly for adults, and very different methodologies are applied for children and adults. Our paper seeks to improve on this practice in two ways. One is that we estimate the prevalence of undernutrition in sub-Saharan Africa (SSA) for both children and adults based on anthropometric data available at province or district level, and secondly, we estimate the mean calorie intake and implied calorie gap for SSA, also using anthropometric data on the same geographical aggregation level. Our main results are, first, that we find a much lower prevalence of hunger than presented in the Millennium Development reports (17.3% against 27.8% for the continent as a whole). Secondly, we find that there is much less spread in mean calorie intake across the continent than reported by the Food and Agricultural Organization (FAO) in the State of Food and Agriculture, 2007, the only estimate that covers the whole of Africa. While FAO estimates for calorie availability vary from a low of 1760 Kcal/capita/day for Central Africa to a high of 2825 Kcal/capita/day for Southern Africa, our estimates lay in a range of 2245 Kcal/capita/day (Eastern Africa) to 2618 Kcal/capita/day for Southern Africa. Thirdly, we validate the main data sources used (the Demographic and Health Surveys) by comparing them over time and with other available data sources for various countries. We conclude that the picture of Africa that emerges from anthropometric data is much less negative than that usually presented. Especially for Eastern and Central Africa, the nutritional status is less critical than commonly assumed and also mean calorie intake is higher, which implies that agricultural production and hence income must also

  4. CSA C873 Building Energy Estimation Methodology - A simplified monthly calculation for quick building optimization

    NARCIS (Netherlands)

    Legault, A.; Scott, L.; Rosemann, A.L.P.; Hopkins, M.

    2014-01-01

    CSA C873 Building Energy Estimation Methodology (BEEM) is a new series of (10) standards that is intended to simplify building energy calculations. The standard is based upon the German DIN Standard 18599 that has 8 years of proven track record and has been modified for the Canadian market. The BEEM

  5. A methodology for estimating potential doses and risks from recycling U.S. Department of Energy radioactive scrap metals

    International Nuclear Information System (INIS)

    MacKinney, J.A.

    1995-01-01

    The U.S. Environmental Protection Agency (EPA) is considering writing regulations for the controlled use of materials originating from radioactively contaminated zones which may be recyclable. These materials include metals, such as steel (carbon and stainless), nickel, copper, aluminum and lead, from the decommissioning of federal, and non-federal facilities. To develop criteria for the release of such materials, a risk analysis of all potential exposure pathways should be conducted. These pathways include direct exposure to the recycled material by the public and workers, both individual and collective, as well as numerous other potential exposure pathways in the life of the material. EPA has developed a risk assessment methodology for estimating doses and risks associated with recycling radioactive scrap metals. This methodology was applied to metal belonging to the U.S. Department of Energy. This paper will discuss the draft EPA risk assessment methodology as a tool for estimating doses and risks from recycling. (author)

  6. A Methodology of Health Effects Estimation from Air Pollution in Large Asian Cities

    Directory of Open Access Journals (Sweden)

    Keiko Hirota

    2017-09-01

    Full Text Available The increase of health effects caused by air pollution seems to be a growing concern in Asian cities with increasing motorization. This paper discusses methods of estimating the health effects of air pollution in large Asian cities. Due to the absence of statistical data in Asia, this paper carefully chooses the methodology using data of the Japanese compensation system. A basic idea of health effects will be captured from simple indicators, such as population and air quality, in a correlation model. This correlation model enables more estimation results of respiratory mortality caused by air pollution to be yielded than by using the relative model. The correlation model could be an alternative method to estimate mortality besides the relative risk model since the results of the correlation model are comparable with those of the relative model by city and by time series. The classification of respiratory diseases is not known from the statistical yearbooks in many countries. Estimation results could support policy decision-making with respect to public health in a cost-effective way.

  7. Estimating the global prevalence of inadequate zinc intake from national food balance sheets: effects of methodological assumptions.

    Directory of Open Access Journals (Sweden)

    K Ryan Wessells

    Full Text Available The prevalence of inadequate zinc intake in a population can be estimated by comparing the zinc content of the food supply with the population's theoretical requirement for zinc. However, assumptions regarding the nutrient composition of foods, zinc requirements, and zinc absorption may affect prevalence estimates. These analyses were conducted to: (1 evaluate the effect of varying methodological assumptions on country-specific estimates of the prevalence of dietary zinc inadequacy and (2 generate a model considered to provide the best estimates.National food balance data were obtained from the Food and Agriculture Organization of the United Nations. Zinc and phytate contents of these foods were estimated from three nutrient composition databases. Zinc absorption was predicted using a mathematical model (Miller equation. Theoretical mean daily per capita physiological and dietary requirements for zinc were calculated using recommendations from the Food and Nutrition Board of the Institute of Medicine and the International Zinc Nutrition Consultative Group. The estimated global prevalence of inadequate zinc intake varied between 12-66%, depending on which methodological assumptions were applied. However, country-specific rank order of the estimated prevalence of inadequate intake was conserved across all models (r = 0.57-0.99, P<0.01. A "best-estimate" model, comprised of zinc and phytate data from a composite nutrient database and IZiNCG physiological requirements for absorbed zinc, estimated the global prevalence of inadequate zinc intake to be 17.3%.Given the multiple sources of uncertainty in this method, caution must be taken in the interpretation of the estimated prevalence figures. However, the results of all models indicate that inadequate zinc intake may be fairly common globally. Inferences regarding the relative likelihood of zinc deficiency as a public health problem in different countries can be drawn based on the country

  8. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  9. Assessing Flood Risk Under Sea Level Rise and Extreme Sea Levels Scenarios: Application to the Ebro Delta (Spain)

    Science.gov (United States)

    Sayol, J. M.; Marcos, M.

    2018-02-01

    This study presents a novel methodology to estimate the impact of local sea level rise and extreme surges and waves in coastal areas under climate change scenarios. The methodology is applied to the Ebro Delta, a valuable and vulnerable low-lying wetland located in the northwestern Mediterranean Sea. Projections of local sea level accounting for all contributions to mean sea level changes, including thermal expansion, dynamic changes, fresh water addition and glacial isostatic adjustment, have been obtained from regionalized sea level projections during the 21st century. Particular attention has been paid to the uncertainties, which have been derived from the spread of the multi-model ensemble combined with seasonal/inter-annual sea level variability from local tide gauge observations. Besides vertical land movements have also been integrated to estimate local relative sea level rise. On the other hand, regional projections over the Mediterranean basin of storm surges and wind-waves have been used to evaluate changes in extreme events. The compound effects of surges and extreme waves have been quantified using their joint probability distributions. Finally, offshore sea level projections from extreme events superimposed to mean sea level have been propagated onto a high resolution digital elevation model of the study region in order to construct flood hazards maps for mid and end of the 21st century and under two different climate change scenarios. The effect of each contribution has been evaluated in terms of percentage of the area exposed to coastal hazards, which will help to design more efficient protection and adaptation measures.

  10. An Update of Sea Level Rise in the northwestern part of the Arabian Gulf

    Science.gov (United States)

    Alothman, Abdulaziz; Bos, Machiel; Fernandes, Rui

    2017-04-01

    Relative sea level variations in the northwestern part of the Arabian Gulf have been estimated in the past using no more than 10 to 15 years of observations. In Alothman et al. (2014), we have almost doubled the period to 28.7 years by examining all available tide gauge data in the area and constructing a mean gauge time-series from seven coastal tide gauges. We found for the period 1979-2007 a relative sea level rise of about 2mm/yr, which correspond to an absolute sea level rise of about 1.5mm/yr based on the vertical displacement of GNSS stations in the region. By taking into account the temporal correlations we concluded that previous published results underestimate the true sea level rate error in this area by a factor of 5-10. In this work, we discuss and update the methodology and results from Alothman et al. (2014), particularly by checking and extending the GNSS solutions. Since 3 of the 6 GPS stations used only started observing in the end of 2011, the longer time series have now significantly lower uncertainties in the estimated vertical rate. In addition, we compare our results with GRACE derived ocean bottom pressure time series which are a good proxy of the changes in water mass in this area over time.

  11. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  12. Estimation of undernutrition and mean calorie intake in Africa: methodology, findings and implications

    Directory of Open Access Journals (Sweden)

    Nubé Maarten

    2009-06-01

    Full Text Available Abstract Background As poverty and hunger are basic yardsticks of underdevelopment and destitution, the need for reliable statistics in this domain is self-evident. While the measurement of poverty through surveys is relatively well documented in the literature, for hunger, information is much scarcer, particularly for adults, and very different methodologies are applied for children and adults. Our paper seeks to improve on this practice in two ways. One is that we estimate the prevalence of undernutrition in sub-Saharan Africa (SSA for both children and adults based on anthropometric data available at province or district level, and secondly, we estimate the mean calorie intake and implied calorie gap for SSA, also using anthropometric data on the same geographical aggregation level. Results Our main results are, first, that we find a much lower prevalence of hunger than presented in the Millennium Development reports (17.3% against 27.8% for the continent as a whole. Secondly, we find that there is much less spread in mean calorie intake across the continent than reported by the Food and Agricultural Organization (FAO in the State of Food and Agriculture, 2007, the only estimate that covers the whole of Africa. While FAO estimates for calorie availability vary from a low of 1760 Kcal/capita/day for Central Africa to a high of 2825 Kcal/capita/day for Southern Africa, our estimates lay in a range of 2245 Kcal/capita/day (Eastern Africa to 2618 Kcal/capita/day for Southern Africa. Thirdly, we validate the main data sources used (the Demographic and Health Surveys by comparing them over time and with other available data sources for various countries. Conclusion We conclude that the picture of Africa that emerges from anthropometric data is much less negative than that usually presented. Especially for Eastern and Central Africa, the nutritional status is less critical than commonly assumed and also mean calorie intake is higher, which implies

  13. Waste management programmatic environmental impact statement methodology for estimating human health risks

    International Nuclear Information System (INIS)

    Bergenback, B.; Blaylock, B.P.; Legg, J.L.

    1995-05-01

    The US Department of Energy (DOE) has produced large quantities of radioactive and hazardous waste during years of nuclear weapons production. As a result, a large number of sites across the DOE Complex have become chemically and/or radiologically contaminated. In 1990, the Secretary of Energy charged the DOE Office of Environmental Restoration and Waste management (EM) with the task of preparing a Programmatic Environmental Impact Statement (PEIS). The PEIS should identify and assess the potential environmental impacts of implementing several integrated Environmental Restoration (ER) and Waste Management (WM) alternatives. The determination and integration of appropriate remediation activities and sound waste management practices is vital for ensuring the diminution of adverse human health impacts during site cleanup and waste management programs. This report documents the PEIS risk assessment methodology used to evaluate human health risks posed by WM activities. The methodology presents a programmatic cradle to grave risk assessment for EM program activities. A unit dose approach is used to estimate risks posed by WM activities and is the subject of this document

  14. ETE-EVAL: a methodology for D and D cost estimation

    International Nuclear Information System (INIS)

    Decobert, G.; Robic, S.; Vanel, V.

    2008-01-01

    In compliance with Article 20 of the sustainable radioactive materials and waste management act dated 28 June 2006, the CEA and AREVA are required every three years to revise the cost of decommissioning their facilities and to provide the necessary assets by constituting a dedicated fund. For the 2007 revision the CEA used ETE-EVAL V5. Similarly, AREVA reevaluated the cost of decontaminating and dismantling its facilities at La Hague, as the previous estimate in 2004 did not take into account the complete cleanup of all the structural work. ETE-EVAL V5 is a computer application designed to estimate the cost of decontamination and dismantling of basic nuclear installations (INB). It has been qualified by Bureau Veritas and audited. ETE-EVAL V5 has become the official software for cost assessment of CEA civilian and AREVA decommissioning projects. It has been used by the DPAD (Decontamination and Dismantling Projects Department) cost assessment group to estimate the cost of decommissioning some thirty facilities (cost update on completion for the dedicated fund for dismantling civilian CEA facilities) and by AREVA to estimate the cost of decommissioning its fuel cycle back-end facilities. Some necessary modifications are now being implemented to allow for the specific aspects of fuel cycle front-end facilities. The computational method is based on physical, radiological and waste inventories following a particular methodology, and on interviews with operating personnel to compile ratios and financial data (operating cost, etc.) and enter them in a database called GREEN (from the French acronym for Management Ratios for Assessment of Nuclear Facilities). ETE-EVAL V5 comprises the cost assessment module and GREEN database. It has been enriched with the lessons learned from experience, and can be adapted as necessary to meet installation-specific requirements. The cost assessment module allows the user to estimate decommissioning costs once the inventory has been

  15. THE FORMATION OF DESIGN AND ORGANIZATIONAL AND TECHNOLOGICAL DECISIONS OF THE CONSTRUCTION OF HIGH-RISE MULTIPURPOSE COMPLEXES

    Directory of Open Access Journals (Sweden)

    BOLSHAKOV V. I.

    2016-05-01

    Full Text Available Purpose. The formation of the many ways the construction of high-rise multipurpose complexes. Methodology. The formation of system implementation variants of creation and functioning of high-rise multipurpose complexes using combinatorial morphological analysis and synthesis. Findings. Many life cycle options of high-rise multipurpose complexes. Originality. The developed method of formation of organizational and technological solutions adapted to the conditions of the construction of high-rise multipurpose complexes, which provides the opportunity for multi-variant conditions, taking into account regulatory requirements for fire safety, insolation of buildings and premises, protection against noise and vibration, energy efficiency, infrastructure and population density of a residential district with a full range of institutions and enterprises of local significance, within existing resource constraints, to ensure the commissioning of objects with specified technical and economic characteristics. Practical value. The proposed model and the methodology allow to determine a rational variant of high-rise building according to specified criteria and constraints.

  16. Methodology for uncertainty estimation of Hanford tank chemical and radionuclide inventories and concentrations

    International Nuclear Information System (INIS)

    Chen, G.; Ferryman, T.A.; Remund, K.M.

    1998-02-01

    The exact physical and chemical nature of 55 million gallons of toxic waste held in 177 underground waste tanks at the Hanford Site is not known with sufficient detail to support the safety, retrieval, and immobilization missions presented to Hanford. The Hanford Best Basis team has made point estimates of the inventories in each tank. The purpose of this study is to estimate probability distributions for each of the 71 analytes and 177 tanks that the Hanford Best Basis team has made point estimates for. This will enable uncertainty intervals to be calculated for the Best Basis inventories and should facilitate the safety, retrieval, and immobilization missions. Section 2 of this document describes the overall approach used to estimate tank inventory uncertainties. Three major components are considered in this approach: chemical concentration, density, and waste volume. Section 2 also describes the two different methods used to evaluate the tank wastes in terms of sludges and in terms of supernatant or saltcakes. Sections 3 and 4 describe in detail the methodology to assess the probability distributions for each of the three components, as well as the data sources for implementation. The conclusions are given in Section 5

  17. A Capacitance-Based Methodology for the Estimation of Piezoelectric Coefficients of Poled Piezoelectric Materials

    KAUST Repository

    Al Ahmad, Mahmoud

    2010-10-04

    A methodology is proposed to estimate the piezoelectric coefficients of bulk piezoelectric materials using simple capacitance measurements. The extracted values of d33 and d31 from the capacitance measurements were 506 pC/N and 247 pC/N, respectively. The d33 value is in agreement with that obtained from the Berlincourt method, which gave a d33 value of 500 pC/N. In addition, the d31 value is in agreement with the value obtained from the optical method, which gave a d 31 value of 223 pC/V. These results suggest that the proposed method is a viable way to quickly estimate piezoelectric coefficients of bulk unclamped samples. © 2010 The Electrochemical Society.

  18. Vertical Rise Velocity of Equatorial Plasma Bubbles Estimated from Equatorial Atmosphere Radar Observations and High-Resolution Bubble Model Simulations

    Science.gov (United States)

    Yokoyama, T.; Ajith, K. K.; Yamamoto, M.; Niranjan, K.

    2017-12-01

    Equatorial plasma bubble (EPB) is a well-known phenomenon in the equatorial ionospheric F region. As it causes severe scintillation in the amplitude and phase of radio signals, it is important to understand and forecast the occurrence of EPBs from a space weather point of view. The development of EPBs is presently believed as an evolution of the generalized Rayleigh-Taylor instability. We have already developed a 3D high-resolution bubble (HIRB) model with a grid spacing of as small as 1 km and presented nonlinear growth of EPBs which shows very turbulent internal structures such as bifurcation and pinching. As EPBs have field-aligned structures, the latitude range that is affected by EPBs depends on the apex altitude of EPBs over the dip equator. However, it was not easy to observe the apex altitude and vertical rise velocity of EPBs. Equatorial Atmosphere Radar (EAR) in Indonesia is capable of steering radar beams quickly so that the growth phase of EPBs can be captured clearly. The vertical rise velocities of the EPBs observed around the midnight hours are significantly smaller compared to those observed in postsunset hours. Further, the vertical growth of the EPBs around midnight hours ceases at relatively lower altitudes, whereas the majority of EPBs at postsunset hours found to have grown beyond the maximum detectable altitude of the EAR. The HIRB model with varying background conditions are employed to investigate the possible factors that control the vertical rise velocity and maximum attainable altitudes of EPBs. The estimated rise velocities from EAR observations at both postsunset and midnight hours are, in general, consistent with the nonlinear evolution of EPBs from the HIRB model.

  19. Using Rising Limb Analysis to Estimate Uptake of Reactive Solutes in Advective and Transient Storage Sub-compartments of Stream Ecosystems

    Science.gov (United States)

    Thomas, S. A.; Valett, H.; Webster, J. R.; Mulholland, P. J.; Dahm, C. N.

    2001-12-01

    Identifying the locations and controls governing solute uptake is a recent area of focus in studies of stream biogeochemistry. We introduce a technique, rising limb analysis (RLA), to estimate areal nitrate uptake in the advective and transient storage (TS) zones of streams. RLA is an inverse approach that combines nutrient spiraling and transient storage modeling to calculate total uptake of reactive solutes and the fraction of uptake occurring within the advective sub-compartment of streams. The contribution of the transient storage zones to solute loss is determined by difference. Twelve-hour coinjections of conservative (Cl-) and reactive (15NO3) tracers were conducted seasonally in several headwater streams among which AS/A ranged from 0.01 - 2.0. TS characteristics were determined using an advection-dispersion model modified to include hydrologic exchange with a transient storage compartment. Whole-system uptake was determined by fitting the longitudinal pattern of NO3 to first-order, exponential decay model. Uptake in the advective sub-compartment was determined by collecting a temporal sequence of samples from a single location beginning with the arrival of the solute front and concluding with the onset of plateau conditions (i.e. the rising limb). Across the rising limb, 15NO3:Cl was regressed against the percentage of water that had resided in the transient storage zone (calculated from the TS modeling). The y-intercept thus provides an estimate of the plateau 15NO3:Cl ratio in the absence of NO3 uptake within the transient storage zone. Algebraic expressions were used to calculate the percentage of NO3 uptake occurring in the advective and transient storage sub-compartments. Application of RLA successfully estimated uptake coefficients for NO3 in the subsurface when the physical dimensions of that habitat were substantial (AS/A > 0.2) and when plateau conditions at the sampling location consisted of waters in which at least 25% had resided in the

  20. Temperature-based estimation of global solar radiation using soft computing methodologies

    Science.gov (United States)

    Mohammadi, Kasra; Shamshirband, Shahaboddin; Danesh, Amir Seyed; Abdullah, Mohd Shahidan; Zamani, Mazdak

    2016-07-01

    Precise knowledge of solar radiation is indeed essential in different technological and scientific applications of solar energy. Temperature-based estimation of global solar radiation would be appealing owing to broad availability of measured air temperatures. In this study, the potentials of soft computing techniques are evaluated to estimate daily horizontal global solar radiation (DHGSR) from measured maximum, minimum, and average air temperatures ( T max, T min, and T avg) in an Iranian city. For this purpose, a comparative evaluation between three methodologies of adaptive neuro-fuzzy inference system (ANFIS), radial basis function support vector regression (SVR-rbf), and polynomial basis function support vector regression (SVR-poly) is performed. Five combinations of T max, T min, and T avg are served as inputs to develop ANFIS, SVR-rbf, and SVR-poly models. The attained results show that all ANFIS, SVR-rbf, and SVR-poly models provide favorable accuracy. Based upon all techniques, the higher accuracies are achieved by models (5) using T max- T min and T max as inputs. According to the statistical results, SVR-rbf outperforms SVR-poly and ANFIS. For SVR-rbf (5), the mean absolute bias error, root mean square error, and correlation coefficient are 1.1931 MJ/m2, 2.0716 MJ/m2, and 0.9380, respectively. The survey results approve that SVR-rbf can be used efficiently to estimate DHGSR from air temperatures.

  1. Sensitivity analysis of hydrogeological parameters affecting groundwater storage change caused by sea level rise

    Science.gov (United States)

    Shin, J.; Kim, K.-H.; Lee, K.-K.

    2012-04-01

    Sea level rise, which is one of the representative phenomena of climate changes caused by global warming, can affect groundwater system. The rising trend of the sea level caused by the global warming is reported to be about 3 mm/year for the most recent 10 year average (IPCC, 2007). The rate of sea level rise around the Korean peninsula is reported to be 2.30±2.22 mm/yr during the 1960-1999 period (Cho, 2002) and 2.16±1.77 mm/yr (Kim et al., 2009) during the 1968-2007 period. Both of these rates are faster than the 1.8±0.5 mm/yr global average for the similar 1961-2003 period (IPCC, 2007). In this study, we analyzed changes in the groundwater environment affected by the sea level rise by using an analytical methodology. We tried to find the most effective parameters of groundwater amount change in order to estimate the change in fresh water amount in coastal groundwater. A hypothetical island model of a cylindrical shape in considered. The groundwater storage change is bi-directional as the sea level rises according to the natural and hydrogeological conditions. Analysis of the computation results shows that topographic slope and hydraulic conductivity are the most sensitive factors. The contributions of the groundwater recharge rate and the thickness of aquifer below sea level are relatively less effective. In the island with steep seashore slopes larger than 1~2 degrees or so, the storage amount of fresh water in a coastal area increases as sea level rises. On the other hand, when sea level drops, the storage amount decreases. This is because the groundwater level also rises with the rising sea level in steep seashores. For relatively flat seashores, where the slope is smaller than around 1-2 degrees, the storage amount of coastal fresh water decreases when the sea level rises because the area flooded by the rising sea water is increased. The volume of aquifer fresh water in this circumstance is greatly reduced in proportion to the flooded area with the sea

  2. The use of nanomodified concrete in construction of high-rise buildings

    Science.gov (United States)

    Prokhorov, Sergei

    2018-03-01

    Construction is one of the leading economy sectors. Currently, concrete is the basis of most of the structural elements, without which it is impossible to imagine the construction of a single building or facility. Their strength, reinforcement and the period of concrete lifetime are determined at the design stage, taking into account long-term operation. However, in real life, the number of impacts that affects the structural strength is pretty high. In some cases, they are random and do not have standardized values. This is especially true in the construction and exploitation of high-rise buildings and structures. Unlike the multi-storey buildings, they experience significant loads already at the stage of erection, as they support load-lifting mechanisms, formwork systems, workers, etc. The purpose of the presented article is to develop a methodology for estimating the internal fatigue of concrete structures based on changes in their electrical conductivity.

  3. Compilation and review of methodologies for estimating the comparative electric power system costs for renewable energy systems. Working material

    International Nuclear Information System (INIS)

    1993-01-01

    This Working Material provides a review of methodologies for estimating the costs of renewable energy systems and the state of art knowledge on stochastic features and economic evaluation methodologies of renewable energy systems for electricity generation in a grid integrated system. It is expected that this material facilitates the wider access by interested persons to sources for relevant comparative assessment activities which are progressing in the IAEA. Refs, figs, tabs

  4. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...... of parameters was performed in order to get a good model fit to the data. However, not all parameters are identifiable with the given data set and model structure. Sensitivity, identifiability, and uncertainty analysis were completed and a relevant identifiable subset of parameters was determined for a new...

  5. High-rise construction in the Russian economy: modeling of management decisions

    Science.gov (United States)

    Miroshnikova, Tatyana; Taskaeva, Natalia

    2018-03-01

    The growth in the building industry, particularly in residential high-rise construction, is having considerable influence on the country's economic development. The scientific hypothesis of the research is that the execution of town-planning programs of high-rise construction depends to a large extent on the management of the provision of material resources for the construction of a millionth city, while the balance model is the most important tool for establishing and determining the ratio between supply and demand for material resources. In order to improve the efficiency of high-rise building management, it is proposed to develop a methodology for managing the provision of construction of large cities with material resources.

  6. Projecting Future Sea Level Rise for Water Resources Planning in California

    Science.gov (United States)

    Anderson, J.; Kao, K.; Chung, F.

    2008-12-01

    Sea level rise is one of the major concerns for the management of California's water resources. Higher water levels and salinity intrusion into the Sacramento-San Joaquin Delta could affect water supplies, water quality, levee stability, and aquatic and terrestrial flora and fauna species and their habitat. Over the 20th century, sea levels near San Francisco Bay increased by over 0.6ft. Some tidal gauge and satellite data indicate that rates of sea level rise are accelerating. Sea levels are expected to continue to rise due to increasing air temperatures causing thermal expansion of the ocean and melting of land-based ice such as ice on Greenland and in southeastern Alaska. For water planners, two related questions are raised on the uncertainty of future sea levels. First, what is the expected sea level at a specific point in time in the future, e.g., what is the expected sea level in 2050? Second, what is the expected point of time in the future when sea levels will exceed a certain height, e.g., what is the expected range of time when the sea level rises by one foot? To address these two types of questions, two factors are considered: (1) long term sea level rise trend, and (2) local extreme sea level fluctuations. A two-step approach will be used to develop sea level rise projection guidelines for decision making that takes both of these factors into account. The first step is developing global sea level rise probability distributions for the long term trends. The second step will extend the approach to take into account the effects of local astronomical tides, changes in atmospheric pressure, wind stress, floods, and the El Niño/Southern Oscillation. In this paper, the development of the first step approach is presented. To project the long term sea level rise trend, one option is to extend the current rate of sea level rise into the future. However, since recent data indicate rates of sea level rise are accelerating, methods for estimating sea level rise

  7. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type

    International Nuclear Information System (INIS)

    Alva N, J.

    2010-01-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  8. Accelerated lifetime testing methodology for lifetime estimation of Lithium-ion batteries used in augmented wind power plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2013-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium-ion batteries. The results obtained at the end of the accelerated ageing process can be used for the parametrization of a performance-degradation lifetime model. In the proposed...... methodology both calendar and cycling lifetime tests are considered since both components are influencing the lifetime of Lithium-ion batteries. The methodology proposes also a lifetime model verification stage, where Lithium-ion battery cells are tested at normal operating conditions using an application...

  9. Accelerated Lifetime Testing Methodology for Lifetime Estimation of Lithium-ion Batteries used in Augmented Wind Power Plants

    DEFF Research Database (Denmark)

    Stroe, Daniel Ioan; Swierczynski, Maciej Jozef; Stan, Ana-Irina

    2014-01-01

    The development of lifetime estimation models for Lithium-ion battery cells, which are working under highly variable mission profiles characteristic for wind power plant applications, requires a lot of expenditures and time resources. Therefore, batteries have to be tested under accelerated...... lifetime ageing conditions. This paper presents a three-stage methodology used for accelerated lifetime testing of Lithium ion batteries. The results obtained at the end of the accelerated ageing process were used for the parametrization of a performance-degradation lifetime model, which is able to predict...... both the capacity fade and the power capability decrease of the selected Lithium-ion battery cells. In the proposed methodology both calendar and cycling lifetime tests were considered since both components are influencing the lifetime of Lithium-ion batteries. Furthermore, the proposed methodology...

  10. Rural demographic change, rising wages and the restructuring of Chinese agriculture

    DEFF Research Database (Denmark)

    Li, Tianxiang; Yu, Wusheng; Baležentis, Tomas

    2017-01-01

    Purpose The purpose of this paper is to identify the effects of recent demographic transition and rising labor costs on agricultural production structure and pattern in China during 1998-2012. Design/methodology/approach The authors, first, theoretically discuss the effects of changing relative...

  11. Methodology to Estimate the Quantity, Composition, and Management of Construction and Demolition Debris in the United States

    Science.gov (United States)

    This report, Methodology to Estimate the Quantity, Composition and Management of Construction and Demolition Debris in the US, was developed to expand access to data on CDD in the US and to support research on CDD and sustainable materials management. Since past US EPA CDD estima...

  12. Quantitative testing of the methodology for genome size estimation in plants using flow cytometry: a case study of the Primulina genus

    Directory of Open Access Journals (Sweden)

    Jing eWang

    2015-05-01

    Full Text Available Flow cytometry (FCM is a commonly used method for estimating genome size in many organisms. The use of flow cytometry in plants is influenced by endogenous fluorescence inhibitors and may cause an inaccurate estimation of genome size; thus, falsifying the relationship between genome size and phenotypic traits/ecological performance. Quantitative optimization of FCM methodology minimizes such errors, yet there are few studies detailing this methodology. We selected the genus Primulina, one of the most representative and diverse genera of the Old World Gesneriaceae, to evaluate the methodology effect on determining genome size. Our results showed that buffer choice significantly affected genome size estimation in six out of the eight species examined and altered the 2C-value (DNA content by as much as 21.4%. The staining duration and propidium iodide (PI concentration slightly affected the 2C-value. Our experiments showed better histogram quality when the samples were stained for 40 minutes at a PI concentration of 100 µg ml-1. The quality of the estimates was not improved by one-day incubation in the dark at 4 °C or by centrifugation. Thus, our study determined an optimum protocol for genome size measurement in Primulina: LB01 buffer supplemented with 100 µg ml-1 PI and stained for 40 minutes. This protocol also demonstrated a high universality in other Gesneriaceae genera. We report the genome size of nine Gesneriaceae species for the first time. The results showed substantial genome size variation both within and among the species, with the 2C-value ranging between 1.62 and 2.71 pg. Our study highlights the necessity of optimizing the FCM methodology prior to obtaining reliable genome size estimates in a given taxon.

  13. A methodology to estimate earthquake effects on fractures intersecting canister holes

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, P.; Wallmann, P.; Thomas, A.; Follin, S. [Golder Assocites Inc. (Sweden)

    1997-03-01

    A literature review and a preliminary numerical modeling study were carried out to develop and demonstrate a method for estimating displacements on fractures near to or intersecting canister emplacement holes. The method can be applied during preliminary evaluation of candidate sites prior to any detailed drilling or underground excavation, utilizing lineament maps and published regression relations between surface rupture trace length and earthquake magnitude, rupture area and displacements. The calculated displacements can be applied to lineament traces which are assumed to be faults and may be the sites for future earthquakes. Next, a discrete fracture model is created for secondary faulting and jointing in the vicinity of the repository. These secondary fractures may displace due to the earthquake on the primary faults. The three-dimensional numerical model assumes linear elasticity and linear elastic fracture mechanics which provides a conservative displacement estimate, while still preserving realistic fracture patterns. Two series of numerical studies were undertaken to demonstrate how the methodology could be implemented and how results could be applied to questions regarding site selection and performance assessment. The first series illustrates how earthquake damage to a hypothetical repository for a specified location (Aespoe) could be estimated. A second series examined the displacements induced by earthquakes varying in magnitude from 6.0 to 8.2 as a function of how close the earthquake was in relation to the repository. 143 refs, 25 figs, 7 tabs.

  14. A methodology to estimate earthquake effects on fractures intersecting canister holes

    International Nuclear Information System (INIS)

    La Pointe, P.; Wallmann, P.; Thomas, A.; Follin, S.

    1997-03-01

    A literature review and a preliminary numerical modeling study were carried out to develop and demonstrate a method for estimating displacements on fractures near to or intersecting canister emplacement holes. The method can be applied during preliminary evaluation of candidate sites prior to any detailed drilling or underground excavation, utilizing lineament maps and published regression relations between surface rupture trace length and earthquake magnitude, rupture area and displacements. The calculated displacements can be applied to lineament traces which are assumed to be faults and may be the sites for future earthquakes. Next, a discrete fracture model is created for secondary faulting and jointing in the vicinity of the repository. These secondary fractures may displace due to the earthquake on the primary faults. The three-dimensional numerical model assumes linear elasticity and linear elastic fracture mechanics which provides a conservative displacement estimate, while still preserving realistic fracture patterns. Two series of numerical studies were undertaken to demonstrate how the methodology could be implemented and how results could be applied to questions regarding site selection and performance assessment. The first series illustrates how earthquake damage to a hypothetical repository for a specified location (Aespoe) could be estimated. A second series examined the displacements induced by earthquakes varying in magnitude from 6.0 to 8.2 as a function of how close the earthquake was in relation to the repository. 143 refs, 25 figs, 7 tabs

  15. The method of selecting an integrated development territory for the high-rise unique constructions

    Science.gov (United States)

    Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena

    2018-03-01

    On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.

  16. Modular High Voltage Pulse Converter for Short Rise and Decay Times

    NARCIS (Netherlands)

    Mao, S.

    2018-01-01

    This thesis explores a modular HV pulse converter technology with short rise and decay times. A systematic methodology to derive and classify HV architectures based on a modularization level of power building blocks of the HV pulse converter is developed to summarize existing architectures and

  17. Combined methodology for estimating dose rates and health effects from exposure to radioactive pollutants

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Leggett, R.W.; Yalcintas, M.G.

    1980-12-01

    The work described in the report is basically a synthesis of two previously existing computer codes: INREM II, developed at the Oak Ridge National Laboratory (ORNL); and CAIRD, developed by the Environmental Protection Agency (EPA). The INREM II code uses contemporary dosimetric methods to estimate doses to specified reference organs due to inhalation or ingestion of a radionuclide. The CAIRD code employs actuarial life tables to account for competing risks in estimating numbers of health effects resulting from exposure of a cohort to some incremental risk. The combined computer code, referred to as RADRISK, estimates numbers of health effects in a hypothetical cohort of 100,000 persons due to continuous lifetime inhalation or ingestion of a radionuclide. Also briefly discussed in this report is a method of estimating numbers of health effects in a hypothetical cohort due to continuous lifetime exposure to external radiation. This method employs the CAIRD methodology together with dose conversion factors generated by the computer code DOSFACTER, developed at ORNL; these dose conversion factors are used to estimate dose rates to persons due to radionuclides in the air or on the ground surface. The combination of the life table and dosimetric guidelines for the release of radioactive pollutants to the atmosphere, as required by the Clean Air Act Amendments of 1977.

  18. Deterministic sensitivity and uncertainty methodology for best estimate system codes applied in nuclear technology

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.; Cacuci, D.G.

    2009-01-01

    Nuclear Power Plant (NPP) technology has been developed based on the traditional defense in depth philosophy supported by deterministic and overly conservative methods for safety analysis. In the 1970s [1], conservative hypotheses were introduced for safety analyses to address existing uncertainties. Since then, intensive thermal-hydraulic experimental research has resulted in a considerable increase in knowledge and consequently in the development of best-estimate codes able to provide more realistic information about the physical behaviour and to identify the most relevant safety issues allowing the evaluation of the existing actual margins between the results of the calculations and the acceptance criteria. However, the best-estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are un-predictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes (BE) within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Taking into consideration the above framework, a comprehensive approach for utilizing quantified uncertainties arising from Integral Test Facilities (ITFs, [2]) and Separate Effect Test Facilities (SETFs, [3]) in the process of calibrating complex computer models for the application to NPP transient scenarios has been developed. The methodology proposed is capable of accommodating multiple SETFs and ITFs to learn as much as possible about uncertain parameters, allowing for the improvement of the computer model predictions based on the available experimental evidences. The proposed methodology constitutes a major step forward with respect to the generally used expert judgment and statistical methods as it permits a) to establish the uncertainties of any parameter

  19. A Hierarchical Clustering Methodology for the Estimation of Toxicity

    Science.gov (United States)

    A Quantitative Structure Activity Relationship (QSAR) methodology based on hierarchical clustering was developed to predict toxicological endpoints. This methodology utilizes Ward's method to divide a training set into a series of structurally similar clusters. The structural sim...

  20. Committed sea-level rise under the Paris Agreement and the legacy of delayed mitigation action.

    Science.gov (United States)

    Mengel, Matthias; Nauels, Alexander; Rogelj, Joeri; Schleussner, Carl-Friedrich

    2018-02-20

    Sea-level rise is a major consequence of climate change that will continue long after emissions of greenhouse gases have stopped. The 2015 Paris Agreement aims at reducing climate-related risks by reducing greenhouse gas emissions to net zero and limiting global-mean temperature increase. Here we quantify the effect of these constraints on global sea-level rise until 2300, including Antarctic ice-sheet instabilities. We estimate median sea-level rise between 0.7 and 1.2 m, if net-zero greenhouse gas emissions are sustained until 2300, varying with the pathway of emissions during this century. Temperature stabilization below 2 °C is insufficient to hold median sea-level rise until 2300 below 1.5 m. We find that each 5-year delay in near-term peaking of CO 2 emissions increases median year 2300 sea-level rise estimates by ca. 0.2 m, and extreme sea-level rise estimates at the 95th percentile by up to 1 m. Our results underline the importance of near-term mitigation action for limiting long-term sea-level rise risks.

  1. Trip Energy Estimation Methodology and Model Based on Real-World Driving Data for Green Routing Applications: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Holden, Jacob [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Van Til, Harrison J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gonder, Jeffrey D [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhu, Lei [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-09

    A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any type of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.

  2. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  3. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  4. The Rise in Co-authorship in the Social Sciences (1980-2013)

    DEFF Research Database (Denmark)

    Henriksen, Dorte

    2015-01-01

    This paper examines the rise in co-authorship in the Social Sciences over a 33-year period. We investigate the development in co-authorship in different research areas and discuss how the methodological differences in these research areas and changes in academia affect the tendency to co-author a......This paper examines the rise in co-authorship in the Social Sciences over a 33-year period. We investigate the development in co-authorship in different research areas and discuss how the methodological differences in these research areas and changes in academia affect the tendency to co......-author articles. The study is based on bibliographic data about 4.5 million peer review articles published in the period 1980- 2013 and indexed in the 56 subject categories of the Web of Science’s (WoS) Social Science Citation Index (SSCI). Results show that in the majority of the subject categories we can...... data set, statistical methods and/or team-production models....

  5. Estimating the Greenland ice sheet surface mass balance contribution to future sea level rise using the regional atmospheric climate model MAR

    Directory of Open Access Journals (Sweden)

    X. Fettweis

    2013-03-01

    Full Text Available To estimate the sea level rise (SLR originating from changes in surface mass balance (SMB of the Greenland ice sheet (GrIS, we present 21st century climate projections obtained with the regional climate model MAR (Modèle Atmosphérique Régional, forced by output of three CMIP5 (Coupled Model Intercomparison Project Phase 5 general circulation models (GCMs. Our results indicate that in a warmer climate, mass gain from increased winter snowfall over the GrIS does not compensate mass loss through increased meltwater run-off in summer. Despite the large spread in the projected near-surface warming, all the MAR projections show similar non-linear increase of GrIS surface melt volume because no change is projected in the general atmospheric circulation over Greenland. By coarsely estimating the GrIS SMB changes from GCM output, we show that the uncertainty from the GCM-based forcing represents about half of the projected SMB changes. In 2100, the CMIP5 ensemble mean projects a GrIS SMB decrease equivalent to a mean SLR of +4 ± 2 cm and +9 ± 4 cm for the RCP (Representative Concentration Pathways 4.5 and RCP 8.5 scenarios respectively. These estimates do not consider the positive melt–elevation feedback, although sensitivity experiments using perturbed ice sheet topographies consistent with the projected SMB changes demonstrate that this is a significant feedback, and highlight the importance of coupling regional climate models to an ice sheet model. Such a coupling will allow the assessment of future response of both surface processes and ice-dynamic changes to rising temperatures, as well as their mutual feedbacks.

  6. Methodologies for estimating air emissions from three non-traditional source categories: Oil spills, petroleum vessel loading and unloading, and cooling towers. Final report, October 1991-March 1993

    International Nuclear Information System (INIS)

    Ramadan, W.; Sleva, S.; Dufner, K.; Snow, S.; Kersteter, S.L.

    1993-04-01

    The report discusses part of EPA's program to identify and characterize emissions sources not currently accounted for by either the existing Aerometric Information Retrieval System (AIRS) or State Implementation Plan (SIP) area source methodologies and to develop appropriate emissions estimation methodologies and emission factors for a group of these source categories. Based on the results of the identification and characterization portions of this research, three source categories were selected for methodology and emission factor development: oil spills, petroleum vessel loading and unloading, and cooling towers. The report describes the category selection process and presents emissions estimation methodologies and emission factor data for the selected source categories. The discussions for each category include general background information, emissions generation activities, pollutants emitted, sources of activity and pollutant data, emissions estimation methodologies and data issues. The information used in these discussions was derived from various sources including available literature, industrial and trade association publications and contracts, experts on the category and activity, and knowledgeable federal and state personnel

  7. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Rebollo, L.

    1993-01-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  8. Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study

    Directory of Open Access Journals (Sweden)

    Elie Bienenstock

    2008-06-01

    Full Text Available Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i For all estimators considered, the main source of error is the bias. (ii The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in

  9. REGULARITIES OF THE INFLUENCE OF ORGANIZATIONAL AND TECHNOLOGICAL FACTORS ON THE DURATION OF CONSTRUCTION OF HIGH-RISE MULTIFUNCTIONAL COMPLEXES

    Directory of Open Access Journals (Sweden)

    ZAIATS Yi. I.

    2015-10-01

    Full Text Available Problem statement. Technical and economic indexes of projects of construction of high-rise multifunctional complexes, namely: the duration of construction works and the cost of building products depends on the technology of construction works and method of construction organization, and on their choice influence the architectural and design, constructional and engineering made decisions. Purpose. To reveal the regularity of influence of organizational and technological factors on the duration of construction of high-rise multifunctional complexes in the conditions of dense city building. Conclusion. To reveal the regularity of the influence of organizational and technological factors (the height, the factor complexity of design of project and and estimate documentation, factor of complexity of construction works, the factor of complexity of control of investment and construction project, economy factor, comfort factor, factor of technology of projected solutions for the duration of the construction of high-rise multifunctional complexes (depending on their height: from 73,5 m to 100 m inclusively; from 100 m to 200 m inclusively allow us to quantitatively assess their influence and can be used in the development of the methodology of substantiation of the expediency and effectiveness of the realization of projects of high-rise construction in condition of compacted urban development, based on the consideration of the influence of organizational and technological aspects.

  10. Drought risk assessment under climate change is sensitive to methodological choices for the estimation of evaporative demand.

    Science.gov (United States)

    Dewes, Candida F; Rangwala, Imtiaz; Barsugli, Joseph J; Hobbins, Michael T; Kumar, Sanjiv

    2017-01-01

    Several studies have projected increases in drought severity, extent and duration in many parts of the world under climate change. We examine sources of uncertainty arising from the methodological choices for the assessment of future drought risk in the continental US (CONUS). One such uncertainty is in the climate models' expression of evaporative demand (E0), which is not a direct climate model output but has been traditionally estimated using several different formulations. Here we analyze daily output from two CMIP5 GCMs to evaluate how differences in E0 formulation, treatment of meteorological driving data, choice of GCM, and standardization of time series influence the estimation of E0. These methodological choices yield different assessments of spatio-temporal variability in E0 and different trends in 21st century drought risk. First, we estimate E0 using three widely used E0 formulations: Penman-Monteith; Hargreaves-Samani; and Priestley-Taylor. Our analysis, which primarily focuses on the May-September warm-season period, shows that E0 climatology and its spatial pattern differ substantially between these three formulations. Overall, we find higher magnitudes of E0 and its interannual variability using Penman-Monteith, in particular for regions like the Great Plains and southwestern US where E0 is strongly influenced by variations in wind and relative humidity. When examining projected changes in E0 during the 21st century, there are also large differences among the three formulations, particularly the Penman-Monteith relative to the other two formulations. The 21st century E0 trends, particularly in percent change and standardized anomalies of E0, are found to be sensitive to the long-term mean value and the amplitude of interannual variability, i.e. if the magnitude of E0 and its interannual variability are relatively low for a particular E0 formulation, then the normalized or standardized 21st century trend based on that formulation is amplified

  11. Methodology for estimating realistic responses of buildings and components under earthquake motion and its application

    International Nuclear Information System (INIS)

    Ebisawa, Katsumi; Abe, Kiyoharu; Kohno, Kunihiko; Nakamura, Hidetaka; Itoh, Mamoru.

    1996-11-01

    Failure probabilities of buildings and components under earthquake motion are estimated as conditional probabilities that their realistic responses exceed their capacities. Two methods for estimating their failure probabilities have already been developed. One is a detailed method developed in the Seismic Safety margins Research Program of Lawrence Livermore National Laboratory in U.S.A., which is called 'SSMRP method'. The other is a simplified method proposed by Kennedy et al., which is called 'Zion method'. The Zion method is sometimes called 'response factor method'. The authors adopted the response factor method. In order to enhance the estimation accuracy of failure probabilities of buildings and components, however, a new methodology for improving the response factor method was proposed. Based on the improved method, response factors of buildings and components designed to seismic design standard in Japan were estimated, and their realistic responses were also calculated. By using their realistic responses and capacities, the failure probabilities of a reactor building and relays were estimated. In order to identify the difference between new method, SSMRP method and original response factor method, the failure probabilities were compared estimated by these three methods. A similar method of SSMRP was used instead of the original SSMRP for saving time and labor. The viewpoints for selecting the methods to estimate failure probabilities of buildings and components were also proposed. (author). 55 refs

  12. Large Volcanic Rises on Venus

    Science.gov (United States)

    Smrekar, Suzanne E.; Kiefer, Walter S.; Stofan, Ellen R.

    1997-01-01

    Large volcanic rises on Venus have been interpreted as hotspots, or the surface manifestation of mantle upwelling, on the basis of their broad topographic rises, abundant volcanism, and large positive gravity anomalies. Hotspots offer an important opportunity to study the behavior of the lithosphere in response to mantle forces. In addition to the four previously known hotspots, Atla, Bell, Beta, and western Eistla Regiones, five new probable hotspots, Dione, central Eistla, eastern Eistla, Imdr, and Themis, have been identified in the Magellan radar, gravity and topography data. These nine regions exhibit a wider range of volcano-tectonic characteristics than previously recognized for venusian hotspots, and have been classified as rift-dominated (Atla, Beta), coronae-dominated (central and eastern Eistla, Themis), or volcano-dominated (Bell, Dione, western Eistla, Imdr). The apparent depths of compensation for these regions ranges from 65 to 260 km. New estimates of the elastic thickness, using the 90 deg and order spherical harmonic field, are 15-40 km at Bell Regio, and 25 km at western Eistla Regio. Phillips et al. find a value of 30 km at Atla Regio. Numerous models of lithospheric and mantle behavior have been proposed to interpret the gravity and topography signature of the hotspots, with most studies focusing on Atla or Beta Regiones. Convective models with Earth-like parameters result in estimates of the thickness of the thermal lithosphere of approximately 100 km. Models of stagnant lid convection or thermal thinning infer the thickness of the thermal lithosphere to be 300 km or more. Without additional constraints, any of the model fits are equally valid. The thinner thermal lithosphere estimates are most consistent with the volcanic and tectonic characteristics of the hotspots. Estimates of the thermal gradient based on estimates of the elastic thickness also support a relatively thin lithosphere (Phillips et al.). The advantage of larger estimates of

  13. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  14. Appearance Principles of High Rise Buildings in the City Center: Visual Efect to Historical Heritage, Regulation Proposals

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2016-04-01

    Full Text Available High rise buildings is the phenomenon of XXI century, the expression of city’s economic and political power. This is the reflection of contemporary, modern and attractive city. Very often high rise buildings, who are characterized by a unique morphology, the parameters of high, density and intensity, are built near the historic center areas and cause irreversible visual impact on the historic sites, fundamentally altering the silhouette of the city. As a result, new problems and challenges appear. In this article the evolution of high rise buildings according to London, Jerusalem, Ottawa, Vilnius cities examples is analysed, the latest methodological principles which are applicable to control the development of high-rise buildings in the central parts of the city, while providing preservation and representation of cultural heritage are discussed. The latest computer technologies which are applied in urban regulations are presented. In case of Lithuania, high-rise building spatial development, general, spatial planning documents, urban design concepts, and monitoring of virtual city panoramas are reviewed. Comparative analysis in order to find out the essential methodological differences between cities regulation systems is done.

  15. Estimation of retired mobile phones generation in China: A comparative study on methodology.

    Science.gov (United States)

    Li, Bo; Yang, Jianxin; Lu, Bin; Song, Xiaolong

    2015-01-01

    Due to the rapid development of economy and technology, China has the biggest production and possession of mobile phones around the world. In general, mobile phones have relatively short life time because the majority of users replace their mobile phones frequently. Retired mobile phones represent the most valuable electrical and electronic equipment (EEE) in the main waste stream because of such characteristics as large quantity, high reuse/recovery value and fast replacement frequency. Consequently, the huge amount of retired mobile phones in China calls for a sustainable management system. The generation estimation can provide fundamental information to construct the sustainable management system of retired mobile phones and other waste electrical and electronic equipment (WEEE). However, the reliable estimation result is difficult to get and verify. The priority aim of this paper is to provide proper estimation approach for the generation of retired mobile phones in China, by comparing some relevant methods. The results show that the sales&new method is in the highest priority in estimation of the retired mobile phones. The result of sales&new method shows that there are 47.92 million mobile phones retired in 2002, and it reached to 739.98 million in China in 2012. It presents an increasing tendency with some fluctuations clearly. Furthermore, some discussions on methodology, such as the selection of improper approach and error in the input data, are also conducted in order to improve generation estimation of retired mobile phones and other WEEE. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Demonstration of an infiltration evaluation methodology

    International Nuclear Information System (INIS)

    Smyth, J.D.; Gee, G.W.; Kincaid, C.T.; Nichols, W.M.; Bresler, E.

    1990-07-01

    An Infiltration Evaluation Methodology (IEM) was developed for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL) to provide a consistent, well formulated approach for evaluating drainage through engineered covers at low-level radioactive waste (LLW) sites. The methodology is designed to help evaluate the ability of proposed waste site covers to minimize drainage for LLW site license applications and for sites associated with the Uranium Mill Tailings Remedial Action (UMTRA) program. The objective of this methodology is to estimate the drainage through an engineered burial site cover system. The drainage estimate can be used as an input to a broader performance assessment methodology currently under development by the NRC. The methodology is designed to simulate, at the field scale, significant factors and hydrologic conditions which determine or influence estimates of infiltration, long-term moisture content profiles, and drainage from engineered covers and barriers. The IEM developed under this study acknowledges the uncertainty inherent in soil properties and quantifies the influence of such uncertainty on the estimates of drainage in engineered cover systems at waste disposal sites. 6 refs., 1 fig

  17. Trend analysis of modern high-rise construction

    Science.gov (United States)

    Radushinsky, Dmitry; Gubankov, Andrey; Mottaeva, Asiiat

    2018-03-01

    The article reviews the main trends of modern high-rise construction considered a number of architectural, engineering and technological, economic and image factors that have influenced the intensification of construction of high-rise buildings in the 21st century. The key factors of modern high-rise construction are identified, which are associated with an attractive image component for businessmen and politicians, with the ability to translate current views on architecture and innovations in construction technologies and the lobbying of relevant structures, as well as the opportunity to serve as an effective driver in the development of a complex of national economy sectors with the achievement of a multiplicative effect. The estimation of the priority nature of participation of foreign architectural bureaus in the design of super-high buildings in Russia at the present stage is given. The issue of economic expediency of construction of high-rise buildings, including those with only a residential function, has been investigated. The connection between the construction of skyscrapers as an important component of the image of cities in the marketing of places and territories, the connection of the availability of a high-rise center, the City, with the possibilities of attracting a "creative class" and the features of creating a large working space for specialists on the basis of territorial proximity and density of high-rise buildings.

  18. The method of selecting an integrated development territory for the high-rise unique constructions

    Directory of Open Access Journals (Sweden)

    Sheina Svetlana

    2018-01-01

    Full Text Available On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.

  19. Methods of erection of high-rise buildings

    Science.gov (United States)

    Cherednichenko, Nadezhda; Oleinik, Pavel

    2018-03-01

    The article contains the factors determining the choice of methods for organizing the construction and production of construction and installation work for the construction of high-rise buildings. There are also indicated specific features of their underground parts, characterized by powerful slab-pile foundations, large volumes of earthworks, reinforced bases and foundations for assembly cranes. The work cycle is considered when using reinforced concrete, steel and combined skeletons of high-rise buildings; the areas of application of flow, separate and complex methods are being disclosed. The main conditions for the erection of high-rise buildings and their components are singled out: the choice of formwork systems, delivery and lifting of concrete mixes, installation of reinforcement, the formation of lifting and transporting and auxiliary equipment. The article prescribes the reserves of reduction in the duration of construction due to the creation of: complex mechanized technologies for the efficient construction of foundations in various soil conditions, including in the heaving, swelling, hindered, subsidence, bulk, water-saturated forms; complex mechanized technologies for the erection of monolithic reinforced concrete structures, taking into account the winter conditions of production and the use of mobile concrete-laying complexes and new generation machines; modular formwork systems, distinguished by their versatility, ease, simplicity in operation suitable for complex high-rise construction; more perfect methodology and the development of a set of progressive organizational and technological solutions that ensure a rational relationship between the processes of production and their maximum overlap in time and space.

  20. ISSUES ON USING PRICE INDICES FOR ESTIMATING GDP AND ITS COMPONENTS AT CONSTANT PRICES ACCORDING TO SNA METHODOLOGY

    Directory of Open Access Journals (Sweden)

    K. Prykhodko

    2014-06-01

    Full Text Available The article examines requirements and methodological approaches to the calculation of price indices (deflators in the national accounts. It gives estimation for the level and dynamics of price indicators. It proposes on improving the calculation of price indices (deflators in the national accounts of Ukraine.

  1. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  2. Application of best estimate and uncertainty safety analysis methodology to loss of flow events at Ontario's Power Generation's Darlington Nuclear Generating Station

    International Nuclear Information System (INIS)

    Huget, R.G.; Lau, D.K.; Luxat, J.C.

    2001-01-01

    Ontario Power Generation (OPG) is currently developing a new safety analysis methodology based on best estimate and uncertainty (BEAU) analysis. The framework and elements of the new safety analysis methodology are defined. The evolution of safety analysis technology at OPG has been thoroughly documented. Over the years, the use of conservative limiting assumptions in OPG safety analyses has led to gradual erosion of predicted safety margins. The main purpose of the new methodology is to provide a more realistic quantification of safety margins within a probabilistic framework, using best estimate results, with an integrated accounting of the underlying uncertainties. Another objective of the new methodology is to provide a cost-effective means for on-going safety analysis support of OPG's nuclear generating stations. Discovery issues and plant aging effects require that the safety analyses be periodically revised and, in the past, the cost of reanalysis at OPG has been significant. As OPG enters the new competitive marketplace for electricity, there is a strong need to conduct safety analysis in a less cumbersome manner. This paper presents the results of the first licensing application of the new methodology in support of planned design modifications to the shutdown systems (SDSs) at Darlington Nuclear Generating Station (NGS). The design modifications restore dual trip parameter coverage over the full range of reactor power for certain postulated loss-of-flow (LOF) events. The application of BEAU analysis to the single heat transport pump trip event provides a realistic estimation of the safety margins for the primary and backup trip parameters. These margins are significantly larger than those predicted by conventional limit of the operating envelope (LOE) analysis techniques. (author)

  3. A photogrammetric methodology for estimating construction and demolition waste composition

    International Nuclear Information System (INIS)

    Heck, H.H.; Reinhart, D.R.; Townsend, T.; Seibert, S.; Medeiros, S.; Cochran, K.; Chakrabarti, S.

    2002-01-01

    Manual sorting of construction, demolition, and renovation (C and D) waste is difficult and costly. A photogrammetric method has been developed to analyze the composition of C and D waste that eliminates the need for physical contact with the waste. The only field data collected is the weight and volume of the solid waste in the storage container and a photograph of each side of the waste pile, after it is dumped on the tipping floor. The methodology was developed and calibrated based on manual sorting studies at three different landfills in Florida, where the contents of twenty roll-off containers filled with C and D waste were sorted. The component classifications used were wood, concrete, paper products, drywall, metals, insulation, roofing, plastic, flooring, municipal solid waste, land-clearing waste, and other waste. Photographs of each side of the waste pile were taken with a digital camera and the pictures were analyzed on a computer using Photoshop software. Photoshop was used to divide the picture into eighty cells composed of ten columns and eight rows. The component distribution of each cell was estimated and results were summed to get a component distribution for the pile. Two types of distribution factors were developed that allow the component volumes and weights to be estimated. One set of distribution factors was developed to correct the volume distributions and the second set was developed to correct the weight distributions. The bulk density of each of the waste components were determined and used to convert waste volumes to weights. (author)

  4. A photogrammetric methodology for estimating construction and demolition waste composition

    Energy Technology Data Exchange (ETDEWEB)

    Heck, H.H. [Florida Inst. of Technology, Dept. of divil Engineering, Melbourne, Florida (United States); Reinhart, D.R.; Townsend, T.; Seibert, S.; Medeiros, S.; Cochran, K.; Chakrabarti, S

    2002-06-15

    Manual sorting of construction, demolition, and renovation (C and D) waste is difficult and costly. A photogrammetric method has been developed to analyze the composition of C and D waste that eliminates the need for physical contact with the waste. The only field data collected is the weight and volume of the solid waste in the storage container and a photograph of each side of the waste pile, after it is dumped on the tipping floor. The methodology was developed and calibrated based on manual sorting studies at three different landfills in Florida, where the contents of twenty roll-off containers filled with C and D waste were sorted. The component classifications used were wood, concrete, paper products, drywall, metals, insulation, roofing, plastic, flooring, municipal solid waste, land-clearing waste, and other waste. Photographs of each side of the waste pile were taken with a digital camera and the pictures were analyzed on a computer using Photoshop software. Photoshop was used to divide the picture into eighty cells composed of ten columns and eight rows. The component distribution of each cell was estimated and results were summed to get a component distribution for the pile. Two types of distribution factors were developed that allow the component volumes and weights to be estimated. One set of distribution factors was developed to correct the volume distributions and the second set was developed to correct the weight distributions. The bulk density of each of the waste components were determined and used to convert waste volumes to weights. (author)

  5. Improved USGS methodology for assessing continuous petroleum resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2010-01-01

    This report presents an improved methodology for estimating volumes of continuous (unconventional) oil and gas resources within the United States and around the world. The methodology is based on previously developed U.S. Geological Survey methodologies that rely on well-scale production data. Improvements were made primarily to how the uncertainty about estimated ultimate recoveries is incorporated in the estimates. This is particularly important when assessing areas with sparse or no production data, because the new methodology allows better use of analog data from areas with significant discovery histories.

  6. Influence of plume rise on the consequences of radioactive material releases

    International Nuclear Information System (INIS)

    Russo, A.J.; Wayland, J.R.; Ritchie, L.T.

    1977-01-01

    Estimates of health consequences resulting from a postulated nuclear reactor accident can be strongly dependent on the buoyant rise of the plume of released radioactive material. The sensitivity of the consequences of a postulated accident to two different plume rise models has been investigated. The results of these investigations are the subject of this report. One of the models includes the effects of emission angle, momentum, and radioactive heating of the released material. The difference in the consequence estimates from the two models can exceed an order of magnitude under some conditions, but in general the results are similar

  7. Simplified Methodology to Estimate the Maximum Liquid Helium (LHe) Cryostat Pressure from a Vacuum Jacket Failure

    Science.gov (United States)

    Ungar, Eugene K.; Richards, W. Lance

    2015-01-01

    The aircraft-based Stratospheric Observatory for Infrared Astronomy (SOFIA) is a platform for multiple infrared astronomical observation experiments. These experiments carry sensors cooled to liquid helium temperatures. The liquid helium supply is contained in large (i.e., 10 liters or more) vacuum-insulated dewars. Should the dewar vacuum insulation fail, the inrushing air will condense and freeze on the dewar wall, resulting in a large heat flux on the dewar's contents. The heat flux results in a rise in pressure and the actuation of the dewar pressure relief system. A previous NASA Engineering and Safety Center (NESC) assessment provided recommendations for the wall heat flux that would be expected from a loss of vacuum and detailed an appropriate method to use in calculating the maximum pressure that would occur in a loss of vacuum event. This method involved building a detailed supercritical helium compressible flow thermal/fluid model of the vent stack and exercising the model over the appropriate range of parameters. The experimenters designing science instruments for SOFIA are not experts in compressible supercritical flows and do not generally have access to the thermal/fluid modeling packages that are required to build detailed models of the vent stacks. Therefore, the SOFIA Program engaged the NESC to develop a simplified methodology to estimate the maximum pressure in a liquid helium dewar after the loss of vacuum insulation. The method would allow the university-based science instrument development teams to conservatively determine the cryostat's vent neck sizing during preliminary design of new SOFIA Science Instruments. This report details the development of the simplified method, the method itself, and the limits of its applicability. The simplified methodology provides an estimate of the dewar pressure after a loss of vacuum insulation that can be used for the initial design of the liquid helium dewar vent stacks. However, since it is not an exact

  8. The 1988 coal outlook: steadily rising consumption

    Energy Technology Data Exchange (ETDEWEB)

    Soras, C.G.; Stodden, J.R.

    1987-12-01

    Total coal use - domestic and foreign - will reach 910 million tons in 1988, an expansion of 1.3% from an estimated 898 million tons in 1987. The overall rise in consumption will add to inventory needs. Moreover, lower interest rates cut effective carrying costs and further encourage the holding of coal stocks by users. The results will be a gain in inventories of 3.5 tons by the end of 1988. As a result of all these factors, coal production is anticipated to rise by 11.6 million tons, or 1.2%, which projects firm markets in a time of relatively soft economic conditions in the USA. 2 tabs.

  9. Advanced Best-Estimate Methodologies for Thermal-Hydraulics Stability Analyses with TRACG code and Improvements on Operating Boiling Water Reactors

    International Nuclear Information System (INIS)

    Vedovi, J.; Trueba, M.; Ibarra, L; Espino, M.; Hoang, H.

    2016-01-01

    In recent years GE Hitachi has introduced two advanced methodologies to address the thermal-hydraulics instabilities in Boiling Water Reactors (BWRs); the “Detect and Suppress Solution - Confirmation Density (DSS-CD)” and the “GEH Simplified Stability Solution (GS3).” These two methodologies are based on Best-Estimate Plus Uncertainty (BEPU) analyses and provide significant improvement on safety, plant maneuvering and fuel economics with respect to existing solutions. DSS-CD and GS3 solutions have been recently approved by the United States Nuclear Regulatory Commission. This paper describes the main characteristics of these two stability methodologies and shares the experience of their recent implementation in operating BWRs. The BEPU approach provided a much deeper understanding of the parameters affecting instabilities in operating BWRs and allowed for better calculation of plant setpoints by improving plant manoeuvring restrictions and reducing manual operator actions. DSS-CD and GS3 methodologies are both based on safety analyses performed with the best-estimate system code TRACG. The assessment of uncertainty is performed following the Code Scaling, Applicability and Uncertainty (CSAU) methodology documented in NUREG/CR-5249. The two solutions have been already implemented in a combined 18 BWR units with 7 more units in the process of transitioning. The main results demonstrate a significant decrease (>0.1) in the stability based Operating Limit Minimum Critical Power Ratio (OLMCPR), which possibly results in significant fuel savings and the increase in allowable stability plant setpoints that address instability events such as the one occurred at the Fermi 2 plant in 2015 and can help prevent unnecessary Scrams. The paper also describes the advantages of reduced plant manoeuvring as a result to transitioning to these solutions; in particular the history of a BWR/6 transition to DSS-CD is discussed.

  10. Estimating the cost of delaying a nuclear power plant: methodology and application

    International Nuclear Information System (INIS)

    Hill, L.J.; Tepel, R.C.; Van Dyke, J.W.

    1985-01-01

    This paper presents an analysis of an actual 24-month nuclear power plant licensing delay under alternate assumptions about regulatory practice, sources of replacement power, and the cost of the plant. The analysis focuses on both the delay period and periods subsequent to the delay. The methodology utilized to simulate the impacts involved the recursive interaction of a generation-costing program to estimate fuel-replacement costs and a financial regulatory model to concomitantly determine the impact on the utility, its ratepayers, and security issues. The results indicate that a licensing delay has an adverse impact on the utility's internal generation of funds and financial indicators used to evaluate financial soundness. The direction of impact on electricity rates is contingent on the source of fuel used for replacement power. 5 references, 5 tables

  11. Is sea-level rising?

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A.S.

    correction in the estimation of trends obtained for tide gauge records. The altimeter data permits to prepare spatial maps of sea-level rise trends. We present a map prepared for the Indian Ocean (Figure 4) north of 10oS , which shows a fairly uniform... drawn information from research papers published by the author and report of the IPCC AR5 WG1 Chapter 13: Sea Level Changes, in which the author has served as a ‘Lead Author’. Figure1 is prepared using data from the University of Colorado. Nerem, R...

  12. Validating alternative methodologies to estimate the hydrological regime of temporary streams when flow data are unavailable

    Science.gov (United States)

    Llorens, Pilar; Gallart, Francesc; Latron, Jérôme; Cid, Núria; Rieradevall, Maria; Prat, Narcís

    2016-04-01

    ) were examined. In this case, flow permanence metrics were estimated as the proportion of photographs presenting stream flow. Results indicate that for streams being more than 25% of the time dry, interviews systematically underestimated flow, but the qualitative information given by inhabitants was of great interest to understand river dynamics. On the other hand, the use of aerial photographs gave a good estimation of flow permanence, but the seasonality was conditioned to the capture date of the aerial photographs. For these reasons, we recommend to use both methodologies together.

  13. The rise in co-authorship in the social sciences (1980-2013)

    DEFF Research Database (Denmark)

    Henriksen, Dorte

    2016-01-01

    This article examines the rise in co-authorship in the Social Sciences over a 34-year period. It investigates the development in co-authorship in different research fields and discusses how the methodological differences in these research fields together with changes in academia affect the tendency...... to co-author articles. The study is based on bibliographic data about 4.5 million peer review articles published in the period 1980-2013 and indexed in the 56 subject categories of the Web of Science’s (WoS) Social Science Citation Index (SSCI). The results show a rise in the average number of authors...... data set, statistical methods and/or team-production models....

  14. Doubling of coastal flooding frequency within decades due to sea-level rise

    Science.gov (United States)

    Vitousek, Sean; Barnard, Patrick L.; Fletcher, Charles H.; Frazer, Neil; Erikson, Li; Storlazzi, Curt D.

    2017-01-01

    Global climate change drives sea-level rise, increasing the frequency of coastal flooding. In most coastal regions, the amount of sea-level rise occurring over years to decades is significantly smaller than normal ocean-level fluctuations caused by tides, waves, and storm surge. However, even gradual sea-level rise can rapidly increase the frequency and severity of coastal flooding. So far, global-scale estimates of increased coastal flooding due to sea-level rise have not considered elevated water levels due to waves, and thus underestimate the potential impact. Here we use extreme value theory to combine sea-level projections with wave, tide, and storm surge models to estimate increases in coastal flooding on a continuous global scale. We find that regions with limited water-level variability, i.e., short-tailed flood-level distributions, located mainly in the Tropics, will experience the largest increases in flooding frequency. The 10 to 20 cm of sea-level rise expected no later than 2050 will more than double the frequency of extreme water-level events in the Tropics, impairing the developing economies of equatorial coastal cities and the habitability of low-lying Pacific island nations.

  15. Doubling of coastal flooding frequency within decades due to sea-level rise.

    Science.gov (United States)

    Vitousek, Sean; Barnard, Patrick L; Fletcher, Charles H; Frazer, Neil; Erikson, Li; Storlazzi, Curt D

    2017-05-18

    Global climate change drives sea-level rise, increasing the frequency of coastal flooding. In most coastal regions, the amount of sea-level rise occurring over years to decades is significantly smaller than normal ocean-level fluctuations caused by tides, waves, and storm surge. However, even gradual sea-level rise can rapidly increase the frequency and severity of coastal flooding. So far, global-scale estimates of increased coastal flooding due to sea-level rise have not considered elevated water levels due to waves, and thus underestimate the potential impact. Here we use extreme value theory to combine sea-level projections with wave, tide, and storm surge models to estimate increases in coastal flooding on a continuous global scale. We find that regions with limited water-level variability, i.e., short-tailed flood-level distributions, located mainly in the Tropics, will experience the largest increases in flooding frequency. The 10 to 20 cm of sea-level rise expected no later than 2050 will more than double the frequency of extreme water-level events in the Tropics, impairing the developing economies of equatorial coastal cities and the habitability of low-lying Pacific island nations.

  16. A proposed methodology for the calculation of direct consumption of fossil fuels and electricity for livestock breeding, and its application to Cyprus

    International Nuclear Information System (INIS)

    Kythreotou, Nicoletta; Florides, Georgios; Tassou, Savvas A.

    2012-01-01

    On-farm energy consumption is becoming increasingly important in the context of rising energy costs and concerns over greenhouse gas emissions. For farmers throughout the world, energy inputs represent a major and rapidly increasing cost. In many countries such as Cyprus, however, there is lack of systematic research on energy use in agriculture, which hinders benchmarking end evaluation of approaches and investment decisions for energy improvement. This study established a methodology for the estimation of the direct consumption of fossil fuels and electricity for livestock breeding, excluding transport, for locations where full data sets are not available. This methodology was then used to estimate fossil fuel and electricity consumption for livestock breeding in Cyprus. For 2008, this energy was found to be equivalent to 40.3 GWh that corresponds to 8% of the energy used in agriculture. Differences between the energy consumption per animal in Cyprus and other countries was found to be mainly due to differences in climatic conditions and technologies used in the farms. -- Highlights: ► A methodology to calculate energy consumption in farming applied to Cyprus. ► Annual consumption per animal was estimated to be 565 kWh/cow, 537 kWh/sow and 0.677 kWh/chicken. ► Direct energy consumption in livestock breeding is estimated at 40.3 GWh in 2008.

  17. Estimation of the daily global solar radiation based on the Gaussian process regression methodology in the Saharan climate

    Science.gov (United States)

    Guermoui, Mawloud; Gairaa, Kacem; Rabehi, Abdelaziz; Djafer, Djelloul; Benkaciali, Said

    2018-06-01

    Accurate estimation of solar radiation is the major concern in renewable energy applications. Over the past few years, a lot of machine learning paradigms have been proposed in order to improve the estimation performances, mostly based on artificial neural networks, fuzzy logic, support vector machine and adaptive neuro-fuzzy inference system. The aim of this work is the prediction of the daily global solar radiation, received on a horizontal surface through the Gaussian process regression (GPR) methodology. A case study of Ghardaïa region (Algeria) has been used in order to validate the above methodology. In fact, several combinations have been tested; it was found that, GPR-model based on sunshine duration, minimum air temperature and relative humidity gives the best results in term of mean absolute bias error (MBE), root mean square error (RMSE), relative mean square error (rRMSE), and correlation coefficient ( r) . The obtained values of these indicators are 0.67 MJ/m2, 1.15 MJ/m2, 5.2%, and 98.42%, respectively.

  18. Methods of erection of high-rise buildings

    Directory of Open Access Journals (Sweden)

    Cherednichenko Nadezhda

    2018-01-01

    Full Text Available The article contains the factors determining the choice of methods for organizing the construction and production of construction and installation work for the construction of high-rise buildings. There are also indicated specific features of their underground parts, characterized by powerful slab-pile foundations, large volumes of earthworks, reinforced bases and foundations for assembly cranes. The work cycle is considered when using reinforced concrete, steel and combined skeletons of high-rise buildings; the areas of application of flow, separate and complex methods are being disclosed. The main conditions for the erection of high-rise buildings and their components are singled out: the choice of formwork systems, delivery and lifting of concrete mixes, installation of reinforcement, the formation of lifting and transporting and auxiliary equipment. The article prescribes the reserves of reduction in the duration of construction due to the creation of: complex mechanized technologies for the efficient construction of foundations in various soil conditions, including in the heaving, swelling, hindered, subsidence, bulk, water-saturated forms; complex mechanized technologies for the erection of monolithic reinforced concrete structures, taking into account the winter conditions of production and the use of mobile concrete-laying complexes and new generation machines; modular formwork systems, distinguished by their versatility, ease, simplicity in operation suitable for complex high-rise construction; more perfect methodology and the development of a set of progressive organizational and technological solutions that ensure a rational relationship between the processes of production and their maximum overlap in time and space.

  19. Anthropogenic sea level rise and adaptation in the Yangtze estuary

    Science.gov (United States)

    Cheng, H.; Chen, J.; Chen, Z.; Ruan, R.; Xu, G.; Zeng, G.; Zhu, J.; Dai, Z.; Gu, S.; Zhang, X.; Wang, H.

    2016-02-01

    Sea level rise is a major projected threat of climate change. There are regional variations in sea level changes, depending on both naturally the tectonic subsidence, geomorphology, naturally changing river inputs and anthropogenic driven forces as artificial reservoir water impoundment within the watershed and urban land subsidence driven by ground water depletion in the river delta. Little is known on regional sea level fall in response to the channel erosion due to the sediment discharge decline by reservoir interception in the upstream watershed, and water level rise driven by anthropogenic measures as the land reclamation, deep waterway regulation and fresh water reservoir construction to the sea level change in estuaries. Changing coastal cities are situated in the delta regions expected to be threatened in various degrees. Shanghai belongs to those cities. Here we show that the anthropogenic driven sea level rise in the Yangtze estuary from the point of view of the continuous hydrodynamic system consisted of river catchment, estuary and coastal sea. Land subsidence is cited as 4 mm/a (2011-2030). Scour depth of the estuarine channel by upstream engineering as Three Gauge Dam is estimated at 2-10 cm (2011-2030). The rise of water level by deep waterway and land reclamation is estimated at 8-10 cm (2011-2030). The relative sea level rise will be speculated about 10 -16 cm (2011-2030), which these anthropogenic sea level changes will be imposed into the absolute sea level rise 2 mm/a and tectonic subsidence 1 mm/a measured in 1990s. The action guideline to the sea level rise strategy in the Shanghai city have been proposed to the Shanghai government as (1) recent actions (2012-2015) to upgrade the city water supply and drainage engineering and protective engineering; (2) interim actions (2016-2020) to improve sea level monitoring and early warning system, and then the special, city, regional planning considering sea level rise; (3) long term actions (2021

  20. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    Science.gov (United States)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  1. Vertical cities - the new form of high-rise construction evolution

    Science.gov (United States)

    Akristiniy, Vera A.; Boriskina, Yulia I.

    2018-03-01

    The article considers the basic principles of the vertical cities formation for the creation of a comfortable urban environment in conditions of rapid population growth and limited territories. As urban growth increases, there is a need for new concepts and approaches to urban space planning through the massive introduction of high-rise construction. The authors analyzed and systematized the list of high-tech solutions for arrangement the space of vertical cities, which are an integral part of the creation of the methodology for forming a high-rise buildings. Their concept differs in scale, presence of the big areas of public spaces, tendencies to self-sufficiency and sustainability, opportunity to offer the new unique comfortable environment to the population living in them.

  2. Heat flux estimate of warm water flow in a low-temperature diffuse flow site, southern East Pacific Rise 17°25‧ S

    Science.gov (United States)

    Goto, Shusaku; Kinoshita, Masataka; Mitsuzawa, Kyohiko

    2003-09-01

    A low-temperature diffuse flow site associated with abundant vent fauna was found by submersible observations on the southern East Pacific Rise at 17°25‧ S in 1997. This site was characterized by thin sediment covered pillow and sheet lavas with collapsed pits up to ˜15 m in diameter. There were three warm water vents (temperature: 6.5 to 10.5 °C) within the site above which the vented fluids rise as plumes. To estimate heat flux of the warm water vents, a temperature logger array was deployed and the vertical temperature distribution in the water column up to 38 m above the seafloor was monitored. A stationary deep seafloor observatory system was also deployed to monitor hydrothermal activity in this site. The temperature logger array measured temperature anomalies, while the plumes from the vents passed through the array. Because the temperature anomalies were measured in only specific current directions, we identified one of the vents as the source. Heat flux from the vent was estimated by applying a plume model in crossflow in a density-stratified environment. The average heat flux from September 13 to October 18, 1997 was 39 MW. This heat flux is as same order as those of high-temperature black smokers, indicating that a large volume flux was discharged from the vent (1.9 m3/s). Previous observations found many similar warm water flow vents along the spreading axis between 17°20‧ S 30‧ S. The total heat flux was estimated to be at least a few hundred mega-watts. This venting style would contribute to form effluent hydrothermal plumes extended above the spreading axis.

  3. Empirical evaluation and justification of methodologies in psychological science.

    Science.gov (United States)

    Proctor, R W; Capaldi, E J

    2001-11-01

    The purpose of this article is to describe a relatively new movement in the history and philosophy of science, naturalism, a form of pragmatism emphasizing that methodological principles are empirical statements. Thus, methodological principles must be evaluated and justified on the same basis as other empirical statements. On this view, methodological statements may be less secure than the specific scientific theories to which they give rise. The authors examined the feasibility of a naturalistic approach to methodology using logical and historical analysis and by contrasting theories that predict new facts versus theories that explain already known facts. They provide examples of how differences over methodological issues in psychology and in science generally may be resolved using a naturalistic, or empirical, approach.

  4. Estimating Areas of Vulnerability: Sea Level Rise and Storm Surge Hazards in the National Parks

    Science.gov (United States)

    Caffrey, M.; Beavers, R. L.; Slayton, I. A.

    2013-12-01

    The University of Colorado Boulder in collaboration with the National Park Service has undertaken the task of compiling sea level change and storm surge data for 105 coastal parks. The aim of our research is to highlight areas of the park system that are at increased risk of rapid inundation as well as periodic flooding due to sea level rise and storms. This research will assist park managers and planners in adapting to climate change. The National Park Service incorporates climate change data into many of their planning documents and is willing to implement innovative coastal adaptation strategies. Events such as Hurricane Sandy highlight how impacts of coastal hazards will continue to challenge management of natural and cultural resources and infrastructure along our coastlines. This poster will discuss the current status of this project. We discuss the impacts of Hurricane Sandy as well as the latest sea level rise and storm surge modeling being employed in this project. In addition to evaluating various drivers of relative sea-level change, we discuss how park planners and managers also need to consider projected storm surge values added to sea-level rise magnitudes, which could further complicate the management of coastal lands. Storm surges occurring at coastal parks will continue to change the land and seascapes of these areas, with the potential to completely submerge them. The likelihood of increased storm intensity added to increasing rates of sea-level rise make predicting the reach of future storm surges essential for planning and adaptation purposes. The National Park Service plays a leading role in developing innovative strategies for coastal parks to adapt to sea-level rise and storm surge, whilst coastal storms are opportunities to apply highly focused responses.

  5. Evaluation Methodologies for Estimating the Likelihood of Program Implementation Failure

    Science.gov (United States)

    Durand, Roger; Decker, Phillip J.; Kirkman, Dorothy M.

    2014-01-01

    Despite our best efforts as evaluators, program implementation failures abound. A wide variety of valuable methodologies have been adopted to explain and evaluate the "why" of these failures. Yet, typically these methodologies have been employed concurrently (e.g., project monitoring) or to the post-hoc assessment of program activities.…

  6. Methodological approaches to analysis of agricultural countermeasures on radioactive contaminated areas: Estimation of effectiveness and comparison of different alternatives

    DEFF Research Database (Denmark)

    Yatsalo, B.I.; Hedemann Jensen, P.; Alexakhin, R.M.

    1997-01-01

    Methodological aspects of countermeasure analysis in the long-term period after a nuclear accident are discussed for agriculture countermeasures for illustrative purposes. The estimates of effectiveness fbr specific countermeasures as well as methods of justified action levels assessments...... and comparison of different alternatives (countermeasures) based on the use of several criteria are considered....

  7. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  8. New methodology for estimating biofuel consumption for cooking: Atmospheric emissions of black carbon and sulfur dioxide from India

    Science.gov (United States)

    Habib, Gazala; Venkataraman, Chandra; Shrivastava, Manish; Banerjee, Rangan; Stehr, J. W.; Dickerson, Russell R.

    2004-09-01

    The dominance of biofuel combustion emissions in the Indian region, and the inherently large uncertainty in biofuel use estimates based on cooking energy surveys, prompted the current work, which develops a new methodology for estimating biofuel consumption for cooking. This is based on food consumption statistics, and the specific energy for food cooking. Estimated biofuel consumption in India was 379 (247-584) Tg yr-1. New information on the user population of different biofuels was compiled at a state level, to derive the biofuel mix, which varied regionally and was 74:16:10%, respectively, of fuelwood, dung cake and crop waste, at a national level. Importantly, the uncertainty in biofuel use from quantitative error assessment using the new methodology is around 50%, giving a narrower bound than in previous works. From this new activity data and currently used black carbon emission factors, the black carbon (BC) emissions from biofuel combustion were estimated as 220 (65-760) Gg yr-1. The largest BC emissions were from fuelwood (75%), with lower contributions from dung cake (16%) and crop waste (9%). The uncertainty of 245% in the BC emissions estimate is now governed by the large spread in BC emission factors from biofuel combustion (122%), implying the need for reducing this uncertainty through measurements. Emission factors of SO2 from combustion of biofuels widely used in India were measured, and ranged 0.03-0.08 g kg-1 from combustion of two wood species, 0.05-0.20 g kg-1 from 10 crop waste types, and 0.88 g kg-1 from dung cake, significantly lower than currently used emission factors for wood and crop waste. Estimated SO2 emissions from biofuels of 75 (36-160) Gg yr-1 were about a factor of 3 lower than that in recent studies, with a large contribution from dung cake (73%), followed by fuelwood (21%) and crop waste (6%).

  9. SU-F-T-687: Comparison of SPECT/CT-Based Methodologies for Estimating Lung Dose from Y-90 Radioembolization

    Energy Technology Data Exchange (ETDEWEB)

    Kost, S; Yu, N [Cleveland Clinic, Cleveland, OH (United States); Lin, S [Cleveland State University, Cleveland, OH (United States)

    2016-06-15

    Purpose: To compare mean lung dose (MLD) estimates from 99mTc macroaggregated albumin (MAA) SPECT/CT using two published methodologies for patients treated with {sup 90}Y radioembolization for liver cancer. Methods: MLD was estimated retrospectively using two methodologies for 40 patients from SPECT/CT images of 99mTc-MAA administered prior to radioembolization. In these two methods, lung shunt fractions (LSFs) were calculated as the ratio of scanned lung activity to the activity in the entire scan volume or to the sum of activity in the lung and liver respectively. Misregistration of liver activity into the lungs during SPECT acquisition was overcome by excluding lung counts within either 2 or 1.5 cm of the diaphragm apex respectively. Patient lung density was assumed to be 0.3 g/cm{sup 3} or derived from CT densitovolumetry respectively. Results from both approaches were compared to MLD determined by planar scintigraphy (PS). The effect of patient size on the difference between MLD from PS and SPECT/CT was also investigated. Results: Lung density from CT densitovolumetry is not different from the reference density (p = 0.68). The second method resulted in lung dose of an average 1.5 times larger lung dose compared to the first method; however the difference between the means of the two estimates was not significant (p = 0.07). Lung dose from both methods were statistically different from those estimated from 2D PS (p < 0.001). There was no correlation between patient size and the difference between MLD from PS and both SPECT/CT methods (r < 0.22, p > 0.17). Conclusion: There is no statistically significant difference between MLD estimated from the two techniques. Both methods are statistically different from conventional PS, with PS overestimating dose by a factor of three or larger. The difference between lung doses estimated from 2D planar or 3D SPECT/CT is not dependent on patient size.

  10. Costs of disarmament - Rethinking the price tag: A methodological inquiry into the costs and benefits of arms control

    International Nuclear Information System (INIS)

    Willett, S.

    2002-06-01

    The growing number of arms control and disarmament treaties agreed on over the past decades as well as rising concerns about harmful environmental and public health effects of weapons disposal, have understandably led to an increase in the cost of implementing arms control agreements. As a result, the expenses associated with treaty compliance have emerged as a contentious issue within the realm of arms control and disarmament discussions. In particular, opponents of arms control and disarmament point to perceived rising costs of meeting current and proposed treaty obligations in an attempt to limit and undermine such activities. Yet determining just how much arms control and disarmament cost remains very much an ambiguous task. In Costs of Disarmament - Rethinking the Price Tag: A Methodological Inquiry into the Costs and Benefits of Arms Control, Susan Willett addresses the question of how the cost of arms control ought to be measured. Emphasizing the proper allocation of costs associated with arms control treaty implementation to the life cycle costs of weapon systems and their correct weighing against the benefits they procure in terms of averted arms races and increased international security, Willett argues for a revised methodology of costing arms control and disarmament that gives a more accurate - and significantly lower - estimate of the latter. Adopting such a revised methodology concludes the author, might dispel considerable misunderstanding and help point decisions over arms control and disarmament in the right direction

  11. STUDY OF SHELL FOR ENERGY EFFICIENT OF SUSTAINABLE LOW-RISE BUILDING

    Directory of Open Access Journals (Sweden)

    DANISHEVSKYI V. V.

    2016-03-01

    Full Text Available The article presents the results of study the shell for energy-efficient environmental low-rise residential building, corresponding to the criteria of sustainable development in construction. Purpose. The purpose of the presented research is providing a study of parameters for shell of energy-efficient environmental low-rise buildings. Methodology. Research is carried out on the basis of an improved method for calculating the thermal characteristics of the external walling, as well as physical heat transfer simulation. Conclusion.The ratio between the thickness of external walling and the proportion of heat loss through them was determined, and also the heat loss through thermal "bridges" was studied. Originality. The limits for the optimum thickness of the external walling of ecological materials was analyzed, and it was offered solution for minimization of heat loss through the nodes of shell. Practical value.Recommendations are worked out on constructing of thermal shell at planning of energy-efficient low-rise residential buildings.

  12. A methodology for the estimation of the radiological consequences of a Loss of Coolant Accident

    Energy Technology Data Exchange (ETDEWEB)

    Kereszturi, Andras; Brolly, Aron; Panka, Istvan; Pazmandi, Tamas; Trosztel, Istvan [Hungarian Academy of Sciences, Budapest (Hungary). MTA EK, Centre for Energy Research

    2017-09-15

    For calculation of the radiological consequences of Large Break Loss of Coolant (LBLOCA) events, a set of various computer codes modeling the corresponding physical processes, disciplines and their appropriate subsequent data exchange are necessary. For demonstrating the methodology applied in MTA EK, a LBLOCA event at shut down reactor state - when only limited configuration of the Emergency Core Cooling System (ECCS) is available - was selected. In this special case, fission gas release from a number of fuel pins is obtained from the analyses. This paper describes the initiating event and the corresponding thermal hydraulic calculations and the further physical processes, the necessary models and computer codes and their connections. Additionally the applied conservative assumptions and the Best Estimate Plus Uncertainty (B+U) evaluation applied for characterizing the pin power and burnup distribution in the core are presented. Also, the fuel behavior processes. Finally, the newly developed methodology to predict whether the fuel pins are getting in-hermetic or not is described and the the results of the activity transport and dose calculations are shown.

  13. Crown-rise and crown-length dynamics: applications to loblolly pine

    Science.gov (United States)

    Harry T. Valentine; Ralph L. Amateis; Jeffrey H. Gove; Annikki. Makela

    2013-01-01

    The original crown-rise model estimates the average height of a crown-base in an even-aged mono-species stand of trees. We have elaborated this model to reduce bias and prediction error, and to also provide crown-base estimates for individual trees. Results for the latter agree with a theory of branch death based on resource availability and allocation.We use the...

  14. Detecting anthropogenic footprints in sea level rise: the role of complex colored noise

    Science.gov (United States)

    Dangendorf, Sönke; Marcos, Marta; Müller, Alfred; Zorita, Eduardo; Jensen, Jürgen

    2015-04-01

    While there is scientific consensus that global mean sea level (MSL) is rising since the late 19th century, it remains unclear how much of this rise is due to natural variability or anthropogenic forcing. Uncovering the anthropogenic contribution requires profound knowledge about the persistence of natural MSL variations. This is challenging, since observational time series represent the superposition of various processes with different spectral properties. Here we statistically estimate the upper bounds of naturally forced centennial MSL trends on the basis of two separate components: a slowly varying volumetric (mass and density changes) and a more rapidly changing atmospheric component. Resting on a combination of spectral analyses of tide gauge records, ocean reanalysis data and numerical Monte-Carlo experiments, we find that in records where transient atmospheric processes dominate, the persistence of natural volumetric changes is underestimated. If each component is assessed separately, natural centennial trends are locally up to ~0.5 mm/yr larger than in case of an integrated assessment. This implies that external trends in MSL rise related to anthropogenic forcing might be generally overestimated. By applying our approach to the outputs of a centennial ocean reanalysis (SODA), we estimate maximum natural trends in the order of 1 mm/yr for the global average. This value is larger than previous estimates, but consistent with recent paleo evidence from periods in which the anthropogenic contribution was absent. Comparing our estimate to the observed 20th century MSL rise of 1.7 mm/yr suggests a minimum external contribution of at least 0.7 mm/yr. We conclude that an accurate detection of anthropogenic footprints in MSL rise requires a more careful assessment of the persistence of intrinsic natural variability.

  15. Methodology for Estimation of Flood Magnitude and Frequency for New Jersey Streams

    Science.gov (United States)

    Watson, Kara M.; Schopp, Robert D.

    2009-01-01

    Methodologies were developed for estimating flood magnitudes at the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals for unregulated or slightly regulated streams in New Jersey. Regression equations that incorporate basin characteristics were developed to estimate flood magnitude and frequency for streams throughout the State by use of a generalized least squares regression analysis. Relations between flood-frequency estimates based on streamflow-gaging-station discharge and basin characteristics were determined by multiple regression analysis, and weighted by effective years of record. The State was divided into five hydrologically similar regions to refine the regression equations. The regression analysis indicated that flood discharge, as determined by the streamflow-gaging-station annual peak flows, is related to the drainage area, main channel slope, percentage of lake and wetland areas in the basin, population density, and the flood-frequency region, at the 95-percent confidence level. The standard errors of estimate for the various recurrence-interval floods ranged from 48.1 to 62.7 percent. Annual-maximum peak flows observed at streamflow-gaging stations through water year 2007 and basin characteristics determined using geographic information system techniques for 254 streamflow-gaging stations were used for the regression analysis. Drainage areas of the streamflow-gaging stations range from 0.18 to 779 mi2. Peak-flow data and basin characteristics for 191 streamflow-gaging stations located in New Jersey were used, along with peak-flow data for stations located in adjoining States, including 25 stations in Pennsylvania, 17 stations in New York, 16 stations in Delaware, and 5 stations in Maryland. Streamflow records for selected stations outside of New Jersey were included in the present study because hydrologic, physiographic, and geologic boundaries commonly extend beyond political boundaries. The StreamStats web application was developed

  16. Estimation of the laser cutting operating cost by support vector regression methodology

    Science.gov (United States)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  17. Using GNSS for Assessment Recent Sea Level Rise in the Northwestern Part of the Arabian Gulf

    Science.gov (United States)

    Alothman, A. O.; Bos, M. S.; Fernandes, R.

    2017-12-01

    Due to the global warming acting recently (in the 21st century) on the planet Earth, an associated sea level rise is predicted to reach up to 30 cm to 60 cm in some regions. Sea level monitoring is important for the Kingdom of Saudi Arabia, since it is surrounded by very long cost of about 3400 km in length and hundreds of isolated islands. The eastern coast line of KSA, in the Arabian Gulf, needs some monitoring in the long term, due to low land nature of the region. Also, the ongoing oil withdrawal activities in the area, may affect the regional sea level rise. In addition to these two facts, the tectonic structure of the Arabian Peninsula is one factor. The Regional Relative sea level in the eastern cost of Saudi Arabia has been estimated in the past using tide gauge data of more than 28 years using the vertical displacement of permanent Global Navigation Satellite System GNSS stations having time span of only about 3 years. In this paper, we discuss and update the methodology and results from Alothman et al. (2014), particularly by checking and extending the GNSS solutions. Since 3 of the 6 GPS stations used only started observing in the end of 2011, the longer time series have now significantly lower uncertainties in the estimated vertical rate. Longer time span of GNSS observations were included and 500 synthetic time series were estimated and seasonal signals were analysed. it is concluded that the varying seasonal signal present in the GNSS time series causes an underestimation of 0.1 mm/yr for short time series of 3 years. In addition to the implications of using short time series to estimate the vertical land motion, we found that if the varying seasonal signals are present in the data, the problem is aggravated. This finding can be useful for other studies analyzing short GNSS time series.

  18. Approximate analysis of high-rise frames with flexible connections

    NARCIS (Netherlands)

    Hoenderkamp, J.C.D.; Snijder, H.H.

    2000-01-01

    An approximate hand method for estimating horizontal deflections in high-rise steel frames with flexible beam–column connections subjected to horizontal loading is presented. The method is developed from the continuous medium theory for coupled walls which is expressed in non-dimensional structural

  19. Comparison of regression coefficient and GIS-based methodologies for regional estimates of forest soil carbon stocks

    International Nuclear Information System (INIS)

    Elliott Campbell, J.; Moen, Jeremie C.; Ney, Richard A.; Schnoor, Jerald L.

    2008-01-01

    Estimates of forest soil organic carbon (SOC) have applications in carbon science, soil quality studies, carbon sequestration technologies, and carbon trading. Forest SOC has been modeled using a regression coefficient methodology that applies mean SOC densities (mass/area) to broad forest regions. A higher resolution model is based on an approach that employs a geographic information system (GIS) with soil databases and satellite-derived landcover images. Despite this advancement, the regression approach remains the basis of current state and federal level greenhouse gas inventories. Both approaches are analyzed in detail for Wisconsin forest soils from 1983 to 2001, applying rigorous error-fixing algorithms to soil databases. Resulting SOC stock estimates are 20% larger when determined using the GIS method rather than the regression approach. Average annual rates of increase in SOC stocks are 3.6 and 1.0 million metric tons of carbon per year for the GIS and regression approaches respectively. - Large differences in estimates of soil organic carbon stocks and annual changes in stocks for Wisconsin forestlands indicate a need for validation from forthcoming forest surveys

  20. Future sea level rise constrained by observations and long-term commitment

    Science.gov (United States)

    Mengel, Matthias; Levermann, Anders; Frieler, Katja; Robinson, Alexander; Marzeion, Ben; Winkelmann, Ricarda

    2016-01-01

    Sea level has been steadily rising over the past century, predominantly due to anthropogenic climate change. The rate of sea level rise will keep increasing with continued global warming, and, even if temperatures are stabilized through the phasing out of greenhouse gas emissions, sea level is still expected to rise for centuries. This will affect coastal areas worldwide, and robust projections are needed to assess mitigation options and guide adaptation measures. Here we combine the equilibrium response of the main sea level rise contributions with their last century's observed contribution to constrain projections of future sea level rise. Our model is calibrated to a set of observations for each contribution, and the observational and climate uncertainties are combined to produce uncertainty ranges for 21st century sea level rise. We project anthropogenic sea level rise of 28–56 cm, 37–77 cm, and 57–131 cm in 2100 for the greenhouse gas concentration scenarios RCP26, RCP45, and RCP85, respectively. Our uncertainty ranges for total sea level rise overlap with the process-based estimates of the Intergovernmental Panel on Climate Change. The “constrained extrapolation” approach generalizes earlier global semiempirical models and may therefore lead to a better understanding of the discrepancies with process-based projections. PMID:26903648

  1. Future sea level rise constrained by observations and long-term commitment.

    Science.gov (United States)

    Mengel, Matthias; Levermann, Anders; Frieler, Katja; Robinson, Alexander; Marzeion, Ben; Winkelmann, Ricarda

    2016-03-08

    Sea level has been steadily rising over the past century, predominantly due to anthropogenic climate change. The rate of sea level rise will keep increasing with continued global warming, and, even if temperatures are stabilized through the phasing out of greenhouse gas emissions, sea level is still expected to rise for centuries. This will affect coastal areas worldwide, and robust projections are needed to assess mitigation options and guide adaptation measures. Here we combine the equilibrium response of the main sea level rise contributions with their last century's observed contribution to constrain projections of future sea level rise. Our model is calibrated to a set of observations for each contribution, and the observational and climate uncertainties are combined to produce uncertainty ranges for 21st century sea level rise. We project anthropogenic sea level rise of 28-56 cm, 37-77 cm, and 57-131 cm in 2100 for the greenhouse gas concentration scenarios RCP26, RCP45, and RCP85, respectively. Our uncertainty ranges for total sea level rise overlap with the process-based estimates of the Intergovernmental Panel on Climate Change. The "constrained extrapolation" approach generalizes earlier global semiempirical models and may therefore lead to a better understanding of the discrepancies with process-based projections.

  2. Analysis of offsite dose calculation methodology for a nuclear power reactor

    International Nuclear Information System (INIS)

    Moser, D.M.

    1995-01-01

    This technical study reviews the methodology for calculating offsite dose estimates as described in the offsite dose calculation manual (ODCM) for Pennsylvania Power and Light - Susquehanna Steam Electric Station (SSES). An evaluation of the SSES ODCM dose assessment methodology indicates that it conforms with methodology accepted by the US Nuclear Regulatory Commission (NRC). Using 1993 SSES effluent data, dose estimates are calculated according to SSES ODCM methodology and compared to the dose estimates calculated according to SSES ODCM and the computer model used to produce the reported 1993 dose estimates. The 1993 SSES dose estimates are based on the axioms of Publication 2 of the International Commission of Radiological Protection (ICRP). SSES Dose estimates based on the axioms of ICRP Publication 26 and 30 reveal the total body estimates to be the most affected

  3. Diagnostics from three rising submillimeter bursts

    International Nuclear Information System (INIS)

    Zhou, Ai-Hua; Li, Jian-Ping; Wang, Xin-Dong

    2016-01-01

    In this paper we investigate three novel rising submillimeter (THz) bursts that occurred sequentially in Super Active Region NOAA 10486. The average rising rate of the flux density above 200 GHz is only 20 sfu GHz −1 (corresponding to spectral index α of 1.6) for the THz spectral components of the 2003 October 28 and November 4 bursts, but it attained values of 235 sfu GHz −1 (α = 4.8) in the 2003 November 2 burst. The steeply rising THz spectrum can be produced by a population of highly relativistic electrons with a low-energy cutoff of 1 MeV, but it only requires a low-energy cutoff of 30 keV for the two slowly rising THz bursts, via gyrosynchrotron (GS) radiation based on our numerical simulations of burst spectra in the magnetic dipole field case. The electron density variation is much larger in the THz source than in the microwave (MW) source. It is interesting that the THz source radius decreased by 20%–50% during the decay phase for the three events, but the MW source increased by 28% for the 2003 November 2 event. In the paper we will present a formula that can be used to calculate the energy released by ultrarelativistic electrons, taking the relativistic correction into account for the first time. We find that the energy released by energetic electrons in the THz source exceeds that in the MW source due to the strong GS radiation loss in the THz range, although the modeled THz source area is 3–4 orders smaller than the modeled MW source one. The total energies released by energetic electrons via the GS radiation in radio sources are estimated, respectively, to be 5.2 × 10 33 , 3.9 × 10 33 and 3.7 × 10 32 erg for the October 28, November 2 and 4 bursts, which are 131, 76 and 4 times as large as the thermal energies of 2.9 × 10 31 , 2.1 × 10 31 and 5.2 × 10 31 erg estimated from soft X-ray GOES observations. (paper)

  4. Development and validation of a CFD based methodology to estimate the pressure loss of flow through perforated plates

    International Nuclear Information System (INIS)

    Barros Filho, Jose A.; Navarro, Moyses A.; Santos, Andre A.C. dos; Jordao, E.

    2011-01-01

    In spite of the recent great development of Computational Fluid Dynamics (CFD), there are still some issues about how to assess its accurateness. This work presents the validation of a CFD methodology devised to estimate the pressure drop of water flow through perforated plates similar to the ones used in some reactor core components. This was accomplished by comparing the results of CFD simulations against experimental data of 5 perforated plates with different geometric characteristics. The proposed methodology correlates the experimental data within a range of ± 7.5%. The validation procedure recommended by the ASME Standard for Verification and Validation in Computational Fluid Dynamics and Heat Transfer-V and V 20 is also evaluated. The conclusion is that it is not adequate to this specific use. (author)

  5. Para-Quantitative Methodology: Reclaiming Experimentalism in Educational Research

    Science.gov (United States)

    Shabani Varaki, Bakhtiar; Floden, Robert E.; Javidi Kalatehjafarabadi, Tahereh

    2015-01-01

    This article focuses on the criticisms of current approaches in educational research methodology. It summarizes rationales for mixed methods and argues that the mixing quantitative paradigm and qualitative paradigm is problematic due to practical and philosophical arguments. It is also indicated that the current rise of mixed methods work has…

  6. Coastal sea level rise with warming above 2 °C.

    Science.gov (United States)

    Jevrejeva, Svetlana; Jackson, Luke P; Riva, Riccardo E M; Grinsted, Aslak; Moore, John C

    2016-11-22

    Two degrees of global warming above the preindustrial level is widely suggested as an appropriate threshold beyond which climate change risks become unacceptably high. This "2 °C" threshold is likely to be reached between 2040 and 2050 for both Representative Concentration Pathway (RCP) 8.5 and 4.5. Resulting sea level rises will not be globally uniform, due to ocean dynamical processes and changes in gravity associated with water mass redistribution. Here we provide probabilistic sea level rise projections for the global coastline with warming above the 2 °C goal. By 2040, with a 2 °C warming under the RCP8.5 scenario, more than 90% of coastal areas will experience sea level rise exceeding the global estimate of 0.2 m, with up to 0.4 m expected along the Atlantic coast of North America and Norway. With a 5 °C rise by 2100, sea level will rise rapidly, reaching 0.9 m (median), and 80% of the coastline will exceed the global sea level rise at the 95th percentile upper limit of 1.8 m. Under RCP8.5, by 2100, New York may expect rises of 1.09 m, Guangzhou may expect rises of 0.91 m, and Lagos may expect rises of 0.90 m, with the 95th percentile upper limit of 2.24 m, 1.93 m, and 1.92 m, respectively. The coastal communities of rapidly expanding cities in the developing world, and vulnerable tropical coastal ecosystems, will have a very limited time after midcentury to adapt to sea level rises unprecedented since the dawn of the Bronze Age.

  7. Model and Algorithm for Substantiating Solutions for Organization of High-Rise Construction Project

    Directory of Open Access Journals (Sweden)

    Anisimov Vladimir

    2018-01-01

    Full Text Available In the paper the models and the algorithm for the optimal plan formation for the organization of the material and logistical processes of the high-rise construction project and their financial support are developed. The model is based on the representation of the optimization procedure in the form of a non-linear problem of discrete programming, which consists in minimizing the execution time of a set of interrelated works by a limited number of partially interchangeable performers while limiting the total cost of performing the work. The proposed model and algorithm are the basis for creating specific organization management methodologies for the high-rise construction project.

  8. Model and Algorithm for Substantiating Solutions for Organization of High-Rise Construction Project

    Science.gov (United States)

    Anisimov, Vladimir; Anisimov, Evgeniy; Chernysh, Anatoliy

    2018-03-01

    In the paper the models and the algorithm for the optimal plan formation for the organization of the material and logistical processes of the high-rise construction project and their financial support are developed. The model is based on the representation of the optimization procedure in the form of a non-linear problem of discrete programming, which consists in minimizing the execution time of a set of interrelated works by a limited number of partially interchangeable performers while limiting the total cost of performing the work. The proposed model and algorithm are the basis for creating specific organization management methodologies for the high-rise construction project.

  9. GRACE Detected Rise of Groundwater in the Sahelian Niger River Basin

    Science.gov (United States)

    Werth, S.; White, D.; Bliss, D. W.

    2017-12-01

    West African regions along the Niger River experience climate and land cover changes that affect hydrological processes and therewith the distribution of fresh water resources (WR). This study provides an investigation of long-term changes in terrestrial water storages (TWS) of the Niger River basin and its subregions by analyzing a decade of satellite gravity data from the Gravity Recovery and Climate Experiment (GRACE) mission. The location of large trends in TWS maps of differently processed GRACE solutions points to rising groundwater stocks. Soil moisture data from a global land surface model allow separating the effect of significantly increasing amount of WR from that of TWS variations. Surface water variations from a global water storage model validated with observations from altimetry data were applied to estimate the groundwater component in WR. For the whole Niger, a rise in groundwater stocks is estimated to be 93 ± 61 km3 between January 2003 and December 2013. A careful analysis of uncertainties in all data sets supports the significance of the groundwater rise. Our results confirm previous observations of rising water tables, indicating that effects of land cover changes on groundwater storage are relevant on basin scales. Areas with rising water storage are stocking a comfortable backup to mitigate possible future droughts and to deliver water to remote areas. This has implications for Niger water management strategies. Increasing groundwater recharges may be accompanied by reduction in water quality. This study helps to inform authority's decision to mitigate its negative impacts on local communities.

  10. Hydrologic evaluation methodology for estimating water movement through the unsaturated zone at commercial low-level radioactive waste disposal sites

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, P.D.; Rockhold, M.L.; Nichols, W.E.; Gee, G.W. [Pacific Northwest Lab., Richland, WA (United States)

    1996-01-01

    This report identifies key technical issues related to hydrologic assessment of water flow in the unsaturated zone at low-level radioactive waste (LLW) disposal facilities. In addition, a methodology for incorporating these issues in the performance assessment of proposed LLW disposal facilities is identified and evaluated. The issues discussed fall into four areas: estimating the water balance at a site (i.e., infiltration, runoff, water storage, evapotranspiration, and recharge); analyzing the hydrologic performance of engineered components of a facility; evaluating the application of models to the prediction of facility performance; and estimating the uncertainty in predicted facility performance. To illustrate the application of the methodology, two examples are presented. The first example is of a below ground vault located in a humid environment. The second example looks at a shallow land burial facility located in an arid environment. The examples utilize actual site-specific data and realistic facility designs. The two examples illustrate the issues unique to humid and arid sites as well as the issues common to all LLW sites. Strategies for addressing the analytical difficulties arising in any complex hydrologic evaluation of the unsaturated zone are demonstrated.

  11. Hydrologic evaluation methodology for estimating water movement through the unsaturated zone at commercial low-level radioactive waste disposal sites

    International Nuclear Information System (INIS)

    Meyer, P.D.; Rockhold, M.L.; Nichols, W.E.; Gee, G.W.

    1996-01-01

    This report identifies key technical issues related to hydrologic assessment of water flow in the unsaturated zone at low-level radioactive waste (LLW) disposal facilities. In addition, a methodology for incorporating these issues in the performance assessment of proposed LLW disposal facilities is identified and evaluated. The issues discussed fall into four areas: estimating the water balance at a site (i.e., infiltration, runoff, water storage, evapotranspiration, and recharge); analyzing the hydrologic performance of engineered components of a facility; evaluating the application of models to the prediction of facility performance; and estimating the uncertainty in predicted facility performance. To illustrate the application of the methodology, two examples are presented. The first example is of a below ground vault located in a humid environment. The second example looks at a shallow land burial facility located in an arid environment. The examples utilize actual site-specific data and realistic facility designs. The two examples illustrate the issues unique to humid and arid sites as well as the issues common to all LLW sites. Strategies for addressing the analytical difficulties arising in any complex hydrologic evaluation of the unsaturated zone are demonstrated

  12. A methodology for modeling photocatalytic reactors for indoor pollution control using previously estimated kinetic parameters

    Energy Technology Data Exchange (ETDEWEB)

    Passalia, Claudio; Alfano, Orlando M. [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina); Brandi, Rodolfo J., E-mail: rbrandi@santafe-conicet.gov.ar [INTEC - Instituto de Desarrollo Tecnologico para la Industria Quimica, CONICET - UNL, Gueemes 3450, 3000 Santa Fe (Argentina); FICH - Departamento de Medio Ambiente, Facultad de Ingenieria y Ciencias Hidricas, Universidad Nacional del Litoral, Ciudad Universitaria, 3000 Santa Fe (Argentina)

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Indoor pollution control via photocatalytic reactors. Black-Right-Pointing-Pointer Scaling-up methodology based on previously determined mechanistic kinetics. Black-Right-Pointing-Pointer Radiation interchange model between catalytic walls using configuration factors. Black-Right-Pointing-Pointer Modeling and experimental validation of a complex geometry photocatalytic reactor. - Abstract: A methodology for modeling photocatalytic reactors for their application in indoor air pollution control is carried out. The methodology implies, firstly, the determination of intrinsic reaction kinetics for the removal of formaldehyde. This is achieved by means of a simple geometry, continuous reactor operating under kinetic control regime and steady state. The kinetic parameters were estimated from experimental data by means of a nonlinear optimization algorithm. The second step was the application of the obtained kinetic parameters to a very different photoreactor configuration. In this case, the reactor is a corrugated wall type using nanosize TiO{sub 2} as catalyst irradiated by UV lamps that provided a spatially uniform radiation field. The radiative transfer within the reactor was modeled through a superficial emission model for the lamps, the ray tracing method and the computation of view factors. The velocity and concentration fields were evaluated by means of a commercial CFD tool (Fluent 12) where the radiation model was introduced externally. The results of the model were compared experimentally in a corrugated wall, bench scale reactor constructed in the laboratory. The overall pollutant conversion showed good agreement between model predictions and experiments, with a root mean square error less than 4%.

  13. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  14. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.

    Science.gov (United States)

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-06-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.

  15. Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†

    Science.gov (United States)

    Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana

    2014-01-01

    Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370

  16. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Science.gov (United States)

    CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira

    2013-01-01

    Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological

  17. Validation of a physical anthropology methodology using mandibles for gender estimation in a Brazilian population

    Directory of Open Access Journals (Sweden)

    Suzana Papile Maciel Carvalho

    2013-07-01

    Full Text Available Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. OBJECTIVE: This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995, previously used in a population sample from Northeast Brazil. MATERIAL AND METHODS: The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. RESULTS: The results demonstrated that the application of the method of Oliveira, et al. (1995 in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. CONCLUSION: It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995 presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South

  18. Sea-level rise: towards understanding local vulnerability

    Science.gov (United States)

    Rahmstorf, Stefan

    2012-06-01

    , experts are increasingly looking at its potential impacts on coasts to facilitate local adaptation planning. This is a more complex issue than one might think, because different stretches of coast can be affected in very different ways. First of all, the sea-level response to global warming will not be globally uniform, since factors like changes in ocean currents (Levermann et al 2005) and the changing gravitational pull of continental ice (Mitrovica et al 2001) affect the local rise. Secondly, superimposed on the climatic trend is natural variability in sea level, which regionally can be as large as the climatic signal on multi-decadal timescales. Over the past decades, sea level has dropped in sizable parts of the world ocean, although it has of course risen in global mean (IPCC 2007). Thirdly, local land uplift or subsidence affects the local sea-level change relative to the coast, both for natural reasons (post-glacial isostatic adjustment centred on regions that were covered by ice sheets during the last ice age) and artificial ones (e.g., extraction of water or oil as in the Gulf of Mexico). Finally, local vulnerability to sea-level rise depends on many factors. Two interesting new studies in this journal (Tebaldi et al 2012, Strauss et al 2012) make important steps towards understanding sea-level vulnerability along the coasts of the United States, with methods that could also be applied elsewhere. The first, by Strauss and colleagues, merges high-resolution topographic data and a newly available tidal model together with population and housing data in order to estimate what land area and population would be at risk given certain increments in sea level. The results are mapped and tabulated at county and city level. They reveal the 'hot spots' along the US coast where sea-level rise is of the highest concern because of large populations living near the high-tide line: New York City and Long Island; the New Jersey shore; the Norfolk, Virginia, area; near Charleston

  19. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  20. Procedures for estimating the radiation dose in the vicinity of uranium mines and mills by direct calculation methodology

    International Nuclear Information System (INIS)

    Coelho, C.P.

    1983-01-01

    A methodology for estimating the radiation doses to the members of the general public, in the vicinity of uranium mines and mills is presented. The data collected in the surveys performed to characterize the neighborhood of the site, and used in this work to estimate the radiation dose, are required by the Regulatory Body, for the purpose of Licensing. Initially, a description is shown of the main processing steps to obtain the uranium concentrate and the critical instalation radionuclides are identified. Following, some studies required to characterize the facility neighborhood are presented, specially those related to geography, demography, metheorology, hydrology and environmental protection. Also, the basic programs for monitoring the facility neighborhood in the pre-operational and operational phases are included. It is then proposed a procedure to estimate inhalation, ingestion and external doses. As an example, the proposed procedure is applied to a hypotetical site. Finally, some aspects related to the applicability of this work are discussed. (Author) [pt

  1. Future rise of the sea level: consequences and strategies on the shoreline

    International Nuclear Information System (INIS)

    Teisson, C.

    1991-11-01

    The Mean Sea Level may rise in a near future due to the warming of the atmosphere associated with the 'greenhouse effect'. The alarming estimations issued in the 1980's (several meters of surelevation in the next centuries) are now lowered: the ice sheets, the melting of which could induce such a rise, do not present signs of instability. A rise from 30 to 50 cm is likely to occur in the middle of the next century; there is a probability of 25% that the rise of sea level relative to the year 1980 stands beyond 1 meter by 2100. The consequences of such a rise on the shoreline and the maritime works are reviewed, and planning strategies are discussed. This study has been performed in the framework of a convention between EDF-LNH and the Sea State Secretary (Service Technique des Ports Maritimes et Voies Navigables) 41 refs., 31 figs., 6 tabs

  2. Keys to creative-strategic innovation using transmedia methodology

    Directory of Open Access Journals (Sweden)

    Eduardo Pradanos Grijalvo

    2016-02-01

    Full Text Available This article gives rise to a disruptive vision in the advertising industry through the directtestimony of two professionals. It exposes the need to find new formulas to reach a societythat is increasingly hyper-connected, fragmented and mobile. And this is done by employingConnect and Develop methodology, taking on a transmedia approach to the solutions.

  3. Determination of Temperature Rise and Temperature Differentials of CEMII/B-V Cement for 20MPa Mass Concrete using Adiabatic Temperature Rise Data

    Science.gov (United States)

    Chee Siang, GO

    2017-07-01

    Experimental test was carried out to determine the temperature rise characteristics of Portland-Fly-Ash Cement (CEM II/B-V, 42.5N) of Blaine fineness 418.6m2/kg and 444.6m2/kg respectively for 20MPa mass concrete under adiabatic condition. The estimation on adiabatic temperature rise by way of CIRIA C660 method (Construction Industry Research & Information Information) was adopted to verify and validate the hot-box test results by simulating the heat generation curve of the concrete under semi-adiabatic condition. Test result found that Portland fly-ash cement has exhibited decrease in the peak value of temperature rise and maximum temperature rise rate. The result showed that the temperature development and distribution profile, which is directly contributed from the heat of hydration of cement with time, is affected by the insulation, initial placing temperature, geometry and size of concrete mass. The mock up data showing the measured temperature differential is significantly lower than the technical specifications 20°C temperature differential requirement and the 27.7°C limiting temperature differential for granite aggregate concrete as stipulated in BS8110-2: 1985. The concrete strength test result revealed that the 28 days cubes compressive strength was above the stipulated 20MPa characteristic strength at 90 days. The test demonstrated that with proper concrete mix design, the use of Portland flyash cement, combination of chilled water and flake ice, and good insulation is effective in reducing peak temperature rise, temperature differential, and lower adiabatic temperature rise for mass concrete pours. As far as the determined adiabatic temperature rise result was concern, the established result could be inferred for in-situ thermal properties of 20MPa mass concrete application, as the result could be repeatable on account of similar type of constituent materials and concrete mix design adopted for permanent works at project site.

  4. Methodology to estimating aquatic dispersion of effluents from accidental and routine releases

    International Nuclear Information System (INIS)

    Borges, Diogo da S.; Lava, Deise Diana; Guimarães, Antônio C.F.; Moreira, Maria L.

    2017-01-01

    This paper presents a methodology to analysis of dispersion of radioactive materials in an aquatic environment, specifically for estuaries, based on the Regulatory Guide 1.113. The objective is to present an adaptation of methodology for computational user, that it is possible by means of the use of numerical approximations techniques. The methodology to be present consist in a numerical approximation of the Navier-Stokes Equation applied in a finite medium with known transport mechanisms, such as Coriolis Effect, floor drag, diffusion, salinity, temperature difference and adhesion per water molecule. The basis of methodology is substantiated in a transport diffusive-convection equation, which has similarity with the Partial Differential Burgues' Equation for one dimension and with the Kardar-Parisi-Zhang Equation for multidimensional cases. (author)

  5. Methodology to estimating aquatic dispersion of effluents from accidental and routine releases

    Energy Technology Data Exchange (ETDEWEB)

    Borges, Diogo da S.; Lava, Deise Diana; Guimarães, Antônio C.F.; Moreira, Maria L., E-mail: diogosb@outlook.com, E-mail: deise_dy@hotmail.com, E-mail: tony@ien.gov.br, E-mail: malu@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    This paper presents a methodology to analysis of dispersion of radioactive materials in an aquatic environment, specifically for estuaries, based on the Regulatory Guide 1.113. The objective is to present an adaptation of methodology for computational user, that it is possible by means of the use of numerical approximations techniques. The methodology to be present consist in a numerical approximation of the Navier-Stokes Equation applied in a finite medium with known transport mechanisms, such as Coriolis Effect, floor drag, diffusion, salinity, temperature difference and adhesion per water molecule. The basis of methodology is substantiated in a transport diffusive-convection equation, which has similarity with the Partial Differential Burgues' Equation for one dimension and with the Kardar-Parisi-Zhang Equation for multidimensional cases. (author)

  6. Problems of increased transport load as a result of implementation of projects of high-rise constructions

    Science.gov (United States)

    Provotorov, Ivan; Gasilov, Valentin; Anisimova, Nadezhda

    2018-03-01

    The structure of problems of high-rise construction us suggested, which includes the impact on environment, design solutions, transportation problems, financial costs for construction and operation, and others. Positive and negative aspects of high-rise construction are considered. One of the basic problems of high-rise construction is the problem of increased transport load. Construction of the subway on the basis of the concession mechanism, with the use of unmanned control of rolling stock is proposed as the most expedient solution. An evaluation of the effectiveness of this project is presented, it shows quite high performance indicators for a private investor. Main problems that the project implementation may face in conditions of lack of scientific and methodological support are outlined.

  7. Development of a model to simulate groundwater inundation induced by sea-level rise and high tides in Honolulu, Hawaii.

    Science.gov (United States)

    Habel, Shellie; Fletcher, Charles H; Rotzoll, Kolja; El-Kadi, Aly I

    2017-05-01

    Many of the world's largest cities face risk of sea-level rise (SLR) induced flooding owing to their limited elevations and proximities to the coastline. Within this century, global mean sea level is expected to reach magnitudes that will exceed the ground elevation of some built infrastructure. The concurrent rise of coastal groundwater will produce additional sources of inundation resulting from narrowing and loss of the vertical unsaturated subsurface space. This has implications for the dense network of buried and low-lying infrastructure that exists across urban coastal zones. Here, we describe a modeling approach that simulates narrowing of the unsaturated space and groundwater inundation (GWI) generated by SLR-induced lifting of coastal groundwater. The methodology combines terrain modeling, groundwater monitoring, estimation of tidal influence, and numerical groundwater-flow modeling to simulate future flood scenarios considering user-specified tide stages and magnitudes of SLR. We illustrate the value of the methodology by applying it to the heavily urbanized and low-lying Waikiki area of Honolulu, Hawaii. Results indicate that SLR of nearly 1 m generates GWI across 23% of the 13 km 2 study area, threatening $5 billion of taxable real estate and 48 km of roadway. Analysis of current conditions reveals that 86% of 259 active cesspool sites in the study area are likely inundated. This suggests that cesspool effluent is currently entering coastal groundwater, which not only leads to degradation of coastal environments, but also presents a future threat to public health as GWI would introduce effluent at the ground surface. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  9. Simulation of a Dispersive Tsunami due to the 2016 El Salvador-Nicaragua Outer-Rise Earthquake (M w 6.9)

    Science.gov (United States)

    Tanioka, Yuichiro; Ramirez, Amilcar Geovanny Cabrera; Yamanaka, Yusuke

    2018-01-01

    The 2016 El Salvador-Nicaragua outer-rise earthquake (M w 6.9) generated a small tsunami observed at the ocean bottom pressure sensor, DART 32411, in the Pacific Ocean off Central America. The dispersive observed tsunami is well simulated using the linear Boussinesq equations. From the dispersive character of tsunami waveform, the fault length and width of the outer-rise event is estimated to be 30 and 15 km, respectively. The estimated seismic moment of 3.16 × 1019 Nm is the same as the estimation in the Global CMT catalog. The dispersive character of the tsunami in the deep ocean caused by the 2016 outer-rise El Salvador-Nicaragua earthquake could constrain the fault size and the slip amount or the seismic moment of the event.

  10. Simulation of a Dispersive Tsunami due to the 2016 El Salvador-Nicaragua Outer-Rise Earthquake ( M w 6.9)

    Science.gov (United States)

    Tanioka, Yuichiro; Ramirez, Amilcar Geovanny Cabrera; Yamanaka, Yusuke

    2018-04-01

    The 2016 El Salvador-Nicaragua outer-rise earthquake ( M w 6.9) generated a small tsunami observed at the ocean bottom pressure sensor, DART 32411, in the Pacific Ocean off Central America. The dispersive observed tsunami is well simulated using the linear Boussinesq equations. From the dispersive character of tsunami waveform, the fault length and width of the outer-rise event is estimated to be 30 and 15 km, respectively. The estimated seismic moment of 3.16 × 1019 Nm is the same as the estimation in the Global CMT catalog. The dispersive character of the tsunami in the deep ocean caused by the 2016 outer-rise El Salvador-Nicaragua earthquake could constrain the fault size and the slip amount or the seismic moment of the event.

  11. Methods and problems in assessing the impacts of accelerated sea-level rise

    Science.gov (United States)

    Nicholls, Robert J.; Dennis, Karen C.; Volonte, Claudio R.; Leatherman, Stephen P.

    1992-06-01

    Accelerated sea-level rise is one of the more certain responses to global warming and presents a major challenge to mankind. However, it is important to note that sea-level rise is only manifest over long timescales (decades to centuries). Coastal scientists are increasingly being called upon to assess the physical, economic and societal impacts of sea-level rise and hence investigate appropriate response strategies. Such assessments are difficult in many developing countries due to a lack of physical, demographic and economic data. In particular, there is a lack of appropriate topographic information for the first (physical) phase of the analysis. To overcome these difficulties we have developed a new rapid and low-cost reconnaissance technique: ``aerial videotape-assisted vulnerability analysis'' (AVA). It involves: 1) videotaping the coastline from a small airplane; 2) limited ground-truth measurements; and 3) archive research. Combining the video record with the ground-truth information characterizes the coastal topography and, with an appropriate land loss model, estimates of the physical impact for different sea-level rise scenarios can be made. However, such land loss estimates raise other important questions such as the appropriate seaward limit of the beach profile. Response options also raise questions such as the long-term costs of seawalls. Therefore, realistic low and high estiimates were developed. To illustrate the method selected results from Senegal, Uruguay and Venezuela are presented.

  12. Estimation of spectral kurtosis

    Science.gov (United States)

    Sutawanir

    2017-03-01

    Rolling bearings are the most important elements in rotating machinery. Bearing frequently fall out of service for various reasons: heavy loads, unsuitable lubrications, ineffective sealing. Bearing faults may cause a decrease in performance. Analysis of bearing vibration signals has attracted attention in the field of monitoring and fault diagnosis. Bearing vibration signals give rich information for early detection of bearing failures. Spectral kurtosis, SK, is a parameter in frequency domain indicating how the impulsiveness of a signal varies with frequency. Faults in rolling bearings give rise to a series of short impulse responses as the rolling elements strike faults, SK potentially useful for determining frequency bands dominated by bearing fault signals. SK can provide a measure of the distance of the analyzed bearings from a healthy one. SK provides additional information given by the power spectral density (psd). This paper aims to explore the estimation of spectral kurtosis using short time Fourier transform known as spectrogram. The estimation of SK is similar to the estimation of psd. The estimation falls in model-free estimation and plug-in estimator. Some numerical studies using simulations are discussed to support the methodology. Spectral kurtosis of some stationary signals are analytically obtained and used in simulation study. Kurtosis of time domain has been a popular tool for detecting non-normality. Spectral kurtosis is an extension of kurtosis in frequency domain. The relationship between time domain and frequency domain analysis is establish through power spectrum-autocovariance Fourier transform. Fourier transform is the main tool for estimation in frequency domain. The power spectral density is estimated through periodogram. In this paper, the short time Fourier transform of the spectral kurtosis is reviewed, a bearing fault (inner ring and outer ring) is simulated. The bearing response, power spectrum, and spectral kurtosis are plotted to

  13. A novel methodology to estimate the evolution of construction waste in construction sites.

    Science.gov (United States)

    Katz, Amnon; Baum, Hadassa

    2011-02-01

    This paper focuses on the accumulation of construction waste generated throughout the erection of new residential buildings. A special methodology was developed in order to provide a model that will predict the flow of construction waste. The amount of waste and its constituents, produced on 10 relatively large construction sites (7000-32,000 m(2) of built area) was monitored periodically for a limited time. A model that predicts the accumulation of construction waste was developed based on these field observations. According to the model, waste accumulates in an exponential manner, i.e. smaller amounts are generated during the early stages of construction and increasing amounts are generated towards the end of the project. The total amount of waste from these sites was estimated at 0.2m(3) per 1m(2) floor area. A good correlation was found between the model predictions and actual data from the field survey. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Improvement of the bubble rise velocity model in the pressurizer using ALMOD 3 computer code to calculate evaporation

    International Nuclear Information System (INIS)

    Madeira, A.A.

    1985-01-01

    It's studied the improvement for the calculation of bubble rise velocity, by adding two different ways to estimate this velocity, one of which more adequate to pressures normally found in the Reactor Cooling System. Additionally, a limitation in bubble rise velocity growth was imposed, to account for the actual behavior of bubble rise in two-phase mixtures. (Author) [pt

  15. Spatial Hedonic Models for Measuring the Impact of Sea-Level Rise on Coastal Real Estate

    OpenAIRE

    Okmyung Bin; Ben Poulter; Christopher F. Dumas; John C. Whitehead

    2009-01-01

    This study uses a unique integration of geospatial and hedonic property data to estimate the impact of sea-level rise on coastal real estate in North Carolina. North Carolina’s coastal plain is one of several large terrestrial systems around the world threatened by rising sea-levels. High-resolution topographic LIDAR (Light Detection and Ranging) data are used to provide accurate inundation maps for all properties that will be at risk under six different sea-level rise scenarios. A simulation...

  16. The substantiation of methodical instrumentation to increase the tempo of high-rise construction in region

    Science.gov (United States)

    Belyaeva, Svetlana; Makeeva, Tatyana; Chugunov, Andrei; Andreeva, Peraskovya

    2018-03-01

    One of the important conditions of effective renovation of accommodation in region on the base of realization of high-rise construction projects is attraction of investments by forming favorable investment climate, as well as reduction if administrative barriers in construction and update of main funds of housing and communal services. The article proposes methodological bases for assessing the state of the investment climate in the region, as well as the methodology for the formation and evaluation of the investment program of the housing and communal services enterprise. The proposed methodologies are tested on the example of the Voronezh region. Authors also showed the necessity and expediency of using the consulting mechanism in the development of state and non-state investment projects and programs.

  17. The substantiation of methodical instrumentation to increase the tempo of high-rise construction in region

    Directory of Open Access Journals (Sweden)

    Belyaeva Svetlana

    2018-01-01

    Full Text Available One of the important conditions of effective renovation of accommodation in region on the base of realization of high-rise construction projects is attraction of investments by forming favorable investment climate, as well as reduction if administrative barriers in construction and update of main funds of housing and communal services. The article proposes methodological bases for assessing the state of the investment climate in the region, as well as the methodology for the formation and evaluation of the investment program of the housing and communal services enterprise. The proposed methodologies are tested on the example of the Voronezh region. Authors also showed the necessity and expediency of using the consulting mechanism in the development of state and non-state investment projects and programs.

  18. Estimating the Potential Risks of Sea Level Rise for Public and Private Property Ownership, Occupation and Management

    Directory of Open Access Journals (Sweden)

    Georgia Warren-Myers

    2018-04-01

    Full Text Available The estimation of future sea level rise (SLR is a major concern for cities near coastlines and river systems. Despite this, current modelling underestimates the future risks of SLR to property. Direct risks posed to property include inundation, loss of physical property and associated economic and social costs. It is also crucial to consider the risks that emerge from scenarios after SLR. These may produce one-off or periodic events that will inflict physical, economic and social implications, and direct, indirect and consequential losses. Using a case study approach, this paper combines various forms of data to examine the implications of future SLR to further understand the potential risks. The research indicates that the financial implications for local government will be loss of rates associated with total property loss and declines in value. The challenges identified are not specific to this research. Other municipalities worldwide experience similar barriers (i.e., financial implications, coastal planning predicaments, data paucity, knowledge and capacity, and legal and political challenges. This research highlights the need for private and public stakeholders to co-develop and implement strategies to mitigate and adapt property to withstand the future challenges of climate change and SLR.

  19. PMP Estimations at Sparsely Controlled Andinian Basins and Climate Change Projections

    Science.gov (United States)

    Lagos Zúñiga, M. A.; Vargas, X.

    2012-12-01

    Probable Maximum Precipitation (PMP) estimation implies an extensive review of hydrometeorological data and understandig of precipitation formation processes. There exists different methodology processes that apply for their estimations and all of them require a good spatial and temporal representation of storms. The estimation of hydrometeorological PMP on sparsely controlled basins is a difficult task, specially if the studied area has an important orographic effect due to mountains and the mixed precipitation occurrence in the most several storms time period, the main task of this study is to propose and estimate PMP in a sparsely controlled basin, affected by abrupt topography and mixed hidrology basin; also analyzing statystic uncertainties estimations and possible climate changes effects in its estimation. In this study the PMP estimation under statistical and hydrometeorological aproaches (watershed-based and traditional depth area duration analysis) was done in a semi arid zone at Puclaro dam in north Chile. Due to the lack of good spatial meteorological representation at the study zone, we propose a methodology to consider the orographic effects of Los Andes due to orographic effects patterns based in a RCM PRECIS-DGF and annual isoyetal maps. Estimations were validated with precipitation patterns for given winters, considering snow route and rainfall gauges at the preferencial wind direction, finding good results. The estimations are also compared with the highest areal storms in USA, Australia, India and China and with frequency analysis in local rain gauge stations in order to decide about the most adequate approach for the study zone. Climate change projections were evaluated with ECHAM5 GCM model, due to its good quality representation in the seasonality and the magnitude of meteorological variables. Temperature projections, for 2040-2065 period, show that there would be a rise in the catchment contributing area that would lead to an increase of the

  20. Uncertainty analysis for results of thermal hydraulic codes of best-estimate-type; Analisis de incertidumbre para resultados de codigos termohidraulicos de mejor estimacion

    Energy Technology Data Exchange (ETDEWEB)

    Alva N, J.

    2010-07-01

    In this thesis, some fundamental knowledge is presented about uncertainty analysis and about diverse methodologies applied in the study of nuclear power plant transient event analysis, particularly related to thermal hydraulics phenomena. These concepts and methodologies mentioned in this work come from a wide bibliographical research in the nuclear power subject. Methodologies for uncertainty analysis have been developed by quite diverse institutions, and they have been widely used worldwide for application to results from best-estimate-type computer codes in nuclear reactor thermal hydraulics and safety analysis. Also, the main uncertainty sources, types of uncertainties, and aspects related to best estimate modeling and methods are introduced. Once the main bases of uncertainty analysis have been set, and some of the known methodologies have been introduced, it is presented in detail the CSAU methodology, which will be applied in the analyses. The main objective of this thesis is to compare the results of an uncertainty and sensibility analysis by using the Response Surface Technique to the application of W ilks formula, apply through a loss coolant experiment and an event of rise in a BWR. Both techniques are options in the part of uncertainty and sensibility analysis of the CSAU methodology, which was developed for the analysis of transients and accidents at nuclear power plants, and it is the base of most of the methodologies used in licensing of nuclear power plants practically everywhere. Finally, the results of applying both techniques are compared and discussed. (Author)

  1. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  2. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  3. Methodology of environmental risk assessment management

    Directory of Open Access Journals (Sweden)

    Saša T. Bakrač

    2012-04-01

    Full Text Available Successful protection of environment is mostly based on high-quality assessment of potential and present risks. Environmental risk management is a complex process which includes: identification, assessment and control of risk, namely taking measures in order to minimize the risk to an acceptable level. Environmental risk management methodology: In addition to these phases in the management of environmental risk, appropriate measures that affect the reduction of risk occurrence should be implemented: - normative and legal regulations (laws and regulations, - appropriate organizational structures in society, and - establishing quality monitoring of environment. The emphasis is placed on the application of assessment methodologies (three-model concept, as the most important aspect of successful management of environmental risk. Risk assessment methodology - European concept: The first concept of ecological risk assessment methodology is based on the so-called European model-concept. In order to better understand this ecological risk assessment methodology, two concepts - hazard and risk - are introduced. The European concept of environmental risk assessment has the following phases in its implementation: identification of hazard (danger, identification of consequences (if there is hazard, estimate of the scale of consequences, estimate of consequence probability and risk assessment (also called risk characterization. The European concept is often used to assess risk in the environment as a model for addressing the distribution of stressors along the source - path - receptor line. Risk assessment methodology - Canadian concept: The second concept of the methodology of environmental risk assessment is based on the so-called Canadian model-concept. The assessment of ecological risk includes risk arising from natural events (floods, extreme weather conditions, etc., technological processes and products, agents (chemical, biological, radiological, etc

  4. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  5. QUANTIFYING REGIONAL SEA LEVEL RISE CONTRIBUTIONS FROM THE GREENLAND ICE SHEET

    Directory of Open Access Journals (Sweden)

    Diandong Ren

    2013-01-01

    Full Text Available This study projects the sea level contribution from the Greenland ice sheet (GrIS through to 2100, using a recently developed ice dynamics model forced by atmospheric parameters derived from three different climate models (CGCMs. The geographical pattern of the near-surface ice warming imposes a divergent flow field favoring mass loss through enhanced ice flow. The calculated average mass loss rate during the latter half of the 21st century is ~0.64±0.06 mm/year eustatic sea level rise, which is significantly larger than the IPCC AR4 estimate from surface mass balance. The difference is due largely to the positive feedbacks from reduced ice viscosity and the basal sliding mechanism present in the ice dynamics model. This inter-model, inter-scenario spread adds approximately a 20% uncertainty to the IPCC ice model estimates. The sea level rise is geographically non-uniform and reaches 1.69±0.24 mm/year by 2100 for the northeast coastal region of the United States, amplified by the expected weakening of the Atlantic meridional overturning circulation (AMOC. In contrast to previous estimates, which neglected the GrIS fresh water input, both sides of the North Atlantic Gyre are projected to experience sea level rises. The impacts on a selection of major cities on both sides of the Atlantic and in the Pacific and southern oceans also are assessed. The other ocean basins are found to be less affected than the Atlantic Ocean.

  6. An Efficient Power Estimation Methodology for Complex RISC Processor-based Platforms

    OpenAIRE

    Rethinagiri , Santhosh Kumar; Ben Atitallah , Rabie; Dekeyser , Jean-Luc; Niar , Smail; Senn , Eric

    2012-01-01

    International audience; In this contribution, we propose an efficient power estima- tion methodology for complex RISC processor-based plat- forms. In this methodology, the Functional Level Power Analysis (FLPA) is used to set up generic power models for the different parts of the system. Then, a simulation framework based on virtual platform is developed to evalu- ate accurately the activities used in the related power mod- els. The combination of the two parts above leads to a het- erogeneou...

  7. The Climate Science Special Report: Rising Seas and Changing Oceans

    Science.gov (United States)

    Kopp, R. E.

    2017-12-01

    GMSL has risen by about 16-21 cm since 1900. Ocean heat content has increased at all depths since the 1960s, and global mean sea-surface temperature increased 0.7°C/century between 1900 to 2016. Human activity contributed substantially to generating a rate of GMSL rise since 1900 faster than during any preceding century in at least 2800 years. A new set of six sea-level rise scenarios, spanning a range from 30 cm to 250 cm of 21st century GMSL rise, were developed for the CSSR. The lower scenario is based on linearly extrapolating the past two decades' rate of rise. The upper scenario is informed by literature estimates of maximum physically plausible values, observations indicating the onset of marine ice sheet instability in parts of West Antarctica, and modeling of ice-cliff and ice-shelf instability mechanisms. The new scenarios include localized projections along US coastlines. There is significant variability around the US, with rates of rise likely greater than GMSL rise in the US Northeast and the western Gulf of Mexico. Under scenarios involving extreme Antarctic contributions, regional rise would be greater than GMSL rise along almost all US coastlines. Historical sea-level rise has already driven a 5- to 10-fold increase in minor tidal flooding in several US coastal cities since the 1960s. Under the CSSR's Intermediate sea-level rise scenario (1.0 m of GMSL rise in 2100) , a majority of NOAA tide gauge locations will by 2040 experience the historical 5-year coastal flood about 5 times per year. Ocean changes are not limited to rising sea levels. Ocean pH is decreasing at a rate that may be unparalleled in the last 66 million years. Along coastlines, ocean acidification can be enhanced by changes in the upwelling (particularly along the US Pacific Coast); by episodic, climate change-enhanced increases in freshwater input (particularly along the US Atlantic Coast); and by the enhancement of biological respiration by nutrient runoff. Climate models project

  8. State estimation for a hexapod robot

    CSIR Research Space (South Africa)

    Lubbe, Estelle

    2015-09-01

    Full Text Available This paper introduces a state estimation methodology for a hexapod robot that makes use of proprioceptive sensors and a kinematic model of the robot. The methodology focuses on providing reliable full pose state estimation for a commercially...

  9. Regression methodology in groundwater composition estimation with composition predictions for Romuvaara borehole KR10

    Energy Technology Data Exchange (ETDEWEB)

    Luukkonen, A.; Korkealaakso, J.; Pitkaenen, P. [VTT Communities and Infrastructure, Espoo (Finland)

    1997-11-01

    Teollisuuden Voima Oy selected five investigation areas for preliminary site studies (1987Ae1992). The more detailed site investigation project, launched at the beginning of 1993 and presently supervised by Posiva Oy, is concentrated to three investigation areas. Romuvaara at Kuhmo is one of the present target areas, and the geochemical, structural and hydrological data used in this study are extracted from there. The aim of the study is to develop suitable methods for groundwater composition estimation based on a group of known hydrogeological variables. The input variables used are related to the host type of groundwater, hydrological conditions around the host location, mixing potentials between different types of groundwater, and minerals equilibrated with the groundwater. The output variables are electrical conductivity, Ca, Mg, Mn, Na, K, Fe, Cl, S, HS, SO{sub 4}, alkalinity, {sup 3}H, {sup 14}C, {sup 13}C, Al, Sr, F, Br and I concentrations, and pH of the groundwater. The methodology is to associate the known hydrogeological conditions (i.e. input variables), with the known water compositions (output variables), and to evaluate mathematical relations between these groups. Output estimations are done with two separate procedures: partial least squares regressions on the principal components of input variables, and by training neural networks with input-output pairs. Coefficients of linear equations and trained networks are optional methods for actual predictions. The quality of output predictions are monitored with confidence limit estimations, evaluated from input variable covariances and output variances, and with charge balance calculations. Groundwater compositions in Romuvaara borehole KR10 are predicted at 10 metre intervals with both prediction methods. 46 refs.

  10. Using CTX Image Features to Predict HiRISE-Equivalent Rock Density

    Science.gov (United States)

    Serrano, Navid; Huertas, Andres; McGuire, Patrick; Mayer, David; Ardvidson, Raymond

    2010-01-01

    Methods have been developed to quantitatively assess rock hazards at candidate landing sites with the aid of images from the HiRISE camera onboard NASA s Mars Reconnaissance Orbiter. HiRISE is able to resolve rocks as small as 1-m in diameter. Some sites of interest do not have adequate coverage with the highest resolution sensors and there is a need to infer relevant information (like site safety or underlying geomorphology). The proposed approach would make it possible to obtain rock density estimates at a level close to or equal to those obtained from high-resolution sensors where individual rocks are discernable.

  11. The Global Experience of Deployment of Energy-Efficient Technologies in High-Rise Construction

    Science.gov (United States)

    Potienko, Natalia D.; Kuznetsova, Anna A.; Solyakova, Darya N.; Klyueva, Yulia E.

    2018-03-01

    The objective of this research is to examine issues related to the increasing importance of energy-efficient technologies in high-rise construction. The aim of the paper is to investigate modern approaches to building design that involve implementation of various energy-saving technologies in diverse climates and at different structural levels, including the levels of urban development, functionality, planning, construction and engineering. The research methodology is based on the comprehensive analysis of the advanced global expertise in the design and construction of energy-efficient high-rise buildings, with the examination of their positive and negative features. The research also defines the basic principles of energy-efficient architecture. Besides, it draws parallels between the climate characteristics of countries that lead in the field of energy-efficient high-rise construction, on the one hand, and the climate in Russia, on the other, which makes it possible to use the vast experience of many countries, wholly or partially. The paper also gives an analytical review of the results arrived at by implementing energy efficiency principles into high-rise architecture. The study findings determine the impact of energy-efficient technologies on high-rise architecture and planning solutions. In conclusion, the research states that, apart from aesthetic and compositional interpretation of architectural forms, an architect nowadays has to address the task of finding a synthesis between technological and architectural solutions, which requires knowledge of advanced technologies. The study findings reveal that the implementation of modern energy-efficient technologies into high-rise construction is of immediate interest and is sure to bring long-term benefits.

  12. A statistical methodology for the estimation of extreme wave conditions for offshore renewable applications

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kalogeri, Christina; Galanis, George

    2015-01-01

    and post-process outputs from a high resolution numerical wave modeling system for extreme wave estimation based on the significant wave height. This approach is demonstrated through the data analysis at a relatively deep water site, FINO 1, as well as a relatively shallow water area, coastal site Horns...... as a characteristic index of extreme wave conditions. The results from the proposed methodology seem to be in a good agreement with the measurements at both the relatively deep, open water and the shallow, coastal water sites, providing a potentially useful tool for offshore renewable energy applications. © 2015...... Rev, which is located in the North Sea, west of Denmark. The post-processing targets at correcting the modeled time series of the significant wave height, in order to match the statistics of the corresponding measurements, including not only the conventional parameters such as the mean and standard...

  13. Assessing water quality of the Chesapeake Bay by the impact of sea level rise and warming

    Science.gov (United States)

    Wang, P.; Linker, L.; Wang, H.; Bhatt, G.; Yactayo, G.; Hinson, K.; Tian, R.

    2017-08-01

    The influence of sea level rise and warming on circulation and water quality of the Chesapeake Bay under projected climate conditions in 2050 were estimated by computer simulation. Four estuarine circulation scenarios in the estuary were run using the same watershed load in 1991-2000 period. They are, 1) the Base Scenario, which represents the current climate condition, 2) a Sea Level Rise Scenario, 3) a Warming Scenario, and 4) a combined Sea Level Rise and Warming Scenario. With a 1.6-1.9°C increase in monthly air temperatures in the Warming Scenario, water temperature in the Bay is estimated to increase by 0.8-1°C. Summer average anoxic volume is estimated to increase 1.4 percent compared to the Base Scenario, because of an increase in algal blooms in the spring and summer, promotion of oxygen consumptive processes, and an increase of stratification. However, a 0.5-meter Sea Level Rise Scenario results in a 12 percent reduction of anoxic volume. This is mainly due to increased estuarine circulation that promotes oxygen-rich sea water intrusion in lower layers. The combined Sea Level Rise and Warming Scenario results in a 10.8 percent reduction of anoxic volume. Global warming increases precipitation and consequently increases nutrient loads from the watershed by approximately 5-7 percent. A scenario that used a 10 percent increase in watershed loads and current estuarine circulation patterns yielded a 19 percent increase in summer anoxic volume, while a scenario that used a 10 percent increase in watershed loads and modified estuarine circulation patterns by the aforementioned sea level rise and warming yielded a 6 percent increase in summer anoxic volume. Impacts on phytoplankton, sediments, and water clarity were also analysed.

  14. Global warming and sea level rise. Chikyu Ondanka to kaimen josho

    Energy Technology Data Exchange (ETDEWEB)

    Mimura, N [Ibaraki University, Ibaraki (Japan). Faculty of Engineering

    1993-10-15

    This paper describes the following matters on the problems of global warming and sea level rise. The first evaluation report published by the inter-government panel on climate change (IPCC) in 1990 estimates that, if emission of greenhouse effect gas keeps increasing at the present rate, the air temperature and the average sea level would rise by 3[degree]C and 65 centimeters, respectively by 2100. Global warming would not only result in rise of the sea level, but also accompany changes in strengths and routes of tropical low pressure areas, and precipitation patterns. Downstream areas of large rivers and island countries on coral reefs may have a risk of getting submerged. Countries having coasts developed to high densities (Japan, for example) would be subjected to a high potential effect. An 'East Hemisphere International Conference on Sea Level Rising Problem' was held in Japan in August 1993 as part of the works to prepare the second evaluation report of the IPCC (publication scheduled for 1995). The conference was attended by 24 countries, and 43 study results were reported. 4 figs.

  15. Generalized Response Surface Methodology : A New Metaheuristic

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Generalized Response Surface Methodology (GRSM) is a novel general-purpose metaheuristic based on Box and Wilson.s Response Surface Methodology (RSM).Both GRSM and RSM estimate local gradients to search for the optimal solution.These gradients use local first-order polynomials.GRSM, however, uses

  16. Reconciling past changes in Earth's rotation with 20th century global sea-level rise: Resolving Munk's enigma.

    Science.gov (United States)

    Mitrovica, Jerry X; Hay, Carling C; Morrow, Eric; Kopp, Robert E; Dumberry, Mathieu; Stanley, Sabine

    2015-12-01

    In 2002, Munk defined an important enigma of 20th century global mean sea-level (GMSL) rise that has yet to be resolved. First, he listed three canonical observations related to Earth's rotation [(i) the slowing of Earth's rotation rate over the last three millennia inferred from ancient eclipse observations, and changes in the (ii) amplitude and (iii) orientation of Earth's rotation vector over the last century estimated from geodetic and astronomic measurements] and argued that they could all be fit by a model of ongoing glacial isostatic adjustment (GIA) associated with the last ice age. Second, he demonstrated that prevailing estimates of the 20th century GMSL rise (~1.5 to 2.0 mm/year), after correction for the maximum signal from ocean thermal expansion, implied mass flux from ice sheets and glaciers at a level that would grossly misfit the residual GIA-corrected observations of Earth's rotation. We demonstrate that the combination of lower estimates of the 20th century GMSL rise (up to 1990) improved modeling of the GIA process and that the correction of the eclipse record for a signal due to angular momentum exchange between the fluid outer core and the mantle reconciles all three Earth rotation observations. This resolution adds confidence to recent estimates of individual contributions to 20th century sea-level change and to projections of GMSL rise to the end of the 21st century based on them.

  17. Binational Arsenic Exposure Survey: Methodology and Estimated Arsenic Intake from Drinking Water and Urinary Arsenic Concentrations

    Directory of Open Access Journals (Sweden)

    Robin B. Harris

    2012-03-01

    Full Text Available The Binational Arsenic Exposure Survey (BAsES was designed to evaluate probable arsenic exposures in selected areas of southern Arizona and northern Mexico, two regions with known elevated levels of arsenic in groundwater reserves. This paper describes the methodology of BAsES and the relationship between estimated arsenic intake from beverages and arsenic output in urine. Households from eight communities were selected for their varying groundwater arsenic concentrations in Arizona, USA and Sonora, Mexico. Adults responded to questionnaires and provided dietary information. A first morning urine void and water from all household drinking sources were collected. Associations between urinary arsenic concentration (total, organic, inorganic and estimated level of arsenic consumed from water and other beverages were evaluated through crude associations and by random effects models. Median estimated total arsenic intake from beverages among participants from Arizona communities ranged from 1.7 to 14.1 µg/day compared to 0.6 to 3.4 µg/day among those from Mexico communities. In contrast, median urinary inorganic arsenic concentrations were greatest among participants from Hermosillo, Mexico (6.2 µg/L whereas a high of 2.0 µg/L was found among participants from Ajo, Arizona. Estimated arsenic intake from drinking water was associated with urinary total arsenic concentration (p < 0.001, urinary inorganic arsenic concentration (p < 0.001, and urinary sum of species (p < 0.001. Urinary arsenic concentrations increased between 7% and 12% for each one percent increase in arsenic consumed from drinking water. Variability in arsenic intake from beverages and urinary arsenic output yielded counter intuitive results. Estimated intake of arsenic from all beverages was greatest among Arizonans yet participants in Mexico had higher urinary total and inorganic arsenic concentrations. Other contributors to urinary arsenic concentrations should be evaluated.

  18. Rising tides, rising gates: The complex ecogeomorphic response of coastal wetlands to sea-level rise and human interventions

    Science.gov (United States)

    Sandi, Steven G.; Rodríguez, José F.; Saintilan, Neil; Riccardi, Gerardo; Saco, Patricia M.

    2018-04-01

    Coastal wetlands are vulnerable to submergence due to sea-level rise, as shown by predictions of up to 80% of global wetland loss by the end of the century. Coastal wetlands with mixed mangrove-saltmarsh vegetation are particularly vulnerable because sea-level rise can promote mangrove encroachment on saltmarsh, reducing overall wetland biodiversity. Here we use an ecogeomorphic framework that incorporates hydrodynamic effects, mangrove-saltmarsh dynamics, and soil accretion processes to assess the effects of control structures on wetland evolution. Migration and accretion patterns of mangrove and saltmarsh are heavily dependent on topography and control structures. We find that current management practices that incorporate a fixed gate for the control of mangrove encroachment are useful initially, but soon become ineffective due to sea-level rise. Raising the gate, to counteract the effects of sea level rise and promote suitable hydrodynamic conditions, excludes mangrove and maintains saltmarsh over the entire simulation period of 100 years

  19. Integrating wildfire plume rises within atmospheric transport models

    Science.gov (United States)

    Mallia, D. V.; Kochanski, A.; Wu, D.; Urbanski, S. P.; Krueger, S. K.; Lin, J. C.

    2016-12-01

    Wildfires can generate significant pyro-convection that is responsible for releasing pollutants, greenhouse gases, and trace species into the free troposphere, which are then transported a significant distance downwind from the fire. Oftentimes, atmospheric transport and chemistry models have a difficult time resolving the transport of smoke from these wildfires, primarily due to deficiencies in estimating the plume injection height, which has been highlighted in previous work as the most important aspect of simulating wildfire plume transport. As a result of the uncertainties associated with modeled wildfire plume rise, researchers face difficulties modeling the impacts of wildfire smoke on air quality and constraining fire emissions using inverse modeling techniques. Currently, several plume rise parameterizations exist that are able to determine the injection height of fire emissions; however, the success of these parameterizations has been mixed. With the advent of WRF-SFIRE, the wildfire plume rise and injection height can now be explicitly calculated using a fire spread model (SFIRE) that is dynamically linked with the atmosphere simulated by WRF. However, this model has only been tested on a limited basis due to computational costs. Here, we will test the performance of WRF-SFIRE in addition to several commonly adopted plume parameterizations (Freitas, Sofiev, and Briggs) for the 2013 Patch Springs (Utah) and 2012 Baker Canyon (Washington) fires, for both of which observations of plume rise heights are available. These plume rise techniques will then be incorporated within a Lagrangian atmospheric transport model (STILT) in order to simulate CO and CO2 concentrations during NASA's CARVE Earth Science Airborne Program over Alaska during the summer of 2012. Initial model results showed that STILT model simulations were unable to reproduce enhanced CO concentrations produced by Alaskan fires observed during 2012. Near-surface concentrations were drastically

  20. Do we have to take an acceleration of sea level rise into account?

    Science.gov (United States)

    Dillingh, D.; Baart, F.; de Ronde, J.

    2012-04-01

    In view of preservation of safety against inundation and of the many values and functions of the coastal zone, coastal retreat is no longer acceptable. That is why it was decided to maintain the Dutch coastline on its position in 1990. Later the preservation concept was extended to the Dutch coastal foundation, which is the area that encompasses all dune area's and hard sea defences and reaches seawards until the 20m depth contour line. Present Dutch coastal policy is to grow with sea level by means of sand nourishments. A main issue for the planning of sand nourishments is the rate of sea level rise, because that is the main parameter for the volume of the sand needed. The question is than relevant if we already have to take into account an acceleration of sea level rise. Six stations with long water level records, well spread along the Dutch coast, were analysed. Correction of the measured data was considered necessary for an adaptation of the NAP in 2005 as a consequence of movements of the top of the pleistoceen, on which the NAP bench marks have been founded, and for the 18.6 year (nodal) cycle in the time series of yearly mean sea levels. It has been concluded that along the Dutch coast no significant acceleration of sea level rise could be detected yet. Over the last 120 years sea level rose with an average speed of 19 cm per century relative to NAP (the Dutch ordnance datum). Time series shorter than about 50 years showed less robust estimates of sea level rise. Future sea level rise also needs consideration in view of the estimate of future sand nourishment volumes. Scenario's for sea level rise have been derived for the years 2050 and 2100 relative to 1990 by the KNMI (Dutch Met Office) in 2006 for the Dutch situation. Plausible curves have been drawn from 1990 tangent to the linear regression line in 1990 and forced through the high and low scenario projections for 2050 and 2100. These curves show discrepancies with measurements of the last decade

  1. Application of a rising plate meter to estimate forage yield on dairy farms in Pennsylvania

    Science.gov (United States)

    Accurately assessing pasture forage yield is necessary for producers who want to budget feed expenses and make informed pasture management decisions. Clipping and weighing forage from a known area is a direct method to measure pasture forage yield, however it is time consuming. The rising plate mete...

  2. Experimental methodology for obtaining sound absorption coefficients

    Directory of Open Access Journals (Sweden)

    Carlos A. Macía M

    2011-07-01

    Full Text Available Objective: the authors propose a new methodology for estimating sound absorption coefficients using genetic algorithms. Methodology: sound waves are generated and conducted along a rectangular silencer. The waves are then attenuated by the absorbing material covering the silencer’s walls. The attenuated sound pressure level is used in a genetic algorithm-based search to find the parameters of the proposed attenuation expressions that include geometric factors, the wavelength and the absorption coefficient. Results: a variety of adjusted mathematical models were found that make it possible to estimate the absorption coefficients based on the characteristics of a rectangular silencer used for measuring the attenuation of the noise that passes through it. Conclusions: this methodology makes it possible to obtain the absorption coefficients of new materials in a cheap and simple manner. Although these coefficients might be slightly different from those obtained through other methodologies, they provide solutions within the engineering accuracy ranges that are used for designing noise control systems.

  3. THE RISE TIME OF NORMAL AND SUBLUMINOUS TYPE Ia SUPERNOVAE

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Gaitan, S.; Perrett, K.; Carlberg, R. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. george Street, Toronto, ON M5S 3H4 (Canada); Conley, A. [Center for Astrophysics and Space Astronomy, University of Colorado, 593 UCB, Boulder, CO 80309-0593 (United States); Bianco, F. B.; Howell, D. A.; Graham, M. L. [Department of Physics, University of California, Santa Barbara, Broida Hall, Mail Code 9530, Santa Barbara, CA 93106-9530 (United States); Sullivan, M.; Hook, I. M. [Department of Physics (Astrophysics), University of Oxford, DWB, Keble Road, Oxford, OX1 3RH (United Kingdom); Astier, P.; Balland, C.; Fourmanoit, N.; Guy, J.; Hardin, D.; Pain, R. [LPNHE, Universite Pierre et Marie Curie Paris 6, Universite Paris Diderot Paris 7, CNRS-IN2P3, 4 Place Jussieu, 75252 Paris Cedex 05 (France); Balam, D. [Dominion Astrophysical Observatory, Herzberg Institute of Astrophysics, 5071 West Saanich Road, Victoria, BC V9E 2E7 (Canada); Basa, S. [Laboratoire d' Astrophysique de Marseille, Pole de l' Etoile Site de Chateau-Gombert, 38, rue Frederic Joliot-Curie, 13388 Marseille cedex 13 (France); Fouchez, D. [CPPM, CNRS-IN2P3 and University Aix Marseille II, Case 907, 13288 Marseille cedex 9 (France); Lidman, C. [Australian Astronomical Observatory, P.O. Box 296, Epping, NSW 1710 (Australia); Palanque-Delabrouille, N., E-mail: gonzalez@astro.utoronto.ca [DSM/IRFU/SPP, CEA-Saclay, F-91191 Gif-sur-Yvette (France); and others

    2012-01-20

    We calculate the average stretch-corrected rise time of Type Ia supernovae (SNe Ia) in the Supernova Legacy Survey. We use the aggregate light curves of spectroscopic and photometrically identified SNe Ia to fit the rising part of the light curve with a simple quadratic model. We obtain a light curve shape corrected, i.e., stretch-corrected, fiducial rise time of 17.02{sup +0.18}{sub -0.28} (stat) days. The measured rise time differs from an earlier finding by the SNLS (Conley et al.) due to the use of different SN Ia templates. We compare it to nearby samples using the same methods and find no evolution in the early part of the light curve of SNe Ia up to z = 1. We search for variations among different populations, particularly subluminous objects, by dividing the sample in stretch. Bright and slow decliners (s > 1.0) have consistent stretch-corrected rise times compared to fainter and faster decliners (0.8 < s {<=} 1.0); they are shorter by 0.57{sup +0.47}{sub -0.50} (stat) days. Subluminous SNe Ia (here defined as objects with s {<=} 0.8), although less constrained, are also consistent, with a rise time of 18.03{sup +0.81}{sub -1.37} (stat) days. We study several systematic biases and find that the use of different fiducial templates may affect the average rise time but not the intrinsic differences between populations. Based on our results, we estimate that subluminous SNe Ia are powered by 0.05-0.35 M{sub Sun} of {sup 56}Ni synthesized in the explosion. Our conclusions are the same for the single-stretch and two-stretch parameterizations of the light curve.

  4. Researching Education Policy in a Globalized World: Theoretical and Methodological Considerations

    Science.gov (United States)

    Lingard, Bob

    2009-01-01

    This paper shows how globalization has given rise to a number of new theoretical and methodological issues for doing education policy analysis linked to globalization's impact within critical social science. Critical policy analysis has always required critical "reflexivity" and awareness of the "positionality" of the policy analyst. However, as…

  5. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  6. Methodology to estimate variations in solar radiation reaching densely forested slopes in mountainous terrain.

    Science.gov (United States)

    Sypka, Przemysław; Starzak, Rafał; Owsiak, Krzysztof

    2016-12-01

    Solar radiation reaching densely forested slopes is one of the main factors influencing the water balance between the atmosphere, tree stands and the soil. It also has a major impact on site productivity, spatial arrangement of vegetation structure as well as forest succession. This paper presents a methodology to estimate variations in solar radiation reaching tree stands in a small mountain valley. Measurements taken in three inter-forest meadows unambiguously showed the relationship between the amount of solar insolation and the shading effect caused mainly by the contour of surrounding tree stands. Therefore, appropriate knowledge of elevation, aspect and tilt angles of the analysed planes had to be taken into consideration during modelling. At critical times, especially in winter, the diffuse and reflected components of solar radiation only reached some of the sites studied as the beam component of solar radiation was totally blocked by the densely forested mountain slopes in the neighbourhood. The cross-section contours and elevation angles of all obstructions are estimated from a digital surface model including both digital elevation model and the height of tree stands. All the parameters in a simplified, empirical model of the solar insolation reaching a given horizontal surface within the research valley are dependent on the sky view factor (SVF). The presented simplified, empirical model and its parameterisation scheme should be easily adaptable to different complex terrains or mountain valleys characterised by diverse geometry or spatial orientation. The model was developed and validated (R 2  = 0.92 , σ = 0.54) based on measurements taken at research sites located in the Silesian Beskid Mountain Range. A thorough understanding of the factors determining the amount of solar radiation reaching woodlands ought to considerably expand the knowledge of the water exchange balance within forest complexes as well as the estimation of site

  7. Introducing a methodology for estimating duration of surgery in health services research.

    Science.gov (United States)

    Redelmeier, Donald A; Thiruchelvam, Deva; Daneman, Nick

    2008-09-01

    The duration of surgery is an indicator for the quality, risks, and efficiency of surgical procedures. We introduce a new methodology for assessing the duration of surgery based on anesthesiology billing records, along with reviewing its fundamental logic and limitations. The validity of the methodology was assessed through a population-based cohort of patients (n=480,986) undergoing elective operations in 246 Ontario hospitals with 1,084 anesthesiologists between April 1, 1992 and March 31, 2002 (10 years). The weaknesses of the methodology relate to missing data, self-serving exaggerations by providers, imprecisions from clinical diversity, upper limits due to accounting regulations, fluctuations from updates over the years, national differences in reimbursement schedules, and the general failings of claims base analyses. The strengths of the methodology are in providing data that match clinical experiences, correspond to chart review, are consistent over time, can detect differences where differences would be anticipated, and might have implications for examining patient outcomes after long surgical times. We suggest that an understanding and application of large studies of surgical duration may help scientists explore selected questions concerning postoperative complications.

  8. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  9. Population dynamics of Hawaiian seabird colonies vulnerable to sea-level rise

    Science.gov (United States)

    Hatfield, Jeff S.; Reynolds, Michelle H.; Seavy, Nathaniel E.; Krause, Crystal M.

    2012-01-01

    Globally, seabirds are vulnerable to anthropogenic threats both at sea and on land. Seabirds typically nest colonially and show strong fidelity to natal colonies, and such colonies on low-lying islands may be threatened by sea-level rise. We used French Frigate Shoals, the largest atoll in the Hawaiian Archipelago, as a case study to explore the population dynamics of seabird colonies and the potential effects sea-level rise may have on these rookeries. We compiled historic observations, a 30-year time series of seabird population abundance, lidar-derived elevations, and aerial imagery of all the islands of French Frigate Shoals. To estimate the population dynamics of 8 species of breeding seabirds on Tern Island from 1980 to 2009, we used a Gompertz model with a Bayesian approach to infer population growth rates, density dependence, process variation, and observation error. All species increased in abundance, in a pattern that provided evidence of density dependence. Great Frigatebirds (Fregata minor), Masked Boobies (Sula dactylatra), Red-tailed Tropicbirds (Phaethon rubricauda), Spectacled Terns (Onychoprion lunatus), and White Terns (Gygis alba) are likely at carrying capacity. Density dependence may exacerbate the effects of sea-level rise on seabirds because populations near carrying capacity on an island will be more negatively affected than populations with room for growth. We projected 12% of French Frigate Shoals will be inundated if sea level rises 1 m and 28% if sea level rises 2 m. Spectacled Terns and shrub-nesting species are especially vulnerable to sea-level rise, but seawalls and habitat restoration may mitigate the effects of sea-level rise. Losses of seabird nesting habitat may be substantial in the Hawaiian Islands by 2100 if sea levels rise 2 m. Restoration of higher-elevation seabird colonies represent a more enduring conservation solution for Pacific seabirds.

  10. Contribution of climate-driven change in continental water storage to recent sea-level rise

    Science.gov (United States)

    Milly, P. C. D.; Cazenave, A.; Gennero, C.

    2003-01-01

    Using a global model of continental water balance, forced by interannual variations in precipitation and near-surface atmospheric temperature for the period 1981–1998, we estimate the sea-level changes associated with climate-driven changes in storage of water as snowpack, soil water, and ground water; storage in ice sheets and large lakes is not considered. The 1981–1998 trend is estimated to be 0.12 mm/yr, and substantial interannual fluctuations are inferred; for 1993–1998, the trend is 0.25 mm/yr. At the decadal time scale, the terrestrial contribution to eustatic (i.e., induced by mass exchange) sea-level rise is significantly smaller than the estimated steric (i.e., induced by density changes) trend for the same period, but is not negligibly small. In the model the sea-level rise is driven mainly by a downtrend in continental precipitation during the study period, which we believe was generated by natural variability in the climate system. PMID:14576277

  11. Cities and Sea Level Rise: A Roadmap for Flood Hazard Adaptation

    Science.gov (United States)

    Horn, Diane; Cousins, Ann

    2016-04-01

    Coastal cities will face a range of increasingly severe challenges as sea level rises, and adaptation to future flood risk will require more than structural defences. Many cities will not be able to rely solely on engineering structures for protection and will need to develop a suite of policy responses to increase their resilience to impacts of rising sea level. The tools to promote flood risk adaptation are already within the capacity of most cities, with an assortment of policy tools available to address other land-use problems which can be refashioned and used to adapt to sea level rise. This study reviews approaches for urban adaptation through detailed analyses of case studies of cities which have developed flood adaptation strategies that combine structural defences with innovative approaches to living with flood risk. The aim of the overall project is to produce a 'roadmap' to guide practitioners through the process of analysing coastal flood risk in urban areas. Methodologies and tools to estimate vulnerability to coastal flooding, damages suffered, and the assessment of flood defences and adaptation measures are complemented with a discussion on the essential impact that local policy has on the treatment of coastal flooding and the constraints and opportunities that result from the specific country or locality characteristics in relation to economic, political, social and environmental priorities, which are likely to dictate the approach to coastal flooding and the actions proposed. Case studies of adaptation strategies used by Rotterdam, Bristol, Ho Chi Minh City and Norfolk, Virginia, are used to draw out a range of good practice elements that promote effective adaptation to sea level rise. These can be grouped into risk reduction, governance issues, and insurance, and can be used to provide examples of how other cities could adopt and implement flood adaptation strategies from a relatively limited starting position. Most cities will neither be able to

  12. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  13. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  14. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    Science.gov (United States)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once

  15. Sea-level-rise trends off the Indian coasts during the last two decades

    Digital Repository Service at National Institute of Oceanography (India)

    Unnikrishnan, A.S.; Nidheesh, A.G.; Lengaigne, M.

    The present communication discusses sea-level-rise trends in the north Indian Ocean, particularly off the Indian coasts, based on estimates derived from satellite altimeter and tide-gauge data. Altimeter data analysis over the 1993–2012 period...

  16. The methodology proposed to estimate the absorbed dose at the entrance of the labyrinth in HDR brachytherapy facilities with IR-192

    International Nuclear Information System (INIS)

    Pujades, M. C.; Perez-Calatayud, J.; Ballester, F.

    2012-01-01

    In the absence of procedure for evaluating the design of a brachytherapy (BT) vault with maze from the point of view of radiation protection, usually formalism of external radiation is adapted. The purpose of this study is to adapt the methodology described by the National council on Radiological Protection and Measurements Report 151 (NCRP 151). Structural Shielding Design for megavoltage X-and Gamma-Ray Radiotherapy facilities, for estimating dose at the door in BT and its comparison with the results megavoltage X-and Gamma-Ray Radiotherapy Facilities, for estimating dose at the door in BT and its comparison with the results obtained by the method of Monte Carlo (MC) for a special case of bunker. (Author) 17 refs.

  17. Evaluation of the conservativeness of the methodology for estimating earthquake-induced movements of fractures intersecting canisters

    International Nuclear Information System (INIS)

    La Pointe, Paul R.; Cladouhos, Trenton T.; Outters, Nils; Follin, Sven

    2000-04-01

    This study evaluates the parameter sensitivity and the conservativeness of the methodology outlined in TR 99-03. Sensitivity analysis focuses on understanding how variability in input parameter values impacts the calculated fracture displacements. These studies clarify what parameters play the greatest role in fracture movements, and help define critical values of these parameters in terms of canister failures. The thresholds or intervals of values that lead to a certain level of canister failure calculated in this study could be useful for evaluating future candidate sites. Key parameters include: 1. magnitude/frequency of earthquakes; 2. the distance of the earthquake from the canisters; 3. the size and aspect ratio of fractures intersecting canisters; and 4. the orientation of the fractures. The results of this study show that distance and earthquake magnitude are the most important factors, followed by fracture size. Fracture orientation is much less important. Regression relations were developed to predict induced fracture slip as a function of distance and either earthquake magnitude or slip on the earthquake fault. These regression relations were validated by using them to estimate the number of canister failures due to single damaging earthquakes at Aberg, and comparing these estimates with those presented in TR 99-03. The methodology described in TR 99-03 employs several conservative simplifications in order to devise a numerically feasible method to estimate fracture movements due to earthquakes outside of the repository over the next 100,000 years. These simplifications include: 1. fractures are assumed to be frictionless and cohesionless; 2. all energy transmitted to the fracture by the earthquake is assumed to produce elastic deformation of the fracture; no energy is diverted into fracture propagation; and 3. shielding effects of other fractures between the earthquake and the fracture are neglected. The numerical modeling effectively assumes that the

  18. Evaluation of the conservativeness of the methodology for estimating earthquake-induced movements of fractures intersecting canisters

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, Paul R.; Cladouhos, Trenton T. [Golder Associates Inc., Las Vegas, NV (United States); Outters, Nils; Follin, Sven [Golder Grundteknik KB, Stockholm (Sweden)

    2000-04-01

    This study evaluates the parameter sensitivity and the conservativeness of the methodology outlined in TR 99-03. Sensitivity analysis focuses on understanding how variability in input parameter values impacts the calculated fracture displacements. These studies clarify what parameters play the greatest role in fracture movements, and help define critical values of these parameters in terms of canister failures. The thresholds or intervals of values that lead to a certain level of canister failure calculated in this study could be useful for evaluating future candidate sites. Key parameters include: 1. magnitude/frequency of earthquakes; 2. the distance of the earthquake from the canisters; 3. the size and aspect ratio of fractures intersecting canisters; and 4. the orientation of the fractures. The results of this study show that distance and earthquake magnitude are the most important factors, followed by fracture size. Fracture orientation is much less important. Regression relations were developed to predict induced fracture slip as a function of distance and either earthquake magnitude or slip on the earthquake fault. These regression relations were validated by using them to estimate the number of canister failures due to single damaging earthquakes at Aberg, and comparing these estimates with those presented in TR 99-03. The methodology described in TR 99-03 employs several conservative simplifications in order to devise a numerically feasible method to estimate fracture movements due to earthquakes outside of the repository over the next 100,000 years. These simplifications include: 1. fractures are assumed to be frictionless and cohesionless; 2. all energy transmitted to the fracture by the earthquake is assumed to produce elastic deformation of the fracture; no energy is diverted into fracture propagation; and 3. shielding effects of other fractures between the earthquake and the fracture are neglected. The numerical modeling effectively assumes that the

  19. Combining tracer flux ratio methodology with low-flying aircraft measurements to estimate dairy farm CH4 emissions

    Science.gov (United States)

    Daube, C.; Conley, S.; Faloona, I. C.; Yacovitch, T. I.; Roscioli, J. R.; Morris, M.; Curry, J.; Arndt, C.; Herndon, S. C.

    2017-12-01

    Livestock activity, enteric fermentation of feed and anaerobic digestion of waste, contributes significantly to the methane budget of the United States (EPA, 2016). Studies question the reported magnitude of these methane sources (Miller et. al., 2013), calling for more detailed research of agricultural animals (Hristov, 2014). Tracer flux ratio is an attractive experimental method to bring to this problem because it does not rely on estimates of atmospheric dispersion. Collection of data occurred during one week at two dairy farms in central California (June, 2016). Each farm varied in size, layout, head count, and general operation. The tracer flux ratio method involves releasing ethane on-site with a known flow rate to serve as a tracer gas. Downwind mixed enhancements in ethane (from the tracer) and methane (from the dairy) were measured, and their ratio used to infer the unknown methane emission rate from the farm. An instrumented van drove transects downwind of each farm on public roads while tracer gases were released on-site, employing the tracer flux ratio methodology to assess simultaneous methane and tracer gas plumes. Flying circles around each farm, a small instrumented aircraft made measurements to perform a mass balance evaluation of methane gas. In the course of these two different methane quantification techniques, we were able to validate yet a third method: tracer flux ratio measured via aircraft. Ground-based tracer release rates were applied to the aircraft-observed methane-to-ethane ratios, yielding whole-site methane emission rates. Never before has the tracer flux ratio method been executed with aircraft measurements. Estimates from this new application closely resemble results from the standard ground-based technique to within their respective uncertainties. Incorporating this new dimension to the tracer flux ratio methodology provides additional context for local plume dynamics and validation of both ground and flight-based data.

  20. Statistical methodology for estimating the mean difference in a meta-analysis without study-specific variance information.

    Science.gov (United States)

    Sangnawakij, Patarawan; Böhning, Dankmar; Adams, Stephen; Stanton, Michael; Holling, Heinz

    2017-04-30

    Statistical inference for analyzing the results from several independent studies on the same quantity of interest has been investigated frequently in recent decades. Typically, any meta-analytic inference requires that the quantity of interest is available from each study together with an estimate of its variability. The current work is motivated by a meta-analysis on comparing two treatments (thoracoscopic and open) of congenital lung malformations in young children. Quantities of interest include continuous end-points such as length of operation or number of chest tube days. As studies only report mean values (and no standard errors or confidence intervals), the question arises how meta-analytic inference can be developed. We suggest two methods to estimate study-specific variances in such a meta-analysis, where only sample means and sample sizes are available in the treatment arms. A general likelihood ratio test is derived for testing equality of variances in two groups. By means of simulation studies, the bias and estimated standard error of the overall mean difference from both methodologies are evaluated and compared with two existing approaches: complete study analysis only and partial variance information. The performance of the test is evaluated in terms of type I error. Additionally, we illustrate these methods in the meta-analysis on comparing thoracoscopic and open surgery for congenital lung malformations and in a meta-analysis on the change in renal function after kidney donation. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. The contribution to future flood risk in the Severn Estuary from extreme sea level rise due to ice sheet mass loss

    Science.gov (United States)

    Quinn, N.; Bates, P. D.; Siddall, M.

    2013-12-01

    The rate at which sea levels will rise in the coming century is of great interest to decision makers tasked with developing mitigation policies to cope with the risk of coastal inundation. Accurate estimates of future sea levels are vital in the provision of effective policy. Recent reports from UK Climate Impacts Programme (UKCIP) suggest that mean sea levels in the UK may rise by as much as 80 cm by 2100; however, a great deal of uncertainty surrounds model predictions, particularly the contribution from ice sheets responding to climatic warming. For this reason, the application of semi-empirical modelling approaches for sea level rise predictions has increased of late, the results from which suggest that the rate of sea level rise may be greater than previously thought, exceeding 1 m by 2100. Furthermore, studies in the Red Sea indicate that rapid sea level rise beyond 1m per century has occurred in the past. In light of such research, the latest UKCIP assessment has included a H++ scenario for sea level rise in the UK of up to 1.9 m which is defined as improbable but, crucially, physically plausible. The significance of such low-probability sea level rise scenarios upon the estimation of future flood risk is assessed using the Somerset levels (UK) as a case study. A simple asymmetric probability distribution is constructed to include sea level rise scenarios of up to 1.9 m by 2100 which are added to a current 1:200 year event water level to force a two-dimensional hydrodynamic model of coastal inundation. From the resulting ensemble predictions an estimation of risk by 2100 is established. The results indicate that although the likelihood of extreme sea level rise due to rapid ice sheet mass loss is low, the resulting hazard can be large, resulting in a significant (27%) increase to the projected annual risk. Furthermore, current defence construction guidelines for the coming century in the UK are expected to account for 95% of the sea level rise distribution

  2. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  3. Achieving 95% probability level using best estimate codes and the code scaling, applicability and uncertainty (CSAU) [Code Scaling, Applicability and Uncertainty] methodology

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.; Duffey, R.B.; Griffith, P.; Katsma, K.R.; Lellouche, G.S.; Rohatgi, U.S.; Wulff, W.; Zuber, N.

    1988-01-01

    Issue of a revised rule for loss of coolant accident/emergency core cooling system (LOCA/ECCS) analysis of light water reactors will allow the use of best estimate (BE) computer codes in safety analysis, with uncertainty analysis. This paper describes a systematic methodology, CSAU (Code Scaling, Applicability and Uncertainty), which will provide uncertainty bounds in a cost effective, auditable, rational and practical manner. 8 figs., 2 tabs

  4. Methodology to estimate the cost of the severe accidents risk / maximum benefit; Metodologia para estimar el costo del riesgo de accidentes severos / beneficio maximo

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza, G.; Flores, R. M.; Vega, E., E-mail: gozalo.mendoza@inin.gob.mx [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2016-09-15

    For programs and activities to manage aging effects, any changes to plant operations, inspections, maintenance activities, systems and administrative control procedures during the renewal period should be characterized, designed to manage the effects of aging as required by 10 Cfr Part 54 that could impact the environment. Environmental impacts significantly different from those described in the final environmental statement for the current operating license should be described in detail. When complying with the requirements of a license renewal application, the Severe Accident Mitigation Alternatives (SAMA) analysis is contained in a supplement to the environmental report of the plant that meets the requirements of 10 Cfr Part 51. In this paper, the methodology for estimating the cost of severe accidents risk is established and discussed, which is then used to identify and select the alternatives for severe accident mitigation, which are analyzed to estimate the maximum benefit that an alternative could achieve if this eliminate all risk. Using the regulatory analysis techniques of the US Nuclear Regulatory Commission (NRC) estimates the cost of severe accidents risk. The ultimate goal of implementing the methodology is to identify candidates for SAMA that have the potential to reduce the severe accidents risk and determine if the implementation of each candidate is cost-effective. (Author)

  5. USGS Methodology for Assessing Continuous Petroleum Resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  6. Methodology development for the radioecological monitoring effectiveness estimation

    International Nuclear Information System (INIS)

    Gusev, A.E.; Kozlov, A.A.; Lavrov, K.N.; Sobolev, I.A.; Tsyplyakova, T.P.

    1997-01-01

    A general model for estimation of the programs assuring radiation and ecological public protection is described. The complex of purposes and criteria characterizing and giving an opportunity to estimate the effectiveness of environment protection program composition is selected. An algorithm for selecting the optimal management decision from the view point of work cost connected with population protection improvement is considered. The position of radiation-ecological monitoring in general problem of environment pollution is determined. It is shown that the monitoring organizing effectiveness is closely connected with population radiation and ecological protection

  7. Assessing the effectiveness of voluntary solid waste reduction policies: Methodology and a Flemish case study

    International Nuclear Information System (INIS)

    Jaeger, Simon de; Eyckmans, Johan

    2008-01-01

    The purpose of this paper is to illustrate the use of statistical techniques to evaluate the effectiveness of voluntary policy instruments for waste management. The voluntary character of these instruments implies that latent characteristics, unobserved by the analyst, might influence the subscription decision and might lead to biased estimates of the effectiveness of the policy instrument if standard techniques are used. We propose an extension of the difference-in-differences (DiD) estimator to evaluate the effectiveness of voluntary policy instruments, which is termed the dynamic difference-in-differences (or DDD) estimator. We illustrate the technique by estimating the effectiveness of voluntary cooperation agreements between the Flemish environmental administration and individual municipalities aimed at curbing residential solid waste. Using a dataset covering all 308 Flemish municipalities for the period 2000-2005, our results indicate that municipalities subscribing to the agreement accomplished less reduction of their waste levels compared to what could be expected on the basis of their own performance prior to subscription and the performance of the non-subscribers. This result might be explained by the rising marginal cost of extra residential solid waste reduction policies. In addition, there are indications that subscribing municipalities refrain from additional reduction efforts once the target waste level of the program is achieved. The more complicated DDD methodology is shown to generate additional insight over the ordinary DiD analysis

  8. Development of a methodology for the assessment of sea level rise impacts on Florida's transportation modes and infrastructure : [summary].

    Science.gov (United States)

    2012-01-01

    In Florida, low elevations can make transportation infrastructure in coastal and low-lying areas potentially vulnerable to sea level rise (SLR). Becuase global SLR forecasts lack precision at local or regional scales, SLR forecasts or scenarios for p...

  9. Genome size estimation: a new methodology

    Science.gov (United States)

    Álvarez-Borrego, Josué; Gallardo-Escárate, Crisitian; Kober, Vitaly; López-Bonilla, Oscar

    2007-03-01

    Recently, within the cytogenetic analysis, the evolutionary relations implied in the content of nuclear DNA in plants and animals have received a great attention. The first detailed measurements of the nuclear DNA content were made in the early 40's, several years before Watson and Crick proposed the molecular structure of the DNA. In the following years Hewson Swift developed the concept of "C-value" in reference to the haploid phase of DNA in plants. Later Mirsky and Ris carried out the first systematic study of genomic size in animals, including representatives of the five super classes of vertebrates as well as of some invertebrates. From these preliminary results it became evident that the DNA content varies enormously between the species and that this variation does not bear relation to the intuitive notion from the complexity of the organism. Later, this observation was reaffirmed in the following years as the studies increased on genomic size, thus denominating to this characteristic of the organisms like the "Paradox of the C-value". Few years later along with the no-codification discovery of DNA the paradox was solved, nevertheless, numerous questions remain until nowadays unfinished, taking to denominate this type of studies like the "C-value enigma". In this study, we reported a new method for genome size estimation by quantification of fluorescence fading. We measured the fluorescence intensity each 1600 milliseconds in DAPI-stained nuclei. The estimation of the area under the graph (integral fading) during fading period was related with the genome size.

  10. Model of investment appraisal of high-rise construction with account of cost of land resources

    Science.gov (United States)

    Okolelova, Ella; Shibaeva, Marina; Trukhina, Natalya

    2018-03-01

    The article considers problems and potential of high-rise construction as a global urbanization. The results of theoretical and practical studies on the appraisal of investments in high-rise construction are provided. High-rise construction has a number of apparent upsides in modern terms of development of megapolises and primarily it is economically efficient. Amid serious lack of construction sites, skyscrapers successfully deal with the need of manufacturing, office and living premises. Nevertheless, there are plenty issues, which are related with high-rise construction, and only thorough scrutiny of them allow to estimate the real economic efficiency of this branch. The article focuses on the question of economic efficiency of high-rise construction. The suggested model allows adjusting the parameters of a facility under construction, setting the tone for market value as well as the coefficient for appreciation of the construction net cost, that depends on the number of storey's, in the form of function or discrete values.

  11. Comparison of the COMRADEX-IV and AIRDOS-EPA methodologies for estimating the radiation dose to man from radionuclide releases to the atmosphere

    International Nuclear Information System (INIS)

    Miller, C.W.; Hoffman, F.O.; Dunning, D.E. Jr.

    1981-01-01

    This report presents a comparison between two computerized methodologies for estimating the radiation dose to man from radionuclide releases to the atmosphere. The COMRADEX-IV code was designed to provide a means of assessing potential radiological consequences from postulated power reactor accidents. The AIRDOS-EPA code was developed primarily to assess routine radionuclide releases from nuclear facilities. Although a number of different calculations are performed by these codes, three calculations are in common - atmospheric dispersion, estimation of internal dose from inhalation, and estimation of external dose from immersion in air containing gamma emitting radionuclides. The models used in these calculations were examined and found, in general, to be the same. Most differences in the doses calculated by the two codes are due to differences in values chosen for input parameters and not due to model differences. A sample problem is presented for illustration

  12. Fate of water pumped from underground and contributions to sea-level rise

    Science.gov (United States)

    Wada, Yoshihide; Lo, Min-Hui; Yeh, Pat J.-F.; Reager, John T.; Famiglietti, James S.; Wu, Ren-Jie; Tseng, Yu-Heng

    2016-08-01

    The contributions from terrestrial water sources to sea-level rise, other than ice caps and glaciers, are highly uncertain and heavily debated. Recent assessments indicate that groundwater depletion (GWD) may become the most important positive terrestrial contribution over the next 50 years, probably equal in magnitude to the current contributions from glaciers and ice caps. However, the existing estimates assume that nearly 100% of groundwater extracted eventually ends up in the oceans. Owing to limited knowledge of the pathways and mechanisms governing the ultimate fate of pumped groundwater, the relative fraction of global GWD that contributes to sea-level rise remains unknown. Here, using a coupled climate-hydrological model simulation, we show that only 80% of GWD ends up in the ocean. An increase in runoff to the ocean accounts for roughly two-thirds, whereas the remainder results from the enhanced net flux of precipitation minus evaporation over the ocean, due to increased atmospheric vapour transport from the land to the ocean. The contribution of GWD to global sea-level rise amounted to 0.02 (+/-0.004) mm yr-1 in 1900 and increased to 0.27 (+/-0.04) mm yr-1 in 2000. This indicates that existing studies have substantially overestimated the contribution of GWD to global sea-level rise by a cumulative amount of at least 10 mm during the twentieth century and early twenty-first century. With other terrestrial water contributions included, we estimate the net terrestrial water contribution during the period 1993-2010 to be +0.12 (+/-0.04) mm yr-1, suggesting that the net terrestrial water contribution reported in the IPCC Fifth Assessment Report report is probably overestimated by a factor of three.

  13. Two-stage commercial evaluation of engineering systems production projects for high-rise buildings

    Science.gov (United States)

    Bril, Aleksander; Kalinina, Olga; Levina, Anastasia

    2018-03-01

    The paper is devoted to the current and debatable problem of methodology of choosing the effective innovative enterprises for venture financing. A two-stage system of commercial innovation evaluation based on the UNIDO methodology is proposed. Engineering systems account for 25 to 40% of the cost of high-rise residential buildings. This proportion increases with the use of new construction technologies. Analysis of the construction market in Russia showed that the production of internal engineering systems elements based on innovative technologies has a growth trend. The production of simple elements is organized in small enterprises on the basis of new technologies. The most attractive for development is the use of venture financing of small innovative business. To improve the efficiency of these operations, the paper proposes a methodology for a two-stage evaluation of small business development projects. A two-stage system of commercial evaluation of innovative projects allows creating an information base for informed and coordinated decision-making on venture financing of enterprises that produce engineering systems elements for the construction business.

  14. Two-stage commercial evaluation of engineering systems production projects for high-rise buildings

    Directory of Open Access Journals (Sweden)

    Bril Aleksander

    2018-01-01

    Full Text Available The paper is devoted to the current and debatable problem of methodology of choosing the effective innovative enterprises for venture financing. A two-stage system of commercial innovation evaluation based on the UNIDO methodology is proposed. Engineering systems account for 25 to 40% of the cost of high-rise residential buildings. This proportion increases with the use of new construction technologies. Analysis of the construction market in Russia showed that the production of internal engineering systems elements based on innovative technologies has a growth trend. The production of simple elements is organized in small enterprises on the basis of new technologies. The most attractive for development is the use of venture financing of small innovative business. To improve the efficiency of these operations, the paper proposes a methodology for a two-stage evaluation of small business development projects. A two-stage system of commercial evaluation of innovative projects allows creating an information base for informed and coordinated decision-making on venture financing of enterprises that produce engineering systems elements for the construction business.

  15. Preliminary methodological proposal for estimating environmental flows in projects approved by the ministry of environment and sustainable development (MADS), Colombia

    International Nuclear Information System (INIS)

    Pinilla Agudelo, Gabriel A; Rodriguez Sandoval, Erasmo A; Camacho Botero, Luis A

    2014-01-01

    A methodological proposal for estimating environmental flows in large projects approved by Agencia Nacional de Licencias Ambientales (ANLA) in Colombian rivers was developed. The project is the result of an agreement between the MADS and the Universidad Nacional de Colombia, Bogota (UNC). The proposed method begins with an evaluation of hydrological criteria, continues with a hydraulic and water quality validation, and follows with the determination of habitat integrity. This is an iterative process that compares conditions before and after the project construction and allows to obtain the magnitude of a monthly flow that, besides preserving the ecological functions of the river, guarantees the water uses downstream. Regarding to the biotic component, the proposal includes the establishment and monitoring of biotic integrity indices for four aquatic communities (periphyton, macro invertebrates, riparian vegetation, and fish). The effects that flow reduction may produce in the medium and long term can be assessed by these indices. We present the results of applying the methodology to several projects licensed by the MADS.

  16. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  17. Procedure for estimating permanent total enclosure costs

    Energy Technology Data Exchange (ETDEWEB)

    Lukey, M.E.; Prasad, C.; Toothman, D.A.; Kaplan, N.

    1999-07-01

    Industries that use add-on control devices must adequately capture emissions before delivering them to the control device. One way to capture emissions is to use permanent total enclosures (PTEs). By definition, an enclosure which meets the US Environmental Protection Agency's five-point criteria is a PTE and has a capture efficiency of 100%. Since costs play an important role in regulatory development, in selection of control equipment, and in control technology evaluations for permitting purposes, EPA has developed a Control Cost Manual for estimating costs of various items of control equipment. EPA's Manual does not contain any methodology for estimating PTE costs. In order to assist environmental regulators and potential users of PTEs, a methodology for estimating PTE costs was developed under contract with EPA, by Pacific Environmental Services, Inc. (PES) and is the subject of this paper. The methodology for estimating PTE costs follows the approach used for other control devices in the Manual. It includes procedures for sizing various components of a PTE and for estimating capital as well as annual costs. It contains verification procedures for demonstrating compliance with EPA's five-point criteria. In addition, procedures are included to determine compliance with Occupational Safety and Health Administration (OSHA) standards. Meeting these standards is an important factor in properly designing PTEs. The methodology is encoded in Microsoft Exel spreadsheets to facilitate cost estimation and PTE verification. Examples are given throughout the methodology development and in the spreadsheets to illustrate the PTE design, verification, and cost estimation procedures.

  18. Procedure for estimating permanent total enclosure costs

    Energy Technology Data Exchange (ETDEWEB)

    Lukey, M E; Prasad, C; Toothman, D A; Kaplan, N

    1999-07-01

    Industries that use add-on control devices must adequately capture emissions before delivering them to the control device. One way to capture emissions is to use permanent total enclosures (PTEs). By definition, an enclosure which meets the US Environmental Protection Agency's five-point criteria is a PTE and has a capture efficiency of 100%. Since costs play an important role in regulatory development, in selection of control equipment, and in control technology evaluations for permitting purposes, EPA has developed a Control Cost Manual for estimating costs of various items of control equipment. EPA's Manual does not contain any methodology for estimating PTE costs. In order to assist environmental regulators and potential users of PTEs, a methodology for estimating PTE costs was developed under contract with EPA, by Pacific Environmental Services, Inc. (PES) and is the subject of this paper. The methodology for estimating PTE costs follows the approach used for other control devices in the Manual. It includes procedures for sizing various components of a PTE and for estimating capital as well as annual costs. It contains verification procedures for demonstrating compliance with EPA's five-point criteria. In addition, procedures are included to determine compliance with Occupational Safety and Health Administration (OSHA) standards. Meeting these standards is an important factor in properly designing PTEs. The methodology is encoded in Microsoft Exel spreadsheets to facilitate cost estimation and PTE verification. Examples are given throughout the methodology development and in the spreadsheets to illustrate the PTE design, verification, and cost estimation procedures.

  19. Rising U.S. Earnings Inequality and Family Labor Supply: The Covariance Structure of Intrafamily Earnings

    OpenAIRE

    Dean R. Hyslop

    2001-01-01

    This paper studies the labor supply contributions to individual and family earnings inequality during the period of rising wage inequality in the early 1980's. Working couples have positively correlated labor market outcomes, which are almost entirely attributable to permanent factors. An intertemporal family labor supply model with this feature is used to estimate labor supply elasticities for husbands of 0.05, and wives of 0.40. This implies that labor supply explains little of the rising a...

  20. The Impact of Sea Level Rise on Developing Countries: A Comparative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, S. [World Bank, Washington, DC (United States)

    2008-07-01

    Sea-level rise (SLR) due to climate change is a serious global threat: The scientific evidence is now overwhelming. In this paper, Geographic Information System software has been used to overlay the best available, spatially-disaggregated global data on land, population, agriculture, urban extent, wetlands, and GDP, to assess the consequences of continued SLR for 84 coastal developing countries. Estimates suggest that even a one-meter rise in sea level in coastal countries of the developing world would submerge 194,000 square kilometers of land area, and turn at least 56 million people into environmental refugees. At the country level results are extremely skewed.

  1. The Impact of Sea Level Rise on Developing Countries: A Comparative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Susmita (World Bank, Washington, DC (United States))

    2008-07-01

    Sea-level rise (SLR) due to climate change is a serious global threat: The scientific evidence is now overwhelming. In this paper, Geographic Information System software has been used to overlay the best available, spatially-disaggregated global data on land, population, agriculture, urban extent, wetlands, and GDP, to assess the consequences of continued SLR for 84 coastal developing countries. Estimates suggest that even a one-meter rise in sea level in coastal countries of the developing world would submerge 194,000 square kilometers of land area, and turn at least 56 million people into environmental refugees. At the country level results are extremely skewed

  2. A geospatial dataset for U.S. hurricane storm surge and sea-level rise vulnerability: Development and case study applications

    Directory of Open Access Journals (Sweden)

    Megan C. Maloney

    2014-01-01

    Full Text Available The consequences of future sea-level rise for coastal communities are a priority concern arising from anthropogenic climate change. Here, previously published methods are scaled up in order to undertake a first pass assessment of exposure to hurricane storm surge and sea-level rise for the U.S. Gulf of Mexico and Atlantic coasts. Sea-level rise scenarios ranging from +0.50 to +0.82 m by 2100 increased estimates of the area exposed to inundation by 4–13% and 7–20%, respectively, among different Saffir-Simpson hurricane intensity categories. Potential applications of these hazard layers for vulnerability assessment are demonstrated with two contrasting case studies: potential exposure of current energy infrastructure in the U.S. Southeast and exposure of current and future housing along both the Gulf and Atlantic Coasts. Estimates of the number of Southeast electricity generation facilities potentially exposed to hurricane storm surge ranged from 69 to 291 for category 1 and category 5 storms, respectively. Sea-level rise increased the number of exposed facilities by 6–60%, depending on the sea-level rise scenario and the intensity of the hurricane under consideration. Meanwhile, estimates of the number of housing units currently exposed to hurricane storm surge ranged from 4.1 to 9.4 million for category 1 and category 4 storms, respectively, while exposure for category 5 storms was estimated at 7.1 million due to the absence of landfalling category 5 hurricanes in the New England region. Housing exposure was projected to increase 83–230% by 2100 among different sea-level rise and housing scenarios, with the majority of this increase attributed to future housing development. These case studies highlight the utility of geospatial hazard information for national-scale coastal exposure or vulnerability assessment as well as the importance of future socioeconomic development in the assessment of coastal vulnerability.

  3. Principles for the formation of an effective concept of multifunctional high-rise construction investment projects

    Directory of Open Access Journals (Sweden)

    Beliakov Sergei

    2018-01-01

    Full Text Available Investment projects of high-rise construction have a number of features that determine specific risks and additional opportunities that require analysis and accounting in the formation of an effective project concept. The most significant features of high-rise construction include long construction time, complexity of technical and technological solutions, complexity of decisions on the organization of construction and operation, high cost of construction and operation, complexity in determining the ratio of areas designed to accommodate different functional areas, when organizing and coordinating the operation of the facility, with internal zoning. Taking into account the specificity of high-rise construction, among the factors determining the effectiveness of projects, it is advisable to consider as key factors: organizational, technological and investment factors. Within the framework of the article, the author singled out key particular functions for each group of factors under consideration, and also developed a system of principles for the formation of an effective concept of multifunctional high-rise construction investment projects, including the principle of logistic efficiency, the principle of optimal functional zoning, the principle of efficiency of equipment use, the principle of optimizing technological processes, the principle maximization of income, the principle of fund management, the principle of risk management . The model of formation of an effective concept of investment projects of multifunctional high-rise construction developed by the author can contribute to the development of methodological tools in the field of managing the implementation of high-rise construction projects, taking into account their specificity in the current economic conditions.

  4. Principles for the formation of an effective concept of multifunctional high-rise construction investment projects

    Science.gov (United States)

    Beliakov, Sergei

    2018-03-01

    Investment projects of high-rise construction have a number of features that determine specific risks and additional opportunities that require analysis and accounting in the formation of an effective project concept. The most significant features of high-rise construction include long construction time, complexity of technical and technological solutions, complexity of decisions on the organization of construction and operation, high cost of construction and operation, complexity in determining the ratio of areas designed to accommodate different functional areas, when organizing and coordinating the operation of the facility, with internal zoning. Taking into account the specificity of high-rise construction, among the factors determining the effectiveness of projects, it is advisable to consider as key factors: organizational, technological and investment factors. Within the framework of the article, the author singled out key particular functions for each group of factors under consideration, and also developed a system of principles for the formation of an effective concept of multifunctional high-rise construction investment projects, including the principle of logistic efficiency, the principle of optimal functional zoning, the principle of efficiency of equipment use, the principle of optimizing technological processes, the principle maximization of income, the principle of fund management, the principle of risk management . The model of formation of an effective concept of investment projects of multifunctional high-rise construction developed by the author can contribute to the development of methodological tools in the field of managing the implementation of high-rise construction projects, taking into account their specificity in the current economic conditions.

  5. Study on methodology to estimate isotope generation and depletion for core design of HTGR

    International Nuclear Information System (INIS)

    Fukaya, Yuji; Ueta, Shohei; Goto, Minoru; Shimakawa, Satoshi

    2013-12-01

    An investigation on methodology to estimate isotope generation and depletion had been performed in order to improve the accuracy for HTGR core design. The technical problem for isotope generation and depletion can be divided into major three parts, for solving the burn-up equations, generating effective cross section and employing nuclide data. Especially for the generating effective cross section, the core burn-up calculation has a technological problem in common with point burn-up calculation. Thus, the investigation had also been performed for the core burn-up calculation to develop new code system in the future. As a result, it was found that the cross section with the extended 108 energy groups structure from the SRAC 107 groups structure to 20 MeV and the cross section collapse using the flux obtained by the deterministic code SRAC is proper for the use. In addition, it becomes clear the needs for the nuclear data from an investigation on the preparation condition for nuclear data for a safety analysis and a fuel design. (author)

  6. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  7. An alternative method to record rising temperatures during dental implant site preparation: a preliminary study using bovine bone

    Directory of Open Access Journals (Sweden)

    Domenica Laurito

    2010-12-01

    Full Text Available Overheating is constantly mentioned as a risk factor for bone necrosis that could compromise the dental implant primary stability. Uncontrolled thermal injury can result in a fibrous tissue, interpositioned at the implant-bone interface, compromising the long-term prognosis. The methods used to record temperature rise include either direct recording by thermocouple instruments or indirect estimating by infrared thermography. This preliminary study was carried out using bovine bone and a different method of temperatures rising estimation is presented. Two different types of drills were tested using fluoroptic thermometer and the effectiveness of this alternative temperature recording method was evaluated.

  8. Food availability and the rising obesity prevalence in Malaysia

    OpenAIRE

    Geok-Lin Khor

    2012-01-01

    It is estimated that more than 1.1 billion adultsand 115 million children worldwide are overweight.In Malaysia, the second and third National Healthand Morbidity Surveys in 1996 and 2006 respectivelyreported a three-fold increase in obesity prevalenceamong adults, surging from 4.4% to 14% over the10-year period. Evidence of rising childhood obesityhas also emerged. The aim of this article is to gatherevidence from food availability data for an insightinto population shifts in dietary patterns...

  9. Development of Numerical Estimation in Young Children

    Science.gov (United States)

    Siegler, Robert S.; Booth, Julie L.

    2004-01-01

    Two experiments examined kindergartners', first graders', and second graders' numerical estimation, the internal representations that gave rise to the estimates, and the general hypothesis that developmental sequences within a domain tend to repeat themselves in new contexts. Development of estimation in this age range on 0-to-100 number lines…

  10. Methodology and data used for estimating the complex-wide impacts of alternative environmental restoration clean-up goals

    International Nuclear Information System (INIS)

    Shay, M.R.; Short, S.M.; Stiles, D.L.

    1994-03-01

    This paper describes the methodologies and data used for estimating the complex-wide impacts of alternative strategies for conducting remediation of all DOE sites and facilities, but does not address issues relating to Waste Management capabilities. Clean-up strategies and their corresponding goals for contaminated media may be driven by concentration-based regulatory standards, land-use standards (e.g., residential, industrial, wild life reserve, or totally restricted), risk-based standards, or other standards determined through stakeholder input. Strategies implemented to achieve these goals usually require the deployment of (a) clean-up technologies to destroy, remove, or contain the contaminants of concern; (b) institutional controls to prevent potential receptors from coming into contact with the contaminants; or (c) a combination of the above

  11. Interfacing system LOCA risk assessment: Methodology and application

    International Nuclear Information System (INIS)

    Galyean, W.J.; Schroeher, J.A.; Hanson, D.J.

    1991-01-01

    The United States Nuclear Regulatory Commission (NRC) is sponsoring a research program to develop an improved understanding of the human factors hardware, and accident consequence issues that dominate the risk from an Interfacing Systems Loss-of-Coolant Accident (ISLOCA) at a nuclear power plant. To accomplish this program, a methodology has been developed for estimating the core damage frequency and risk associated with an ISLOCA. The steps of the methodology are described with emphasis on one step which is unique, estimation of the probability of rupture of the low pressure systems. A trial application of the methodology was made for a Pressurized Water Reactor (PWR). The results are believed to be plant specific and indicate that human errors during startup and shutdown could be significant contributors to ISLOCA risk at the plant evaluated. 10 refs

  12. Population dynamics of Hawaiian seabird colonies vulnerable to sea-level rise.

    Science.gov (United States)

    Hatfield, Jeff S; Reynolds, Michelle H; Seavy, Nathaniel E; Krause, Crystal M

    2012-08-01

    Globally, seabirds are vulnerable to anthropogenic threats both at sea and on land. Seabirds typically nest colonially and show strong fidelity to natal colonies, and such colonies on low-lying islands may be threatened by sea-level rise. We used French Frigate Shoals, the largest atoll in the Hawaiian Archipelago, as a case study to explore the population dynamics of seabird colonies and the potential effects sea-level rise may have on these rookeries. We compiled historic observations, a 30-year time series of seabird population abundance, lidar-derived elevations, and aerial imagery of all the islands of French Frigate Shoals. To estimate the population dynamics of 8 species of breeding seabirds on Tern Island from 1980 to 2009, we used a Gompertz model with a Bayesian approach to infer population growth rates, density dependence, process variation, and observation error. All species increased in abundance, in a pattern that provided evidence of density dependence. Great Frigatebirds (Fregata minor), Masked Boobies (Sula dactylatra), Red-tailed Tropicbirds (Phaethon rubricauda), Spectacled Terns (Onychoprion lunatus), and White Terns (Gygis alba) are likely at carrying capacity. Density dependence may exacerbate the effects of sea-level rise on seabirds because populations near carrying capacity on an island will be more negatively affected than populations with room for growth. We projected 12% of French Frigate Shoals will be inundated if sea level rises 1 m and 28% if sea level rises 2 m. Spectacled Terns and shrub-nesting species are especially vulnerable to sea-level rise, but seawalls and habitat restoration may mitigate the effects of sea-level rise. Losses of seabird nesting habitat may be substantial in the Hawaiian Islands by 2100 if sea levels rise 2 m. Restoration of higher-elevation seabird colonies represent a more enduring conservation solution for Pacific seabirds. Conservation Biology ©2012 Society for Conservation Biology. No claim to original

  13. Reliability evaluation methodologies for ensuring container integrity of stored transuranic (TRU) waste

    International Nuclear Information System (INIS)

    Smith, K.L.

    1995-06-01

    This report provides methodologies for providing defensible estimates of expected transuranic waste storage container lifetimes at the Radioactive Waste Management Complex. These methodologies can be used to estimate transuranic waste container reliability (for integrity and degradation) and as an analytical tool to optimize waste container integrity. Container packaging and storage configurations, which directly affect waste container integrity, are also addressed. The methodologies presented provide a means for demonstrating Resource Conservation and Recovery Act waste storage requirements

  14. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  15. A Comprehensive Methodology for Development, Parameter Estimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens

    2016-01-01

    of the prediction. The methodology is evaluated through development of a GC method for the prediction of the heat of combustion (ΔHco) for pure components. The results showed that robust regression lead to best performance statistics for parameter estimation. The bootstrap method is found to be a valid alternative......A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... identifiability issues, reporting of the 95% confidence intervals of the predicted property values should be mandatory as opposed to reporting only single value prediction, currently the norm in literature. Moreover, inclusion of higher order groups (additional parameters) does not always lead to improved...

  16. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    Science.gov (United States)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities

  17. Plume rise from multiple sources

    International Nuclear Information System (INIS)

    Briggs, G.A.

    1975-01-01

    A simple enhancement factor for plume rise from multiple sources is proposed and tested against plume-rise observations. For bent-over buoyant plumes, this results in the recommendation that multiple-source rise be calculated as [(N + S)/(1 + S)]/sup 1/3/ times the single-source rise, Δh 1 , where N is the number of sources and S = 6 (total width of source configuration/N/sup 1/3/ Δh 1 )/sup 3/2/. For calm conditions a crude but simple method is suggested for predicting the height of plume merger and subsequent behavior which is based on the geometry and velocity variations of a single buoyant plume. Finally, it is suggested that large clusters of buoyant sources might occasionally give rise to concentrated vortices either within the source configuration or just downwind of it

  18. Finite difference modelling of the temperature rise in non-linear medical ultrasound fields.

    Science.gov (United States)

    Divall, S A; Humphrey, V F

    2000-03-01

    Non-linear propagation of ultrasound can lead to increased heat generation in medical diagnostic imaging due to the preferential absorption of harmonics of the original frequency. A numerical model has been developed and tested that is capable of predicting the temperature rise due to a high amplitude ultrasound field. The acoustic field is modelled using a numerical solution to the Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation, known as the Bergen Code, which is implemented in cylindrical symmetric form. A finite difference representation of the thermal equations is used to calculate the resulting temperature rises. The model allows for the inclusion of a number of layers of tissue with different acoustic and thermal properties and accounts for the effects of non-linear propagation, direct heating by the transducer, thermal diffusion and perfusion in different tissues. The effect of temperature-dependent skin perfusion and variation in background temperature between the skin and deeper layers of the body are included. The model has been tested against analytic solutions for simple configurations and then used to estimate temperature rises in realistic obstetric situations. A pulsed 3 MHz transducer operating with an average acoustic power of 200 mW leads to a maximum steady state temperature rise inside the foetus of 1.25 degrees C compared with a 0.6 degree C rise for the same transmitted power under linear propagation conditions. The largest temperature rise occurs at the skin surface, with the temperature rise at the foetus limited to less than 2 degrees C for the range of conditions considered.

  19. A new time-series methodology for estimating relationships between elderly frailty, remaining life expectancy, and ambient air quality.

    Science.gov (United States)

    Murray, Christian J; Lipfert, Frederick W

    2012-01-01

    Many publications estimate short-term air pollution-mortality risks, but few estimate the associated changes in life-expectancies. We present a new methodology for analyzing time series of health effects, in which prior frailty is assumed to precede short-term elderly nontraumatic mortality. The model is based on a subpopulation of frail individuals whose entries and exits (deaths) are functions of daily and lagged environmental conditions: ambient temperature/season, airborne particles, and ozone. This frail susceptible population is unknown; its fluctuations cannot be observed but are estimated using maximum-likelihood methods with the Kalman filter. We used an existing 14-y set of daily data to illustrate the model and then tested the assumption of prior frailty with a new generalized model that estimates the portion of the daily death count allocated to nonfrail individuals. In this demonstration dataset, new entries into the high-risk pool are associated with lower ambient temperatures and higher concentrations of particulate matter and ozone. Accounting for these effects on antecedent frailty reduces this at-risk population, yielding frail life expectancies of 5-7 days. Associations between environmental factors and entries to the at-risk pool are about twice as strong as for mortality. Nonfrail elderly deaths are seen to make only small contributions. This new model predicts a small short-lived frail population-at-risk that is stable over a wide range of environmental conditions. The predicted effects of pollution on new entries and deaths are robust and consistent with conventional morbidity/mortality times-series studies. We recommend model verification using other suitable datasets.

  20. Climate-change-driven accelerated sea-level rise detected in the altimeter era.

    Science.gov (United States)

    Nerem, R S; Beckley, B D; Fasullo, J T; Hamlington, B D; Masters, D; Mitchum, G T

    2018-02-27

    Using a 25-y time series of precision satellite altimeter data from TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3, we estimate the climate-change-driven acceleration of global mean sea level over the last 25 y to be 0.084 ± 0.025 mm/y 2 Coupled with the average climate-change-driven rate of sea level rise over these same 25 y of 2.9 mm/y, simple extrapolation of the quadratic implies global mean sea level could rise 65 ± 12 cm by 2100 compared with 2005, roughly in agreement with the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5) model projections. Copyright © 2018 the Author(s). Published by PNAS.

  1. PERSPECTIVE: The tripping points of sea level rise

    Science.gov (United States)

    Hecht, Alan D.

    2009-12-01

    When President Nixon created the US Environmental Protection Agency (EPA) in 1970 he said the environment must be perceived as a single, interrelated system. We are nowhere close to achieving this vision. Jim Titus and his colleagues [1] highlight one example of where one set of regulations or permits may be in conflict with another and where regulations were crafted in the absence of understanding the cumulative impact of global warming. The issue here is how to deal with the impacts of climate change on sea level and the latter's impact on wetland polices, clean water regulations, and ecosystem services. The Titus paper could also be called `The tripping points of sea level rise'. Titus and his colleagues have looked at the impact of such sea level rise on the east coast of the United States. Adaptive responses include costly large- scale investment in shore protection (e.g. dikes, sand replenishment) and/or ecosystem migration (retreat), where coastal ecosystems move inland. Shore protection is limited by available funds, while ecosystem migrations are limited by available land use. The driving factor is the high probability of sea level rise due to climate change. Estimating sea level rise is difficult because of local land and coastal dynamics including rising or falling land areas. It is estimated that sea level could rise between 8 inches and 2 feet by the end of this century [2]. The extensive data analysis done by Titus et al of current land use is important because, as they observe, `property owners and land use agencies have generally not decided how they will respond to sea level rise, nor have they prepared maps delineating where shore protection and retreat are likely'. This is the first of two `tripping points', namely the need for adaptive planning for a pending environmental challenge that will create economic and environment conflict among land owners, federal and state agencies, and businesses. One way to address this gap in adaptive management

  2. A model of water and sediment balance as determinants of relative sea level rise in contemporary and future deltas

    Science.gov (United States)

    Tessler, Zachary D.; Vörösmarty, Charles J.; Overeem, Irina; Syvitski, James P. M.

    2018-03-01

    Modern deltas are dependent on human-mediated freshwater and sediment fluxes. Changes to these fluxes impact delta biogeophysical functioning and affect the long-term sustainability of these landscapes for human and for natural systems. Here we present contemporary estimates of long-term mean sediment balance and relative sea level rise across 46 global deltas. We model scenarios of contemporary and future water resource management schemes and hydropower infrastructure in upstream river basins to explore how changing sediment fluxes impact relative sea level rise in delta systems. Model results show that contemporary sediment fluxes, anthropogenic drivers of land subsidence, and sea level rise result in delta relative sea level rise rates that average 6.8 mm/y. Assessment of impacts of planned and under-construction dams on relative sea level rise rates suggests increases on the order of 1 mm/y in deltas with new upstream construction. Sediment fluxes are estimated to decrease by up to 60% in the Danube and 21% in the Ganges-Brahmaputra-Meghna if all currently planned dams are constructed. Reduced sediment retention on deltas caused by increased river channelization and management has a larger impact, increasing relative sea level rise on average by nearly 2 mm/y. Long-term delta sustainability requires a more complete understanding of how geophysical and anthropogenic change impact delta geomorphology. Local and regional strategies for sustainable delta management that focus on local and regional drivers of change, especially groundwater and hydrocarbon extraction and upstream dam construction, can be highly impactful even in the context of global climate-induced sea level rise.

  3. Find-rate methodology and resource base estimates of the Hydrocarbon Supply Model (1990 update). Topical report

    International Nuclear Information System (INIS)

    Woods, T.

    1991-02-01

    The Hydrocarbon Supply Model is used to develop long-term trends in Lower-48 gas production and costs. The model utilizes historical find-rate patterns to predict the discovery rate and size distribution of future oil and gas field discoveries. The report documents the methodologies used to quantify historical oil and gas field find-rates and to project those discovery patterns for future drilling. It also explains the theoretical foundations for the find-rate approach. The new field and reserve growth resource base is documented and compared to other published estimates. The report has six sections. Section 1 provides background information and an overview of the model. Sections 2, 3, and 4 describe the theoretical foundations of the model, the databases, and specific techniques used. Section 5 presents the new field resource base by region and depth. Section 6 documents the reserve growth model components

  4. Modelling sea level rise impacts on storm surges along US coasts

    International Nuclear Information System (INIS)

    Tebaldi, Claudia; Strauss, Benjamin H; Zervas, Chris E

    2012-01-01

    Sound policies for protecting coastal communities and assets require good information about vulnerability to flooding. Here, we investigate the influence of sea level rise on expected storm surge-driven water levels and their frequencies along the contiguous United States. We use model output for global temperature changes, a semi-empirical model of global sea level rise, and long-term records from 55 nationally distributed tidal gauges to develop sea level rise projections at each gauge location. We employ more detailed records over the period 1979–2008 from the same gauges to elicit historic patterns of extreme high water events, and combine these statistics with anticipated relative sea level rise to project changing local extremes through 2050. We find that substantial changes in the frequency of what are now considered extreme water levels may occur even at locations with relatively slow local sea level rise, when the difference in height between presently common and rare water levels is small. We estimate that, by mid-century, some locations may experience high water levels annually that would qualify today as ‘century’ (i.e., having a chance of occurrence of 1% annually) extremes. Today’s century levels become ‘decade’ (having a chance of 10% annually) or more frequent events at about a third of the study gauges, and the majority of locations see substantially higher frequency of previously rare storm-driven water heights in the future. These results add support to the need for policy approaches that consider the non-stationarity of extreme events when evaluating risks of adverse climate impacts. (letter)

  5. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  6. Rising atmospheric CO{sub 2} and crops: Research methodology and direct effects

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, H. [National Soil Dynamics Laboratory, Auburn, AL (United States); Acock, B. [Systems Research Laboratory, Beltsville, MD (United States)

    1993-12-31

    Carbon dioxide is the food of trees and grass. Our relentless pursuit of a better life has taken us down a traffic jammed road, past smoking factories and forests. This pursuit is forcing a rise in the atmospheric CO{sub 2} level, and no one know when and if flood stage will be reached. Some thinkers have suggested that this increase of CO{sub 2} in the atmosphere will cause warming. No matter whether this prediction is realized or not, more CO{sub 2} will directly affect plants. Data from controlled observations have usually, but not always, shown benefits. Our choices of scientific equipment for gathering CO{sub 2} response data are critical since we must see what is happening through the eye of the instrument. The signals derived from our sensors will ultimately determine the truth of our conclusions, conclusion which will profoundly influence our policy decisions. Experimental gear is selected on the basis of scale of interest and problem to be addressed. Our imaginations and our budgets interact to set bounds on our objectives and approaches. Techniques run the gamut from cellular microprobes through whole-plant controlled environment chambers to field-scale exposure systems. Trade-offs exist among the various CO{sub 2} exposure techniques, and many factors impinge on the choice of a method. All exposure chambers are derivatives of three primary types--batch, plug flow, and continuous stirred tank reactor. Systems for the generation of controlled test atmospheres of CO{sub 2} vary in two basic ways--size and degree of control. Among the newest is free-air CO{sub 2} enrichment which allows tens of square meters of cropland to be studied.

  7. GLUE Based Uncertainty Estimation of Urban Drainage Modeling Using Weather Radar Precipitation Estimates

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2011-01-01

    Distributed weather radar precipitation measurements are used as rainfall input for an urban drainage model, to simulate the runoff from a small catchment of Denmark. It is demonstrated how the Generalized Likelihood Uncertainty Estimation (GLUE) methodology can be implemented and used to estimate...

  8. Methodologies on estimating the energy requirements for maintenance and determining the net energy contents of feed ingredients in swine: a review of recent work.

    Science.gov (United States)

    Li, Zhongchao; Liu, Hu; Li, Yakui; Lv, Zhiqian; Liu, Ling; Lai, Changhua; Wang, Junjun; Wang, Fenglai; Li, Defa; Zhang, Shuai

    2018-01-01

    In the past two decades, a considerable amount of research has focused on the determination of the digestible (DE) and metabolizable energy (ME) contents of feed ingredients fed to swine. Compared with the DE and ME systems, the net energy (NE) system is assumed to be the most accurate estimate of the energy actually available to the animal. However, published data pertaining to the measured NE content of ingredients fed to growing pigs are limited. Therefore, the Feed Data Group at the Ministry of Agricultural Feed Industry Centre (MAFIC) located at China Agricultural University has evaluated the NE content of many ingredients using indirect calorimetry. The present review summarizes the NE research works conducted at MAFIC and compares these results with those from other research groups on methodological aspect. These research projects mainly focus on estimating the energy requirements for maintenance and its impact on the determination, prediction, and validation of the NE content of several ingredients fed to swine. The estimation of maintenance energy is affected by methodology, growth stage, and previous feeding level. The fasting heat production method and the curvilinear regression method were used in MAFIC to estimate the NE requirement for maintenance. The NE contents of different feedstuffs were determined using indirect calorimetry through standard experimental procedure in MAFIC. Previously generated NE equations can also be used to predict NE in situations where calorimeters are not available. Although popular, the caloric efficiency is not a generally accepted method to validate the energy content of individual feedstuffs. In the future, more accurate and dynamic NE prediction equations aiming at specific ingredients should be established, and more practical validation approaches need to be developed.

  9. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  10. Regional approaches in high-rise construction

    Science.gov (United States)

    Iconopisceva, O. G.; Proskurin, G. A.

    2018-03-01

    The evolutionary process of high-rise construction is in the article focus. The aim of the study was to create a retrospective matrix reflecting the tasks of the study such as: structuring the most iconic high-rise objects within historic boundaries. The study is based on contemporary experience of high-rise construction in different countries. The main directions and regional specifics in the field of high-rise construction as well as factors influencing the further evolution process are analyzed. The main changes in architectural stylistics, form-building, constructive solutions that focus on the principles of energy efficiency and bio positivity of "sustainable buildings", as well as the search for a new typology are noted. The most universal constructive methods and solutions that turned out to be particularly popular are generalized. The new typology of high-rises and individual approach to urban context are noted. The results of the study as a graphical scheme made it possible to represent the whole high-rise evolution. The new spatial forms of high-rises lead them to new role within the urban environments. Futuristic hyperscalable concepts take the autonomous urban space functions itself and demonstrate us how high-rises can replace multifunctional urban fabric, developing it inside their shells.

  11. Reconciling projections of the Antarctic contribution to sea level rise

    Science.gov (United States)

    Edwards, Tamsin; Holden, Philip; Edwards, Neil; Wernecke, Andreas

    2017-04-01

    Two recent studies of the Antarctic contribution to sea level rise this century had best estimates that differed by an order of magnitude (around 10 cm and 1 m by 2100). The first, Ritz et al. (2015), used a model calibrated with satellite data, giving a 5% probability of exceeding 30cm by 2100 for sea level rise due to Antarctic instability. The second, DeConto and Pollard (2016), used a model evaluated with reconstructions of palaeo-sea level. They did not estimate probabilities, but using a simple assumption here about the distribution shape gives up to a 5% chance of Antarctic contribution exceeding 2.3 m this century with total sea level rise approaching 3 m. If robust, this would have very substantial implications for global adaptation to climate change. How are we to make sense of this apparent inconsistency? How much is down to the data - does the past tell us we will face widespread and rapid Antarctic ice losses in the future? How much is due to the mechanism of rapid ice loss ('cliff failure') proposed in the latter paper, or other parameterisation choices in these low resolution models (GRISLI and PISM, respectively)? How much is due to choices made in the ensemble design and calibration? How do these projections compare with high resolution, grounding line resolving models such as BISICLES? Could we reduce the huge uncertainties in the palaeo-study? Emulation provides a powerful tool for understanding these questions and reconciling the projections. By describing the three numerical ice sheet models with statistical models, we can re-analyse the ensembles and re-do the calibrations under a common statistical framework. This reduces uncertainty in the PISM study because it allows massive sampling of the parameter space, which reduces the sensitivity to reconstructed palaeo-sea level values and also narrows the probability intervals because the simple assumption about distribution shape above is no longer needed. We present reconciled probabilistic

  12. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  13. Soil Structure Interaction Effect on High Rise and Low Rise Buildings

    OpenAIRE

    Divya Pathak; PAresh H. SHAH

    2000-01-01

    Effect of supporting soil on the response of structure has been analyzed in the present study. A low rise (G+ 5 storey) and a high rise (G+12 storey) building has been taken for the analysis. For both type of buildings, the response of building with and without consideration of soil structure interaction effect has been compared.Without interaction case is the case in which ends of the structure are assumed to be fixed while in interaction case, structure is assumed to be...

  14. The ideologies of positive economics: technocracy, laissez-faire, and the tensions of Friedman's methodological claims

    Directory of Open Access Journals (Sweden)

    Fernando Rugitsky

    2015-09-01

    Full Text Available AbstractThe purpose of this paper is to address some connections between Milton Friedman's classic essay on methodology and the rise of neoliberal thinking. In order to do so, it briefly reconstructs Friedman's methodological claims and reinterprets the debate about them. Emphasis is put on the tensions between instrumentalism and realism or pragmatism, and between empiricism and the defense of Chicago price theory. Then, these tensions are related to a tension that is arguably inherent in neoliberalism, between technocracy and laissez-faire. The argument presented aims to contribute to bridging the gap between the recent literature on neoliberalism and the older one on Friedman's methodological essay.

  15. Challenges in Projecting Sea Level Rise impacts on the Coastal Environment of South Florida (Invited)

    Science.gov (United States)

    Obeysekera, J.; Park, J.; Irizarry-Ortiz, M. M.; Barnes, J. A.; Trimble, P.; Said, W.

    2010-12-01

    Due to flat topography, a highly transmissive groundwater aquifer, and a growing population with the associated infrastructure, South Florida’s coastal environment is one of the most vulnerable areas to sea level rise. Current projections of sea level rise and the associated storm surges will have direct impacts on coastal beaches and infrastructure, flood protection, freshwater aquifers, and both the isolated and regional wetlands. Uncertainties in current projections have made it difficult for regional and local governments to develop adaptation strategies as such measures will depend heavily on the temporal and spatial patterns of sea level rise in the coming decades. We demonstrate the vulnerability of both the built and natural environments of the coastal region and present the current efforts to understand and predict the sea level rise estimate that management agencies could employ in planning of adaptation strategies. In particular, the potential vulnerabilities of the flood control system as well as the threat to the water supply wellfields in the coastal belt will be presented. In an effort to understand the historical variability of sea level rise, we present linkages to natural phenomena such as Atlantic Multi-Decadal Oscillation, and the analytical methods we have developed to provide probabilistic projections of both mean sea level rise and the extremes.

  16. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    Science.gov (United States)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  17. Probabilistic methodology for estimating radiation-induced cancer risk

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario

  18. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  19. Pulse superimposition calculational methodology for estimating the subcriticality level of nuclear fuel assemblies

    International Nuclear Information System (INIS)

    Talamo, Alberto; Gohar, Y.; Rabiti, C.; Aliberti, G.; Kondev, F.; Smith, D.; Zhong, Z.; Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Serafimovich, I.

    2009-01-01

    One of the most reliable experimental methods for measuring the subcriticality level of a nuclear fuel assembly is the Sjoestrand method applied to the reaction rate generated from a pulsed neutron source. This study developed a new analytical methodology simulating the Sjoestrand method, which allows comparing the experimental and analytical reaction rates and the obtained subcriticality levels. In this methodology, the reaction rate is calculated due to a single neutron pulse using MCNP/MCNPX computer code or any other neutron transport code that explicitly simulates the delayed fission neutrons. The calculation simulates a single neutron pulse over a long time period until the delayed neutron contribution to the reaction rate is vanished. The obtained reaction rate is then superimposed to itself, with respect to the time, to simulate the repeated pulse operation until the asymptotic level of the reaction rate, set by the delayed neutrons, is achieved. The superimposition of the pulse to itself was calculated by a simple C computer program. A parallel version of the C program is used due to the large amount of data being processed, e.g. by the Message Passing Interface (MPI). The analytical results of this new calculation methodology have shown an excellent agreement with the experimental data available from the YALINA-Booster facility of Belarus. This methodology can be used to calculate Bell and Glasstone spatial correction factor.

  20. Pulse superimposition calculational methodology for estimating the subcriticality level of nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Talamo, Alberto [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States)], E-mail: atalamo@anl.gov; Gohar, Y. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Rabiti, C. [Idaho National Laboratory, P.O. Box 2528, Idaho Falls, ID 83403 (United States); Aliberti, G.; Kondev, F.; Smith, D.; Zhong, Z. [Argonne National Laboratory, 9700 South Cass Avenue, Argonne, IL 60439 (United States); Kiyavitskaya, H.; Bournos, V.; Fokov, Y.; Routkovskaya, C.; Serafimovich, I. [Joint Institute for Power and Nuclear Research-Sosny, National Academy of Sciences (Belarus)

    2009-07-21

    One of the most reliable experimental methods for measuring the subcriticality level of a nuclear fuel assembly is the Sjoestrand method applied to the reaction rate generated from a pulsed neutron source. This study developed a new analytical methodology simulating the Sjoestrand method, which allows comparing the experimental and analytical reaction rates and the obtained subcriticality levels. In this methodology, the reaction rate is calculated due to a single neutron pulse using MCNP/MCNPX computer code or any other neutron transport code that explicitly simulates the delayed fission neutrons. The calculation simulates a single neutron pulse over a long time period until the delayed neutron contribution to the reaction rate is vanished. The obtained reaction rate is then superimposed to itself, with respect to the time, to simulate the repeated pulse operation until the asymptotic level of the reaction rate, set by the delayed neutrons, is achieved. The superimposition of the pulse to itself was calculated by a simple C computer program. A parallel version of the C program is used due to the large amount of data being processed, e.g. by the Message Passing Interface (MPI). The analytical results of this new calculation methodology have shown an excellent agreement with the experimental data available from the YALINA-Booster facility of Belarus. This methodology can be used to calculate Bell and Glasstone spatial correction factor.

  1. Construction of high-rise buildings in the Far East of Russia

    Science.gov (United States)

    Kudryavtsev, Sergey; Bugunov, Semen; Pogulyaeva, Evgeniya; Peters, Anastasiya; Kotenko, Zhanna; Grigor'yev, Danil

    2018-03-01

    The construction of high-rise buildings on plate foundation in geotechnical conditions of the Russian Far East is a complicated problem. In this respect foundation engineering becomes rather essential. In order to set a firm foundation it is necessary to take into account the pressure distribution at the structure base, in homogeneity of building deformation, which is due to collaborative geotechnical calculations complicated by a number of factors: actual over-placement of soils, the complex geometry of the building under construction, spatial work of the foundation ground with consideration for physical nonlinearity, the influence of the stiffness of the superstructure (reinforced concrete framing) upon the development of foundation deformations, foundation performance (the performance of the bed plate under the building and stairwells), the origination of internal forces in the superstructure with differential settlement. The solution of spatial problems regarding the mutual interaction between buildings and foundations with account of the factors mentioned above is fully achievable via the application of numerical modeling methodology. The work makes a review of the results of high-rise plate building numerical modeling in geotechnical conditions of the Russian Far East by way of the example of Khabarovsk city.

  2. A practical and transferable methodology for dose estimation in irradiated spices based on thermoluminescence dosimetry

    International Nuclear Information System (INIS)

    D'Oca, M.C.; Bartolotta, A.; Cammilleri, C.; Giuffrida, S.; Parlato, A.; Di Stefano, V.

    2008-01-01

    Full text: Among the industrial applications of ionizing radiation, the treatment of food for preservation purposes is a worldwide recognized tool, provided that proper and validated identification methods are available and used. The thermoluminescence (TL) dosimetry is the physical method validated by the European Committee for Standardization for food from which silicate minerals can be isolated, such as spices and aromatic herbs. The aim of this work was to set up a reasonably simple procedure, alternative to the recommended one, for the identification of irradiated spices and to estimate at the same time the original dose in the irradiated product, using TL and the additive dose method, even after months storage. We have already shown that the additive dose method can be applied with TL dosimetry, if the TL response of the silicate specimen after extraction is always added to the response after each irradiation; the applied added doses were higher than 1 kGy, that can however give saturation problems. The new proposed methodology makes use of added doses lower than 600 Gy; the entire process can be completed within few hours and a linear fit can be utilized. The method was applied to the silicates extracted from oregano samples soon after the radiation treatment (original dose: 2 - 3 - 5 kGy), and after one year storage at room conditions in the dark (original dose: 1-2 kGy). The procedure allows the identification of irradiated samples, without any false positive, together with an estimation of the dose range

  3. Sea level hazards: Altimetric monitoring of tsunamis and sea level rise

    Science.gov (United States)

    Hamlington, Benjamin Dillon

    Whether on the short timescale of an impending tsunami or the much longer timescale of climate change-driven sea level rise, the threat stemming from rising and inundating ocean waters is a great concern to coastal populations. Timely and accurate observations of potentially dangerous changes in sea level are vital in determining the precautionary steps that need to be taken in order to protect coastal communities. While instruments from the past have provided in situ measurements of sea level at specific locations across the globe, satellites can be used to provide improved spatial and temporal sampling of the ocean in addition to producing more accurate measurements. Since 1993, satellite altimetry has provided accurate measurements of sea surface height (SSH) with near-global coverage. Not only have these measurements led to the first definitive estimates of global mean sea level rise, satellite altimetry observations have also been used to detect tsunami waves in the open ocean where wave amplitudes are relatively small, a vital step in providing early warning to those potentially affected by the impending tsunami. The use of satellite altimetry to monitor two specific sea level hazards is examined in this thesis. The first section will focus on the detection of tsunamis in the open ocean for the purpose of providing early warning to coastal inhabitants. The second section will focus on estimating secular trends using satellite altimetry data with the hope of improving our understanding of future sea level change. Results presented here will show the utility of satellite altimetry for sea level monitoring and will lay the foundation for further advancement in the detection of the two sea level hazards considered.

  4. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  5. Polyfactorial corruption index in the Russian regions: methodology of estimation

    Directory of Open Access Journals (Sweden)

    Elina L. Sidorenko

    2016-09-01

    Full Text Available Objective to summarize criminological social and economic indicators of development of the Russian Federation subjects to identify and assess the hidden system dependencies between social indicators and levels of corruption to define the links between individual indicators and to develop the methodology of anticorruption ranking of the regions. Methods comparison analysis synthesis mathematical modeling correlation comparisons and extrapolation. Results in the work the author describes the methodology of the complex analysis of corruption in the Russian Federation subjects and elaborates forecasts for its development short term and medium term. Scientific novelty for the first time in domestic criminology the algorithm is proposed of studying and forecasting regional corruption on the basis of polyfactorial analysis of criminological social and political indicators. For profound and comprehensive study of the regional aspects of corruption a model was developed to monitor and forecast on the basis of measuring the polyfactorial corruption index PCI. PCI consists of two groups of parameters corruption potential of the region of the country CPR and corruption risk in the region CRR. Practical significance the research results can be used in the process of developing regional strategies of corruption counteraction as well as in adjustment of the existing methods of corruption prevention.

  6. Conducting experimental investigations of wind influence on high-rise constructions

    Science.gov (United States)

    Poddaeva, Olga I.; Fedosova, Anastasia N.; Churin, Pavel S.; Gribach, Julia S.

    2018-03-01

    The design of buildings with a height of more than 100 meters is accompanied by strict control in determining the external loads and the subsequent calculation of building structures, which is due to the uniqueness of these facilities. An important factor, the impact of which must be carefully studied at the stage of development of project documentation, is the wind. This work is devoted to the problem of studying the wind impact on buildings above 100 meters. In the article the technique of carrying out of experimental researches of wind influence on high-rise buildings and constructions, developed in the Educational-research-and-production laboratory on aerodynamic and aeroacoustic tests of building designs of NRU MGSU is presented. The publication contains a description of the main stages of the implementation of wind tunnel tests. The article presents the approbation of the methodology, based on the presented algorithm, on the example of a high-rise building under construction. This paper reflects the key requirements that are established at different stages of performing wind impact studies, as well as the results obtained, including the average values of the aerodynamic pressure coefficients, total forces and aerodynamic drag coefficients. Based on the results of the work, conclusions are presented.

  7. Review and evaluation of paleohydrologic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.

  8. Review and evaluation of paleohydrologic methodologies

    International Nuclear Information System (INIS)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites

  9. How accurate are forecasts of costs of energy? A methodological contribution

    International Nuclear Information System (INIS)

    Siddons, Craig; Allan, Grant; McIntyre, Stuart

    2015-01-01

    Forecasts of the cost of energy are typically presented as point estimates; however forecasts are seldom accurate, which makes it important to understand the uncertainty around these point estimates. The scale of the differences between forecasts and outturns (i.e. contemporary estimates) of costs may have important implications for government decisions on the appropriate form (and level) of support, modelling energy scenarios or industry investment appraisal. This paper proposes a methodology to assess the accuracy of cost forecasts. We apply this to levelised costs of energy for different generation technologies due to the availability of comparable forecasts and contemporary estimates, however the same methodology could be applied to the components of levelised costs, such as capital costs. The estimated “forecast errors” capture the accuracy of previous forecasts and can provide objective bounds to the range around current forecasts for such costs. The results from applying this method are illustrated using publicly available data for on- and off-shore wind, Nuclear and CCGT technologies, revealing the possible scale of “forecast errors” for these technologies. - Highlights: • A methodology to assess the accuracy of forecasts of costs of energy is outlined. • Method applied to illustrative data for four electricity generation technologies. • Results give an objective basis for sensitivity analysis around point estimates.

  10. HiRISE: The People's Camera

    Science.gov (United States)

    McEwen, A. S.; Eliason, E.; Gulick, V. C.; Spinoza, Y.; Beyer, R. A.; HiRISE Team

    2010-12-01

    The High Resolution Imaging Science Experiment (HiRISE) camera, orbiting Mars since 2006 on the Mars Reconnaissance Orbiter (MRO), has returned more than 17,000 large images with scales as small as 25 cm/pixel. From it’s beginning, the HiRISE team has followed “The People’s Camera” concept, with rapid release of useful images, explanations, and tools, and facilitating public image suggestions. The camera includes 14 CCDs, each read out into 2 data channels, so compressed images are returned from MRO as 28 long (up to 120,000 line) images that are 1024 pixels wide (or binned 2x2 to 512 pixels, etc.). This raw data is very difficult to use, especially for the public. At the HiRISE operations center the raw data are calibrated and processed into a series of B&W and color products, including browse images and JPEG2000-compressed images and tools to make it easy for everyone to explore these enormous images (see http://hirise.lpl.arizona.edu/). Automated pipelines do all of this processing, so we can keep up with the high data rate; images go directly to the format of the Planetary Data System (PDS). After students visually check each image product for errors, they are fully released just 1 month after receipt; captioned images (written by science team members) may be released sooner. These processed HiRISE images have been incorporated into tools such as Google Mars and World Wide Telescope for even greater accessibility. 51 Digital Terrain Models derived from HiRISE stereo pairs have been released, resulting in some spectacular flyover movies produced by members of the public and viewed up to 50,000 times according to YouTube. Public targeting began in 2007 via NASA Quest (http://marsoweb.nas.nasa.gov/HiRISE/quest/) and more than 200 images have been acquired, mostly by students and educators. At the beginning of 2010 we released HiWish (http://www.uahirise.org/hiwish/), opening HiRISE targeting to anyone in the world with Internet access, and already more

  11. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  12. Qualitative and quantitative cost estimation : a methodology analysis

    NARCIS (Netherlands)

    Aram, S.; Eastman, C.; Beetz, J.; Issa, R.; Flood, I.

    2014-01-01

    This paper reports on the first part of ongoing research with the goal of designing a framework and a knowledge-based system for 3D parametric model-based quantity take-off and cost estimation in the Architecture, Engineering and Construction (AEC) industry. The authors have studied and analyzed

  13. A decade of sea level rise slowed by climate-driven hydrology.

    Science.gov (United States)

    Reager, J T; Gardner, A S; Famiglietti, J S; Wiese, D N; Eicker, A; Lo, M-H

    2016-02-12

    Climate-driven changes in land water storage and their contributions to sea level rise have been absent from Intergovernmental Panel on Climate Change sea level budgets owing to observational challenges. Recent advances in satellite measurement of time-variable gravity combined with reconciled global glacier loss estimates enable a disaggregation of continental land mass changes and a quantification of this term. We found that between 2002 and 2014, climate variability resulted in an additional 3200 ± 900 gigatons of water being stored on land. This gain partially offset water losses from ice sheets, glaciers, and groundwater pumping, slowing the rate of sea level rise by 0.71 ± 0.20 millimeters per year. These findings highlight the importance of climate-driven changes in hydrology when assigning attribution to decadal changes in sea level. Copyright © 2016, American Association for the Advancement of Science.

  14. Estimation of erosion-accumulative processes at the Inia River's mouth near high-rise construction zones.

    Science.gov (United States)

    Sineeva, Natalya

    2018-03-01

    Our study relevance is due to the increasing man-made impact on water bodies and associated land resources within the urban areas, as a consequence, by a change in the morphology and dynamics of Rivers' canals. This leads to the need to predict the development of erosion-accumulation processes, especially within the built-up urban areas. Purpose of the study is to develop programs on the assessment of erosion-accumulation processes at a water body, a mouth area of the Inia River, in the of perspective high-rise construction zone of a residential microdistrict, the place, where floodplain-channel complex is intensively expected to develop. Results of the study: Within the velocities of the water flow comparing, full-scale measured conditions, and calculated from the model, a slight discrepancy was recorded. This allows us to say that the numerical model reliably describes the physical processes developing in the River. The carried out calculations to assess the direction and intensity of the channel re-formations, made us possible to conclude, there was an insignificant predominance of erosion processes over the accumulative ones on the undeveloped part of the Inia River (the processes activity is noticeable only in certain areas (by the coasts and the island)). Importance of the study: The study on the erosion-accumulation processes evaluation can be used in design decisions for the future high-rise construction of this territory, which will increase their economic efficiency.

  15. High School Students' Accuracy in Estimating the Cost of College: A Proposed Methodological Approach and Differences among Racial/Ethnic Groups and College Financial-Related Factors

    Science.gov (United States)

    Nienhusser, H. Kenny; Oshio, Toko

    2017-01-01

    High school students' accuracy in estimating the cost of college (AECC) was examined by utilizing a new methodological approach, the absolute-deviation-continuous construct. This study used the High School Longitudinal Study of 2009 (HSLS:09) data and examined 10,530 11th grade students in order to measure their AECC for 4-year public and private…

  16. Spent fuel management fee methodology and computer code user's manual

    International Nuclear Information System (INIS)

    Engel, R.L.; White, M.K.

    1982-01-01

    The methodology and computer model described here were developed to analyze the cash flows for the federal government taking title to and managing spent nuclear fuel. The methodology has been used by the US Department of Energy (DOE) to estimate the spent fuel disposal fee that will provide full cost recovery. Although the methodology was designed to analyze interim storage followed by spent fuel disposal, it could be used to calculate a fee for reprocessing spent fuel and disposing of the waste. The methodology consists of two phases. The first phase estimates government expenditures for spent fuel management. The second phase determines the fees that will result in revenues such that the government attains full cost recovery assuming various revenue collection philosophies. These two phases are discussed in detail in subsequent sections of this report. Each of the two phases constitute a computer module, called SPADE (SPent fuel Analysis and Disposal Economics) and FEAN (FEe ANalysis), respectively

  17. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  18. Force acting on a spherical bubble rising through a quiescent liquid

    International Nuclear Information System (INIS)

    Takagi, Shu; Matsumoto, Yoichiro

    1996-01-01

    The direct numerical simulation is performed on the spherical bubble unsteadily rising through a quiescent liquid. The method is based on a finite-volume solution of the equations on an orthogonal curvilinear coordinate system. The calculations are performed for a bubble rising through a clean liquid and contaminated one. Following the former experimental results, the tangential stress free condition is given for a clean bubble, and no-slip condition for contaminated one. The numerical results are compared with those of the model equation of the translational motion of the bubble, which is often used in numerical models of a bubbly flow. The steady drag, added mass and history terms are checked up by the comparison. It is revealed that the history force effect is negligible for a bubble rising through the clean liquid beyond Re=O(50). From the numerical point of view, the fact that the history force is negligible is quite important, because it reduces the calculation time and memory for a bubbly flow model. For a contaminated bubble, history force effect is not negligible even though the Reynolds number is high enough. It is found that the expression of the history force by Basset kernel gives an over-estimation of the history force for the bubble rising at moderate Reynolds number. This error becomes larger with increasing Reynolds number and it reduces the accuracy to calculate the bubble motion by the model equation. (author)

  19. Simple methodologies to estimate the energy amount stored in a tree due to an explosive seed dispersal mechanism

    Science.gov (United States)

    do Carmo, Eduardo; Goncalves Hönnicke, Marcelo

    2018-05-01

    There are different forms to introduce/illustrate the energy concepts for the basic physics students. The explosive seed dispersal mechanism found in a variety of trees could be one of them. Sibipiruna trees carry out fruits (pods) who show such an explosive mechanism. During the explosion, the pods throw out seeds several meters away. In this manuscript we show simple methodologies to estimate the energy amount stored in the Sibipiruna tree due to such a process. Two different physics approaches were used to carry out this study: by monitoring indoor and in situ the explosive seed dispersal mechanism and by measuring the elastic constant of the pod shell. An energy of the order of kJ was found to be stored in a single tree due to such an explosive mechanism.

  20. Methodology to estimate the threshold in-cylinder temperature for self-ignition of fuel during cold start of Diesel engines

    International Nuclear Information System (INIS)

    Broatch, A.; Ruiz, S.; Margot, X.; Gil, A.

    2010-01-01

    Cold startability of automotive direct injection (DI) Diesel engines is frequently one of the negative features when these are compared to their closest competitor, the gasoline engine. This situation worsens with the current design trends (engine downsizing) and the emerging new Diesel combustion concepts, such as HCCI, PCCI, etc., which require low compression ratio engines. To mitigate this difficulty, pre-heating systems (glow plugs, air heating, etc.) are frequently used and their technologies have been continuously developed. For the optimum design of these systems, the determination of the threshold temperature that the gas should have in the cylinder in order to provoke the self-ignition of the fuel injected during cold starting is crucial. In this paper, a novel methodology for estimating the threshold temperature is presented. In this methodology, experimental and computational procedures are adequately combined to get a good compromise between accuracy and effort. The measurements have been used as input data and boundary conditions in 3D and 0D calculations in order to obtain the thermodynamic conditions of the gas in the cylinder during cold starting. The results obtained from the study of two engine configurations -low and high compression ratio- indicate that the threshold in-cylinder temperature is a single temperature of about 415 o C.

  1. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  2. Microsphere estimates of blood flow: Methodological considerations

    International Nuclear Information System (INIS)

    von Ritter, C.; Hinder, R.A.; Womack, W.; Bauerfeind, P.; Fimmel, C.J.; Kvietys, P.R.; Granger, D.N.; Blum, A.L.

    1988-01-01

    The microsphere technique is a standard method for measuring blood flow in experimental animals. Sporadic reports have appeared outlining the limitations of this method. In this study the authors have systematically assessed the effect of blood withdrawals for reference sampling, microsphere numbers, and anesthesia on blood flow estimates using radioactive microspheres in dogs. Experiments were performed on 18 conscious and 12 anesthetized dogs. Four blood flow estimates were performed over 120 min using 1 x 10 6 microspheres each time. The effects of excessive numbers of microspheres pentobarbital sodium anesthesia, and replacement of volume loss for reference samples with dextran 70 were assessed. In both conscious and anesthetized dogs a progressive decrease in gastric mucosal blood flow and cardiac output was observed over 120 min. This was also observed in the pancreas in conscious dogs. The major factor responsible for these changes was the volume loss due to the reference sample withdrawals. Replacement of the withdrawn blood with dextran 70 led to stable blood flows to all organs. The injection of excessive numbers of microspheres did not modify hemodynamics to a greater extent than did the injection of 4 million microspheres. Anesthesia exerted no influence on blood flow other than raising coronary flow. The authors conclude that although blood flow to the gastric mucosa and the pancreas is sensitive to the minor hemodynamic changes associated with the microsphere technique, replacement of volume loss for reference samples ensures stable blood flow to all organs over a 120-min period

  3. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  4. Methodological proposals for estimating the price of climate in France

    Science.gov (United States)

    Joly, D.; Brossard, T.; Cardot, H.; Cavailhes, J.; Hilal, M.; Wavresky, P.

    2009-09-01

    identification problem, well-known in hedonic literature, is not a problem here, because climate is a non-produced good. Some explanatory variables may be endogenous; thus, we use the instrumental method. Finally, multicollinearity, detected by the condition number, occurs between climatic variables; thus we use a second estimation procedure, Partial Least Squares. The mean annual temperature has a positive significant effect on the housing price for owner occupiers: a rise of 1 °C entails an increase in housing prices of 5.9-6.2% (according to the equation and estimation method). The sign is also positive for tenants, with values between 2.5 and 3.9%, which are roughly half as much as for owner-occupiers. The effect of warmer summers (mean July temperature minus mean annual temperature) is compounded with the preceding one for single-detached houses: an extra 1 °C entails a price increase of 3.7 to 8.4% (depending on the model). This effect is insignificant for apartments. Hot summer days (more than 30 °C) have a significant effect for owner-occupiers of single-detached houses and renters of apartments. At the median point, an extra day of heat lowers the value of housing by 4.3% (owner-occupiers) or by 1% (tenants). This effect is quadratic, probably due to seaside sites where hot summers are appreciated. French households are insensitive to cold winters, either the January temperature minus the mean annual temperature or the number of coldest days (less than - 5 °C). The number of days' rain in January and July has a significant effect on real-estate values. The January sign is the expected: prices or rents fall by almost 1.2-2.3% for an extra day's rain. The number of days of rainfall in July also exerts a positive effect on the price of apartments (but not on the price of single-detached houses), indicating that households pay more for their housing (1.4 to 4.4%) for an extra summer day's rain. Rosen S., 1974. Hedonic prices and implicit markets: product differentiation

  5. The anticipated spatial loss of microtidal beaches in the next 100 years due to sea level rise.

    Science.gov (United States)

    Alexandrakis, G.; Poulos, S.

    2012-04-01

    The anticipated sea level rise is expected to influence on a global scale the earth coast in the near future and it is considered to be a main factor related to coastal retreat, with beach zones being among the most vulnerable coastal landforms. Records for the period 1890-1990 have shown that sea level has already risen by 18cm (min: +10cm, max: +25cm), while the projected to 2100 sea level rise has estimated to be 20 to 50cm (IPCC, 2007). It has to be highlighted that a small rise of few tens of meters would cause shoreline retreat of a few to tens meters in the case of low lying coasts, i.e. beach zones (e.g. Bruun 1962, Nichol and Letherman, 1995, Ciavola and Corbau, 2002). Within the concept of climate change, sea level rise could also being related, in regional scale, to changes of meteorological factors such as intensity, duration and direction of the onshore blowing winds, variation in atmospheric pressure. In the microtidal Greek waters temporary changes in sea level exceeds the 1 m (HHS, 2004) This work investigates the impact of sea level rise to sixteen beach zones along the Greek coast. More specifically, shoreline retreat has been estimated for time periods of 10, 20, 50 and 100 years for the corresponding sea level rise of 0,038, 0,076m, 0,19m and 0,38m, according to the A1B scenario of IPCC (2007) and utilizing Dean's (1991) equation; the latter includes in the calculations both the effects of the anticipated sea level rise and the associated storm surge The appropriate morphodynamic and sedimentological data used for the estimation of beach retreat has been deduced from field measurements. Finally, the percentage of the sub-aerial area lost for each beach zone, under investigation, has been estimated. The results show that coastline retreat follows a liner increase in the case of eleven out of the 16 beach zones, for a time period of 100 years. Santava beach zone (inner Messiniakos Gulf) undergoes most of erosion in the first period of 20 years

  6. Rise, stagnation, and rise of Danish women's life expectancy

    DEFF Research Database (Denmark)

    Lindahl-Jacobsen, Rune; Rau, Roland; Jeune, Bernard

    2016-01-01

    Health conditions change from year to year, with a general tendency in many countries for improvement. These conditions also change from one birth cohort to another: some generations suffer more adverse events in childhood, smoke more heavily, eat poorer diets, etc., than generations born earlier...... favor forecasts that hinge on cohort differences. We use a combination of age decomposition and exchange of survival probabilities between countries to study the remarkable recent history of female life expectancy in Denmark, a saga of rising, stagnating, and now again rising lifespans. The gap between...... female life expectancy in Denmark vs. Sweden grew to 3.5 y in the period 1975-2000. When we assumed that Danish women born 1915-1945 had the same survival probabilities as Swedish women, the gap remained small and roughly constant. Hence, the lower Danish life expectancy is caused by these cohorts...

  7. Application of realistic (best- estimate) methodologies for large break loss of coolant (LOCA) safety analysis: licensing of Westinghouse ASTRUM evaluation model in Spain

    International Nuclear Information System (INIS)

    Lage, Carlos; Frepoli, Cesare

    2010-01-01

    When the LOCA Final Acceptance Criteria for Light Water Reactors was issued in Appendix K of 10CFR50 both the USNRC and the industry recognized that the rule was highly conservative. At that time, however, the degree of conservatism in the analysis could not be quantified. As a result, the USNRC began a research program to identify the degree of conservatism in those models permitted in the Appendix K rule and to develop improved thermal-hydraulic computer codes so that realistic accident analysis calculations could be performed. The overall results of this research program quantified the conservatism in the Appendix K rule and confirmed that some relaxation of the rule can be made without a loss in safety to the public. Also, from a risk-informed perspective it is recognized that conservatism is not always a complete defense for lack of sophistication in models. In 1988, as a result of the improved understanding of LOCA phenomena, the USNRC staff amended the requirements of 10 CFR 50.46 and Appendix K, 'ECCS Evaluation Models', so that a realistic evaluation model may be used to analyze the performance of the ECCS during a hypothetical LOCA. Under the amended rules, best-estimate plus uncertainty (BEPU) thermal-hydraulic analysis may be used in place of the overly prescriptive set of models mandated by Appendix K rule. Further guidance for the use of best-estimate codes was provided in Regulatory Guide 1.157 To demonstrate use of the revised ECCS rule, the USNRC and its consultants developed a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology as an approach for defining and qualifying a best-estimate thermal-hydraulic code and quantifying the uncertainties in a LOCA analysis. More recently the CSAU principles have been generalized in the Evaluation Model Development and Assessment Process (EMDAP) of Regulatory Guide 1.203. ASTRUM is the Westinghouse Best Estimate Large Break LOCA evaluation model applicable to two-, three

  8. Relative sea-level rise and the conterminous United States : Consequences of potential land inundation in terms of population at risk and GDP loss

    NARCIS (Netherlands)

    Haer, Toon; Kalnay, Eugenia; Kearney, Michael; Moll, Henk

    2013-01-01

    Global sea-level rise poses a significant threat not only for coastal communities as development continues but also for national economies. This paper presents estimates of how future changes in relative sea-level rise puts coastal populations at risk, as well as affect overall GDP in the

  9. Prediction of windings temperature rise in induction motors supplied with distorted voltage

    Energy Technology Data Exchange (ETDEWEB)

    Gnacinski, P. [Gdynia Maritime University, Department of Ship Electrical Power Engineering, Morska Street 83, 81-225 Gdynia (Poland)

    2008-04-15

    One of the features of ship power systems is a different level and intensity of disturbances appearing during routine operation - the rms voltage value and frequency deviation, voltage unbalance and waveform voltage distortion. As a result, marine induction machines are exposed to overheating due to the lowered voltage quality. This paper is devoted to windings temperature rise prediction in marine induction cage machines supplied with distorted voltage, which means real voltage conditions. The proposed method of prediction does not require detailed knowledge of the thermal properties of a machine. Although the method was developed for marine induction motors, it is applicable for industry machines supplied with distorted voltage. It can also be generalized and used for estimation of the steady state windings temperature rise of any electrical machinery in various work conditions. (author)

  10. Prediction of windings temperature rise in induction motors supplied with distorted voltage

    International Nuclear Information System (INIS)

    Gnacinski, P.

    2008-01-01

    One of the features of ship power systems is a different level and intensity of disturbances appearing during routine operation - the rms voltage value and frequency deviation, voltage unbalance and waveform voltage distortion. As a result, marine induction machines are exposed to overheating due to the lowered voltage quality. This paper is devoted to windings temperature rise prediction in marine induction cage machines supplied with distorted voltage, which means real voltage conditions. The proposed method of prediction does not require detailed knowledge of the thermal properties of a machine. Although the method was developed for marine induction motors, it is applicable for industry machines supplied with distorted voltage. It can also be generalized and used for estimation of the steady state windings temperature rise of any electrical machinery in various work conditions

  11. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  12. Cost Methodology for Biomass Feedstocks: Herbaceous Crops and Agricultural Residues

    Energy Technology Data Exchange (ETDEWEB)

    Turhollow Jr, Anthony F [ORNL; Webb, Erin [ORNL; Sokhansanj, Shahabaddine [ORNL

    2009-12-01

    This report describes a set of procedures and assumptions used to estimate production and logistics costs of bioenergy feedstocks from herbaceous crops and agricultural residues. The engineering-economic analysis discussed here is based on methodologies developed by the American Society of Agricultural and Biological Engineers (ASABE) and the American Agricultural Economics Association (AAEA). An engineering-economic analysis approach was chosen due to lack of historical cost data for bioenergy feedstocks. Instead, costs are calculated using assumptions for equipment performance, input prices, and yield data derived from equipment manufacturers, research literature, and/or standards. Cost estimates account for fixed and variable costs. Several examples of this costing methodology used to estimate feedstock logistics costs are included at the end of this report.

  13. Review of PRA methodology for LMFBR

    International Nuclear Information System (INIS)

    Yang, J. E.

    1999-02-01

    Probabilistic Risk Assessment (PRA) has been widely used as a tool to evaluate the safety of NPPs (Nuclear Power Plants), which are in the design stage as well as in operation. Recently, PRA becomes one of the licensing requirements for many existing and new NPPs. KALIMER is a Liquid Metal Fast Breeder Reactor (LMFBR) being developed by KAERI. Since the design concept of KALIMER is similar to that of the PRISM plant developed by GE, it would be appropriate to review the PRA methodology of PRISM as the first step of KALIMER PRA. Hence, in this report summarizes the PRA methodology of PRISM plant, and the required works for the PSA of KALIMER based on the reviewed results. The PRA technology of PRISM plant consists of following five major tasks: (1) development of initiating event list, (2) development of system event tree, (3) development of core response event tree, (4) development of containment response event tree, and (5) consequences and risk estimation. The estimated individual and societal risk measures show that the risk from a PRISM module is substantially less than the NRC goal. Each task is compared to the PRA methodology of Light Water Reactor (LWR)/Pressurized Heavy Water Reactor (PHWR). In the report, each task of PRISM PRA methodology is reviewed and compared to the corresponding part of LWR/PHWR PSA performed in Korea. The parts that are not modeled appropriately in PRISM PRA are identified, and the recommendations for KALIMER PRA are stated. (author). 14 refs., 9 tabs., 4 figs

  14. SEA-LEVEL RISE. Sea-level rise due to polar ice-sheet mass loss during past warm periods.

    Science.gov (United States)

    Dutton, A; Carlson, A E; Long, A J; Milne, G A; Clark, P U; DeConto, R; Horton, B P; Rahmstorf, S; Raymo, M E

    2015-07-10

    Interdisciplinary studies of geologic archives have ushered in a new era of deciphering magnitudes, rates, and sources of sea-level rise from polar ice-sheet loss during past warm periods. Accounting for glacial isostatic processes helps to reconcile spatial variability in peak sea level during marine isotope stages 5e and 11, when the global mean reached 6 to 9 meters and 6 to 13 meters higher than present, respectively. Dynamic topography introduces large uncertainties on longer time scales, precluding robust sea-level estimates for intervals such as the Pliocene. Present climate is warming to a level associated with significant polar ice-sheet loss in the past. Here, we outline advances and challenges involved in constraining ice-sheet sensitivity to climate change with use of paleo-sea level records. Copyright © 2015, American Association for the Advancement of Science.

  15. Best estimate LB LOCA approach based on advanced thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Sauvage, J.Y.; Gandrille, J.L.; Gaurrand, M.; Rochwerger, D.; Thibaudeau, J.; Viloteau, E.

    2004-01-01

    Improvements achieved in thermal-hydraulics with development of Best Estimate computer codes, have led number of Safety Authorities to preconize realistic analyses instead of conservative calculations. The potentiality of a Best Estimate approach for the analysis of LOCAs urged FRAMATOME to early enter into the development with CEA and EDF of the 2nd generation code CATHARE, then of a LBLOCA BE methodology with BWNT following the Code Scaling Applicability and Uncertainty (CSAU) proceeding. CATHARE and TRAC are the basic tools for LOCA studies which will be performed by FRAMATOME according to either a deterministic better estimate (dbe) methodology or a Statistical Best Estimate (SBE) methodology. (author)

  16. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  17. Quantifying the effect of sea level rise and flood defence - a point process perspective on coastal flood damage

    Science.gov (United States)

    Boettle, M.; Rybski, D.; Kropp, J. P.

    2016-02-01

    In contrast to recent advances in projecting sea levels, estimations about the economic impact of sea level rise are vague. Nonetheless, they are of great importance for policy making with regard to adaptation and greenhouse-gas mitigation. Since the damage is mainly caused by extreme events, we propose a stochastic framework to estimate the monetary losses from coastal floods in a confined region. For this purpose, we follow a Peak-over-Threshold approach employing a Poisson point process and the Generalised Pareto Distribution. By considering the effect of sea level rise as well as potential adaptation scenarios on the involved parameters, we are able to study the development of the annual damage. An application to the city of Copenhagen shows that a doubling of losses can be expected from a mean sea level increase of only 11 cm. In general, we find that for varying parameters the expected losses can be well approximated by one of three analytical expressions depending on the extreme value parameters. These findings reveal the complex interplay of the involved parameters and allow conclusions of fundamental relevance. For instance, we show that the damage typically increases faster than the sea level rise itself. This in turn can be of great importance for the assessment of sea level rise impacts on the global scale. Our results are accompanied by an assessment of uncertainty, which reflects the stochastic nature of extreme events. While the absolute value of uncertainty about the flood damage increases with rising mean sea levels, we find that it decreases in relation to the expected damage.

  18. Developing a methodological framework for estimating water productivity indicators in water scarce regions

    Science.gov (United States)

    Mubako, S. T.; Fullerton, T. M.; Walke, A.; Collins, T.; Mubako, G.; Walker, W. S.

    2014-12-01

    Water productivity is an area of growing interest in assessing the impact of human economic activities on water resources, especially in arid regions. Indicators of water productivity can assist water users in evaluating sectoral water use efficiency, identifying sources of pressure on water resources, and in supporting water allocation rationale under scarcity conditions. This case study for the water-scarce Middle Rio Grande River Basin aims to develop an environmental-economic accounting approach for water use in arid river basins through a methodological framework that relates water use to human economic activities impacting regional water resources. Water uses are coupled to economic transactions, and the complex but mutual relations between various water using sectors estimated. A comparison is made between the calculated water productivity indicators and representative cost/price per unit volume of water for the main water use sectors. Although it contributes very little to regional economic output, preliminary results confirm that Irrigation is among the sectors with the largest direct water use intensities. High economic value and low water use intensity economic sectors in the study region include Manufacturing, Mining, and Steam Electric Power. Water accounting challenges revealed by the study include differences in water management regimes between jurisdictions, and little understanding of the impact of major economic activities on the interaction between surface and groundwater systems in this region. A more comprehensive assessment would require the incorporation of environmental and social sustainability indicators to the calculated water productivity indicators.

  19. The Impact of Sea Level Rise on Florida's Everglades

    Science.gov (United States)

    Senarath, S. U.

    2005-12-01

    Global warming and the resulting melting of polar ice sheets could increase global sea levels significantly. Some studies have predicted mean sea level increases in the order of six inches to one foot in the next 25 to 50 years. This could have severe irreversible impacts on low-lying areas of Florida's Everglades. The key objective of this study is to evaluate the effects of a one foot sea level rise on Cape Sable Seaside Sparrow (CSSS) nesting areas within the Everglades National Park (ENP). A regional-scale hydrologic model is used to assess the sensitivities of this sea-level rise scenario. Florida's Everglades supports a unique ecosystem. At present, about 50 percent of this unique ecosystem has been lost due to urbanization and farming. Today, the water flow in the remnant Everglades is also regulated to meet a variety of competing environmental, water-supply and flood-control needs. A 30-year, eight billion dollar (1999 estimate) project has been initiated to improve Everglades' water flows. The expected benefits of this restoration project will be short-lived if the predicted sea level rise causes severe impacts on the environmentally sensitive areas of the Everglades. Florida's Everglades is home to many threatened and endangered species of wildlife. The Cape Sable Seaside Sparrow population in the ENP is one such species that is currently listed as endangered. Since these birds build their nests close to the ground surface (the base of the nest is approximately six inches from the ground surface), they are directly affected by any sea level induced ponding depth, frequency or duration change. Therefore, the CSSS population serves as a good indicator species for evaluating the negative impacts of sea level rise on the Everglades' ecosystem. The impact of sea level rise on the CSSS habitat is evaluated using the Regional Simulation Model (RSM) developed by the South Florida Water Management District. The RSM is an implicit, finite-volume, continuous

  20. Utilisation of best estimate system codes and best estimate methods in safety analyses of VVER reactors in the Czech Republic

    International Nuclear Information System (INIS)

    Macek, Jiri; Kral, Pavel

    2010-01-01

    The content of the presentation was as follows: Conservative versus best estimate approach, Brief description and selection of methodology, Description of uncertainty methods, Examples of the BE methodology. It is concluded that where BE computer codes are used, uncertainty and sensitivity analyses should be included; if best estimate codes + uncertainty are used, the safety margins increase; and BE + BSA is the next step in licensing analyses. (P.A.)

  1. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  2. Rethinking Fragile Landscapes during the Greek Crisis: Precarious Aesthetics and Methodologies in Athenian Dance Performances

    Science.gov (United States)

    Zervou, Natalie

    2017-01-01

    The financial crisis in Greece brought about significant changes in the sociopolitical and financial landscape of the country. Severe budget cuts imposed on the arts and performing practices have given rise to a new aesthetic which has impacted the themes and methodologies of contemporary productions. To unpack this aesthetic, I explore the ways…

  3. Theoretical and methodological bases of the cooperation and the cooperative

    Directory of Open Access Journals (Sweden)

    Claudio Alberto Rivera Rodríguez

    2013-12-01

    Full Text Available The present work has the purpose to approach the theoretical and methodological foundations of the rise of the cooperatives. In this article are studied the logical antecedents of the cooperativism, the premises  establish by  the Industrial Revolution for the emergence of the first modern cooperative “The Pioneers of Rochdale”  that  is  the inflection point of  cooperativism, until analyzing the contributions of the whole thinking  of the time that maintain this process.

  4. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  5. Development of regional stump-to-mill logging cost estimators

    Science.gov (United States)

    Chris B. LeDoux; John E. Baumgras

    1989-01-01

    Planning logging operations requires estimating the logging costs for the sale or tract being harvested. Decisions need to be made on equipment selection and its application to terrain. In this paper a methodology is described that has been developed and implemented to solve the problem of accurately estimating logging costs by region. The methodology blends field time...

  6. Sun Radio Interferometer Space Experiment (SunRISE)

    Science.gov (United States)

    Kasper, Justin C.; SunRISE Team

    2018-06-01

    The Sun Radio Interferometer Space Experiment (SunRISE) is a NASA Heliophysics Explorer Mission of Opportunity currently in Phase A. SunRISE is a constellation of spacecraft flying in a 10-km diameter formation and operating as the first imaging radio interferometer in space. The purpose of SunRISE is to reveal critical aspects of solar energetic particle (SEP) acceleration at coronal mass ejections (CMEs) and transport into space by making the first spatially resolved observations of coherent Type II and III radio bursts produced by electrons accelerated at CMEs or released from flares. SunRISE will focus on solar Decametric-Hectometric (DH, 0.1 space before major SEP events, but cannot be seen on Earth due to ionospheric absorption. This talk will describe SunRISE objectives and implementation. Presented on behalf of the entire SunRISE team.

  7. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  8. Rise of a cold plume

    International Nuclear Information System (INIS)

    Kakuta, Michio

    1977-06-01

    The rise of smoke from the stacks of two research reactors in normal operation was measured by photogrametric method. The temperature of effluent gas is less than 20 0 C higher than that of the ambient air (heat emission of the order 10 4 cal s -1 ), and the efflux velocity divided by the wind speed is between 0.5 and 2.8 in all 16 smoke runs. The field data obtained within downwind distance of 150m are compared with those by plume rise formulas presently available. Considering the shape of bending-over plume, the Briggs' formula for 'jet' gives a reasonable explanation of the observed plume rise. (auth.)

  9. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  10. ALTERNATIVE METHODOLOGIES FOR THE ESTIMATION OF LOCAL POINT DENSITY INDEX: MOVING TOWARDS ADAPTIVE LIDAR DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Z. Lari

    2012-07-01

    Full Text Available Over the past few years, LiDAR systems have been established as a leading technology for the acquisition of high density point clouds over physical surfaces. These point clouds will be processed for the extraction of geo-spatial information. Local point density is one of the most important properties of the point cloud that highly affects the performance of data processing techniques and the quality of extracted information from these data. Therefore, it is necessary to define a standard methodology for the estimation of local point density indices to be considered for the precise processing of LiDAR data. Current definitions of local point density indices, which only consider the 2D neighbourhood of individual points, are not appropriate for 3D LiDAR data and cannot be applied for laser scans from different platforms. In order to resolve the drawbacks of these methods, this paper proposes several approaches for the estimation of the local point density index which take the 3D relationship among the points and the physical properties of the surfaces they belong to into account. In the simplest approach, an approximate value of the local point density for each point is defined while considering the 3D relationship among the points. In the other approaches, the local point density is estimated by considering the 3D neighbourhood of the point in question and the physical properties of the surface which encloses this point. The physical properties of the surfaces enclosing the LiDAR points are assessed through eigen-value analysis of the 3D neighbourhood of individual points and adaptive cylinder methods. This paper will discuss these approaches and highlight their impact on various LiDAR data processing activities (i.e., neighbourhood definition, region growing, segmentation, boundary detection, and classification. Experimental results from airborne and terrestrial LiDAR data verify the efficacy of considering local point density variation for

  11. Morphological response of the saltmarsh habitats of the Guadiana estuary due to flow regulation and sea-level rise

    Science.gov (United States)

    Sampath, D. M. R.; Boski, T.

    2016-12-01

    In the context of rapid sea-level rise in the 21st century, the reduction of fluvial sediment supply due to the regulation of river discharge represents a major challenge for the management of estuarine ecosystems. Therefore, the present study aims to assess the cumulative impacts of the reduction of river discharge and projected sea-level rise on the morphological evolution of the Guadiana estuary during the 21st century. The assessment was based on a set of analytical solutions to simplified equations of tidal wave propagation in shallow waters and empirical knowledge of the system. As methods applied to estimate environmental flows do not take into consideration the fluvial discharge required to maintain saltmarsh habitats and the impact of sea-level rise, simulations were carried out for ten cases in terms of base river flow and sea-level rise so as to understand their sensitivity on the deepening of saltmarsh platforms. Results suggest saltmarsh habitats may not be affected severely in response to lower limit scenarios of sea-level rise and sedimentation. A similar behaviour can be expected even due to the upper limit scenarios until 2050, but with a significant submergence afterwards. In the case of the upper limit scenarios under scrutiny, there was a net erosion of sediment from the estuary. Multiplications of amplitudes of the base flow function by factors 1.5, 2, and 5 result in reduction of the estimated net eroded sediment volume by 25, 40, and 80%, respectively, with respect to the net eroded volume for observed river discharge. The results also indicate that defining the minimum environmental flow as a percentage of dry season flow (as done presently) should be updated to include the full spectrum of natural flows, incorporating temporal variability to better anticipate scenarios of sea-level rise during this century. As permanent submergence of intertidal habitats can be significant after 2050, due to the projected 79 cm rise of sea-level by the year

  12. Novel methodology for pharmaceutical expenditure forecast.

    Science.gov (United States)

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time

  13. On Capillary Rise and Nucleation

    Science.gov (United States)

    Prasad, R.

    2008-01-01

    A comparison of capillary rise and nucleation is presented. It is shown that both phenomena result from a balance between two competing energy factors: a volume energy and a surface energy. Such a comparison may help to introduce nucleation with a topic familiar to the students, capillary rise. (Contains 1 table and 3 figures.)

  14. ExternE transport methodology for external cost evaluation of air pollution

    DEFF Research Database (Denmark)

    Jensen, S. S.; Berkowicz, R.; Brandt, J.

    The report describes how the human exposure estimates based on NERI's human exposure modelling system (AirGIS) can improve the Danish data used for exposure factors in the ExternE Transport methodology. Initially, a brief description of the ExternE Tranport methodology is given and it is summarised...

  15. Estimating absolute sea level variations by combining GNSS and Tide gauge data

    Digital Repository Service at National Institute of Oceanography (India)

    Bos, M.S.; Fernandes, R.M.S; Vethamony, P.; Mehra, P.

    Indian tide gauges can be used to estimate sea level rise. To separate relative sea level rise from vertical land motion at the tide gauges, various GNSS stations have been installed in the last years at, or nearby, tide gauges. Using the PSMSL...

  16. Strategic advantages of high-rise construction

    Directory of Open Access Journals (Sweden)

    Yaskova Natalya

    2018-01-01

    Full Text Available Traditional methods to assess the competitiveness of different types of real estate in the context of huge changes of new technological way of life don’t provide building solutions that would be correct from a strategic perspective. There are many challenges due to changes in the consumers’ behavior in the housing area. A multiplicity of life models, a variety of opportunities and priorities, traditions and new trends in construction should be assessed in terms of prospective benefits in the environment of the emerging new world order. At the same time, the mane discourse of high-rise construction mainly relates to its design features, technical innovations, and architectural accents. We need to clarify the criteria for economic evaluation of high-rise construction in order to provide decisions with clear and quantifiable contexts. The suggested approach to assessing the strategic advantage of high-rise construction and the prospects for capitalization of high-rise buildings poses new challenges for the economy to identify adequate quantitative assessment methods of the high-rise buildings economic efficiency, taking into account all stages of their life cycle.

  17. Strategic advantages of high-rise construction

    Science.gov (United States)

    Yaskova, Natalya

    2018-03-01

    Traditional methods to assess the competitiveness of different types of real estate in the context of huge changes of new technological way of life don't provide building solutions that would be correct from a strategic perspective. There are many challenges due to changes in the consumers' behavior in the housing area. A multiplicity of life models, a variety of opportunities and priorities, traditions and new trends in construction should be assessed in terms of prospective benefits in the environment of the emerging new world order. At the same time, the mane discourse of high-rise construction mainly relates to its design features, technical innovations, and architectural accents. We need to clarify the criteria for economic evaluation of high-rise construction in order to provide decisions with clear and quantifiable contexts. The suggested approach to assessing the strategic advantage of high-rise construction and the prospects for capitalization of high-rise buildings poses new challenges for the economy to identify adequate quantitative assessment methods of the high-rise buildings economic efficiency, taking into account all stages of their life cycle.

  18. Evaluation of a Rising Plate Meter for Use in Multispecies Swards

    Directory of Open Access Journals (Sweden)

    S. Leanne Dillard

    2016-11-01

    Full Text Available The rising plate meter (RPM provides rapid estimates of herbage mass (HM. Accurate calibration of the RPM is difficult due to variability in forage management, growth, and species composition. The RPM is typically calibrated by linear regression of HM and RPM height; however, the is usually low. Curvilinear regression, with the intercept set to zero, could provide a more robust calibration equation and decrease variability in RPM estimates. Three Pennsylvania organic dairy farms grazing lactating dairy cattle on multispecies pastures were used to determine measured HM and estimated HM using a RPM. Removal of the intercept increased the adjusted of all equations between 42.8 and 89.0%. Use of quadratic and cubic regression only resulted in 0.01 to 0.02 increase in adjusted . Linear regression remains the simplest and preferred method of calibration; however, error can be reduced by setting calibration equations so that zero RPM height is associated with zero HM.

  19. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    International Nuclear Information System (INIS)

    Park, Sukyoung; Heo, Gyunyoung; Kim, Jung Taek; Kim, Tae Wan

    2014-01-01

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  20. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sukyoung; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Kim, Jung Taek [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Tae Wan [Kepco International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  1. An inventory of nitrous oxide emissions from agriculture in the UK using the IPCC methodology: emission estimate, uncertainty and sensitivity analysis

    Science.gov (United States)

    Brown, L.; Armstrong Brown, S.; Jarvis, S. C.; Syed, B.; Goulding, K. W. T.; Phillips, V. R.; Sneath, R. W.; Pain, B. F.

    Nitrous oxide emission from UK agriculture was estimated, using the IPCC default values of all emission factors and parameters, to be 87 Gg N 2O-N in both 1990 and 1995. This estimate was shown, however, to have an overall uncertainty of 62%. The largest component of the emission (54%) was from the direct (soil) sector. Two of the three emission factors applied within the soil sector, EF1 (direct emission from soil) and EF3 PRP (emission from pasture range and paddock) were amongst the most influential on the total estimate, producing a ±31 and +11% to -17% change in emissions, respectively, when varied through the IPCC range from the default value. The indirect sector (from leached N and deposited ammonia) contributed 29% of the total emission, and had the largest uncertainty (126%). The factors determining the fraction of N leached (Frac LEACH) and emissions from it (EF5), were the two most influential. These parameters are poorly specified and there is great potential to improve the emission estimate for this component. Use of mathematical models (NCYCLE and SUNDIAL) to predict Frac LEACH suggested that the IPCC default value for this parameter may be too high for most situations in the UK. Comparison with other UK-derived inventories suggests that the IPCC methodology may overestimate emission. Although the IPCC approach includes additional components to the other inventories (most notably emission from indirect sources), estimates for the common components (i.e. fertiliser and animals), and emission factors used, are higher than those of other inventories. Whilst it is recognised that the IPCC approach is generalised in order to allow widespread applicability, sufficient data are available to specify at least two of the most influential parameters, i.e. EF1 and Frac LEACH, more accurately, and so provide an improved estimate of nitrous oxide emissions from UK agriculture.

  2. Radiation monitoring for the HTTR rise-to-power test (1) and (2)'

    Energy Technology Data Exchange (ETDEWEB)

    Nakazawa, Takashi; Yoshino, Toshiaki; Yasu, Katsuji; Ashikagaya, Yoshinobu; Kikuchi, Toshiki [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2001-02-01

    The High Temperature Engineering Test Reactor (HTTR) is the first high temperature gas-cooled research reactor in Japan. This reactor is a helium-gas-cooled and graphite-moderated reactor with a thermal output of 30 MW. The rated operation temperature of the outlet coolant is 850degC. (During high temperature test operation, this reaches 950degC). The first criticality of the HTTR was attained in November 1998. The single loaded, parallel loaded operation with a thermal output of 9 MW (called the HTTR Rise-to-Power Test (1)) was completed between September 16, 1999 and July 8, 2000. The single loaded, parallel loaded continuous operation with a thermal output of 20 MW (called the HTTR Rise-to-Power Test (2)) has also been carried out, but it was shutdown at the halfway stage by a single from the reactor, when the thermal output was 16.5 MW and the reactor outlet coolant temperature was 500degC. This report describes the radiation monitoring carried out during the HTTR Rise-to-Power Tests (1) and (2)'. The data measured by the various radiation monitors is also reported. These data will be used for the estimation of radiation levels (such as the radiation dose equivalent rate, the radioactive concentration in effluents, etc.) for the next HTTR Rise-to-Power Test, and for periodic inspections. (author)

  3. Practical methodologies for the calculation of capacity in electricity markets for wind energy

    International Nuclear Information System (INIS)

    Botero B, Sergio; Giraldo V, Luis Alfonso; Isaza C, Felipe

    2008-01-01

    Determining the real capacity of the generators in a power market is an essential task in order to estimate the actual system reliability, and to estimate the reward for generators due to their capacity in the firm energy market. In the wind power case, which is an intermittent resource, several methodologies have been proposed to estimate the capacity of a wind power emplacement, not only for planning but also for firm energy remuneration purposes. This paper presents some methodologies that have been proposed or implemented around the world in order to calculate the capacity of this energy resource.

  4. A methodology for evaluating environmental impacts of railway freight transportation policies

    International Nuclear Information System (INIS)

    Lopez, Ignacio; Rodriguez, Javier; Buron, Jose Manuel; Garcia, Alberto

    2009-01-01

    Railway freight transportation presents a degree of complexity which frequently makes impossible to model it with sufficient precision. Currently, energetic and environmental impacts of freight transportation are usually modelled following average data, which do not reflect the characteristics of specific lines. These models allow qualitative approximations which may be used as criteria for designing high-level transportation policies: road-train modal shift, regional energetic planning or environmental policies. This paper proposes a methodology for estimating railway consumption associated to a specific railway line which yields a new degree of precision. It is based on estimating different contributions to railway consumption by a collection of factors, mobility, operation, or infrastructure-related. This procedure also allows applying the methodology for designing transportation policies in detail: evaluating impact of modal shift, consumption and pollutant emissions on a specific line, as well as the effect of building tunnels, reducing slopes, improving traffic control, etc. A comparison of the estimations given by the conventional approach and the proposed methodology is offered, as well as further comments on the results.

  5. Estimation of erosion-accumulative processes at the Inia River’s mouth near high-rise construction zones.

    Directory of Open Access Journals (Sweden)

    Sineeva Natalya

    2018-01-01

    Full Text Available Our study relevance is due to the increasing man-made impact on water bodies and associated land resources within the urban areas, as a consequence, by a change in the morphology and dynamics of Rivers’ canals. This leads to the need to predict the development of erosion-accumulation processes, especially within the built-up urban areas. Purpose of the study is to develop programs on the assessment of erosion-accumulation processes at a water body, a mouth area of the Inia River, in the of perspective high-rise construction zone of a residential microdistrict, the place, where floodplain-channel complex is intensively expected to develop. Results of the study: Within the velocities of the water flow comparing, full-scale measured conditions, and calculated from the model, a slight discrepancy was recorded. This allows us to say that the numerical model reliably describes the physical processes developing in the River. The carried out calculations to assess the direction and intensity of the channel re-formations, made us possible to conclude, there was an insignificant predominance of erosion processes over the accumulative ones on the undeveloped part of the Inia River (the processes activity is noticeable only in certain areas (by the coasts and the island. Importance of the study: The study on the erosion-accumulation processes evaluation can be used in design decisions for the future high-rise construction of this territory, which will increase their economic efficiency.

  6. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    Science.gov (United States)

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  7. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    Science.gov (United States)

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  8. Teaching methodology for modeling reference evapotranspiration with artificial neural networks

    OpenAIRE

    Martí, Pau; Pulido Calvo, Inmaculada; Gutiérrez Estrada, Juan Carlos

    2015-01-01

    [EN] Artificial neural networks are a robust alternative to conventional models for estimating different targets in irrigation engineering, among others, reference evapotranspiration, a key variable for estimating crop water requirements. This paper presents a didactic methodology for introducing students in the application of artificial neural networks for reference evapotranspiration estimation using MatLab c . Apart from learning a specific application of this software wi...

  9. Genome-driven evolutionary game theory helps understand the rise of metabolic interdependencies in microbial communities.

    Science.gov (United States)

    Zomorrodi, Ali R; Segrè, Daniel

    2017-11-16

    Metabolite exchanges in microbial communities give rise to ecological interactions that govern ecosystem diversity and stability. It is unclear, however, how the rise of these interactions varies across metabolites and organisms. Here we address this question by integrating genome-scale models of metabolism with evolutionary game theory. Specifically, we use microbial fitness values estimated by metabolic models to infer evolutionarily stable interactions in multi-species microbial "games". We first validate our approach using a well-characterized yeast cheater-cooperator system. We next perform over 80,000 in silico experiments to infer how metabolic interdependencies mediated by amino acid leakage in Escherichia coli vary across 189 amino acid pairs. While most pairs display shared patterns of inter-species interactions, multiple deviations are caused by pleiotropy and epistasis in metabolism. Furthermore, simulated invasion experiments reveal possible paths to obligate cross-feeding. Our study provides genomically driven insight into the rise of ecological interactions, with implications for microbiome research and synthetic ecology.

  10. Estimating the Economic Benefits of Regional Ocean Observing Systems

    National Research Council Canada - National Science Library

    Kite-Powell, Hauke L; Colgan, Charles S; Wellman, Katharine F; Pelsoci, Thomas; Wieand, Kenneth; Pendleton, Linwood; Kaiser, Mark J; Pulsipher, Allan G; Luger, Michael

    2005-01-01

    We develop a methodology to estimate the potential economic benefits from new investments in regional coastal ocean observing systems in US waters, and apply this methodology to generate preliminary...

  11. Short Lived Climate Pollutants cause a Long Lived Effect on Sea-level Rise: Analyzing climate metrics for sea-level rise

    Science.gov (United States)

    Sterner, E.; Johansson, D. J.

    2013-12-01

    Climate change depends on the increase of several different atmospheric pollutants. While long term global warming will be determined mainly by carbon dioxide, warming in the next few decades will depend to a large extent on short lived climate pollutants (SLCP). Reducing emissions of SLCPs could contribute to lower the global mean surface temperature by 0.5 °C already by 2050 (Shindell et al. 2012). Furthermore, the warming effect of one of the most potent SLCPs, black carbon (BC), may have been underestimated in the past. Bond et al. (2013) presents a new best estimate of the total BC radiative forcing (RF) of 1.1 W/m2 (90 % uncertainty bounds of 0.17 to 2.1 W/m2) since the beginning of the industrial era. BC is however never emitted alone and cooling aerosols from the same sources offset a majority of this RF. In the wake of calls for mitigation of SLCPs it is important to study other aspects of the climate effect of SLCPs. One key impact of climate change is sea-level rise (SLR). In a recent study, the effect of SLCP mitigation scenarios on SLR is examined. Hu et al (2013) find a substantial effect on SLR from mitigating SLCPs sharply, reducing SLR by 22-42% by 2100. We choose a different approach focusing on emission pulses and analyse a metric based on sea level rise so as to further enlighten the SLR consequences of SLCPs. We want in particular to understand the time dynamics of SLR impacts caused by SLCPs compared to other greenhouse gases. The most commonly used physical based metrics are GWP and GTP. We propose and evaluate an additional metric: The global sea-level rise potential (GSP). The GSP is defined as the sea level rise after a time horizon caused by an emissions pulse of a forcer to the sea level rise after a time horizon caused by an emissions pulse of a CO2. GSP is evaluated and compared to GWP and GTP using a set of climate forcers chosen to cover the whole scale of atmospheric perturbation life times (BC, CH4, N2O, CO2 and SF6). The study

  12. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  13. Probabilistic graphical models to deal with age estimation of living persons.

    Science.gov (United States)

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  14. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  15. LWR design decision methodology. Phase III. Final report

    International Nuclear Information System (INIS)

    Bertucio, R.; Held, J.; Lainoff, S.; Leahy, T.; Prather, W.; Rees, D.; Young, J.

    1982-01-01

    Traditionally, management decisions regarding design options have been made using quantitative cost information and qualitative safety information. A Design Decision Methodology, which utilizes probabilistic risk assessment techniques, including event trees and fault trees, along with systems engineering and standard cost estimation methods, has been developed so that a quantitative safety measure may be obtained as well. The report documents the development of this Design Decision Methodology, a demonstration of the methodology on a current licensing issue with the cooperation of the Washington Public Power Supply System (WPPSS), and a discussion of how the results of the demonstration may be used addressing the various issues associated with a licensing position on the issue

  16. Socioecological Aspects of High-rise Construction

    Science.gov (United States)

    Eichner, Michael; Ivanova, Zinaida

    2018-03-01

    In this article, the authors consider the socioecological problems that arise in the construction and operation of high-rise buildings. They study different points of view on high-rise construction and note that the approaches to this problem are very different. They also analyse projects of modern architects and which attempts are made to overcome negative impacts on nature and mankind. The article contains materials of sociological research, confirming the ambivalent attitude of urban population to high-rise buildings. In addition, one of the author's sociological survey reveals the level of environmental preparedness of the university students, studying in the field of "Construction of unique buildings and structures", raising the question of how future specialists are ready to take into account socioecological problems. Conclusion of the authors: the construction of high-rise buildings is associated with huge social and environmental risks, negative impact on the biosphere and human health. This requires deepened skills about sustainable design methods and environmental friendly construction technologies of future specialists. Professor M. Eichner presents in the article his case study project results on implementation of holistic eco-sustainable construction principles for mixed-use high-rise building in the metropolis of Cairo.

  17. [Statistical (Poisson) motor unit number estimation. Methodological aspects and normal results in the extensor digitorum brevis muscle of healthy subjects].

    Science.gov (United States)

    Murga Oporto, L; Menéndez-de León, C; Bauzano Poley, E; Núñez-Castaín, M J

    Among the differents techniques for motor unit number estimation (MUNE) there is the statistical one (Poisson), in which the activation of motor units is carried out by electrical stimulation and the estimation performed by means of a statistical analysis based on the Poisson s distribution. The study was undertaken in order to realize an approximation to the MUNE Poisson technique showing a coprehensible view of its methodology and also to obtain normal results in the extensor digitorum brevis muscle (EDB) from a healthy population. One hundred fourteen normal volunteers with age ranging from 10 to 88 years were studied using the MUNE software contained in a Viking IV system. The normal subjects were divided into two age groups (10 59 and 60 88 years). The EDB MUNE from all them was 184 49. Both, the MUNE and the amplitude of the compound muscle action potential (CMAP) were significantly lower in the older age group (page than CMAP amplitude ( 0.5002 and 0.4142, respectively pphisiology of the motor unit. The value of MUNE correlates better with the neuromuscular aging process than CMAP amplitude does.

  18. Organic matter content and particle size modifications in mangrove sediments as responses to sea level rise.

    Science.gov (United States)

    Sanders, Christian J; Smoak, Joseph M; Waters, Mathew N; Sanders, Luciana M; Brandini, Nilva; Patchineelam, Sambasiva R

    2012-06-01

    Mangroves sediments contain large reservoirs of organic material (OM) as mangrove ecosystems produce large quantities and rapidly burial OM. Sediment accumulation rates of approximately 2.0 mm year(-1), based on (210)Pb(ex) dating, were estimated at the margin of two well-developed mangrove forest in southern Brazil. Regional data point to a relative sea level (RSL) rise of up to ∼4.0 mm year(-1). This RSL rise in turn, may directly influence the origin and quantity of organic matter (OM) deposited along mangrove sediments. Lithostratigraphic changes show that sand deposition is replacing the mud (<63 μm) fraction and OM content is decreasing in successively younger sediments. Sediment accumulation in coastal areas that are not keeping pace with sea level rise is potentially conducive to the observed shifts in particle size and OM content. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Potential impact of predicted sea level rise on carbon sink function of mangrove ecosystems with special reference to Negombo estuary, Sri Lanka

    Science.gov (United States)

    Perera, K. A. R. S.; De Silva, K. H. W. L.; Amarasinghe, M. D.

    2018-02-01

    Unique location in the land-sea interface makes mangrove ecosystems most vulnerable to the impacts of predicted sea level rise due to increasing anthropogenic CO2 emissions. Among others, carbon sink function of these tropical ecosystems that contribute to reduce rising atmospheric CO2 and temperature, could potentially be affected most. Present study was undertaken to explore the extent of impact of the predicted sea level rise for the region on total organic carbon (TOC) pools of the mangrove ecosystems in Negombo estuary located on the west coast of Sri Lanka. Extents of the coastal inundations under minimum (0.09 m) and maximum (0.88 m) sea level rise scenarios of IPCC for 2100 and an intermediate level of 0.48 m were determined with GIS tools. Estimated total capacity of organic carbon retention by these mangrove areas was 499.45 Mg C ha- 1 of which 84% (418.98 Mg C ha- 1) sequestered in the mangrove soil and 16% (80.56 Mg C ha- 1) in the vegetation. Total extent of land area potentially affected by inundation under lowest sea level rise scenario was 218.9 ha, while it was 476.2 ha under intermediate rise and 696.0 ha with the predicted maximum sea level rise. Estimated rate of loss of carbon sink function due to inundation by the sea level rise of 0.09 m is 6.30 Mg C ha- 1 y- 1 while the intermediate sea level rise indicated a loss of 9.92 Mg C ha- 1 y- 1 and under maximum sea level rise scenario, this loss further increases up to 11.32 Mg C ha- 1 y- 1. Adaptation of mangrove plants to withstand inundation and landward migration along with escalated photosynthetic rates, augmented by changing rainfall patterns and availability of nutrients may contribute to reduce the rate of loss of carbon sink function of these mangrove ecosystems. Predictions over change in carbon sequestration function of mangroves in Negombo estuary reveals that it is not only affected by oceanographic and hydrological alterations associated with sea level rise but also by anthropogenic

  20. Utility of Capture-Recapture Methodology to Estimate Prevalence of Congenital Heart Defects Among Adolescents in 11 New York State Counties: 2008 to 2010.

    Science.gov (United States)

    Akkaya-Hocagil, Tugba; Hsu, Wan-Hsiang; Sommerhalter, Kristin; McGarry, Claire; Van Zutphen, Alissa

    2017-11-01

    Congenital heart defects (CHDs) are the most common birth defects in the United States, and the population of individuals living with CHDs is growing. Though CHD prevalence in infancy has been well characterized, better prevalence estimates among children and adolescents in the United States are still needed. We used capture-recapture methods to estimate CHD prevalence among adolescents residing in 11 New York counties. The three data sources used for analysis included Statewide Planning and Research Cooperative System (SPARCS) hospital inpatient records, SPARCS outpatient records, and medical records provided by seven pediatric congenital cardiac clinics from 2008 to 2010. Bayesian log-linear models were fit using the R package Conting to account for dataset dependencies and heterogeneous catchability. A total of 2537 adolescent CHD cases were captured in our three data sources. Forty-four cases were identified in all data sources, 283 cases were identified in two of three data sources, and 2210 cases were identified in a single data source. The final model yielded an estimated total adolescent CHD population of 3845, indicating that 66% of the cases in the catchment area were identified in the case-identifying data sources. Based on 2010 Census estimates, we estimated adolescent CHD prevalence as 6.4 CHD cases per 1000 adolescents (95% confidence interval: 6.2-6.6). We used capture-recapture methodology with a population-based surveillance system in New York to estimate CHD prevalence among adolescents. Future research incorporating additional data sources may improve prevalence estimates in this population. Birth Defects Research 109:1423-1429, 2017.© 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Climate change trade measures : estimating industry effects

    Science.gov (United States)

    2009-06-01

    Estimating the potential effects of domestic emissions pricing for industries in the United States is complex. If the United States were to regulate greenhouse gas emissions, production costs could rise for certain industries and could cause output, ...

  2. Statistical analysis of the acceleration of Baltic mean sea-level rise, 1900-2012

    Directory of Open Access Journals (Sweden)

    Birgit Hünicke

    2016-07-01

    Full Text Available We analyse annual mean sea-level records from tide-gauges located in the Baltic and parts of the North Sea with the aim of detecting an acceleration of sea-level rise over the 20textsuperscript{th} and 21textsuperscript{st} centuries. The acceleration is estimated as a (1 fit to a polynomial of order two in time, (2 a long-term linear increase in the rates computed over gliding overlapping decadal time segments, and (3 a long-term increase of the annual increments of sea level.The estimation methods (1 and (2 prove to be more powerful in detecting acceleration when tested with sea-level records produced in global climate model simulations. These methods applied to the Baltic-Sea tide-gauges are, however, not powerful enough to detect a significant acceleration in most of individual records, although most estimated accelerations are positive. This lack of detection of statistically significant acceleration at the individual tide-gauge level can be due to the high-level of local noise and not necessarily to the absence of acceleration.The estimated accelerations tend to be stronger in the north and east of the Baltic Sea. Two hypothesis to explain this spatial pattern have been explored. One is that this pattern reflects the slow-down of the Glacial Isostatic Adjustment. However, a simple estimation of this effect suggests that this slow-down cannot explain the estimated acceleration. The second hypothesis is related to the diminishing sea-ice cover over the 20textsuperscript{th} century. The melting o of less saline and colder sea-ice can lead to changes in sea-level. Also, the melting of sea-ice can reduce the number of missing values in the tide-gauge records in winter, potentially influencing the estimated trends and acceleration of seasonal mean sea-level This hypothesis cannot be ascertained either since the spatial pattern of acceleration computed for winter and summer separately are very similar. The all-station-average-record displays an

  3. The dichotomous response of flood and storm extremes to rising global temperatures

    Science.gov (United States)

    Sharma, A.; Wasko, C.

    2017-12-01

    Rising temperature have resulted in increases in short-duration rainfall extremes across the world. Additionally it has been shown (doi:10.1038/ngeo2456) that storms will intensify, causing derived flood peaks to rise even more. This leads us to speculate that flood peaks will increase as a result, complying with the storyline presented in past IPCC reports. This talk, however, shows that changes in flood extremes are much more complex. Using global data on extreme flow events, the study conclusively shows that while the very extreme floods may be rising as a result of storm intensification, the more frequent flood events are decreasing in magnitude. The study argues that changes in the magnitude of floods are a function of changes in storm patterns and as well as pre-storm or antecedent conditions. It goes on to show that while changes in storms dominate for the most extreme events and over smaller, more urbanised catchments, changes in pre-storm conditions are the driving factor in modulating flood peaks in large rural catchments. The study concludes by providing recommendations on how future flood design should proceed, arguing that current practices (or using a design storm to estimate floods) are flawed and need changing.

  4. Rising food costs & global food security: Key issues & relevance for India

    Science.gov (United States)

    Gustafson, Daniel J.

    2013-01-01

    Rising food costs can have major impact on vulnerable households, pushing those least able to cope further into poverty and hunger. On the other hand, provided appropriate policies and infrastructure are in place, higher agricultural prices can also raise farmers’ incomes and rural wages, improve rural economies and stimulate investment for longer-term economic growth. High food prices since 2007 have had both short-term impacts and long-term consequences, both good and bad. This article reviews the evidence of how rising costs have affected global food security since the food price crisis of 2007-2008, and their impact on different categories of households and countries. In light of recent studies, we know more about how households, and countries, cope or not with food price shocks but a number of contentious issues remain. These include the adequacy of current estimates and the interpretation of national and household food and nutrition security indicators. India is a particularly important country in this regard, given the high number of food insecure, the relative weight of India in global estimates of food and nutrition insecurity, and the puzzles that remain concerning the country's reported declining per capita calorie consumption. Competing explanations for what is behind it are not in agreement, but these all point to the importance of policy and programme innovation and greater investment necessary to reach the achievable goal of food and nutrition security for all. PMID:24135190

  5. Rising food costs & global food security: Key issues & relevance for India

    Directory of Open Access Journals (Sweden)

    Daniel J Gustafson

    2013-01-01

    Full Text Available Rising food costs can have major impact on vulnerable households, pushing those least able to cope further into poverty and hunger. On the other hand, provided appropriate policies and infrastructure are in place, higher agricultural prices can also raise farmers′ incomes and rural wages, improve rural economies and stimulate investment for longer-term economic growth. High food prices since 2007 have had both short-term impacts and long-term consequences, both good and bad. This article reviews the evidence of how rising costs have affected global food security since the food price crisis of 2007-2008, and their impact on different categories of households and countries. In light of recent studies, we know more about how households, and countries, cope or not with food price shocks but a number of contentious issues remain. These include the adequacy of current estimates and the interpretation of national and household food and nutrition security indicators. India is a particularly important country in this regard, given the high number of food insecure, the relative weight of India in global estimates of food and nutrition insecurity, and the puzzles that remain concerning the country′s reported declining per capita calorie consumption. Competing explanations for what is behind it are not in agreement, but these all point to the importance of policy and programme innovation and greater investment necessary to reach the achievable goal of food and nutrition security for all.

  6. Rising food costs & global food security: key issues & relevance for India.

    Science.gov (United States)

    Gustafson, Daniel J

    2013-09-01

    Rising food costs can have major impact on vulnerable households, pushing those least able to cope further into poverty and hunger. On the other hand, provided appropriate policies and infrastructure are in place, higher agricultural prices can also raise farmers' incomes and rural wages, improve rural economies and stimulate investment for longer-term economic growth. High food prices since 2007 have had both short-term impacts and long-term consequences, both good and bad. This article reviews the evidence of how rising costs have affected global food security since the food price crisis of 2007-2008, and their impact on different categories of households and countries. In light of recent studies, we know more about how households, and countries, cope or not with food price shocks but a number of contentious issues remain. These include the adequacy of current estimates and the interpretation of national and household food and nutrition security indicators. India is a particularly important country in this regard, given the high number of food insecure, the relative weight of India in global estimates of food and nutrition insecurity, and the puzzles that remain concerning the country's reported declining per capita calorie consumption. Competing explanations for what is behind it are not in agreement, but these all point to the importance of policy and programme innovation and greater investment necessary to reach the achievable goal of food and nutrition security for all.

  7. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  8. Intersystem LOCA risk assessment: methodology and results

    International Nuclear Information System (INIS)

    Galyean, W.J.; Kelly, D.L.; Schroeder, J.A.; Auflick, L.J.; Blackman, H.S.; Gertman, D.I.; Hanley, L.N.

    1994-01-01

    The United States Nuclear Regulatory Commission is sponsoring a research program to develop an improved understanding of the human factors, hardware and accident consequence issues that dominate the risk from an intersystem loss-of-coolant accident (ISLOCA) at a nuclear power plant. To accomplish the goals of this program, a mehtodology has been developed for estimating ISLOCA core damage frequency and risk. The steps in this methodology are briefly described, along with the results obtained from an application of the methodology at three pressurized water reactors. Also included are the results of a screening study of boiling water reactors. ((orig.))

  9. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  10. RAMA Methodology for the Calculation of Neutron Fluence

    International Nuclear Information System (INIS)

    Villescas, G.; Corchon, F.

    2013-01-01

    he neutron fluence plays an important role in the study of the structural integrity of the reactor vessel after a certain time of neutron irradiation. The NRC defined in the Regulatory Guide 1.190, the way must be estimated neutron fluence, including uncertainty analysis of the validation process (creep uncertainty is ? 20%). TRANSWARE Enterprises Inc. developed a methodology for calculating the neutron flux, 1,190 based guide, known as RAMA. Uncertainty values obtained with this methodology, for about 18 vessels, are less than 10%.

  11. Support vector regression methodology for estimating global solar radiation in Algeria

    Science.gov (United States)

    Guermoui, Mawloud; Rabehi, Abdelaziz; Gairaa, Kacem; Benkaciali, Said

    2018-01-01

    Accurate estimation of Daily Global Solar Radiation (DGSR) has been a major goal for solar energy applications. In this paper we show the possibility of developing a simple model based on the Support Vector Regression (SVM-R), which could be used to estimate DGSR on the horizontal surface in Algeria based only on sunshine ratio as input. The SVM model has been developed and tested using a data set recorded over three years (2005-2007). The data was collected at the Applied Research Unit for Renewable Energies (URAER) in Ghardaïa city. The data collected between 2005-2006 are used to train the model while the 2007 data are used to test the performance of the selected model. The measured and the estimated values of DGSR were compared during the testing phase statistically using the Root Mean Square Error (RMSE), Relative Square Error (rRMSE), and correlation coefficient (r2), which amount to 1.59(MJ/m2), 8.46 and 97,4%, respectively. The obtained results show that the SVM-R is highly qualified for DGSR estimation using only sunshine ratio.

  12. Allowances for evolving coastal flood risk under uncertain local sea-level rise

    Science.gov (United States)

    Buchanan, M. K.; Kopp, R. E.; Oppenheimer, M.; Tebaldi, C.

    2015-12-01

    Sea-level rise (SLR) causes estimates of flood risk made under the assumption of stationary mean sea level to be biased low. However, adjustments to flood return levels made assuming fixed increases of sea level are also inaccurate when applied to sea level that is rising over time at an uncertain rate. To accommodate both the temporal dynamics of SLR and their uncertainty, we develop an Average Annual Design Life Level (AADLL) metric and associated SLR allowances [1,2]. The AADLL is the flood level corresponding to a time-integrated annual expected probability of occurrence (AEP) under uncertainty over the lifetime of an asset; AADLL allowances are the adjustment from 2000 levels that maintain current risk. Given non-stationary and uncertain SLR, AADLL flood levels and allowances provide estimates of flood protection heights and offsets for different planning horizons and different levels of confidence in SLR projections in coastal areas. Allowances are a function primarily of local SLR and are nearly independent of AEP. Here we employ probabilistic SLR projections [3] to illustrate the calculation of AADLL flood levels and allowances with a representative set of long-duration tide gauges along U.S. coastlines. [1] Rootzen et al., 2014, Water Resources Research 49: 5964-5972. [2] Hunter, 2013, Ocean Engineering 71: 17-27. [3] Kopp et al., 2014, Earth's Future 2: 383-406.

  13. A molecular timescale of eukaryote evolution and the rise of complex multicellular life

    Science.gov (United States)

    Hedges, S. Blair; Blair, Jaime E.; Venturi, Maria L.; Shoe, Jason L.

    2004-01-01

    BACKGROUND: The pattern and timing of the rise in complex multicellular life during Earth's history has not been established. Great disparity persists between the pattern suggested by the fossil record and that estimated by molecular clocks, especially for plants, animals, fungi, and the deepest branches of the eukaryote tree. Here, we used all available protein sequence data and molecular clock methods to place constraints on the increase in complexity through time. RESULTS: Our phylogenetic analyses revealed that (i) animals are more closely related to fungi than to plants, (ii) red algae are closer to plants than to animals or fungi, (iii) choanoflagellates are closer to animals than to fungi or plants, (iv) diplomonads, euglenozoans, and alveolates each are basal to plants+animals+fungi, and (v) diplomonads are basal to other eukaryotes (including alveolates and euglenozoans). Divergence times were estimated from global and local clock methods using 20-188 proteins per node, with data treated separately (multigene) and concatenated (supergene). Different time estimation methods yielded similar results (within 5%): vertebrate-arthropod (964 million years ago, Ma), Cnidaria-Bilateria (1,298 Ma), Porifera-Eumetozoa (1,351 Ma), Pyrenomycetes-Plectomycetes (551 Ma), Candida-Saccharomyces (723 Ma), Hemiascomycetes-filamentous Ascomycota (982 Ma), Basidiomycota-Ascomycota (968 Ma), Mucorales-Basidiomycota (947 Ma), Fungi-Animalia (1,513 Ma), mosses-vascular plants (707 Ma), Chlorophyta-Tracheophyta (968 Ma), Rhodophyta-Chlorophyta+Embryophyta (1,428 Ma), Plantae-Animalia (1,609 Ma), Alveolata-plants+animals+fungi (1,973 Ma), Euglenozoa-plants+animals+fungi (1,961 Ma), and Giardia-plants+animals+fungi (2,309 Ma). By extrapolation, mitochondria arose approximately 2300-1800 Ma and plastids arose 1600-1500 Ma. Estimates of the maximum number of cell types of common ancestors, combined with divergence times, showed an increase from two cell types at 2500 Ma to

  14. A molecular timescale of eukaryote evolution and the rise of complex multicellular life

    Directory of Open Access Journals (Sweden)

    Venturi Maria L

    2004-01-01

    Full Text Available Abstract Background The pattern and timing of the rise in complex multicellular life during Earth's history has not been established. Great disparity persists between the pattern suggested by the fossil record and that estimated by molecular clocks, especially for plants, animals, fungi, and the deepest branches of the eukaryote tree. Here, we used all available protein sequence data and molecular clock methods to place constraints on the increase in complexity through time. Results Our phylogenetic analyses revealed that (i animals are more closely related to fungi than to plants, (ii red algae are closer to plants than to animals or fungi, (iii choanoflagellates are closer to animals than to fungi or plants, (iv diplomonads, euglenozoans, and alveolates each are basal to plants+animals+fungi, and (v diplomonads are basal to other eukaryotes (including alveolates and euglenozoans. Divergence times were estimated from global and local clock methods using 20–188 proteins per node, with data treated separately (multigene and concatenated (supergene. Different time estimation methods yielded similar results (within 5%: vertebrate-arthropod (964 million years ago, Ma, Cnidaria-Bilateria (1,298 Ma, Porifera-Eumetozoa (1,351 Ma, Pyrenomycetes-Plectomycetes (551 Ma, Candida-Saccharomyces (723 Ma, Hemiascomycetes-filamentous Ascomycota (982 Ma, Basidiomycota-Ascomycota (968 Ma, Mucorales-Basidiomycota (947 Ma, Fungi-Animalia (1,513 Ma, mosses-vascular plants (707 Ma, Chlorophyta-Tracheophyta (968 Ma, Rhodophyta-Chlorophyta+Embryophyta (1,428 Ma, Plantae-Animalia (1,609 Ma, Alveolata-plants+animals+fungi (1,973 Ma, Euglenozoa-plants+animals+fungi (1,961 Ma, and Giardia-plants+animals+fungi (2,309 Ma. By extrapolation, mitochondria arose approximately 2300-1800 Ma and plastids arose 1600-1500 Ma. Estimates of the maximum number of cell types of common ancestors, combined with divergence times, showed an increase from two cell types at 2500 Ma to ~10

  15. A Methodology for the Estimation of the Wind Generator Economic Efficiency

    Science.gov (United States)

    Zaleskis, G.

    2017-12-01

    Integration of renewable energy sources and the improvement of the technological base may not only reduce the consumption of fossil fuel and environmental load, but also ensure the power supply in regions with difficult fuel delivery or power failures. The main goal of the research is to develop the methodology of evaluation of the wind turbine economic efficiency. The research has demonstrated that the electricity produced from renewable sources may be much more expensive than the electricity purchased from the conventional grid.

  16. Application of STORMTOOLS's simplified flood inundation model with sea level rise to assess impacts to RI coastal areas

    Science.gov (United States)

    Spaulding, M. L.

    2015-12-01

    The vision for STORMTOOLS is to provide access to a suite of coastal planning tools (numerical models et al), available as a web service, that allows wide spread accessibly and applicability at high resolution for user selected coastal areas of interest. The first product developed under this framework were flood inundation maps, with and without sea level rise, for varying return periods for RI coastal waters. The flood mapping methodology is based on using the water level vs return periods at a primary NOAA water level gauging station and then spatially scaling the values, based on the predictions of high resolution, storm and wave simulations performed by Army Corp of Engineers, North Atlantic Comprehensive Coastal Study (NACCS) for tropical and extratropical storms on an unstructured grid, to estimate inundation levels for varying return periods. The scaling for the RI application used Newport, RI water levels as the reference point. Predictions are provided for once in 25, 50, and 100 yr return periods (at the upper 95% confidence level), with sea level rises of 1, 2, 3, and 5 ft. Simulations have also been performed for historical hurricane events including 1938, Carol (1954), Bob (1991), and Sandy (2012) and nuisance flooding events with return periods of 1, 3, 5, and 10 yr. Access to the flooding maps is via a web based, map viewer that seamlessly covers all coastal waters of the state at one meter resolution. The GIS structure of the map viewer allows overlays of additional relevant data sets (roads and highways, wastewater treatment facilities, schools, hospitals, emergency evacuation routes, etc.) as desired by the user. The simplified flooding maps are publically available and are now being implemented for state and community resilience planning and vulnerability assessment activities in response to climate change impacts.

  17. A review of methodologies applied in Australian practice to evaluate long-term coastal adaptation options

    Directory of Open Access Journals (Sweden)

    Timothy David Ramm

    2017-01-01

    Full Text Available Rising sea levels have the potential to alter coastal flooding regimes around the world and local governments are beginning to consider how to manage uncertain coastal change. In doing so, there is increasing recognition that such change is deeply uncertain and unable to be reliably described with probabilities or a small number of scenarios. Characteristics of methodologies applied in Australian practice to evaluate long-term coastal adaptation options are reviewed and benchmarked against two state-of-the-art international methods suited for conditions of uncertainty (Robust Decision Making and Dynamic Adaptive Policy Pathways. Seven out of the ten Australian case studies assumed the uncertain parameters, such as sea level rise, could be described deterministically or stochastically when identifying risk and evaluating adaptation options across multi-decadal periods. This basis is not considered sophisticated enough for long-term decision-making, implying that Australian practice needs to increase the use of scenarios to explore a much larger uncertainty space when assessing the performance of adaptation options. Two Australian case studies mapped flexible adaptation pathways to manage uncertainty, and there remains an opportunity to incorporate quantitative methodologies to support the identification of risk thresholds. The contextual framing of risk, including the approach taken to identify risk (top-down or bottom-up and treatment of uncertain parameters, were found to be fundamental characteristics that influenced the methodology selected to evaluate adaptation options. The small sample of case studies available suggests that long-term coastal adaptation in Australian is in its infancy and there is a timely opportunity to guide local government towards robust methodologies for developing long-term coastal adaptation plans.

  18. Nuclear data evaluation methodology including estimates of covariances

    Directory of Open Access Journals (Sweden)

    Smith D.L.

    2010-10-01

    Full Text Available Evaluated nuclear data rather than raw experimental and theoretical information are employed in nuclear applications such as the design of nuclear energy systems. Therefore, the process by which such information is produced and ultimately used is of critical interest to the nuclear science community. This paper provides an overview of various contemporary methods employed to generate evaluated cross sections and related physical quantities such as particle emission angular distributions and energy spectra. The emphasis here is on data associated with neutron induced reaction processes, with consideration of the uncertainties in these data, and on the more recent evaluation methods, e.g., those that are based on stochastic (Monte Carlo techniques. There is no unique way to perform such evaluations, nor are nuclear data evaluators united in their opinions as to which methods are superior to the others in various circumstances. In some cases it is not critical which approaches are used as long as there is consistency and proper use is made of the available physical information. However, in other instances there are definite advantages to using particular methods as opposed to other options. Some of these distinctions are discussed in this paper and suggestions are offered regarding fruitful areas for future research in the development of evaluation methodology.

  19. Methodology and analysis of production safety during Pu recycling at SSC RF RIAR

    International Nuclear Information System (INIS)

    Kirillovich, A.P.

    2000-01-01

    The methodology and criteria for estimating safety in technological processes of the nuclear fuel cycle (NFC) are proposed, substantiated and verified during the large-scale Pu recycling (500 kg). The comprehensive investigation results of the radiation-ecological situation are presented during pilot production of the mixed uranium-plutonium fuel and fuel assembly at SSC RF RIAR. The methodology and experimental data bank can be used while estimating safety in the industrial recycling of Pu and minor-actinides (Np, Am, Cm) in NFC. (author)

  20. Socioecological Aspects of High-rise Construction

    Directory of Open Access Journals (Sweden)

    Eichner Michael

    2018-01-01

    Full Text Available In this article, the authors consider the socioecological problems that arise in the construction and operation of high-rise buildings. They study different points of view on high-rise construction and note that the approaches to this problem are very different. They also analyse projects of modern architects and which attempts are made to overcome negative impacts on nature and mankind. The article contains materials of sociological research, confirming the ambivalent attitude of urban population to high-rise buildings. In addition, one of the author’s sociological survey reveals the level of environmental preparedness of the university students, studying in the field of "Construction of unique buildings and structures", raising the question of how future specialists are ready to take into account socioecological problems. Conclusion of the authors: the construction of high-rise buildings is associated with huge social and environmental risks, negative impact on the biosphere and human health. This requires deepened skills about sustainable design methods and environmental friendly construction technologies of future specialists. Professor M. Eichner presents in the article his case study project results on implementation of holistic eco-sustainable construction principles for mixed-use high-rise building in the metropolis of Cairo.

  1. Sea Level Rise Impacts on Wastewater Treatment Systems Along the U.S. Coasts

    Science.gov (United States)

    Hummel, Michelle A.; Berry, Matthew S.; Stacey, Mark T.

    2018-04-01

    As sea levels rise, coastal communities will experience more frequent and persistent nuisance flooding, and some low-lying areas may be permanently inundated. Critical components of lifeline infrastructure networks in these areas are also at risk of flooding, which could cause significant service disruptions that extend beyond the flooded zone. Thus, identifying critical infrastructure components that are exposed to sea level rise is an important first step in developing targeted investment in protective actions and enhancing the overall resilience of coastal communities. Wastewater treatment plants are typically located at low elevations near the coastline to minimize the cost of collecting consumed water and discharging treated effluent, which makes them particularly susceptible to coastal flooding. For this analysis, we used geographic information systems to assess the exposure of wastewater infrastructure to various sea level rise projections at the national level. We then estimated the number of people who would lose wastewater services, which could be more than five times as high as previous predictions of the number of people at risk of direct flooding due to sea level rise. We also performed a regional comparison of wastewater exposure to marine and groundwater flooding in the San Francisco Bay Area. Overall, this analysis highlights the widespread exposure of wastewater infrastructure in the United States and demonstrates that local disruptions to infrastructure networks may have far-ranging impacts on areas that do not experience direct flooding.

  2. Sea level rise impacts on wastewater treatment systems along the U.S. coasts

    Science.gov (United States)

    Hummel, M.; Berry, M.; Stacey, M. T.

    2017-12-01

    As sea levels rise, coastal communities will experience more frequent and persistent nuisance flooding, and some low-lying areas may be permanently inundated. Critical components of lifeline infrastructure networks in these areas are also at risk of flooding, which could cause significant service disruptions that extend beyond the flooded zone. Thus, identifying critical infrastructure components that are vulnerable to sea level rise is an important first step in developing targeted investment in protective actions and enhancing the overall resilience of coastal communities. Wastewater treatment plants are typically located at low elevations near the coastline to minimize the cost of collecting consumed water and discharging treated effluent, which makes them particularly susceptible to coastal flooding. For this analysis, we used geographic information systems to assess the vulnerability of wastewater infrastructure to various sea level rise projections at the national level. We then estimated the number of people who would lose wastewater services, which could be more than three times as high as previous predictions of the number of people at risk of direct flooding due to sea level rise. We also considered several case studies of wastewater infrastructure in mid-sized cities to determine how topography and system configuration (centralized versus distributed) impact vulnerability. Overall, this analysis highlights the widespread vulnerability of wastewater infrastructure in the U.S. and demonstrates that local disruptions to infrastructure networks may have far-ranging impacts on areas that do not experience direct flooding.

  3. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  4. Rising Long-term Interest Rates

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes

    Rather than chronicle recent developments in European long-term interest rates as such, this paper assesses the impact of increases in those interest rates on economic performance and inflation. That puts us in a position to evaluate the economic pressures for further rises in those rates......, the first question posed in this assignment, and the scope for overshooting (the second question), and then make some illustrative predictions of future interest rates in the euro area. We find a wide range of effects from rising interest rates, mostly small and mostly negative, focused on investment...... till the emerging European recovery is on a firmer basis and capable of overcoming increases in the cost of borrowing and shrinking fiscal space. There is also an implication that worries about rising/overshooting interest rates often reflect the fact that inflation risks are unequally distributed...

  5. Final report for sea-level rise response modeling for San Francisco Bay estuary tidal marshes

    Science.gov (United States)

    Takekawa, John Y.; Thorne, Karen M.; Buffington, Kevin J.; Spragens, Kyle A.; Swanson, Kathleen M.; Drexler, Judith Z.; Schoellhamer, David H.; Overton, Cory T.; Casazza, Michael L.

    2013-01-01

    The International Panel on Climate Change has identified coastal ecosystems as areas that will be disproportionally affected by climate change. Current sea-level rise projections range widely with 0.57 to 1.9 meters increase in mea sea level by 2100. The expected accelerated rate of sea-level rise through the 21st century will put many coastal ecosystems at risk, especially those in topographically low-gradient areas. We assessed marsh accretion and plant community state changes through 2100 at 12 tidal salt marshes around San Francisco Bay estuary with a sea-level rise response model. Detailed ground elevation, vegetation, and water level data were collected at all sites between 2008 and 2011 and used as model inputs. Sediment cores (taken by Callaway and others, 2012) at four sites around San Francisco Bay estuary were used to estimate accretion rates. A modification of the Callaway and others (1996) model, the Wetland Accretion Rate Model for Ecosystem Resilience (WARMER), was utilized to run sea-level rise response models for all sites. With a mean sea level rise of 1.24 m by 2100, WARMER projected that the vast majority, 95.8 percent (1,942 hectares), of marsh area in our study will lose marsh plant communities by 2100 and to transition to a relative elevation range consistent with mudflat habitat. Three marshes were projected to maintain marsh vegetation to 2100, but they only composed 4.2 percent (85 hectares) of the total marsh area surveyed.

  6. Fracture mechanics approach to estimate rail wear limits

    Science.gov (United States)

    2009-10-01

    This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...

  7. Study on pulsed-discharge devices with high current rising rate for point spot short-wavelength source in dense plasma observations

    International Nuclear Information System (INIS)

    Tachinami, Fumitaka; Anzai, Nobuyuki; Sasaki, Toru; Kikuchi, Takashi; Harada, Nob.

    2014-01-01

    A pulsed-power generator with high current rise based on a pulse-forming-network was studied toward generating intense point-spot X-ray source. To obtain the high rate of current rise, we have designed the compact discharge device with low circuit inductance. The results indicate that the inductance of the compact discharge device was dominated by a gap switch inductance. To reduce the gap switch inductance and operation voltage, the feasible gap switch inductance in the vacuum chamber has been estimated by the circuit simulation. The gap switch inductance can be reduced by the lower pressure operation. It means that the designed discharge device achieves the rate of current rise of 10 12 A/s

  8. The test-negative design for estimating influenza vaccine effectiveness.

    Science.gov (United States)

    Jackson, Michael L; Nelson, Jennifer C

    2013-04-19

    The test-negative design has emerged in recent years as the preferred method for estimating influenza vaccine effectiveness (VE) in observational studies. However, the methodologic basis of this design has not been formally developed. In this paper we develop the rationale and underlying assumptions of the test-negative study. Under the test-negative design for influenza VE, study subjects are all persons who seek care for an acute respiratory illness (ARI). All subjects are tested for influenza infection. Influenza VE is estimated from the ratio of the odds of vaccination among subjects testing positive for influenza to the odds of vaccination among subjects testing negative. With the assumptions that (a) the distribution of non-influenza causes of ARI does not vary by influenza vaccination status, and (b) VE does not vary by health care-seeking behavior, the VE estimate from the sample can generalized to the full source population that gave rise to the study sample. Based on our derivation of this design, we show that test-negative studies of influenza VE can produce biased VE estimates if they include persons seeking care for ARI when influenza is not circulating or do not adjust for calendar time. The test-negative design is less susceptible to bias due to misclassification of infection and to confounding by health care-seeking behavior, relative to traditional case-control or cohort studies. The cost of the test-negative design is the additional, difficult-to-test assumptions that incidence of non-influenza respiratory infections is similar between vaccinated and unvaccinated groups within any stratum of care-seeking behavior, and that influenza VE does not vary across care-seeking strata. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Distribution of flexural deflection in the worldwide outer rise area

    Science.gov (United States)

    Lin, Zi-Jun; Lin, Jing-Yi; Lin, Yi-Chin; Chin, Shao-Jinn; Chen, Yen-Fu

    2015-04-01

    The outer rise on the fringe of a subduction system is caused by an accreted load on the flexed oceanic lithosphere. The magnitude of the deflection is usually linked to the stress state beard by the oceanic plate. In a coupled subduction zone, the stress is abundantly accumulated across the plate boundary which should affect the flexural properties of the subducted plate. Thus, the variation of the outer rise in shape may reflect the seismogenic characteristics of the subduction system. In this study, we intent to find the correlation between the flexure deflection (Wb) of the outer rise and the subduction zone properties by comparing several slab parameters and the Wb distribution. The estimation of Wb is performed based on the available bathymetry data and the statistic analysis of earthquakes is from the global ISC earthquake catalog for the period of 1900-2015. Our result shows a progressive change of Wb in space, suggesting a robust calculation. The average Wb of worldwise subduction system spreads from 348 to 682 m. No visible distinction in the ranging of Wb was observed for different subduction zones. However, in a weak coupling subduction system, the standard variation of Wb has generally larger value. Relatively large Wb generally occurs in the center of the trench system, whereas small Wb for the two ends of trench. The comparison of Wb and several slab parameters shows that the Wb may be correlated with the maximal magnitude and the number of earthquakes. Otherwise, no clear relationship with other parameters can be obtained.

  10. Combining urbanization and hydrodynamics data to evaluate sea level rise impacts on coastal water resources

    Science.gov (United States)

    Young, C. R.; Martin, J. B.

    2016-02-01

    Assessments of the potential for salt water intrusion due to sea level rise require consideration of both coastal hydrodynamic and human activity thresholds. In siliciclastic systems, sea level rise may cause salt intrusion to coastal aquifers at annual or decadal scales, whereas in karst systems salt intrudes at the tidal scalse. In both cases, human activity impacts the freshwater portion of the system by altering the water demand on the aquifer. We combine physicochemical and human activity data to evaluate impact of sea level rise on salt intrusion to siliclastic (Indian River Lagoon, Fl, USA) and karst (Puerto Morelos, Yucatan, Mexico) systems under different sea level rise rate scenarios. Two hydrodynamic modeling scenarios are considered; flux controlled and head controlled. Under a flux controlled system hydraulic head gradients remain constant during sea level rise while under a head controlled system hydraulic graidents diminish, allowing saltwater intrusion. Our model contains three key terms; aquifer recharge, groundwater discharge and hydraulic conductivity. Groundwater discharge and hydraulic conductivity were calculated based on high frequency (karst system) and decadal (siliciclastic system) field measurements. Aquifer recharge is defined as precipitation less evapotranspiration and water demand was evaluated based on urban planning data that provided the regional water demand. Water demand includes agricultural area, toursim, traffic patterns, garbage collection and total population. Water demand was initially estimated using a partial leaset squares regression based on these variables. Our model indicates that water demand depends most on agricultural area, which has changed significantly over the last 30 years. In both systems, additional water demand creates a head controlled scenario, thus increaseing the protential fo salt intrusion with projected sea level rise.

  11. Model-based Small Area Estimates of Cancer Risk Factors and Screening Behaviors - Small Area Estimates

    Science.gov (United States)

    These model-based estimates use two surveys, the Behavioral Risk Factor Surveillance System (BRFSS) and the National Health Interview Survey (NHIS). The two surveys are combined using novel statistical methodology.

  12. The rise of hot gases resulting from hydrogen combustion at a tritium recovery plant

    International Nuclear Information System (INIS)

    Selander, W.N.

    1981-10-01

    An accidental release of hydrogen isotopes at a proposed tritium recovery plant may result in a fire or explosion. In this report estimates are given for the initial transient rise and final height of the cloud of hot gasses which results from various modes of combustion. The radiation dose equivalent caused by the downwind passage of the tritium-bearing cloud is estimated to be less than 100 mrem in any mode of combustion or weather condition. The model used for calculating the final height of the cloud depends on an entrainment assumption, and the low-density cloud loses energy by entrainment at a slower rate than in conventional atmospheric processes. Consequently, the estimated final cloud height is conservative, and, therefore, the actual radiation dose equivalent would be lower than predicted

  13. The Structure of Research Methodology Competency in Higher Education and the Role of Teaching Teams and Course Temporal Distance

    Science.gov (United States)

    Schweizer, Karl; Steinwascher, Merle; Moosbrugger, Helfried; Reiss, Siegbert

    2011-01-01

    The development of research methodology competency is a major aim of the psychology curriculum at universities. Usually, three courses concentrating on basic statistics, advanced statistics and experimental methods, respectively, serve the achievement of this aim. However, this traditional curriculum-based course structure gives rise to the…

  14. Improving determination of the Martian rotation parameters through the synergy between LaRa and RISE radioscience experiments

    Science.gov (United States)

    Le Maistre, S.; Péters, M. J.; Yseboodt, M.; Dehant, V. M. A.

    2017-12-01

    estimates obtained from RISE alone.In addition, because the two candidate landing sites of ExoMars are higher in latitude (18.20°N for Oxia Planum, 22°N for Mawrth Vallis) than InSight (4°N), we could estimate for the very first time the Chandler Wobble component of the polar motion using LaRa (Le Maistre et al., 2012), which is also powerful to constrain Mars interior and atmospheric models.

  15. Proposed methodology for estimating the impact of highway improvements on urban air pollution.

    Science.gov (United States)

    1971-01-01

    The aim of this methodology is to indicate the expected change in ambient air quality in the vicinity of a highway improvement and in the total background level of urban air pollution resulting from the highway improvement. Both the jurisdiction in w...

  16. Generalized estimating equations

    CERN Document Server

    Hardin, James W

    2002-01-01

    Although powerful and flexible, the method of generalized linear models (GLM) is limited in its ability to accurately deal with longitudinal and clustered data. Developed specifically to accommodate these data types, the method of Generalized Estimating Equations (GEE) extends the GLM algorithm to accommodate the correlated data encountered in health research, social science, biology, and other related fields.Generalized Estimating Equations provides the first complete treatment of GEE methodology in all of its variations. After introducing the subject and reviewing GLM, the authors examine th

  17. Impacts of rising air temperatures on electric transmission ampacity and peak electricity load in the United States

    Science.gov (United States)

    Bartos, Matthew; Chester, Mikhail; Johnson, Nathan; Gorman, Brandon; Eisenberg, Daniel; Linkov, Igor; Bates, Matthew

    2016-11-01

    Climate change may constrain future electricity supply adequacy by reducing electric transmission capacity and increasing electricity demand. The carrying capacity of electric power cables decreases as ambient air temperatures rise; similarly, during the summer peak period, electricity loads typically increase with hotter air temperatures due to increased air conditioning usage. As atmospheric carbon concentrations increase, higher ambient air temperatures may strain power infrastructure by simultaneously reducing transmission capacity and increasing peak electricity load. We estimate the impacts of rising ambient air temperatures on electric transmission ampacity and peak per-capita electricity load for 121 planning areas in the United States using downscaled global climate model projections. Together, these planning areas account for roughly 80% of current peak summertime load. We estimate climate-attributable capacity reductions to transmission lines by constructing thermal models of representative conductors, then forcing these models with future temperature projections to determine the percent change in rated ampacity. Next, we assess the impact of climate change on electricity load by using historical relationships between ambient temperature and utility-scale summertime peak load to estimate the extent to which climate change will incur additional peak load increases. We find that by mid-century (2040-2060), increases in ambient air temperature may reduce average summertime transmission capacity by 1.9%-5.8% relative to the 1990-2010 reference period. At the same time, peak per-capita summertime loads may rise by 4.2%-15% on average due to increases in ambient air temperature. In the absence of energy efficiency gains, demand-side management programs and transmission infrastructure upgrades, these load increases have the potential to upset current assumptions about future electricity supply adequacy.

  18. Methodology for the Model-based Small Area Estimates of Cancer-Related Knowledge - Small Area Estimates

    Science.gov (United States)

    The HINTS is designed to produce reliable estimates at the national and regional levels. GIS maps using HINTS data have been used to provide a visual representation of possible geographic relationships in HINTS cancer-related variables.

  19. A soft-computing methodology for noninvasive time-spatial temperature estimation.

    Science.gov (United States)

    Teixeira, César A; Ruano, Maria Graça; Ruano, António E; Pereira, Wagner C A

    2008-02-01

    The safe and effective application of thermal therapies is restricted due to lack of reliable noninvasive temperature estimators. In this paper, the temporal echo-shifts of backscattered ultrasound signals, collected from a gel-based phantom, were tracked and assigned with the past temperature values as radial basis functions neural networks input information. The phantom was heated using a piston-like therapeutic ultrasound transducer. The neural models were assigned to estimate the temperature at different intensities and points arranged across the therapeutic transducer radial line (60 mm apart from the transducer face). Model inputs, as well as the number of neurons were selected using the multiobjective genetic algorithm (MOGA). The best attained models present, in average, a maximum absolute error less than 0.5 degrees C, which is pointed as the borderline between a reliable and an unreliable estimator in hyperthermia/diathermia. In order to test the spatial generalization capacity, the best models were tested using spatial points not yet assessed, and some of them presented a maximum absolute error inferior to 0.5 degrees C, being "elected" as the best models. It should be also stressed that these best models present implementational low-complexity, as desired for real-time applications.

  20. A correction in the CDM methodological tool for estimating methane emissions from solid waste disposal sites.

    Science.gov (United States)

    Santos, M M O; van Elk, A G P; Romanel, C

    2015-12-01

    Solid waste disposal sites (SWDS) - especially landfills - are a significant source of methane, a greenhouse gas. Although having the potential to be captured and used as a fuel, most of the methane formed in SWDS is emitted to the atmosphere, mainly in developing countries. Methane emissions have to be estimated in national inventories. To help this task the Intergovernmental Panel on Climate Change (IPCC) has published three sets of guidelines. In addition, the Kyoto Protocol established the Clean Development Mechanism (CDM) to assist the developed countries to offset their own greenhouse gas emissions by assisting other countries to achieve sustainable development while reducing emissions. Based on methodologies provided by the IPCC regarding SWDS, the CDM Executive Board has issued a tool to be used by project developers for estimating baseline methane emissions in their project activities - on burning biogas from landfills or on preventing biomass to be landfilled and so avoiding methane emissions. Some inconsistencies in the first two IPCC guidelines have already been pointed out in an Annex of IPCC latest edition, although with hidden details. The CDM tool uses a model for methane estimation that takes on board parameters, factors and assumptions provided in the latest IPCC guidelines, while using in its core equation the one of the second IPCC edition with its shortcoming as well as allowing a misunderstanding of the time variable. Consequences of wrong ex-ante estimation of baseline emissions regarding CDM project activities can be of economical or environmental type. Example of the first type is the overestimation of 18% in an actual project on biogas from landfill in Brazil that harms its developers; of the second type, the overestimation of 35% in a project preventing municipal solid waste from being landfilled in China, which harms the environment, not for the project per se but for the undue generated carbon credits. In a simulated landfill - the same

  1. TEST (Toxicity Estimation Software Tool) Ver 4.1

    Science.gov (United States)

    The Toxicity Estimation Software Tool (T.E.S.T.) has been developed to allow users to easily estimate toxicity and physical properties using a variety of QSAR methodologies. T.E.S.T allows a user to estimate toxicity without requiring any external programs. Users can input a chem...

  2. Estimation of steady-state and transcient power distributions for the RELAP analyses of the 1963 loss-of-flow and loss-of-pressure tests at BR2

    International Nuclear Information System (INIS)

    Dionne, B.; Tzanos, C.P.

    2011-01-01

    To support the safety analyses required for the conversion of the Belgian Reactor 2 (BR2) from highly-enriched uranium (HEU) to low-enriched uranium (LEU) fuel, the simulation of a number of loss-of-flow tests, with or without loss of pressure, has been undertaken. These tests were performed at BR2 in 1963 and used instrumented fuel assemblies (FAs) with thermocouples (TC) imbedded in the cladding as well as probes to measure the FAs power on the basis of their coolant temperature rise. The availability of experimental data for these tests offers an opportunity to better establish the credibility of the RELAP5-3D model and methodology used in the conversion analysis. In order to support the HEU to LEU conversion safety analyses of the BR2 reactor, RELAP simulations of a number of loss-of-flow/loss-of-pressure tests have been undertaken. Preliminary analyses showed that the conservative power distributions used historically in the BR2 RELAP model resulted in a significant overestimation of the peak cladding temperature during the transient. Therefore, it was concluded that better estimates of the steady-state and decay power distributions were needed to accurately predict the cladding temperatures measured during the tests and establish the credibility of the RELAP model and methodology. The new approach ('best estimate' methodology) uses the MCNP5, ORIGEN-2 and BERYL codes to obtain steady-state and decay power distributions for the BR2 core during the tests A/400/1, C/600/3 and F/400/1. This methodology can be easily extended to simulate any BR2 core configuration. Comparisons with measured peak cladding temperatures showed a much better agreement when power distributions obtained with the new methodology are used.

  3. Practitioner's knowledge representation a pathway to improve software effort estimation

    CERN Document Server

    Mendes, Emilia

    2014-01-01

    The main goal of this book is to help organizations improve their effort estimates and effort estimation processes by providing a step-by-step methodology that takes them through the creation and validation of models that are based on their own knowledge and experience. Such models, once validated, can then be used to obtain predictions, carry out risk analyses, enhance their estimation processes for new projects and generally advance them as learning organizations.Emilia Mendes presents the Expert-Based Knowledge Engineering of Bayesian Networks (EKEBNs) methodology, which she has used and adapted during the course of several industry collaborations with different companies world-wide over more than 6 years. The book itself consists of two major parts: first, the methodology's foundations in knowledge management, effort estimation (with special emphasis on the intricacies of software and Web development) and Bayesian networks are detailed; then six industry case studies are presented which illustrate the pra...

  4. Development of sea level rise scenarios for climate change assessments of the Mekong Delta, Vietnam

    Science.gov (United States)

    Doyle, Thomas W.; Day, Richard H.; Michot, Thomas C.

    2010-01-01

    Rising sea level poses critical ecological and economical consequences for the low-lying megadeltas of the world where dependent populations and agriculture are at risk. The Mekong Delta of Vietnam is one of many deltas that are especially vulnerable because much of the land surface is below mean sea level and because there is a lack of coastal barrier protection. Food security related to rice and shrimp farming in the Mekong Delta is currently under threat from saltwater intrusion, relative sea level rise, and storm surge potential. Understanding the degree of potential change in sea level under climate change is needed to undertake regional assessments of potential impacts and to formulate adaptation strategies. This report provides constructed time series of potential sea level rise scenarios for the Mekong Delta region by incorporating (1) aspects of observed intra- and inter-annual sea level variability from tide records and (2) projected estimates for different rates of regional subsidence and accelerated eustacy through the year 2100 corresponding with the Intergovernmental Panel on Climate Change (IPCC) climate models and emission scenarios.

  5. Cause and countermeasure for heat up of HTTR core support plate at power rise tests

    Energy Technology Data Exchange (ETDEWEB)

    Fujimoto, Nozomu; Takada, Eiji; Nakagawa, Shigeaki; Tachibana, Yukio; Kawasaki, Kozo; Saikusa, Akio; Kojima, Takao; Iyoku, Tatuo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment

    2002-01-01

    HTTR has carried out many kinds of tests as power rise tests in which reactor power rises step by step after attained the first criticality. In the tests, temperature of a core support plate reached higher than expected at each power level, the temperature was expected to be higher than the maximum working temperature at 100% power level. Therefore, tests under the high temperature test operation mode, in which the core flow rate was different, were carried out to predict the temperature at 100% power precisely, and investigate the cause of the temperature rise. From the investigation, it was clear that the cause was gap flow in the core support structure. Furthermore, it was estimated that the temperature of the core support plate rose locally due to change in gap width between the core support plate and a seal plate due to change in core pressure drop. The maximum working temperature of the core support plate was revised. The integrity of core support plate under the revised maximum working temperature condition was confirmed by stress analyses. (author)

  6. Methodology for cost estimate in projects for nuclear power plants decommissioning

    International Nuclear Information System (INIS)

    Salij, L.M.

    2008-01-01

    The conceptual approaches to cost estimating of nuclear power plants units decommissioning projects were determined. The international experience and national legislative and regulatory basis were analyzed. The possible decommissioning project cost classification was given. It was shown the role of project costs of nuclear power plant units decommissioning as the most important criterion for the main project decisions. The technical and economic estimation of deductions to common-branch fund of decommissioning projects financing was substantiated

  7. Estimation and Forecasting the Gross Domestic Product´s Growth Rate in Ecuador: a Short-term Vision

    Directory of Open Access Journals (Sweden)

    Yadier Alberto Torres−Sánchez

    2016-12-01

    Full Text Available Ecuador is the seventh largest economy in Latin America. From 2000 to 2012, the country has been expanding at an average rate of 1,15 % on a quarter over quarter basis, mostly due to a rise in exports. Ecuador´s economy is highly dependent on oil exports. In order to reach its full growth potential, the country needs to reduce its dependence on oil revenue; increase the tax base; achieve political stability and reduce the levels of poverty and inequality. The main objective of this research is specifically marked in estimate and forecast the Gross Domestic Product´s Growth Rate in Ecuador, applying for this Box – Jenkins´ Methodology for ARIMA models. It was obtained a forecast of 3,96 % approximately, that represents a logical result according with the time series.

  8. Estimation and Forecasting the Gross Domestic Product´s Growth Rate in Ecuador: a Short-term Vision

    Directory of Open Access Journals (Sweden)

    Yadier Alberto Torres−Sánchez

    2017-01-01

    Full Text Available Ecuador is the seventh largest economy in Latin America. From 2000 to 2012, the country has been expanding at an average rate of 1,15 % on a quarter over quarter basis, mostly due to a rise in exports. Ecuador´s economy is highly dependent on oil exports. In order to reach its full growth potential, the country needs to reduce its dependence on oil revenue; increase the tax base; achieve political stability and reduce the levels of poverty and inequality. The main objective of this research is specifically marked in estimate and forecast the Gross Domestic Product´s Growth Rate in Ecuador, applying for this Box – Jenkins´ Methodology for ARIMA models. It was obtained a forecast of 3,96 % approximately, that represents a logical result according with the time series.

  9. Decision-model estimation of the age-specific disability weight for schistosomiasis japonica: a systematic review of the literature.

    Science.gov (United States)

    Finkelstein, Julia L; Schleinitz, Mark D; Carabin, Hélène; McGarvey, Stephen T

    2008-03-05

    Schistosomiasis is among the most prevalent parasitic infections worldwide. However, current Global Burden of Disease (GBD) disability-adjusted life year estimates indicate that its population-level impact is negligible. Recent studies suggest that GBD methodologies may significantly underestimate the burden of parasitic diseases, including schistosomiasis. Furthermore, strain-specific disability weights have not been established for schistosomiasis, and the magnitude of human disease burden due to Schistosoma japonicum remains controversial. We used a decision model to quantify an alternative disability weight estimate of the burden of human disease due to S. japonicum. We reviewed S. japonicum morbidity data, and constructed decision trees for all infected persons and two age-specific strata, or =15 y. We conducted stochastic and probabilistic sensitivity analyses for each model. Infection with S. japonicum was associated with an average disability weight of 0.132, with age-specific disability weights of 0.098 ( or =15 y). Re-estimated disability weights were seven to 46 times greater than current GBD measures; no simulations produced disability weight estimates lower than 0.009. Nutritional morbidities had the greatest contribution to the S. japonicum disability weight in the disability weights for schistosomiasis urgently need to be revised, and species-specific disability weights should be established. Even a marginal increase in current estimates would result in a substantial rise in the estimated global burden of schistosomiasis, and have considerable implications for public health prioritization and resource allocation for schistosomiasis research, monitoring, and control.

  10. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  11. Implications of sea level rise scenarios on land use /land cover classes of the coastal zones of Cochin, India.

    Science.gov (United States)

    Mani Murali, R; Dinesh Kumar, P K

    2015-01-15

    Physical responses of the coastal zones in the vicinity of Cochin, India due to sea level rise are investigated based on analysis of inundation scenarios. Quantification of potential habitat loss was made by merging the Land use/Land cover (LU/LC) prepared from the satellite imagery with the digital elevation model. Scenarios were generated for two different rates of sea level rise and responses of changes occurred were made to ascertain the vulnerability and loss in extent. LU/LC classes overlaid on 1 m and 2 m elevation showed that it was mostly covered by vegetation areas followed by water and urban zones. For the sea level rise scenarios of 1 m and 2 m, the total inundation zones were estimated to be 169.11 km(2) and 598.83 km(2) respectively using Geographic Information System (GIS). The losses of urban areas were estimated at 43 km(2) and 187 km(2) for the 1 m and 2 m sea level rise respectively which is alarming information for the most densely populated state of India. Quantitative comparison of other LU/LC classes showed significant changes under each of the inundation scenarios. The results obtained conclusively point that sea level rise scenarios will bring profound effects on the land use and land cover classes as well as on coastal landforms in the study region. Coastal inundation would leave ocean front and inland properties vulnerable. Increase in these water levels would alter the coastal drainage gradients. Reduction in these gradients would increase flooding attributable to rainstorms which could promote salt water intrusion into coastal aquifers and force water tables to rise. Changes in the coastal landforms associated with inundation generate concern in the background that the coastal region may continue to remain vulnerable in the coming decades due to population growth and development pressures. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Disposal of Kitchen Waste from High Rise Apartment

    Science.gov (United States)

    Ori, Kirki; Bharti, Ajay; Kumar, Sunil

    2017-09-01

    The high rise building has numbers of floor and rooms having variety of users or tenants for residential purposes. The huge quantities of heterogenous mixtures of domestic food waste are generated from every floor of the high rise residential buildings. Disposal of wet and biodegradable domestic kitchen waste from high rise buildings are more expensive in regards of collection and vertical transportation. This work is intended to address the technique to dispose of the wet organic food waste from the high rise buildings or multistory building at generation point with the advantage of gravity and vermicomposting technique. This innovative effort for collection and disposal of wet organic solid waste from high rise apartment is more economical and hygienic in comparison with present system of disposal.

  13. OESbathy version 1.0: a method for reconstructing ocean bathymetry with generalized continental shelf-slope-rise structures

    Science.gov (United States)

    Goswami, A.; Olson, P. L.; Hinnov, L. A.; Gnanadesikan, A.

    2015-09-01

    We present a method for reconstructing global ocean bathymetry that combines a standard plate cooling model for the oceanic lithosphere based on the age of the oceanic crust, global oceanic sediment thicknesses, plus generalized shelf-slope-rise structures calibrated at modern active and passive continental margins. Our motivation is to develop a methodology for reconstructing ocean bathymetry in the geologic past that includes heterogeneous continental margins in addition to abyssal ocean floor. First, the plate cooling model is applied to maps of ocean crustal age to calculate depth to basement. To the depth to basement we add an isostatically adjusted, multicomponent sediment layer constrained by sediment thickness in the modern oceans and marginal seas. A three-parameter continental shelf-slope-rise structure completes the bathymetry reconstruction, extending from the ocean crust to the coastlines. Parameters of the shelf-slope-rise structures at active and passive margins are determined from modern ocean bathymetry at locations where a complete history of seafloor spreading is preserved. This includes the coastal regions of the North, South, and central Atlantic, the Southern Ocean between Australia and Antarctica, and the Pacific Ocean off the west coast of South America. The final products are global maps at 0.1° × 0.1° resolution of depth to basement, ocean bathymetry with an isostatically adjusted multicomponent sediment layer, and ocean bathymetry with reconstructed continental shelf-slope-rise structures. Our reconstructed bathymetry agrees with the measured ETOPO1 bathymetry at most passive margins, including the east coast of North America, north coast of the Arabian Sea, and northeast and southeast coasts of South America. There is disagreement at margins with anomalous continental shelf-slope-rise structures, such as around the Arctic Ocean, the Falkland Islands, and Indonesia.

  14. Contemporary sea level rise.

    Science.gov (United States)

    Cazenave, Anny; Llovel, William

    2010-01-01

    Measuring sea level change and understanding its causes has considerably improved in the recent years, essentially because new in situ and remote sensing observations have become available. Here we report on most recent results on contemporary sea level rise. We first present sea level observations from tide gauges over the twentieth century and from satellite altimetry since the early 1990s. We next discuss the most recent progress made in quantifying the processes causing sea level change on timescales ranging from years to decades, i.e., thermal expansion of the oceans, land ice mass loss, and land water-storage change. We show that for the 1993-2007 time span, the sum of climate-related contributions (2.85 +/- 0.35 mm year(-1)) is only slightly less than altimetry-based sea level rise (3.3 +/- 0.4 mm year(-1)): approximately 30% of the observed rate of rise is due to ocean thermal expansion and approximately 55% results from land ice melt. Recent acceleration in glacier melting and ice mass loss from the ice sheets increases the latter contribution up to 80% for the past five years. We also review the main causes of regional variability in sea level trends: The dominant contribution results from nonuniform changes in ocean thermal expansion.

  15. Atmospheric Turbulence Estimates from a Pulsed Lidar

    Science.gov (United States)

    Pruis, Matthew J.; Delisi, Donald P.; Ahmad, Nash'at N.; Proctor, Fred H.

    2013-01-01

    Estimates of the eddy dissipation rate (EDR) were obtained from measurements made by a coherent pulsed lidar and compared with estimates from mesoscale model simulations and measurements from an in situ sonic anemometer at the Denver International Airport and with EDR estimates from the last observation time of the trailing vortex pair. The estimates of EDR from the lidar were obtained using two different methodologies. The two methodologies show consistent estimates of the vertical profiles. Comparison of EDR derived from the Weather Research and Forecast (WRF) mesoscale model with the in situ lidar estimates show good agreement during the daytime convective boundary layer, but the WRF simulations tend to overestimate EDR during the nighttime. The EDR estimates from a sonic anemometer located at 7.3 meters above ground level are approximately one order of magnitude greater than both the WRF and lidar estimates - which are from greater heights - during the daytime convective boundary layer and substantially greater during the nighttime stable boundary layer. The consistency of the EDR estimates from different methods suggests a reasonable ability to predict the temporal evolution of a spatially averaged vertical profile of EDR in an airport terminal area using a mesoscale model during the daytime convective boundary layer. In the stable nighttime boundary layer, there may be added value to EDR estimates provided by in situ lidar measurements.

  16. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  17. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  18. Rising equity

    International Nuclear Information System (INIS)

    Burr, M.T.

    1992-01-01

    This article reports on the results of a financial rankings survey of the independent energy industry indicating that lenders and investors provided more than five billion dollars in capital for new, private power projects during the first six months of 1992. The topics of the article include rising equity requirements, corporate finance, mergers and acquisitions, project finance investors, revenue bonds, project finance lenders for new projects, project finance lenders for restructurings, and project finance advisors

  19. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    Science.gov (United States)

    Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  20. 21 CFR 137.290 - Self-rising yellow corn meal.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Self-rising yellow corn meal. 137.290 Section 137... Cereal Flours and Related Products § 137.290 Self-rising yellow corn meal. Self-rising yellow corn meal conforms to the definition and standard of identity prescribed by § 137.270 for self-rising white corn meal...

  1. Assessing economic impact of storm surge under projected sea level rise scenarios

    Science.gov (United States)

    Del Angel, D. C.; Yoskowitz, D.

    2017-12-01

    Global sea level is expected to rise 0.2-2m by the year 2100. Rising sea level is expected to have a number of impacts such as erosion, saltwater intrusion, and decline in coastal wetlands; all which have direct and indirect socio-economic impact to coastal communities. By 2050, 25% of the world's population will reside within flood-prone areas. These statistics raise a concern for the economic cost that sea level and flooding has on the growing coastal communities. Economic cost of storm surge inundation and rising seas may include loss or damage to public facilities and infrastructure that may become temporarily inaccessible, as well as disruptions to business and services. This goal of this project is to assess economic impacts of storms under four SLR scenarios including low, intermediate-low, intermediate-high, and high (0.2m, 0.5m, 1.2m and 2m, respectively) in the Northern Gulf of Mexico region. To assess flooding impact on communities from storm surge, this project utilizes HAZUS-MH software - a Geographic Information System (GIS)-based modeling tool developed by the Federal Emergency Management Agency - to estimate physical, economic, and social impacts of natural disasters such as floods, earthquakes and hurricanes. The HAZUS database comes integrated with aggregate and site specific inventory which includes: demographic data, general building stock, agricultural statistics, vehicle inventory, essential facilities, transportation systems, utility systems (among other sensitive facilities). User-defined inundation scenarios will serve to identify assets at risk and damage estimates will be generated using the Depth Damage Function included in the HAZUS software. Results will focus on 3 communities in the Gulf and highlight changes in storm flood impact. This approach not only provides a method for economic impact assessment but also begins to create a link between ecosystem services and natural and nature-based features such as wetlands, beaches and dunes

  2. Estimation of the National Disease Burden of Influenza-Associated Severe Acute Respiratory Illness in Kenya and Guatemala: A Novel Methodology

    Science.gov (United States)

    Katz, Mark A.; Lindblade, Kim A.; Njuguna, Henry; Arvelo, Wences; Khagayi, Sammy; Emukule, Gideon; Linares-Perez, Nivaldo; McCracken, John; Nokes, D. James; Ngama, Mwanajuma; Kazungu, Sidi; Mott, Joshua A.; Olsen, Sonja J.; Widdowson, Marc-Alain; Feikin, Daniel R.

    2013-01-01

    Background Knowing the national disease burden of severe influenza in low-income countries can inform policy decisions around influenza treatment and prevention. We present a novel methodology using locally generated data for estimating this burden. Methods and Findings This method begins with calculating the hospitalized severe acute respiratory illness (SARI) incidence for children Guatemala, using data from August 2009–July 2011. In Kenya (2009 population 38.6 million persons), the annual number of hospitalized influenza-associated SARI cases ranged from 17,129–27,659 for children Guatemala (2011 population 14.7 million persons), the annual number of hospitalized cases of influenza-associated pneumonia ranged from 1,065–2,259 (0.5–1.0 per 1,000 persons) among children Guatemala. This method can be performed in most low and lower-middle income countries. PMID:23573177

  3. Other best-estimate code and methodology applications in addition to licensing

    International Nuclear Information System (INIS)

    Tanarro, A.

    1999-01-01

    Along with their applications for licensing purposes, best-estimate thermalhydraulic codes allow for a wide scope of additional uses and applications, in which as realistic and realizable results as possible are necessary. Although many of these applications have been successfully developed nowadays, the use of best-estimate codes for applications other than those associated to licensing processes is not so well known among the nuclear community. This issue shows some of these applications, briefly describing their more significant and specific features. (Author)

  4. Sea level rise and the geoid: factor analysis approach

    OpenAIRE

    Song, Hongzhi; Sadovski, Alexey; Jeffress, Gary

    2013-01-01

    Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical...

  5. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  6. The ISO 50001 Impact Estimator Tool (IET 50001 V1.1.4) - User Guide and Introduction to the ISO 50001 Impacts Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Therkelsen, Peter L. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Rao, Prakash [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee T. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-01

    ISO 50001-Energy management systems – Requirements with guidance for use, is an internationally developed standard that provides organizations with a flexible framework for implementing an energy management system (EnMS) with the goal of continual energy performance improvement. The ISO 50001 standard was first published in 2011 and has since seen growth in the number of certificates issued around the world, primarily in the industrial (agriculture, manufacturing, and mining) and service (commercial) sectors. Policy makers in many regions and countries are looking to or are already using ISO 50001 as a basis for energy efficiency, carbon reduction, and other energy performance improvement schemes. The Impact Estimator Tool 50001 (IET 50001 Tool) is a computational model developed to assist researchers and policy makers determine the potential impact of ISO 50001 implementation in the industrial and service (commercial) sectors for a given region or country. The IET 50001 Tool is based upon a methodology initially developed by the Lawrence Berkeley National Laboratory that has been improved upon and vetted by a group of international researchers. By using a commonly accepted and transparent methodology, users of the IET 50001 Tool can easily and clearly communicate the potential impact of ISO 50001 for a region or country.

  7. Indirect estimators in US federal programs

    CERN Document Server

    1996-01-01

    In 1991, a subcommittee of the Federal Committee on Statistical Methodology met to document the use of indirect estimators - that is, estimators which use data drawn from a domain or time different from the domain or time for which an estimate is required. This volume comprises the eight reports which describe the use of indirect estimators and they are based on case studies from a variety of federal programs. As a result, many researchers will find this book provides a valuable survey of how indirect estimators are used in practice and which addresses some of the pitfalls of these methods.

  8. Rise and Shock: Optimal Defibrillator Placement in a High-rise Building.

    Science.gov (United States)

    Chan, Timothy C Y

    2017-01-01

    Out-of-hospital cardiac arrests (OHCA) in high-rise buildings experience lower survival and longer delays until paramedic arrival. Use of publicly accessible automated external defibrillators (AED) can improve survival, but "vertical" placement has not been studied. We aim to determine whether elevator-based or lobby-based AED placement results in shorter vertical distance travelled ("response distance") to OHCAs in a high-rise building. We developed a model of a single-elevator, n-floor high-rise building. We calculated and compared the average distance from AED to floor of arrest for the two AED locations. We modeled OHCA occurrences using floor-specific Poisson processes, the risk of OHCA on the ground floor (λ 1 ) and the risk on any above-ground floor (λ). The elevator was modeled with an override function enabling direct travel to the target floor. The elevator location upon override was modeled as a discrete uniform random variable. Calculations used the laws of probability. Elevator-based AED placement had shorter average response distance if the number of floors (n) in the building exceeded three quarters of the ratio of ground-floor OHCA risk to above-ground floor risk (λ 1 /λ) plus one half (n ≥ 3λ 1 /4λ + 0.5). Otherwise, a lobby-based AED had shorter average response distance. If OHCA risk on each floor was equal, an elevator-based AED had shorter average response distance. Elevator-based AEDs travel less vertical distance to OHCAs in tall buildings or those with uniform vertical risk, while lobby-based AEDs travel less vertical distance in buildings with substantial lobby, underground, and nearby street-level traffic and OHCA risk.

  9. CUEX methodology for assessing radiological impacts in the context of ICRP Recommendations

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Kaye, S.V.; Struxness, E.G.

    1975-01-01

    The Cumulative Exposure Index (CUEX) methodology was developed to estimate and assess, in the context of International Commission on Radiological Protection (ICRP) Recommendations, the total radiation dose to man due to environmental releases of radioactivity from nuclear applications. Each CUEX, a time-integrated radionuclide concentration (e.g.μCi.h.cm -3 ), reflects the selected annual dose limit for the reference organ and the estimated total dose to that organ via all exposure modes for a specific exposure situation. To assess the radiological significance of an environmental release of radioactivity, calculated or measured radionuclide concentrations in a suitable environmental sampling medium are compared with CUEXs determined for that medium under comparable conditions. The models and computer codes used in the CUEX methodology to predict environmental transport and to estimate radiation dose have been thoroughly tested. These models and codes are identified and described briefly. Calculation of a CUEX is shown step by step. An application of the methodology to a hypothetical atmospheric release involving four radionuclides illustrates use of the CUEX computer code to assess the radiological significance of a release, and to determine the relative importance (i.e. percentage of the estimated total dose contributed) of each radionuclide and each mode of exposure. The data requirements of the system are shown to be extensive, but not excessive in view of the assessments and analyses provided by the CUEX code. (author)

  10. Methodology for estimating accidental radioactive releases in nuclear waste management

    International Nuclear Information System (INIS)

    Levy, H.B.

    1979-01-01

    Estimation of the risks of accidental radioactive releases is necessary in assessing the safety of any nuclear waste management system. The case of a radioactive waste form enclosed in a barrier system is considered. Two test calculations were carried out

  11. Why does a spinning egg rise?

    Science.gov (United States)

    Cross, Rod

    2018-03-01

    Experimental and theoretical results are presented concerning the rise of a spinning egg. It was found that an egg rises quickly while it is sliding and then more slowly when it starts rolling. The angular momentum of the egg projected in the XZ plane changed in the same direction as the friction torque, as expected, by rotating away from the vertical Z axis. The latter result does not explain the rise. However, an even larger effect arises from the Y component of the angular momentum vector. As the egg rises, the egg rotates about the Y axis, an effect that is closely analogous to rotation of the egg about the Z axis. Both effects can be described in terms of precession about the respective axes. Steady precession about the Z axis arises from the normal reaction force in the Z direction, while precession about the Y axis arises from the friction force in the Y direction. Precession about the Z axis ceases if the normal reaction force decreases to zero, and precession about the Y axis ceases if the friction force decreases to zero.

  12. DEVELOPMENT OF HIGH-RISE CONSTRUCTION IN THE CITIES WITH POPULATION FROM 250 TO 500 THOUSAND INHABITANTS (on the example of the cities of the Ural Federal District

    Directory of Open Access Journals (Sweden)

    Olga Mikhaylovna Shentsova

    2017-09-01

    Full Text Available In article history of construction of high-rise buildings, features of perception of high-rise buildings in the urban environment what factors influence the choice of the site of high-rise buildings are considered. Such concepts as “skyscraper”, “a high-rise dominant” reveal. Also, the analysis of existence of high-rise buildings of the large cities with population from 250 to 500 thousand residents of the Ural Federal District is provided: Barrow, Nizhny Tagil, Nizhnevartovsk, Surgut and Magnitogorsk. And also the analysis of a town-planning situation of Magnitogorsk where the possible, most probable and offered places for an arrangement of high-rise buildings on crossing or end of axes of streets are revealed is given. Work purpose – the analysis of high-rise construction in the large cities of the Ural Federal District with population from 250 to 500 thousand inhabitants. Method or methodology of carrying out work: in article methods of the theoretical and visual analysis, observation, and also studying literary and the Internet of sources were used. Results: the systematized theoretical material in the field of architecture and town planning of the cities of the Ural Federal District is received. Scope of results: the received results can be applied in the field of architectural education and practical architectural activities.

  13. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  14. Approaches to estimating decommissioning costs

    International Nuclear Information System (INIS)

    Smith, R.I.

    1990-07-01

    The chronological development of methodology for estimating the cost of nuclear reactor power station decommissioning is traced from the mid-1970s through 1990. Three techniques for developing decommissioning cost estimates are described. The two viable techniques are compared by examining estimates developed for the same nuclear power station using both methods. The comparison shows that the differences between the estimates are due largely to differing assumptions regarding the size of the utility and operating contractor overhead staffs. It is concluded that the two methods provide bounding estimates on a range of manageable costs, and provide reasonable bases for the utility rate adjustments necessary to pay for future decommissioning costs. 6 refs

  15. Keep up or drown: adjustment of western Pacific coral reefs to sea-level rise in the 21st century.

    Science.gov (United States)

    van Woesik, R; Golbuu, Y; Roff, G

    2015-07-01

    Since the Mid-Holocene, some 5000 years ago, coral reefs in the Pacific Ocean have been vertically constrained by sea level. Contemporary sea-level rise is releasing these constraints, providing accommodation space for vertical reef expansion. Here, we show that Porites microatolls, from reef-flat environments in Palau (western Pacific Ocean), are 'keeping up' with contemporary sea-level rise. Measurements of 570 reef-flat Porites microatolls at 10 locations around Palau revealed recent vertical skeletal extension (78±13 mm) over the last 6-8 years, which is consistent with the timing of the recent increase in sea level. We modelled whether microatoll growth rates will potentially 'keep up' with predicted sea-level rise in the near future, based upon average growth, and assuming a decline in growth for every 1°C increase in temperature. We then compared these estimated extension rates with rates of sea-level rise under four Representative Concentration Pathways (RCPs). Our model suggests that under low-mid RCP scenarios, reef-coral growth will keep up with sea-level rise, but if greenhouse gas concentrations exceed 670 ppm atmospheric CO2 levels and with +2.2°C sea-surface temperature by 2100 (RCP 6.0 W m(-2)), our predictions indicate that Porites microatolls will be unable to keep up with projected rates of sea-level rise in the twenty-first century.

  16. Costs and benefits of sulphur oxide control: a methodological study

    Energy Technology Data Exchange (ETDEWEB)

    1981-01-01

    The objective is to present for the first time a methodology for estimating the costs and benefits of SO/sub x/ control strategies as an aid o policy formulation which could create the basis for further action in member countries. To illustrate the methodology, different control scenarios for Western Europe are developed and analyzed using the cost-benefit approach, and some preliminary conclusions are drawn. The next step assesses the impact of the emissions on ambient air quality, calculated with the aid of long-range and urban air quality models. Finally, the impact of the calculated concentrations of SO/sub x/ in the different scenarios on a number of environmental and human assets - materials, agricultural crops, health, and aquatic ecosystems - are estimated in order to have a measure of the benefits of control.

  17. Do pregnancy characteristics contribute to rising childhood cancer incidence rates in the United States?

    Science.gov (United States)

    Kehm, Rebecca D; Osypuk, Theresa L; Poynter, Jenny N; Vock, David M; Spector, Logan G

    2018-03-01

    Since 1975, childhood cancer incidence rates have gradually increased in the United States; however, few studies have conducted analyses across time to unpack this temporal rise. The aim of this study was to test the hypothesis that increasing cancer incidence rates are due to secular trends in pregnancy characteristics that are established risk factors for childhood cancer incidence including older maternal age, higher birthweight, and lower birth order. We also considered temporal trends in sociodemographic characteristics including race/ethnicity and poverty. We conducted a time series county-level ecologic analysis using linked population-based data from Surveillance, Epidemiology, and End Results cancer registries (1975-2013), birth data from the National Center for Health Statistics (1970-2013), and sociodemographic data from the US Census (1970-2010). We estimated unadjusted and adjusted average annual percent changes (AAPCs) in incidence of combined (all diagnoses) and individual types of cancer among children, ages 0-4 years, from Poisson mixed models. There was a statistically significant unadjusted temporal rise in incidence of combined childhood cancers (AAPC = 0.71%; 95% CI = 0.55-0.86), acute lymphoblastic leukemia (0.78%; 0.49-1.07), acute myeloid leukemia (1.86%; 1.13-2.59), central nervous system tumors (1.31%; 0.94-1.67), and hepatoblastoma (2.70%; 1.68-3.72). Adjustment for county-level maternal age reduced estimated AAPCs between 8% (hepatoblastoma) and 55% (combined). However, adjustment for other county characteristics did not attenuate AAPCs, and AAPCs remained significantly above 0% in models fully adjusted for county-level characteristics. Although rising maternal age may account for some of the increase in childhood cancer incidence over time, other factors, not considered in this analysis, may also contribute to temporal trends. © 2017 Wiley Periodicals, Inc.

  18. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  19. Estimating significances of differences between slopes: A new methodology and software

    Directory of Open Access Journals (Sweden)

    Vasco M. N. C. S. Vieira

    2013-09-01

    Full Text Available Determining the significance of slope differences is a common requirement in studies of self-thinning, ontogeny and sexual dimorphism, among others. This has long been carried out testing for the overlap of the bootstrapped 95% confidence intervals of the slopes. However, the numerical random re-sampling with repetition favours the occurrence of re-combinations yielding largely diverging slopes, widening the confidence intervals and thus increasing the chances of overlooking significant differences. To overcome this problem a permutation test simulating the null hypothesis of no differences between slopes is proposed. This new methodology, when applied both to artificial and factual data, showed an enhanced ability to differentiate slopes.

  20. Refusal bias in the estimation of HIV prevalence

    NARCIS (Netherlands)

    Janssens, Wendy; van der Gaag, Jacques; Rinke de Wit, Tobias F.; Tanović, Zlata

    2014-01-01

    In 2007, UNAIDS corrected estimates of global HIV prevalence downward from 40 million to 33 million based on a methodological shift from sentinel surveillance to population-based surveys. Since then, population-based surveys are considered the gold standard for estimating HIV prevalence. However,