WorldWideScience

Sample records for reliable probability forecasts

  1. Forecasting reliability of transformer populations

    NARCIS (Netherlands)

    Schijndel, van A.; Wetzer, J.; Wouters, P.A.A.F.

    2007-01-01

    The expected replacement wave in the current power grid faces asset managers with challenging questions. Setting up a replacement strategy and planning calls for a forecast of the long term component reliability. For transformers the future failure probability can be predicted based on the ongoing

  2. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  3. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  4. On the reliability of seasonal climate forecasts

    Science.gov (United States)

    Weisheimer, A.; Palmer, T. N.

    2014-01-01

    Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1–5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that ‘goodness’ should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a ‘5’ should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of ‘goodness’ rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching ‘5’ across all regions and variables in 30 years time. PMID:24789559

  5. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  6. Electricity price forecasting using Enhanced Probability Neural Network

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2010-01-01

    This paper proposes a price forecasting system for electric market participants to reduce the risk of price volatility. Combining the Probability Neural Network (PNN) and Orthogonal Experimental Design (OED), an Enhanced Probability Neural Network (EPNN) is proposed in the solving process. In this paper, the Locational Marginal Price (LMP), system load and temperature of PJM system were collected and the data clusters were embedded in the Excel Database according to the year, season, workday, and weekend. With the OED to smooth parameters in the EPNN, the forecasting error can be improved during the training process to promote the accuracy and reliability where even the ''spikes'' can be tracked closely. Simulation results show the effectiveness of the proposed EPNN to provide quality information in a price volatile environment. (author)

  7. Can confidence indicators forecast the probability of expansion in Croatia?

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2016-04-01

    Full Text Available The aim of this paper is to investigate how reliable are confidence indicators in forecasting the probability of expansion. We consider three Croatian Business Survey indicators: the Industrial Confidence Indicator (ICI, the Construction Confidence Indicator (BCI and the Retail Trade Confidence Indicator (RTCI. The quarterly data, used in the research, covered the periods from 1999/Q1 to 2014/Q1. Empirical analysis consists of two parts. The non-parametric Bry-Boschan algorithm is used for distinguishing periods of expansion from the period of recession in the Croatian economy. Then, various nonlinear probit models were estimated. The models differ with respect to the regressors (confidence indicators and the time lags. The positive signs of estimated parameters suggest that the probability of expansion increases with an increase in Confidence Indicators. Based on the obtained results, the conclusion is that ICI is the most powerful predictor of the probability of expansion in Croatia.

  8. More intense experiences, less intense forecasts: why people overweight probability specifications in affective forecasts.

    Science.gov (United States)

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6).

  9. Objective Lightning Probability Forecasts for East-Central Florida Airports

    Science.gov (United States)

    Crawford, Winfred C.

    2013-01-01

    The forecasters at the National Weather Service in Melbourne, FL, (NWS MLB) identified a need to make more accurate lightning forecasts to help alleviate delays due to thunderstorms in the vicinity of several commercial airports in central Florida at which they are responsible for issuing terminal aerodrome forecasts. Such forecasts would also provide safer ground operations around terminals, and would be of value to Center Weather Service Units serving air traffic controllers in Florida. To improve the forecast, the AMU was tasked to develop an objective lightning probability forecast tool for the airports using data from the National Lightning Detection Network (NLDN). The resulting forecast tool is similar to that developed by the AMU to support space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) for use by the 45th Weather Squadron (45 WS) in previous tasks (Lambert and Wheeler 2005, Lambert 2007). The lightning probability forecasts are valid for the time periods and areas needed by the NWS MLB forecasters in the warm season months, defined in this task as May-September.

  10. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  11. Calibration and combination of dynamical seasonal forecasts to enhance the value of predicted probabilities for managing risk

    Science.gov (United States)

    Dutton, John A.; James, Richard P.; Ross, Jeremy D.

    2013-06-01

    Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.

  12. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  13. Exploring the interactions between forecast accuracy, risk perception and perceived forecast reliability in reservoir operator's decision to use forecast

    Science.gov (United States)

    Shafiee-Jood, M.; Cai, X.

    2017-12-01

    Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.

  14. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.

  15. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  16. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  17. DEVELOPMENT OF THE PROBABLY-GEOGRAPHICAL FORECAST METHOD FOR DANGEROUS WEATHER PHENOMENA

    Directory of Open Access Journals (Sweden)

    Elena S. Popova

    2015-12-01

    Full Text Available This paper presents a scheme method of probably-geographical forecast for dangerous weather phenomena. Discuss two general realization stages of this method. Emphasize that developing method is response to actual questions of modern weather forecast and it’s appropriate phenomena: forecast is carried out for specific point in space and appropriate moment of time.

  18. Update to the Objective Lightning Probability Forecast Tool in use at Cape Canaveral Air Force Station, Florida

    Science.gov (United States)

    Lambert, Winifred; Roeder, William

    2013-01-01

    This conference poster describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability and an ability to distinguish between lightning and non-lightning days.

  19. Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier

    We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases into the c......We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...

  20. Ensemble prediction of floods – catchment non-linearity and forecast probabilities

    Directory of Open Access Journals (Sweden)

    C. Reszler

    2007-07-01

    Full Text Available Quantifying the uncertainty of flood forecasts by ensemble methods is becoming increasingly important for operational purposes. The aim of this paper is to examine how the ensemble distribution of precipitation forecasts propagates in the catchment system, and to interpret the flood forecast probabilities relative to the forecast errors. We use the 622 km2 Kamp catchment in Austria as an example where a comprehensive data set, including a 500 yr and a 1000 yr flood, is available. A spatially-distributed continuous rainfall-runoff model is used along with ensemble and deterministic precipitation forecasts that combine rain gauge data, radar data and the forecast fields of the ALADIN and ECMWF numerical weather prediction models. The analyses indicate that, for long lead times, the variability of the precipitation ensemble is amplified as it propagates through the catchment system as a result of non-linear catchment response. In contrast, for lead times shorter than the catchment lag time (e.g. 12 h and less, the variability of the precipitation ensemble is decreased as the forecasts are mainly controlled by observed upstream runoff and observed precipitation. Assuming that all ensemble members are equally likely, the statistical analyses for five flood events at the Kamp showed that the ensemble spread of the flood forecasts is always narrower than the distribution of the forecast errors. This is because the ensemble forecasts focus on the uncertainty in forecast precipitation as the dominant source of uncertainty, and other sources of uncertainty are not accounted for. However, a number of analyses, including Relative Operating Characteristic diagrams, indicate that the ensemble spread is a useful indicator to assess potential forecast errors for lead times larger than 12 h.

  1. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  2. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    Science.gov (United States)

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of  0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For

  3. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  4. Assessing the Effectiveness of the Cone of Probability as a Visual Means of Communicating Scientific Forecasts

    Science.gov (United States)

    Orlove, B. S.; Broad, K.; Meyer, R.

    2010-12-01

    We review the evolution, communication, and differing interpretations of the National Hurricane Center (NHC)'s "cone of uncertainty" hurricane forecast graphic, drawing on several related disciplines—cognitive psychology, visual anthropology, and risk communication theory. We examine the 2004 hurricane season, two specific hurricanes (Katrina 2005 and Ike 2008) and the 2010 hurricane season, still in progress. During the 2004 hurricane season, five named storms struck Florida. Our analysis of that season draws upon interviews with key government officials and media figures, archival research of Florida newspapers, analysis of public comments on the NHC cone of uncertainty graphic and a multiagency study of 2004 hurricane behavior. At that time, the hurricane forecast graphic was subject to misinterpretation by many members of the public. We identify several characteristics of this graphic that contributed to public misinterpretation. Residents overemphasized the specific track of the eye, failed to grasp the width of hurricanes, and generally did not recognize the timing of the passage of the hurricane. Little training was provided to emergency response managers in the interpretation of forecasts. In the following year, Katrina became a national scandal, further demonstrating the limitations of the cone as a means of leading to appropriate responses to forecasts. In the second half of the first decade of the 21st century, three major changes occurred in hurricane forecast communication: the forecasts themselves improved in terms of accuracy and lead time, the NHC made minor changes in the graphics and expanded the explanatory material that accompanies the graphics, and some efforts were made to reach out to emergency response planners and municipal officials to enhance their understanding of the forecasts and graphics. There were some improvements in the responses to Ike, though a number of deaths were due to inadequate evacuations, and property damage probably

  5. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    Science.gov (United States)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  6. Regional corrections and checking the reliability of geomagnetic forecasts

    International Nuclear Information System (INIS)

    Afanas'eva, V.I.; Shevnin, A.D.

    1978-01-01

    Regional corrections of the K index mark estimate with respect to the Moskva observatory are reviewed in order to improve the short-range forecast of the geomagnetic activity and to promote it within the aqua area. The forecasts of the storms of all categories and weak perturbations have been verified for the predominant days in the catalogue of the magnetic storms family. It is shown that the adopted methods of forecasts yield considerably good results for weak perturbations as well as for weak and moderate magnetic storms. Strong and very strong storms are less predictable

  7. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  8. Interval forecasting of cyberattack intensity on informatization objects of industry using probability cluster model

    Science.gov (United States)

    Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.

    2018-05-01

    At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.

  9. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  10. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    Science.gov (United States)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  11. Problems in diagnosing and forecasting power equipment reliability

    Energy Technology Data Exchange (ETDEWEB)

    Popkov, V I; Demirchyan, K S

    1979-11-01

    This general survey deals with approaches to the resolution of such problems as the gathering, analysis and systematization of data on component defects in power equipment and setting up feedback with the manufacturing plants and planning organizations to improve equipment reliability. Such efforts on the part of designers, manufacturers and operating and repair organizations in analyzing faults in 300 MW turbogenerators during 1974-1977 reduced the specific fault rate by 20 to 25% and the downtime per failure by 35 to 40%. Since power equipment should operate for several hundreds of thousands of hours (20 to 30 years) and the majority of power components have guaranteed service lives of no more than 10/sup 5/ hours, an extremely difficult problem is the determination of the reliability of equipment past the 10/sup 5/ point. The present trend in the USSR Unified Power System towards increasing the number of shutdowns and startups, which in the case of turbogenerators of up 1200 MW power can reach 7500 to 10,000 cycles is noted. Other areas briefly treated are: MHD generator reliability and economy; nuclear power plant reliability and safety; the reliability of high-power high-voltage thyristor converters; the difficulties involved in scale modeling of power system reliability and the high cost of the requisite full-scale studies; the poor understanding of long term corrosion and erosion processes. The review concludes with arguments in favor of greater computerization of all aspects of power system management.

  12. Estimating the benefits of single value and probability forecasting for flood warning

    NARCIS (Netherlands)

    Verkade, J.S.; Werner, M.G.F.

    2011-01-01

    Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS). These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty

  13. Estimating the benefits of single value and probability forecasting for flood warning

    NARCIS (Netherlands)

    Verkade, J.S.; Werner, M.G.F.

    2011-01-01

    Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS). These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed floods, or surprises. This forecasting

  14. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  15. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  16. United States streamflow probabilities based on forecasted La Nina, winter-spring 2000

    Science.gov (United States)

    Dettinger, M.D.; Cayan, D.R.; Redmond, K.T.

    1999-01-01

    Although for the last 5 months the TahitiDarwin Southern Oscillation Index (SOI) has hovered close to normal, the “equatorial” SOI has remained in the La Niña category and predictions are calling for La Niña conditions this winter. In view of these predictions of continuing La Niña and as a direct extension of previous studies of the relations between El NiñoSouthern Oscil-lation (ENSO) conditions and streamflow in the United States (e.g., Redmond and Koch, 1991; Cayan and Webb, 1992; Redmond and Cayan, 1994; Dettinger et al., 1998; Garen, 1998; Cayan et al., 1999; Dettinger et al., in press), the probabilities that United States streamflows from December 1999 through July 2000 will be in upper and lower thirds (terciles) of the historical records are estimated here. The processes that link ENSO to North American streamflow are discussed in detail in these diagnostics studies. Our justification for generating this forecast is threefold: (1) Cayan et al. (1999) recently have shown that ENSO influences on streamflow variations and extremes are proportionately larger than the corresponding precipitation teleconnections. (2) Redmond and Cayan (1994) and Dettinger et al. (in press) also have shown that the low-frequency evolution of ENSO conditions support long-lead correlations between ENSO and streamflow in many rivers of the conterminous United States. (3) In many rivers, significant (weeks-to-months) delays between precipitation and the release to streams of snowmelt or ground-water discharge can support even longer term forecasts of streamflow than is possible for precipitation. The relatively slow, orderly evolution of El Niño-Southern Oscillation episodes, the accentuated dependence of streamflow upon ENSO, and the long lags between precipitation and flow encourage us to provide the following analysis as a simple prediction of this year’s river flows.

  17. Objective Lightning Probability Forecasting for Kennedy Space Center and Cape Canaveral Air Force Station, Phase III

    Science.gov (United States)

    Crawford, Winifred C.

    2010-01-01

    The AMU created new logistic regression equations in an effort to increase the skill of the Objective Lightning Forecast Tool developed in Phase II (Lambert 2007). One equation was created for each of five sub-seasons based on the daily lightning climatology instead of by month as was done in Phase II. The assumption was that these equations would capture the physical attributes that contribute to thunderstorm formation more so than monthly equations. However, the SS values in Section 5.3.2 showed that the Phase III equations had worse skill than the Phase II equations and, therefore, will not be transitioned into operations. The current Objective Lightning Forecast Tool developed in Phase II will continue to be used operationally in MIDDS. Three warm seasons were added to the Phase II dataset to increase the POR from 17 to 20 years (1989-2008), and data for October were included since the daily climatology showed lightning occurrence extending into that month. None of the three methods tested to determine the start of the subseason in each individual year were able to discern the start dates with consistent accuracy. Therefore, the start dates were determined by the daily climatology shown in Figure 10 and were the same in every year. The procedures used to create the predictors and develop the equations were identical to those in Phase II. The equations were made up of one to three predictors. TI and the flow regime probabilities were the top predictors followed by 1-day persistence, then VT and Ll. Each equation outperformed four other forecast methods by 7-57% using the verification dataset, but the new equations were outperformed by the Phase II equations in every sub-season. The reason for the degradation may be due to the fact that the same sub-season start dates were used in every year. It is likely there was overlap of sub-season days at the beginning and end of each defined sub-season in each individual year, which could very well affect equation

  18. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  19. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  20. Estimating the benefits of single value and probability forecasting for flood warning

    Directory of Open Access Journals (Sweden)

    J. S. Verkade

    2011-12-01

    Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.

  1. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  2. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  3. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  4. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  5. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  6. Plant calendar pattern based on rainfall forecast and the probability of its success in Deli Serdang regency of Indonesia

    Science.gov (United States)

    Darnius, O.; Sitorus, S.

    2018-03-01

    The objective of this study was to determine the pattern of plant calendar of three types of crops; namely, palawija, rice, andbanana, based on rainfall in Deli Serdang Regency. In the first stage, we forecasted rainfall by using time series analysis, and obtained appropriate model of ARIMA (1,0,0) (1,1,1)12. Based on the forecast result, we designed a plant calendar pattern for the three types of plant. Furthermore, the probability of success in the plant types following the plant calendar pattern was calculated by using the Markov process by discretizing the continuous rainfall data into three categories; namely, Below Normal (BN), Normal (N), and Above Normal (AN) to form the probability transition matrix. Finally, the combination of rainfall forecasting models and the Markov process were used to determine the pattern of cropping calendars and the probability of success in the three crops. This research used rainfall data of Deli Serdang Regency taken from the office of BMKG (Meteorologist Climatology and Geophysics Agency), Sampali Medan, Indonesia.

  7. Reliability Assessment of Wind Farm Electrical System Based on a Probability Transfer Technique

    Directory of Open Access Journals (Sweden)

    Hejun Yang

    2018-03-01

    Full Text Available The electrical system of a wind farm has a significant influence on the wind farm reliability and electrical energy yield. The disconnect switch installed in an electrical system cannot only improve the operating flexibility, but also enhance the reliability for a wind farm. Therefore, this paper develops a probabilistic transfer technique for integrating the electrical topology structure, the isolation operation of disconnect switch, and stochastic failure of electrical equipment into the reliability assessment of wind farm electrical system. Firstly, as the traditional two-state reliability model of electrical equipment cannot consider the isolation operation, so the paper develops a three-state reliability model to replace the two-state model for incorporating the isolation operation. In addition, a proportion apportion technique is presented to evaluate the state probability. Secondly, this paper develops a probabilistic transfer technique based on the thoughts that through transfer the unreliability of electrical system to the energy transmission interruption of wind turbine generators (WTGs. Finally, some novel indices for describing the reliability of wind farm electrical system are designed, and the variance coefficient of the designed indices is used as a convergence criterion to determine the termination of the assessment process. The proposed technique is applied to the reliability assessment of a wind farm with the different topologies. The simulation results show that the proposed techniques are effective in practical applications.

  8. On new cautious structural reliability models in the framework of imprecise probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev; Kozine, Igor

    2010-01-01

    measures when the number of events of interest or observations is very small. The main feature of the models is that prior ignorance is not modelled by a fixed single prior distribution, but by a class of priors which is defined by upper and lower probabilities that can converge as statistical data......New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...

  9. How will climate novelty influence ecological forecasts? Using the Quaternary to assess future reliability.

    Science.gov (United States)

    Fitzpatrick, Matthew C; Blois, Jessica L; Williams, John W; Nieto-Lugilde, Diego; Maguire, Kaitlin C; Lorenz, David J

    2018-03-23

    Future climates are projected to be highly novel relative to recent climates. Climate novelty challenges models that correlate ecological patterns to climate variables and then use these relationships to forecast ecological responses to future climate change. Here, we quantify the magnitude and ecological significance of future climate novelty by comparing it to novel climates over the past 21,000 years in North America. We then use relationships between model performance and climate novelty derived from the fossil pollen record from eastern North America to estimate the expected decrease in predictive skill of ecological forecasting models as future climate novelty increases. We show that, in the high emissions scenario (RCP 8.5) and by late 21st century, future climate novelty is similar to or higher than peak levels of climate novelty over the last 21,000 years. The accuracy of ecological forecasting models is projected to decline steadily over the coming decades in response to increasing climate novelty, although models that incorporate co-occurrences among species may retain somewhat higher predictive skill. In addition to quantifying future climate novelty in the context of late Quaternary climate change, this work underscores the challenges of making reliable forecasts to an increasingly novel future, while highlighting the need to assess potential avenues for improvement, such as increased reliance on geological analogs for future novel climates and improving existing models by pooling data through time and incorporating assemblage-level information. © 2018 John Wiley & Sons Ltd.

  10. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    models and gen-eralizing conventional ones to imprecise probabili-ties. The theoretical setup employed for this purpose is imprecise statistical reasoning (Walley 1991), whose general framework is provided by upper and lower previsions (expectations). The appeal of this theory is its ability to capture......Uncertainty of parameters in engineering design has been modeled in different frameworks such as inter-val analysis, fuzzy set and possibility theories, ran-dom set theory and imprecise probability theory. The authors of this paper for many years have been de-veloping new imprecise reliability...... both aleatory (stochas-tic) and epistemic uncertainty and the flexibility with which information can be represented. The previous research of the authors related to generalizing structural reliability models to impre-cise statistical measures is summarized in Utkin & Kozine (2002) and Utkin (2004...

  11. A new method for evaluating the availability, reliability, and maintainability whatever may be the probability law

    International Nuclear Information System (INIS)

    Doyon, L.R.; CEA Centre d'Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette

    1975-01-01

    A simple method is presented for computer solving every system model (availability, reliability, and maintenance) with intervals between failures, and time duration for repairs distributed according to any probability law, and for any maintainance policy. A matrix equation is obtained using Markov diagrams. An example is given with the solution by the APAFS program (Algorithme Pour l'Analyse de la Fiabilite des Systemes) [fr

  12. Forecasting systems reliability based on support vector regression with genetic algorithms

    International Nuclear Information System (INIS)

    Chen, K.-Y.

    2007-01-01

    This study applies a novel neural-network technique, support vector regression (SVR), to forecast reliability in engine systems. The aim of this study is to examine the feasibility of SVR in systems reliability prediction by comparing it with the existing neural-network approaches and the autoregressive integrated moving average (ARIMA) model. To build an effective SVR model, SVR's parameters must be set carefully. This study proposes a novel approach, known as GA-SVR, which searches for SVR's optimal parameters using real-value genetic algorithms, and then adopts the optimal parameters to construct the SVR models. A real reliability data for 40 suits of turbochargers were employed as the data set. The experimental results demonstrate that SVR outperforms the existing neural-network approaches and the traditional ARIMA models based on the normalized root mean square error and mean absolute percentage error

  13. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  14. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  15. Test-retest reliability of the Middlesex Assessment of Mental State (MEAMS): a preliminary investigation in people with probable dementia.

    Science.gov (United States)

    Powell, T; Brooker, D J; Papadopolous, A

    1993-05-01

    Relative and absolute test-retest reliability of the MEAMS was examined in 12 subjects with probable dementia and 12 matched controls. Relative reliability was good. Measures of absolute reliability showed scores changing by up to 3 points over an interval of a week. A version effect was found to be in evidence.

  16. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    Science.gov (United States)

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  17. Personnel reliability impact on petrochemical facilities monitoring system's failure skipping probability

    Science.gov (United States)

    Kostyukov, V. N.; Naumenko, A. P.

    2017-08-01

    The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under

  18. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  19. Questionnaire of social probability and potential consequences: Examination of reliability and validity on Serbian population

    Directory of Open Access Journals (Sweden)

    Ranđelović Kristina M.

    2014-01-01

    Full Text Available Prejudice in judgment has an important role in cognitive models of psychopathology. Every selective processing of emotionally relevant stimuli is called cognitive prejudice. One of the cognitive prejudices that is considered a key factor of socially - anxious disorder is prejudice in judgment. It is defined as a disposition to overestimate the probability of occurrence of negative social events in the near future, as well as potential consequences (agitation that might follow them. The perception of danger is essentially determined by a joined effect of subjective assessment of probability and agitation created by certain events. The researches have shown that socially-anxious individuals have a more expressive prejudice in judgment and that it can be reduced by applying certain psychotherapeutic and pharmacological treatments, which proves its relevance for the socially-anxious disorder. Considering the significance of the prejudice in judgment construct for the research and clinical practice and the lack of instruments that is operational in our country, the basic purpose of this paper is to check metric characteristics of the Serbian version of one of the most often mentioned and used questionnaires aimed at the assessment of this construct. It is the Questionnaire of social probability and potential consequences, which has two subscales: 1 to examine the reliability of the questionnaire on the sample of examinees from Serbia; 2 to examine the latent structure of the questionnaire and 3 to examine the construct of validity of the questionnaire by checking the correlations with other relevant constructs (personality traits, anxiety as a trait and fear of negative evaluation.The was adapted for Serbian language from English. The sample consists of 166 examinees, aged from 19 to 29 (AS = 21,73; SD = 1,43. The questionnaire for sensitivity to confirmation assessment was used to estimate personality traits, Anxiety as a trait was estimated by the

  20. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  1. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    Science.gov (United States)

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  2. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  3. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  4. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  5. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    NARCIS (Netherlands)

    Gharouni-Nik, M.; Naeimi, M.; Ahadi, S.; Alimoradi, Z.

    2014-01-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute

  6. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  7. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  8. On the incidence of meteorological and hydrological processors: Effect of resolution, sharpness and reliability of hydrological ensemble forecasts

    Science.gov (United States)

    Abaza, Mabrouk; Anctil, François; Fortin, Vincent; Perreault, Luc

    2017-12-01

    Meteorological and hydrological ensemble prediction systems are imperfect. Their outputs could often be improved through the use of a statistical processor, opening up the question of the necessity of using both processors (meteorological and hydrological), only one of them, or none. This experiment compares the predictive distributions from four hydrological ensemble prediction systems (H-EPS) utilising the Ensemble Kalman filter (EnKF) probabilistic sequential data assimilation scheme. They differ in the inclusion or not of the Distribution Based Scaling (DBS) method for post-processing meteorological forecasts and the ensemble Bayesian Model Averaging (ensemble BMA) method for hydrological forecast post-processing. The experiment is implemented on three large watersheds and relies on the combination of two meteorological reforecast products: the 4-member Canadian reforecasts from the Canadian Centre for Meteorological and Environmental Prediction (CCMEP) and the 10-member American reforecasts from the National Oceanic and Atmospheric Administration (NOAA), leading to 14 members at each time step. Results show that all four tested H-EPS lead to resolution and sharpness values that are quite similar, with an advantage to DBS + EnKF. The ensemble BMA is unable to compensate for any bias left in the precipitation ensemble forecasts. On the other hand, it succeeds in calibrating ensemble members that are otherwise under-dispersed. If reliability is preferred over resolution and sharpness, DBS + EnKF + ensemble BMA performs best, making use of both processors in the H-EPS system. Conversely, for enhanced resolution and sharpness, DBS is the preferred method.

  9. Reliability, failure probability, and strength of resin-based materials for CAD/CAM restorations

    Directory of Open Access Journals (Sweden)

    Kiatlin Lim

    Full Text Available ABSTRACT Objective: This study investigated the Weibull parameters and 5% fracture probability of direct, indirect composites, and CAD/CAM composites. Material and Methods: Discshaped (12 mm diameter x 1 mm thick specimens were prepared for a direct composite [Z100 (ZO, 3M-ESPE], an indirect laboratory composite [Ceramage (CM, Shofu], and two CAD/CAM composites [Lava Ultimate (LU, 3M ESPE; Vita Enamic (VE, Vita Zahnfabrik] restorations (n=30 for each group. The specimens were polished, stored in distilled water for 24 hours at 37°C. Weibull parameters (m= modulus of Weibull, σ0= characteristic strength and flexural strength for 5% fracture probability (σ5% were determined using a piston-on-three-balls device at 1 MPa/s in distilled water. Statistical analysis for biaxial flexural strength analysis were performed either by both one-way ANOVA and Tukey's post hoc (α=0.05 or by Pearson's correlation test. Results: Ranking of m was: VE (19.5, LU (14.5, CM (11.7, and ZO (9.6. Ranking of σ0 (MPa was: LU (218.1, ZO (210.4, CM (209.0, and VE (126.5. σ5% (MPa was 177.9 for LU, 163.2 for CM, 154.7 for Z0, and 108.7 for VE. There was no significant difference in the m for ZO, CM, and LU. VE presented the highest m value and significantly higher than ZO. For σ0 and σ5%, ZO, CM, and LU were similar but higher than VE. Conclusion: The strength characteristics of CAD/ CAM composites vary according to their composition and microstructure. VE presented the lowest strength and highest Weibull modulus among the materials.

  10. Determination of reliability of express forecasting evaluation of radiometric enriching ability of non-ferrous ores

    International Nuclear Information System (INIS)

    Kirpishchikov, S.P.

    1991-01-01

    Use of the data of nuclear physical methods of sampling and logging enables to improve reliability of evaluation of radiometric enriching ability of ores, as well as to evaluate quantitatively this reliability. This problem may be solved by using some concepts of geostatistics. The presented results enable to conclude, that the data of nuclear-physical methods of sampling and logging can provide high reliability of evaluation of radiometric enriching ability of non-ferrous ores and their geometrization by technological types

  11. The influence of frequency and reliability of in-service inspection on reactor pressure vessel disruptive failure probability

    International Nuclear Information System (INIS)

    Jordan, G.M.

    1977-01-01

    A simple probabilistic methodology is used to investigate the benefit, in terms of reduction of disruptive failure probability, which comes from the application of periodic In Service Inspection (ISI) to nuclear pressure vessels. The analysis indicates the strong interaction between inspection benefit and the intrinsic quality of the structure. In order to quantify the inspection benefit, assumptions are made which allow the quality to be characterized in terms of the parameters governing a Log Normal distribution of time-to-failure. Using these assumptions, it is shown that the overall benefit of ISI is unlikely to exceed an order of magnitude in terms of reduction of disruptive failure probability. The method is extended to evaluate the effect of the periodicity and reliability of the inspection process itself. (author)

  12. The influence of frequency and reliability of in-service inspection on reactor pressure vessel disruptive failure probability

    International Nuclear Information System (INIS)

    Jordan, G.M.

    1978-01-01

    A simple probabilistic methodology is used to investigate the benefit, in terms of reduction of disruptive failure probability, which comes from the application of periodic in-service inspection to nuclear pressure vessels. The analysis indicates the strong interaction between inspection benefit and the intrinsic quality of the structure. In order to quantify the inspection benefit, assumptions are made which allow the quality to be characterised in terms of the parameters governing a log normal distribution of time - to - failure. Using these assumptions, it is shown that the overall benefit of in-service inspection unlikely to exceed an order of magnitude in terms of reduction of disruptive failure probability. The method is extended to evaluate the effect of the periodicity and reliability of the inspection process itself. (author)

  13. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  14. Earthquakes and forecast reliability: thermoactivation and mesomechanics of the focal zone

    Science.gov (United States)

    Kalinnikov, I. I.; Manukin, A. B.; Matyunin, V. P.

    2017-06-01

    According to our data, the involvement of the fundamental laws of physics, in particular, consideration of an earthquake as a particular macroprocess with a peak together with the thermofluctuational activation of mechanical stresses in some environments, makes it possible to move beyond the traditional idea of the issue of earthquake prediction. Many formal parameters of statistical processing of the geophysical data can be provided with a physical sense related to the mesomechanics of structural changes in a stressed solid body. Measures for improving the efficiency of observations and their mathematical processing to solve the forecasting issues have been specified.

  15. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  16. Quantitative assessment of probability of failing safely for the safety instrumented system using reliability block diagram method

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Zhao, Shoutang; Hu, Bin

    2015-01-01

    Highlights: • Models of PFS for SIS were established by using the reliability block diagram. • The more accurate calculation of PFS for SIS can be acquired by using SL. • Degraded operation of complex SIS does not affect the availability of SIS. • The safe undetected failure is the largest contribution to the PFS of SIS. - Abstract: The spurious trip of safety instrumented system (SIS) brings great economic losses to production. How to ensure the safety instrumented system is reliable and available has been put on the schedule. But the existing models on spurious trip rate (STR) or probability of failing safely (PFS) are too simplified and not accurate, in-depth studies of availability to obtain more accurate PFS for SIS are required. Based on the analysis of factors that influence the PFS for the SIS, using reliability block diagram method (RBD), the quantitative study of PFS for the SIS is carried out, and gives some application examples. The results show that, the common cause failure will increase the PFS; degraded operation does not affect the availability of the SIS; if the equipment was tested and repaired one by one, the unavailability of the SIS can be ignored; the corresponding occurrence time of independent safe undetected failure should be the system lifecycle (SL) rather than the proof test interval and the independent safe undetected failure is the largest contribution to the PFS for the SIS

  17. Forecast of reliability for mechanical components subjected to wearing; Pronostico de la fiabilidad de componentes mecanicos sometidos a desgaste

    Energy Technology Data Exchange (ETDEWEB)

    Angulo-Zevallos, J.; Castellote-Varona, C.; Alanbari, M.

    2010-07-01

    Generally, improving quality and price of products, obtaining a complete customer satisfaction and achieving excellence in all the processes are some of the challenges currently set up by every company. To do this, knowing frequently the reliability of some component is necessary. To achieve this goal, a research, that contributes with clear ideas and offers a methodology for the assessment of the parameters involved in the reliability calculation, becomes necessary. A parameter closely related to this concept is the probability of product failure depending on the operating time. It is known that mechanical components fail by: creep, fatigue, wear, corrosion, etc. This article proposes a methodology for finding the reliability of a component subject to wear, such as brake pads, grinding wheels, brake linings of clutch discs, etc. (Author)

  18. Linear and evolutionary polynomial regression models to forecast coastal dynamics: Comparison and reliability assessment

    Science.gov (United States)

    Bruno, Delia Evelina; Barca, Emanuele; Goncalves, Rodrigo Mikosz; de Araujo Queiroz, Heithor Alexandre; Berardi, Luigi; Passarella, Giuseppe

    2018-01-01

    In this paper, the Evolutionary Polynomial Regression data modelling strategy has been applied to study small scale, short-term coastal morphodynamics, given its capability for treating a wide database of known information, non-linearly. Simple linear and multilinear regression models were also applied to achieve a balance between the computational load and reliability of estimations of the three models. In fact, even though it is easy to imagine that the more complex the model, the more the prediction improves, sometimes a "slight" worsening of estimations can be accepted in exchange for the time saved in data organization and computational load. The models' outcomes were validated through a detailed statistical, error analysis, which revealed a slightly better estimation of the polynomial model with respect to the multilinear model, as expected. On the other hand, even though the data organization was identical for the two models, the multilinear one required a simpler simulation setting and a faster run time. Finally, the most reliable evolutionary polynomial regression model was used in order to make some conjecture about the uncertainty increase with the extension of extrapolation time of the estimation. The overlapping rate between the confidence band of the mean of the known coast position and the prediction band of the estimated position can be a good index of the weakness in producing reliable estimations when the extrapolation time increases too much. The proposed models and tests have been applied to a coastal sector located nearby Torre Colimena in the Apulia region, south Italy.

  19. Statistical equivalence and test-retest reliability of delay and probability discounting using real and hypothetical rewards.

    Science.gov (United States)

    Matusiewicz, Alexis K; Carter, Anne E; Landes, Reid D; Yi, Richard

    2013-11-01

    Delay discounting (DD) and probability discounting (PD) refer to the reduction in the subjective value of outcomes as a function of delay and uncertainty, respectively. Elevated measures of discounting are associated with a variety of maladaptive behaviors, and confidence in the validity of these measures is imperative. The present research examined (1) the statistical equivalence of discounting measures when rewards were hypothetical or real, and (2) their 1-week reliability. While previous research has partially explored these issues using the low threshold of nonsignificant difference, the present study fully addressed this issue using the more-compelling threshold of statistical equivalence. DD and PD measures were collected from 28 healthy adults using real and hypothetical $50 rewards during each of two experimental sessions, one week apart. Analyses using area-under-the-curve measures revealed a general pattern of statistical equivalence, indicating equivalence of real/hypothetical conditions as well as 1-week reliability. Exceptions are identified and discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Reliable before-fabrication forecasting of normal and touch mode MEMS capacitive pressure sensor: modeling and simulation

    Science.gov (United States)

    Jindal, Sumit Kumar; Mahajan, Ankush; Raghuwanshi, Sanjeev Kumar

    2017-10-01

    An analytical model and numerical simulation for the performance of MEMS capacitive pressure sensors in both normal and touch modes is required for expected behavior of the sensor prior to their fabrication. Obtaining such information should be based on a complete analysis of performance parameters such as deflection of diaphragm, change of capacitance when the diaphragm deflects, and sensitivity of the sensor. In the literature, limited work has been carried out on the above-stated issue; moreover, due to approximation factors of polynomials, a tolerance error cannot be overseen. Reliable before-fabrication forecasting requires exact mathematical calculation of the parameters involved. A second-order polynomial equation is calculated mathematically for key performance parameters of both modes. This eliminates the approximation factor, and an exact result can be studied, maintaining high accuracy. The elimination of approximation factors and an approach of exact results are based on a new design parameter (δ) that we propose. The design parameter gives an initial hint to the designers on how the sensor will behave once it is fabricated. The complete work is aided by extensive mathematical detailing of all the parameters involved. Next, we verified our claims using MATLAB® simulation. Since MATLAB® effectively provides the simulation theory for the design approach, more complicated finite element method is not used.

  1. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  2. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  3. How uncertain are day-ahead wind forecasts?

    Energy Technology Data Exchange (ETDEWEB)

    Grimit, E. [3TIER Environmental Forecast Group, Seattle, WA (United States)

    2006-07-01

    Recent advances in the combination of weather forecast ensembles with Bayesian statistical techniques have helped to address uncertainties in wind forecasting. Weather forecast ensembles are a collection of numerical weather predictions. The combination of several equally-skilled forecasts typically results in a consensus forecast with greater accuracy. The distribution of forecasts also provides an estimate of forecast inaccuracy. However, weather forecast ensembles tend to be under-dispersive, and not all forecast uncertainties can be taken into account. In order to address these issues, a multi-variate linear regression approach was used to correct the forecast bias for each ensemble member separately. Bayesian model averaging was used to provide a predictive probability density function to allow for multi-modal probability distributions. A test location in eastern Canada was used to demonstrate the approach. Results of the test showed that the method improved wind forecasts and generated reliable prediction intervals. Prediction intervals were much shorter than comparable intervals based on a single forecast or on historical observations alone. It was concluded that the approach will provide economic benefits to both wind energy developers and investors. refs., tabs., figs.

  4. Forecasting Italian seismicity through a spatio-temporal physical model: importance of considering time-dependency and reliability of the forecast

    Directory of Open Access Journals (Sweden)

    Amir Hakimhashemi

    2010-11-01

    Full Text Available We apply here a forecasting model to the Italian region for the spatio-temporal distribution of seismicity based on a smoothing Kernel function, Coulomb stress variations, and a rate-and-state friction law. We tested the feasibility of this approach, and analyzed the importance of introducing time-dependency in forecasting future events. The change in seismicity rate as a function of time was estimated by calculating the Coulomb stress change imparted by large earthquakes. We applied our approach to the region of Italy, and used all of the cataloged earthquakes that occurred up to 2006 to generate the reference seismicity rate. For calculation of the time-dependent seismicity rate changes, we estimated the rate-and-state stress transfer imparted by all of the ML≥4.0 earthquakes that occurred during 2007 and 2008. To validate the results, we first compared the reference seismicity rate with the distribution of ML≥1.8 earthquakes since 2007, using both a non-declustered and a declustered catalog. A positive correlation was found, and all of the forecast earthquakes had locations within 82% and 87% of the study area with the highest seismicity rate, respectively. Furthermore, 95% of the forecast earthquakes had locations within 27% and 47% of the study area with the highest seismicity rate, respectively. For the time-dependent seismicity rate changes, the number of events with locations in the regions with a seismicity rate increase was 11% more than in the regions with a seismicity rate decrease.

  5. A Wind Forecasting System for Energy Application

    Science.gov (United States)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2010-05-01

    Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated

  6. Probabilistic forecasting for extreme NO2 pollution episodes

    International Nuclear Information System (INIS)

    Aznarte, José L.

    2017-01-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. - Highlights: • A new probabilistic forecasting system is presented to predict NO 2 concentrations. • While predicting the full distribution, it also outperforms other point-forecasting models. • Forecasts show good properties and peak concentrations are properly predicted. • It forecasts the probability of exceedance of thresholds, key to decision makers. • Relative forecasting importance of the variables is obtained as a by-product.

  7. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  8. Supply reliability in context to quality regulation. Forecast model for the supply reliability; Versorgungszuverlaessigkeit im Kontext der Qualitaetsregulierung. Prognosemodell fuer die Versorgungszuverlaessigkeit

    Energy Technology Data Exchange (ETDEWEB)

    Quadflieg, Dieter [VDE, Berlin (Germany). FNN-Projektgruppe ' ' Einflussgroessen auf die Versorgungszuverlaessigkeit' '

    2011-11-14

    The forum network technology / network operations in VDE (FNN) has published a technical note on supply reliability in the context of quality regulation. In the underlying investigations, on the one hand the influencing variables on the supply reliability are analyzed on the basis of FNN fault and availability statistics. On the other stochastic methods are developed that allow a prediction of the stochastic reliability characteristics of each network operator.

  9. Estimation technique of corrective effects for forecasting of reliability of the designed and operated objects of the generating systems

    Science.gov (United States)

    Truhanov, V. N.; Sultanov, M. M.

    2017-11-01

    In the present article researches of statistical material on the refusals and malfunctions influencing operability of heat power installations have been conducted. In this article the mathematical model of change of output characteristics of the turbine depending on number of the refusals revealed in use has been presented. The mathematical model is based on methods of mathematical statistics, probability theory and methods of matrix calculation. The novelty of this model is that it allows to predict the change of the output characteristic in time, and the operating influences have been presented in an explicit form. As desirable dynamics of change of the output characteristic (function, reliability) the law of distribution of Veybull which is universal is adopted since at various values of parameters it turns into other types of distributions (for example, exponential, normal, etc.) It should be noted that the choice of the desirable law of management allows to determine the necessary management parameters with use of the saved-up change of the output characteristic in general. The output characteristic can be changed both on the speed of change of management parameters, and on acceleration of change of management parameters. In this article the technique of an assessment of the pseudo-return matrix has been stated in detail by the method of the smallest squares and the standard Microsoft Excel functions. Also the technique of finding of the operating effects when finding restrictions both for the output characteristic, and on management parameters has been considered. In the article the order and the sequence of finding of management parameters has been stated. A concrete example of finding of the operating effects in the course of long-term operation of turbines has been shown.

  10. How reliable is the offline linkage of Weather Research & Forecasting Model (WRF) and Variable Infiltration Capacity (VIC) model?

    Science.gov (United States)

    The aim for this research is to evaluate the ability of the offline linkage of Weather Research & Forecasting Model (WRF) and Variable Infiltration Capacity (VIC) model to produce hydrological, e.g. evaporation (ET), soil moisture (SM), runoff, and baseflow. First, the VIC mo...

  11. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data.

  12. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  13. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual, Part 2: Human Error Probability (HEP) Data. Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  14. Evaluation of statistical models for forecast errors from the HBV model

    Science.gov (United States)

    Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur

    2010-04-01

    SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.

  15. Optimal Release Time and Sensitivity Analysis Using a New NHPP Software Reliability Model with Probability of Fault Removal Subject to Operating Environments

    Directory of Open Access Journals (Sweden)

    Kwang Yoon Song

    2018-05-01

    Full Text Available With the latest technological developments, the software industry is at the center of the fourth industrial revolution. In today’s complex and rapidly changing environment, where software applications must be developed quickly and easily, software must be focused on rapidly changing information technology. The basic goal of software engineering is to produce high-quality software at low cost. However, because of the complexity of software systems, software development can be time consuming and expensive. Software reliability models (SRMs are used to estimate and predict the reliability, number of remaining faults, failure intensity, total and development cost, etc., of software. Additionally, it is very important to decide when, how, and at what cost to release the software to users. In this study, we propose a new nonhomogeneous Poisson process (NHPP SRM with a fault detection rate function affected by the probability of fault removal on failure subject to operating environments and discuss the optimal release time and software reliability with the new NHPP SRM. The example results show a good fit to the proposed model, and we propose an optimal release time for a given change in the proposed model.

  16. Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake

    Directory of Open Access Journals (Sweden)

    Y. Dzierma

    2010-10-01

    Full Text Available A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ. Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ≥ 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

  17. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  18. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  19. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  20. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Guide to data processing and revision: Part 2, Human error probability data entry and revision procedures

    International Nuclear Information System (INIS)

    Gilmore, W.E.; Gertman, D.I.; Gilbert, B.G.; Reece, W.J.

    1988-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability (HEP) and hardware component failure data (HCFD). The NUCLARR system software resides on an IBM (or compatible) personal micro-computer. Users can perform data base searches to furnish HEP estimates and HCFD rates. In this manner, the NUCLARR system can be used to support a variety of risk assessment activities. This volume, Volume 3 of a 5-volume series, presents the procedures used to process HEP and HCFD for entry in NUCLARR and describes how to modify the existing NUCLARR taxonomy in order to add either equipment types or action verbs. Volume 3 also specifies the various roles of the administrative staff on assignment to the NUCLARR Clearinghouse who are tasked with maintaining the data base, dealing with user requests, and processing NUCLARR data. 5 refs., 34 figs., 3 tabs

  1. Case studies of extended model-based flood forecasting: prediction of dike strength and flood impacts

    Science.gov (United States)

    Stuparu, Dana; Bachmann, Daniel; Bogaard, Tom; Twigt, Daniel; Verkade, Jan; de Bruijn, Karin; de Leeuw, Annemargreet

    2017-04-01

    Flood forecasts, warning and emergency response are important components in flood risk management. Most flood forecasting systems use models to translate weather predictions to forecasted discharges or water levels. However, this information is often not sufficient for real time decisions. A sound understanding of the reliability of embankments and flood dynamics is needed to react timely and reduce the negative effects of the flood. Where are the weak points in the dike system? When, how much and where the water will flow? When and where is the greatest impact expected? Model-based flood impact forecasting tries to answer these questions by adding new dimensions to the existing forecasting systems by providing forecasted information about: (a) the dike strength during the event (reliability), (b) the flood extent in case of an overflow or a dike failure (flood spread) and (c) the assets at risk (impacts). This work presents three study-cases in which such a set-up is applied. Special features are highlighted. Forecasting of dike strength. The first study-case focusses on the forecast of dike strength in the Netherlands for the river Rhine branches Waal, Nederrijn and IJssel. A so-called reliability transformation is used to translate the predicted water levels at selected dike sections into failure probabilities during a flood event. The reliability of a dike section is defined by fragility curves - a summary of the dike strength conditional to the water level. The reliability information enhances the emergency management and inspections of embankments. Ensemble forecasting. The second study-case shows the setup of a flood impact forecasting system in Dumfries, Scotland. The existing forecasting system is extended with a 2D flood spreading model in combination with the Delft-FIAT impact model. Ensemble forecasts are used to make use of the uncertainty in the precipitation forecasts, which is useful to quantify the certainty of a forecasted flood event. From global

  2. Establishment of turbidity forecasting model and early-warning system for source water turbidity management using back-propagation artificial neural network algorithm and probability analysis.

    Science.gov (United States)

    Yang, Tsung-Ming; Fan, Shu-Kai; Fan, Chihhao; Hsu, Nien-Sheng

    2014-08-01

    The purpose of this study is to establish a turbidity forecasting model as well as an early-warning system for turbidity management using rainfall records as the input variables. The Taipei Water Source Domain was employed as the study area, and ANOVA analysis showed that the accumulative rainfall records of 1-day Ping-lin, 2-day Ping-lin, 2-day Fei-tsui, 2-day Shi-san-gu, 2-day Tai-pin and 2-day Tong-hou were the six most significant parameters for downstream turbidity development. The artificial neural network model was developed and proven capable of predicting the turbidity concentration in the investigated catchment downstream area. The observed and model-calculated turbidity data were applied to developing the turbidity early-warning system. Using a previously determined turbidity as the threshold, the rainfall criterion, above which the downstream turbidity would possibly exceed this respective threshold turbidity, for the investigated rain gauge stations was determined. An exemplary illustration demonstrated the effectiveness of the proposed turbidity early-warning system as a precautionary alarm of possible significant increase of downstream turbidity. This study is the first report of the establishment of the turbidity early-warning system. Hopefully, this system can be applied to source water turbidity forecasting during storm events and provide a useful reference for subsequent adjustment of drinking water treatment operation.

  3. Reliability considerations of NDT by probability of detection (POD). Determination using ultrasound phased array. Results from a project in frame of the German nuclear safety research program

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dugan, Sandra; Juengert, Anne

    2013-01-01

    Reliable assessment procedures are an important aspect of maintenance concepts. Non-destructive testing (NDT) methods are an essential part of a variety of maintenance plans. Fracture mechanical assessments require knowledge of flaw dimensions, loads and material parameters. NDT methods are able to acquire information on all of these areas. However, it has to be considered that the level of detail information depends on the case investigated and therefore on the applicable methods. Reliability aspects of NDT methods are of importance if quantitative information is required. Different design concepts e.g. the damage tolerance approach in aerospace already include reliability criteria of NDT methods applied in maintenance plans. NDT is also an essential part during construction and maintenance of nuclear power plants. In Germany, type and extent of inspection are specified in Safety Standards of the Nuclear Safety Standards Commission (KTA). Only certified inspections are allowed in the nuclear industry. The qualification of NDT is carried out in form of performance demonstrations of the inspection teams and the equipment, witnessed by an authorized inspector. The results of these tests are mainly statements regarding the detection capabilities of certain artificial flaws. In other countries, e.g. the U.S., additional blind tests on test blocks with hidden and unknown flaws may be required, in which a certain percentage of these flaws has to be detected. The knowledge of the probability of detection (POD) curves of specific flaws in specific testing conditions is often not present. This paper shows the results of a research project designed for POD determination of ultrasound phased array inspections of real and artificial cracks. The continuative objective of this project was to generate quantitative POD results. The distribution of the crack sizes of the specimens and the inspection planning is discussed, and results of the ultrasound inspections are presented. In

  4. Verification of space weather forecasts at the UK Met Office

    Science.gov (United States)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  5. On density forecast evaluation

    NARCIS (Netherlands)

    Diks, C.

    2008-01-01

    Traditionally, probability integral transforms (PITs) have been popular means for evaluating density forecasts. For an ideal density forecast, the PITs should be uniformly distributed on the unit interval and independent. However, this is only a necessary condition, and not a sufficient one, as

  6. Flood forecasting and uncertainty of precipitation forecasts

    International Nuclear Information System (INIS)

    Kobold, Mira; Suselj, Kay

    2004-01-01

    The timely and accurate flood forecasting is essential for the reliable flood warning. The effectiveness of flood warning is dependent on the forecast accuracy of certain physical parameters, such as the peak magnitude of the flood, its timing, location and duration. The conceptual rainfall - runoff models enable the estimation of these parameters and lead to useful operational forecasts. The accurate rainfall is the most important input into hydrological models. The input for the rainfall can be real time rain-gauges data, or weather radar data, or meteorological forecasted precipitation. The torrential nature of streams and fast runoff are characteristic for the most of the Slovenian rivers. Extensive damage is caused almost every year- by rainstorms affecting different regions of Slovenia' The lag time between rainfall and runoff is very short for Slovenian territory and on-line data are used only for now casting. Forecasted precipitations are necessary for hydrological forecast for some days ahead. ECMWF (European Centre for Medium-Range Weather Forecasts) gives general forecast for several days ahead while more detailed precipitation data with limited area ALADIN/Sl model are available for two days ahead. There is a certain degree of uncertainty using such precipitation forecasts based on meteorological models. The variability of precipitation is very high in Slovenia and the uncertainty of ECMWF predicted precipitation is very large for Slovenian territory. ECMWF model can predict precipitation events correctly, but underestimates amount of precipitation in general The average underestimation is about 60% for Slovenian region. The predictions of limited area ALADIN/Si model up to; 48 hours ahead show greater applicability in hydrological forecasting. The hydrological models are sensitive to precipitation input. The deviation of runoff is much bigger than the rainfall deviation. Runoff to rainfall error fraction is about 1.6. If spatial and time distribution

  7. A complex study on the reliability assessment of the containment of a PWR. Part I - Magnitude and probability of internal load behavior

    International Nuclear Information System (INIS)

    Augustin, W.; Kafka, P.

    1977-01-01

    For evaluation of the reliability of the safety enclosure in the case of accidents the time-dependent loads by internal pressure and temperature on the spheric steel containment and the correspondent probabilities had to be calculated. Of the spectrum of possible accidents, e.g. large LOCA which leads to a maximum pressure of approximately 4.7 bar. working of all safety systems presumed, small LOCA or rupture of a primary steam pipe, only those have been selected which result in a considerable increase of internal pressure in the safety containment. The pressure buildup in the steel containment depends roughly on the radioactive decay energy produced in the containment, on the performance of the safety systems operative after the accident and on the energy absorbed and transferred by the structural parts of the containment. For simplification the analysis of system behavior was performed in separate steps. Analysis was started by evaluation of alternate possibilities of pressure buildup depending on the function of different safety systems. Then the time dependent changes of temperature and pressure in the containment were calculated as well as the probabilities of the occurrence of the different maximum pressures. Technical data and accident event sequences describing the system analysed were taken from the PWR Biblis B, which at this time is typical for the PWR-line construction in the FRG. In order to avoid event sequences leading to complicated physical phenomena such sequences were selected which allowed well-defined description of consequences as hydrogen production by reaction of water with the Zircalloy fuel cladding or pressure buildup by CO 2 or steam generated from concrete getting in contact with the core-melt. The computer code ZOCO VI was used to calculate pressure buildup for the different event sequences. This code calculates time dependence of pressure and temperature in a multiply segmented safety containment considering accumulation and

  8. Ecological forecasts: An emerging imperative

    Science.gov (United States)

    James S. Clark; Steven R. Carpenter; Mary Barber; Scott Collins; Andy Dobson; Jonathan A. Foley; David M. Lodge; Mercedes Pascual; Roger Pielke; William Pizer; Cathy Pringle; Walter V. Reid; Kenneth A. Rose; Osvaldo Sala; William H. Schlesinger; Diana H. Wall; David Wear

    2001-01-01

    Planning and decision-making can be improved by access to reliable forecasts of ecosystem state, ecosystem services, and natural capital. Availability of new data sets, together with progress in computation and statistics, will increase our ability to forecast ecosystem change. An agenda that would lead toward a capacity to produce, evaluate, and communicate forecasts...

  9. Space Weather Forecasting at IZMIRAN

    Science.gov (United States)

    Gaidash, S. P.; Belov, A. V.; Abunina, M. A.; Abunin, A. A.

    2017-12-01

    Since 1998, the Institute of Terrestrial Magnetism, Ionosphere, and Radio Wave Propagation (IZMIRAN) has had an operating heliogeophysical service—the Center for Space Weather Forecasts. This center transfers the results of basic research in solar-terrestrial physics into daily forecasting of various space weather parameters for various lead times. The forecasts are promptly available to interested consumers. This article describes the center and the main types of forecasts it provides: solar and geomagnetic activity, magnetospheric electron fluxes, and probabilities of proton increases. The challenges associated with the forecasting of effects of coronal mass ejections and coronal holes are discussed. Verification data are provided for the center's forecasts.

  10. Medium Range Forecasts Representation (and Long Range Forecasts?)

    Science.gov (United States)

    Vincendon, J.-C.

    2009-09-01

    The progress of the numerical forecasts urges us to interest us in more and more distant ranges. We thus supply more and more forecasts with term of some days. Nevertheless, precautions of use are necessary to give the most reliable and the most relevant possible information. Available in a TV bulletin or on quite other support (Internet, mobile phone), the interpretation and the representation of a medium range forecast (5 - 15 days) must be different from those of a short range forecast. Indeed, the "foresee-ability” of a meteorological phenomenon decreases gradually in the course of the ranges, it decreases all the more quickly that the phenomenon is of small scale. So, at the end of some days, the probability character of a forecast becomes very widely dominating. That is why in Meteo-France the forecasts of D+4 to D+7 are accompanied with a confidence index since around ten years. It is a figure between 1 and 5: the more we approach 5, the more the confidence in the supplied forecast is good. In the practice, an indication is supplied for period D+4 / D+5, the other one for period D+6 / D+7, every day being able to benefit from a different forecast, that is be represented in a independent way. We thus supply a global tendency over 24 hours with less and less precise symbols as the range goes away. Concrete examples will be presented. From now on two years, we also publish forecasts to D+8 / J+9, accompanied with a sign of confidence (" good reliability " or " to confirm "). These two days are grouped together on a single map because for us, the described tendency to this term is relevant on a duration about 48 hours with a spatial scale slightly superior to the synoptic scale. So, we avoid producing more than two zones of types of weather over France and we content with giving an evolution for the temperatures (still, in increase or in decline). Newspapers began to publish this information, it should soon be the case of televisions. It is particularly

  11. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  12. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  13. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Directory of Open Access Journals (Sweden)

    A. Schepen

    2018-03-01

    Full Text Available Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S, which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  14. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Science.gov (United States)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  15. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  16. Day-Ahead Probabilistic Model for Scheduling the Operation of a Wind Pumped-Storage Hybrid Power Station: Overcoming Forecasting Errors to Ensure Reliability of Supply to the Grid

    Directory of Open Access Journals (Sweden)

    Jakub Jurasz

    2018-06-01

    Full Text Available Variable renewable energy sources (VRES, such as solarphotovoltaic (PV and wind turbines (WT, are starting to play a significant role in several energy systems around the globe. To overcome the problem of their non-dispatchable and stochastic nature, several approaches have been proposed so far. This paper describes a novel mathematical model for scheduling the operation of a wind-powered pumped-storage hydroelectricity (PSH hybrid for 25 to 48 h ahead. The model is based on mathematical programming and wind speed forecasts for the next 1 to 24 h, along with predicted upper reservoir occupancy for the 24th hour ahead. The results indicate that by coupling a 2-MW conventional wind turbine with a PSH of energy storing capacity equal to 54 MWh it is possible to significantly reduce the intraday energy generation coefficient of variation from 31% for pure wind turbine to 1.15% for a wind-powered PSH The scheduling errors calculated based on mean absolute percentage error (MAPE are significantly smaller for such a coupling than those seen for wind generation forecasts, at 2.39% and 27%, respectively. This is even stronger emphasized by the fact that, those for wind generation were calculated for forecasts made for the next 1 to 24 h, while those for scheduled generation were calculated for forecasts made for the next 25 to 48 h. The results clearly show that the proposed scheduling approach ensures the high reliability of the WT–PSH energy source.

  17. Problems of Forecast

    OpenAIRE

    Kucharavy , Dmitry; De Guio , Roland

    2005-01-01

    International audience; The ability to foresee future technology is a key task of Innovative Design. The paper focuses on the obstacles to reliable prediction of technological evolution for the purpose of Innovative Design. First, a brief analysis of problems for existing forecasting methods is presented. The causes for the complexity of technology prediction are discussed in the context of reduction of the forecast errors. Second, using a contradiction analysis, a set of problems related to ...

  18. Spatial electric load forecasting

    CERN Document Server

    Willis, H Lee

    2002-01-01

    Spatial Electric Load Forecasting Consumer Demand for Power and ReliabilityCoincidence and Load BehaviorLoad Curve and End-Use ModelingWeather and Electric LoadWeather Design Criteria and Forecast NormalizationSpatial Load Growth BehaviorSpatial Forecast Accuracy and Error MeasuresTrending MethodsSimulation Method: Basic ConceptsA Detailed Look at the Simulation MethodBasics of Computerized SimulationAnalytical Building Blocks for Spatial SimulationAdvanced Elements of Computerized SimulationHybrid Trending-Simulation MethodsAdvanced

  19. Forecast Combinations

    OpenAIRE

    Timmermann, Allan G

    2005-01-01

    Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the ex-ante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this paper we analyse theoretically the factors that determine the advantages from combining forecasts (for example, the d...

  20. Forecast combinations

    OpenAIRE

    Aiolfi, Marco; Capistrán, Carlos; Timmermann, Allan

    2010-01-01

    We consider combinations of subjective survey forecasts and model-based forecasts from linear and non-linear univariate specifications as well as multivariate factor-augmented models. Empirical results suggest that a simple equal-weighted average of survey forecasts outperform the best model-based forecasts for a majority of macroeconomic variables and forecast horizons. Additional improvements can in some cases be gained by using a simple equal-weighted average of survey and model-based fore...

  1. Incorporating forecast uncertainties into EENS for wind turbine studies

    Energy Technology Data Exchange (ETDEWEB)

    Toh, G.K.; Gooi, H.B. [School of EEE, Nanyang Technological University, Singapore 639798 (Singapore)

    2011-02-15

    The rapid increase in wind power generation around the world has stimulated the development of applicable technologies to model the uncertainties of wind power resulting from the stochastic nature of wind and fluctuations of demand for integration of wind turbine generators (WTGs). In this paper the load and wind power forecast errors are integrated into the expected energy not served (EENS) formulation through determination of probabilities using the normal distribution approach. The effects of forecast errors and wind energy penetration in the power system are traversed. The impact of wind energy penetration on system reliability, total cost for energy and reserve procurement is then studied for a conventional power system. The results show a degradation of system reliability with significant wind energy penetration in the generation system. This work provides a useful insight into system reliability and economics for the independent system operator (ISO) to deploy energy/reserve providers when WTGs are integrated into the existing power system. (author)

  2. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  3. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  4. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  5. Statistical Methods for Solar Flare Probability Forecasting.

    Science.gov (United States)

    1980-09-01

    TAu C a -. 66M7. stNIr’ICANCe .0000 -SOMEOSIS 0 IASYMEMTIC- a -. i55 MITM fLARPE- -D"NOroMT, *-.255S WITH APPLONG OCSEOE4T. 8OLa sSES 0 (S "MERrCTAX 0...8217 q~ - - -WS S0MEOSNSO 0 ASYHNETP1CI * 291312UI HLARE4 OPENS(N, - * 49645 MITM *Ec$POT OWITN E f. ONIRSVS 0 ESYHNETRICI a .36652 0I"El of HISSING ONSERVATIONS t 19 76 ILMEI

  6. Forecasting Skill

    Science.gov (United States)

    1981-01-01

    for the third and fourth day precipitation forecasts. A marked improvement was shown for the consensus 24 hour precipitation forecast, and small... Zuckerberg (1980) found a small long term skill increase in forecasts of heavy snow events for nine eastern cities. Other National Weather Service...and maximum temperature) are each awarded marks 2, 1, or 0 according to whether the forecast is correct, 8 - *- -**■*- ———"—- - -■ t0m 1 MM—IB I

  7. Pollutant forecasting error based on persistence of wind direction

    International Nuclear Information System (INIS)

    Cooper, R.E.

    1978-01-01

    The purpose of this report is to provide a means of estimating the reliability of forecasts of downwind pollutant concentrations from atmospheric puff releases. These forecasts are based on assuming the persistence of wind direction as determined at the time of release. This initial forecast will be used to deploy survey teams, to predict population centers that may be affected, and to estimate the amount of time available for emergency response. Reliability of forecasting is evaluated by developing a cumulative probability distribution of error as a function of lapsed time following an assumed release. The cumulative error is determined by comparing the forecast pollutant concentration with the concentration measured by sampling along the real-time meteorological trajectory. It may be concluded that the assumption of meteorological persistence for emergency response is not very good for periods longer than 3 hours. Even within this period, the possibiity for large error exists due to wind direction shifts. These shifts could affect population areas totally different from those areas first indicated

  8. Assessing the potential for improving S2S forecast skill through multimodel ensembling

    Science.gov (United States)

    Vigaud, N.; Robertson, A. W.; Tippett, M. K.; Wang, L.; Bell, M. J.

    2016-12-01

    Non-linear logistic regression is well suited to probability forecasting and has been successfully applied in the past to ensemble weather and climate predictions, providing access to the full probabilities distribution without any Gaussian assumption. However, little work has been done at sub-monthly lead times where relatively small re-forecast ensembles and lengths represent new challenges for which post-processing avenues have yet to be investigated. A promising approach consists in extending the definition of non-linear logistic regression by including the quantile of the forecast distribution as one of the predictors. So-called Extended Logistic Regression (ELR), which enables mutually consistent individual threshold probabilities, is here applied to ECMWF, CFSv2 and CMA re-forecasts from the S2S database in order to produce rainfall probabilities at weekly resolution. The ELR model is trained on seasonally-varying tercile categories computed for lead times of 1 to 4 weeks. It is then tested in a cross-validated manner, i.e. allowing real-time predictability applications, to produce rainfall tercile probabilities from individual weekly hindcasts that are finally combined by equal pooling. Results will be discussed over a broader North American region, where individual and MME forecasts generated out to 4 weeks lead are characterized by good probabilistic reliability but low sharpness, exhibiting systematically more skill in winter than summer.

  9. Estimating predictive hydrological uncertainty by dressing deterministic and ensemble forecasts; a comparison, with application to Meuse and Rhine

    Science.gov (United States)

    Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.

    2017-12-01

    Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability

  10. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  11. Empirical investigation on using wind speed volatility to estimate the operation probability and power output of wind turbines

    International Nuclear Information System (INIS)

    Liu, Heping; Shi, Jing; Qu, Xiuli

    2013-01-01

    Highlights: ► Ten-minute wind speed and power generation data of an offshore wind turbine are used. ► An ARMA–GARCH-M model is built to simultaneously forecast wind speed mean and volatility. ► The operation probability and expected power output of the wind turbine are predicted. ► The integrated approach produces more accurate wind power forecasting than other conventional methods. - Abstract: In this paper, we introduce a quantitative methodology that performs the interval estimation of wind speed, calculates the operation probability of wind turbine, and forecasts the wind power output. The technological advantage of this methodology stems from the empowered capability of mean and volatility forecasting of wind speed. Based on the real wind speed and corresponding wind power output data from an offshore wind turbine, this methodology is applied to build an ARMA–GARCH-M model for wind speed forecasting, and then to compute the operation probability and the expected power output of the wind turbine. The results show that the developed methodology is effective, the obtained interval estimation of wind speed is reliable, and the forecasted operation probability and expected wind power output of the wind turbine are accurate

  12. Flood Forecasting Based on TIGGE Precipitation Ensemble Forecast

    Directory of Open Access Journals (Sweden)

    Jinyin Ye

    2016-01-01

    Full Text Available TIGGE (THORPEX International Grand Global Ensemble was a major part of the THORPEX (Observing System Research and Predictability Experiment. It integrates ensemble precipitation products from all the major forecast centers in the world and provides systematic evaluation on the multimodel ensemble prediction system. Development of meteorologic-hydrologic coupled flood forecasting model and early warning model based on the TIGGE precipitation ensemble forecast can provide flood probability forecast, extend the lead time of the flood forecast, and gain more time for decision-makers to make the right decision. In this study, precipitation ensemble forecast products from ECMWF, NCEP, and CMA are used to drive distributed hydrologic model TOPX. We focus on Yi River catchment and aim to build a flood forecast and early warning system. The results show that the meteorologic-hydrologic coupled model can satisfactorily predict the flow-process of four flood events. The predicted occurrence time of peak discharges is close to the observations. However, the magnitude of the peak discharges is significantly different due to various performances of the ensemble prediction systems. The coupled forecasting model can accurately predict occurrence of the peak time and the corresponding risk probability of peak discharge based on the probability distribution of peak time and flood warning, which can provide users a strong theoretical foundation and valuable information as a promising new approach.

  13. A convection-allowing ensemble forecast based on the breeding growth mode and associated optimization of precipitation forecast

    Science.gov (United States)

    Li, Xiang; He, Hongrang; Chen, Chaohui; Miao, Ziqing; Bai, Shigang

    2017-10-01

    A convection-allowing ensemble forecast experiment on a squall line was conducted based on the breeding growth mode (BGM). Meanwhile, the probability matched mean (PMM) and neighborhood ensemble probability (NEP) methods were used to optimize the associated precipitation forecast. The ensemble forecast predicted the precipitation tendency accurately, which was closer to the observation than in the control forecast. For heavy rainfall, the precipitation center produced by the ensemble forecast was also better. The Fractions Skill Score (FSS) results indicated that the ensemble mean was skillful in light rainfall, while the PMM produced better probability distribution of precipitation for heavy rainfall. Preliminary results demonstrated that convection-allowing ensemble forecast could improve precipitation forecast skill through providing valuable probability forecasts. It is necessary to employ new methods, such as the PMM and NEP, to generate precipitation probability forecasts. Nonetheless, the lack of spread and the overprediction of precipitation by the ensemble members are still problems that need to be solved.

  14. Should we use seasonnal meteorological ensemble forecasts for hydrological forecasting? A case study for nordic watersheds in Canada.

    Science.gov (United States)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert; Guay, Catherine

    2017-04-01

    Hydro-electricity is a major source of energy for many countries throughout the world, including Canada. Long lead-time streamflow forecasts are all the more valuable as they help decision making and dam management. Different techniques exist for long-term hydrological forecasting. Perhaps the most well-known is 'Extended Streamflow Prediction' (ESP), which considers past meteorological scenarios as possible, often equiprobable, future scenarios. In the ESP framework, those past-observed meteorological scenarios (climatology) are used in turn as the inputs of a chosen hydrological model to produce ensemble forecasts (one member corresponding to each year in the available database). Many hydropower companies, including Hydro-Québec (province of Quebec, Canada) use variants of the above described ESP system operationally for long-term operation planning. The ESP system accounts for the hydrological initial conditions and for the natural variability of the meteorological variables. However, it cannot consider the current initial state of the atmosphere. Climate models can help remedy this drawback. In the context of a changing climate, dynamical forecasts issued from climate models seem to be an interesting avenue to improve upon the ESP method and could help hydropower companies to adapt their management practices to an evolving climate. Long-range forecasts from climate models can also be helpful for water management at locations where records of past meteorological conditions are short or nonexistent. In this study, we compare 7-month hydrological forecasts obtained from climate model outputs to an ESP system. The ESP system mimics the one used operationally at Hydro-Québec. The dynamical climate forecasts are produced by the European Center for Medium range Weather Forecasts (ECMWF) System4. Forecasts quality is assessed using numerical scores such as the Continuous Ranked Probability Score (CRPS) and the Ignorance score and also graphical tools such as the

  15. Load forecasting

    International Nuclear Information System (INIS)

    Mak, H.

    1995-01-01

    Slides used in a presentation at The Power of Change Conference in Vancouver, BC in April 1995 about the changing needs for load forecasting were presented. Technological innovations and population increase were said to be the prime driving forces behind the changing needs in load forecasting. Structural changes, market place changes, electricity supply planning changes, and changes in planning objectives were other factors discussed. It was concluded that load forecasting was a form of information gathering, that provided important market intelligence

  16. Spatial load forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Willis, H.L.; Engel, M.V.; Buri, M.J.

    1995-04-01

    The reliability, efficiency, and economy of a power delivery system depend mainly on how well its substations, transmission lines, and distribution feeders are located within the utility service area, and how well their capacities match power needs in their respective localities. Often, utility planners are forced to commit to sites, rights of way, and equipment capacities year in advance. A necessary element of effective expansion planning is a forecast of where and how much demand must be served by the future T and D system. This article reports that a three-stage method forecasts with accuracy and detail, allowing meaningful determination of sties and sizes for future substation, transmission, and distribution facilities.

  17. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  18. Wind power forecast

    Energy Technology Data Exchange (ETDEWEB)

    Pestana, Rui [Rede Electrica Nacional (REN), S.A., Lisboa (Portugal). Dept. Systems and Development System Operator; Trancoso, Ana Rosa; Delgado Domingos, Jose [Univ. Tecnica de Lisboa (Portugal). Seccao de Ambiente e Energia

    2012-07-01

    Accurate wind power forecast are needed to reduce integration costs in the electric grid caused by wind inherent variability. Currently, Portugal has a significant wind power penetration level and consequently the need to have reliable wind power forecasts at different temporal scales, including localized events such as ramps. This paper provides an overview of the methodologies used by REN to forecast wind power at national level, based on statistical and probabilistic combinations of NWP and measured data with the aim of improving accuracy of pure NWP. Results show that significant improvement can be achieved with statistical combination with persistence in the short-term and with probabilistic combination in the medium-term. NWP are also able to detect ramp events with 3 day notice to the operational planning. (orig.)

  19. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  20. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  2. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  3. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  6. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  7. Time-dependent non-probability reliability analysis of corroded pipeline in service%在役腐蚀管道动态非概率可靠性分析

    Institute of Scientific and Technical Information of China (English)

    魏宗平

    2014-01-01

    Corrosion is one of the primary failure mode of pressurized pipelines .Research on relia-bility of the corroded pipelines has important theoretical significance and application value .Ade-quate data is necessary for probabilistic reliability model and fuzzy reliability model used to ana-lyze the reliability of corroded pipelines .In case of insufficient information was available ,the un-certain parameters of yield strength ,pipeline diameter ,defect depths ,operating pressure and so on were described as interval variables to make up the gap by using the uncertain information of the corroded pipelines well .Time-dependent non-probability reliability model of the corroded pipeline in service was established based on the interval model .A simple method for the remaining life prediction of the corroded pipelines was given .A numerical example had been used to illustrate the proposed method .T he result show s that the proposed method is feasible and reasonable and procticable .Finally a sensitivity analysis was carried out on interval variables involved in the problem .The effects of the coefficient of variation of pipeline wall thickness ,defect depths ,oper-ating pressures and corrosion velocity on non-probability reliability index of the corroded pipelines were evaluated .The results of sensitivity analysis indicate that the non-probability reliability in-dex is the most sensitive to the coefficient of variation of pipeline wall thickness interval variable .%腐蚀失效是压力管道失效的主要形式之一,研究腐蚀管道的可靠性具有重要理论意义和应用价值。在对腐蚀管道可靠性分析时,概率可靠性模型和模糊可靠性模型对于数据信息的要求较高。而在掌握不确定性信息很少情况下,为了充分利用管道的不确定性信息弥补原始数据的不足,可将腐蚀管道可靠性分析中的材料屈服强度、管道直径、缺陷深度和操作压力等不确定参数视为区间变量,

  8. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  9. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  10. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  11. An application and verification of ensemble forecasting on wind power to assess operational risk indicators in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, S.; Ciapessoni, E.; Cirio, D.; Pitto, A.; Sperati, S. [Ricerca sul Sistema Energetico RSE S.p.A., Milan (Italy). Power System Development Dept. and Environment and Sustainable Development Dept.; Pinson, P. [Technical University of Denmark, Lyngby (Denmark). DTU Informatics

    2012-07-01

    Wind energy is part of the so-called not schedulable renewable sources, i.e. it must be exploited when it is available, otherwise it is lost. In European regulation it has priority of dispatch over conventional generation, to maximize green energy production. However, being variable and uncertain, wind (and solar) generation raises several issues for the security of the power grids operation. In particular, Transmission System Operators (TSOs) need as accurate as possible forecasts. Nowadays a deterministic approach in wind power forecasting (WPF) could easily be considered insufficient to face the uncertainty associated to wind energy. In order to obtain information about the accuracy of a forecast and a reliable estimation of its uncertainty, probabilistic forecasting is becoming increasingly widespread. In this paper we investigate the performances of the COnsortium for Small-scale MOdelling Limited area Ensemble Prediction System (COSMO-LEPS). First the ensemble application is followed by assessment of its properties (i.e. consistency, reliability) using different verification indices and diagrams calculated on wind power. Then we provide examples of how EPS based wind power forecast can be used in power system security analyses. Quantifying the forecast uncertainty allows to determine more accurately the regulation reserve requirements, hence improving security of operation and reducing system costs. In particular, the paper also presents a probabilistic power flow (PPF) technique developed at RSE and aimed to evaluate the impact of wind power forecast accuracy on the probability of security violations in power systems. (orig.)

  12. Exposure Forecaster

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Exposure Forecaster Database (ExpoCastDB) is EPA's database for aggregating chemical exposure information and can be used to help with chemical exposure...

  13. Strategic Forecasting

    DEFF Research Database (Denmark)

    Duus, Henrik Johannsen

    2016-01-01

    Purpose: The purpose of this article is to present an overview of the area of strategic forecasting and its research directions and to put forward some ideas for improving management decisions. Design/methodology/approach: This article is conceptual but also informed by the author’s long contact...... and collaboration with various business firms. It starts by presenting an overview of the area and argues that the area is as much a way of thinking as a toolbox of theories and methodologies. It then spells out a number of research directions and ideas for management. Findings: Strategic forecasting is seen...... as a rebirth of long range planning, albeit with new methods and theories. Firms should make the building of strategic forecasting capability a priority. Research limitations/implications: The article subdivides strategic forecasting into three research avenues and suggests avenues for further research efforts...

  14. Verification of ECMWF System 4 for seasonal hydrological forecasting in a northern climate

    Science.gov (United States)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert

    2017-11-01

    Hydropower production requires optimal dam and reservoir management to prevent flooding damage and avoid operation losses. In a northern climate, where spring freshet constitutes the main inflow volume, seasonal forecasts can help to establish a yearly strategy. Long-term hydrological forecasts often rely on past observations of streamflow or meteorological data. Another alternative is to use ensemble meteorological forecasts produced by climate models. In this paper, those produced by the ECMWF (European Centre for Medium-Range Forecast) System 4 are examined and bias is characterized. Bias correction, through the linear scaling method, improves the performance of the raw ensemble meteorological forecasts in terms of continuous ranked probability score (CRPS). Then, three seasonal ensemble hydrological forecasting systems are compared: (1) the climatology of simulated streamflow, (2) the ensemble hydrological forecasts based on climatology (ESP) and (3) the hydrological forecasts based on bias-corrected ensemble meteorological forecasts from System 4 (corr-DSP). Simulated streamflow computed using observed meteorological data is used as benchmark. Accounting for initial conditions is valuable even for long-term forecasts. ESP and corr-DSP both outperform the climatology of simulated streamflow for lead times from 1 to 5 months depending on the season and watershed. Integrating information about future meteorological conditions also improves monthly volume forecasts. For the 1-month lead time, a gain exists for almost all watersheds during winter, summer and fall. However, volume forecasts performance for spring varies from one watershed to another. For most of them, the performance is close to the performance of ESP. For longer lead times, the CRPS skill score is mostly in favour of ESP, even if for many watersheds, ESP and corr-DSP have comparable skill. Corr-DSP appears quite reliable but, in some cases, under-dispersion or bias is observed. A more complex bias

  15. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  16. Forecast Combination under Heavy-Tailed Errors

    Directory of Open Access Journals (Sweden)

    Gang Cheng

    2015-11-01

    Full Text Available Forecast combination has been proven to be a very important technique to obtain accurate predictions for various applications in economics, finance, marketing and many other areas. In many applications, forecast errors exhibit heavy-tailed behaviors for various reasons. Unfortunately, to our knowledge, little has been done to obtain reliable forecast combinations for such situations. The familiar forecast combination methods, such as simple average, least squares regression or those based on the variance-covariance of the forecasts, may perform very poorly due to the fact that outliers tend to occur, and they make these methods have unstable weights, leading to un-robust forecasts. To address this problem, in this paper, we propose two nonparametric forecast combination methods. One is specially proposed for the situations in which the forecast errors are strongly believed to have heavy tails that can be modeled by a scaled Student’s t-distribution; the other is designed for relatively more general situations when there is a lack of strong or consistent evidence on the tail behaviors of the forecast errors due to a shortage of data and/or an evolving data-generating process. Adaptive risk bounds of both methods are developed. They show that the resulting combined forecasts yield near optimal mean forecast errors relative to the candidate forecasts. Simulations and a real example demonstrate their superior performance in that they indeed tend to have significantly smaller prediction errors than the previous combination methods in the presence of forecast outliers.

  17. Real-time emergency forecasting technique for situation management systems

    Science.gov (United States)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  18. Reply to "Comment on 'Nonparametric forecasting of low-dimensional dynamical systems' ".

    Science.gov (United States)

    Berry, Tyrus; Giannakis, Dimitrios; Harlim, John

    2016-03-01

    In this Reply we provide additional results which allow a better comparison of the diffusion forecast and the "past-noise" forecasting (PNF) approach for the El Niño index. We remark on some qualitative differences between the diffusion forecast and PNF, and we suggest an alternative use of the diffusion forecast for the purposes of forecasting the probabilities of extreme events.

  19. Using HPC within an operational forecasting configuration

    Science.gov (United States)

    Jagers, H. R. A.; Genseberger, M.; van den Broek, M. A. F. H.

    2012-04-01

    Various natural disasters are caused by high-intensity events, for example: extreme rainfall can in a short time cause major damage in river catchments, storms can cause havoc in coastal areas. To assist emergency response teams in operational decisions, it's important to have reliable information and predictions as soon as possible. This starts before the event by providing early warnings about imminent risks and estimated probabilities of possible scenarios. In the context of various applications worldwide, Deltares has developed an open and highly configurable forecasting and early warning system: Delft-FEWS. Finding the right balance between simulation time (and hence prediction lead time) and simulation accuracy and detail is challenging. Model resolution may be crucial to capture certain critical physical processes. Uncertainty in forcing conditions may require running large ensembles of models; data assimilation techniques may require additional ensembles and repeated simulations. The computational demand is steadily increasing and data streams become bigger. Using HPC resources is a logical step; in different settings Delft-FEWS has been configured to take advantage of distributed computational resources available to improve and accelerate the forecasting process (e.g. Montanari et al, 2006). We will illustrate the system by means of a couple of practical applications including the real-time dynamic forecasting of wind driven waves, flow of water, and wave overtopping at dikes of Lake IJssel and neighboring lakes in the center of The Netherlands. Montanari et al., 2006. Development of an ensemble flood forecasting system for the Po river basin, First MAP D-PHASE Scientific Meeting, 6-8 November 2006, Vienna, Austria.

  20. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  1. Forecasting metal prices: Do forecasters herd?

    DEFF Research Database (Denmark)

    Pierdzioch, C.; Rulke, J. C.; Stadtmann, G.

    2013-01-01

    We analyze more than 20,000 forecasts of nine metal prices at four different forecast horizons. We document that forecasts are heterogeneous and report that anti-herding appears to be a source of this heterogeneity. Forecaster anti-herding reflects strategic interactions among forecasters...

  2. Predictive Uncertainty Estimation in Water Demand Forecasting Using the Model Conditional Processor

    Directory of Open Access Journals (Sweden)

    Amos O. Anele

    2018-04-01

    Full Text Available In a previous paper, a number of potential models for short-term water demand (STWD prediction have been analysed to find the ones with the best fit. The results obtained in Anele et al. (2017 showed that hybrid models may be considered as the accurate and appropriate forecasting models for STWD prediction. However, such best single valued forecast does not guarantee reliable and robust decisions, which can be properly obtained via model uncertainty processors (MUPs. MUPs provide an estimate of the full predictive densities and not only the single valued expected prediction. Amongst other MUPs, the purpose of this paper is to use the multi-variate version of the model conditional processor (MCP, proposed by Todini (2008, to demonstrate how the estimation of the predictive probability conditional to a number of relatively good predictive models may improve our knowledge, thus reducing the predictive uncertainty (PU when forecasting into the unknown future. Through the MCP approach, the probability distribution of the future water demand can be assessed depending on the forecast provided by one or more deterministic forecasting models. Based on an average weekly data of 168 h, the probability density of the future demand is built conditional on three models’ predictions, namely the autoregressive-moving average (ARMA, feed-forward back propagation neural network (FFBP-NN and hybrid model (i.e., combined forecast from ARMA and FFBP-NN. The results obtained show that MCP may be effectively used for real-time STWD prediction since it brings out the PU connected to its forecast, and such information could help water utilities estimate the risk connected to a decision.

  3. Hybrid PSO–SVM-based method for forecasting of the remaining useful life for aircraft engines and evaluation of its reliability

    International Nuclear Information System (INIS)

    García Nieto, P.J.; García-Gonzalo, E.; Sánchez Lasheras, F.; Cos Juez, F.J. de

    2015-01-01

    The present paper describes a hybrid PSO–SVM-based model for the prediction of the remaining useful life of aircraft engines. The proposed hybrid model combines support vector machines (SVMs), which have been successfully adopted for regression problems, with the particle swarm optimization (PSO) technique. This optimization technique involves kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. However, its use in reliability applications has not been yet widely explored. Bearing this in mind, remaining useful life values have been predicted here by using the hybrid PSO–SVM-based model from the remaining measured parameters (input variables) for aircraft engines with success. A coefficient of determination equal to 0.9034 was obtained when this hybrid PSO–RBF–SVM-based model was applied to experimental data. The agreement of this model with experimental data confirmed its good performance. One of the main advantages of this predictive model is that it does not require information about the previous operation states of the engine. Finally, the main conclusions of this study are exposed. - Highlights: • A hybrid PSO–SVM-based model is built as a predictive model of the RUL values for aircraft engines. • The remaining physical–chemical variables in this process are studied in depth. • The obtained regression accuracy of our method is about 95%. • The results show that PSO–SVM-based model can assist in the diagnosis of the RUL values with accuracy

  4. Novel methodology for pharmaceutical expenditure forecast

    OpenAIRE

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aball?a, Samuel; R?muzat, C?cile; Urbinati, Duccio; Kornfeld, ?sa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective: The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical e...

  5. SHORT-TERM FORECASTING OF MORTGAGE LENDING

    Directory of Open Access Journals (Sweden)

    Irina V. Orlova

    2013-01-01

    Full Text Available The article considers the methodological and algorithmic problems arising in modeling and forecasting of time series of mortgage loans. Focuses on the processes of formation of the levels of time series of mortgage loans and the problem of choice and identification of models in the conditions of small samples. For forecasting options are selected and implemented a model of autoregressive and moving average, which allowed to obtain reliable forecasts.

  6. [Combined forecasting system of peritonitis outcome].

    Science.gov (United States)

    Lebedev, N V; Klimov, A E; Agrba, S B; Gaidukevich, E K

    To create a reliable system for assessing of severity and prediction of the outcome of peritonitis. Critical analysis of the systems for peritonitis severity assessment is presented. The study included outcomes of 347 patients who admitted at the Department of Faculty Surgery of Peoples' Friendship University of Russia in 2015-2016. The cause of peritonitis were destructive forms of acute appendicitis, cholecystitis, perforated gastroduodenal ulcer, various perforation of small and large intestines (including tumor). Combined forecasting system for peritonitis severity assessment is created. The system includes clinical, laboratory data, assessment of systemic inflammatory response (SIRS) and severity of organ failure (qSOFA). The authors focused on easily identifiable parameters which are available in virtually any surgical hospital. Threshold value (lethal outcome probability over 50%) is 8 scores in this system. Sensitivity, specificity and accuracy were 93.3, 99.7 and 98.9%, respectively according to ROC-curve that exceeds those parameters of MPI and APACHE II.

  7. PyForecastTools

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    The PyForecastTools package provides Python routines for calculating metrics for model validation, forecast verification and model comparison. For continuous predictands the package provides functions for calculating bias (mean error, mean percentage error, median log accuracy, symmetric signed bias), and for calculating accuracy (mean squared error, mean absolute error, mean absolute scaled error, normalized RMSE, median symmetric accuracy). Convenience routines to calculate the component parts (e.g. forecast error, scaled error) of each metric are also provided. To compare models the package provides: generic skill score; percent better. Robust measures of scale including median absolute deviation, robust standard deviation, robust coefficient of variation and the Sn estimator are all provided by the package. Finally, the package implements Python classes for NxN contingency tables. In the case of a multi-class prediction, accuracy and skill metrics such as proportion correct and the Heidke and Peirce skill scores are provided as object methods. The special case of a 2x2 contingency table inherits from the NxN class and provides many additional metrics for binary classification: probability of detection, probability of false detection, false alarm ration, threat score, equitable threat score, bias. Confidence intervals for many of these quantities can be calculated using either the Wald method or Agresti-Coull intervals.

  8. Bulk electric system reliability evaluation incorporating wind power and demand side management

    Science.gov (United States)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed

  9. Stochastic model of forecasting spare parts demand

    Directory of Open Access Journals (Sweden)

    Ivan S. Milojević

    2012-01-01

    hypothesis of the existence of phenomenon change trends, the next step in the methodology of forecasting is the determination of a specific growth curve that describes the regularity of the development in time. These curves of growth are obtained by the analytical representation (expression of dynamic lines. There are two basic stages in the process of expression and they are: - The choice of the type of curve the shape of which corresponds to the character of the dynamic order variation - the determination of the number of values (evaluation of the curve parameters. The most widespread method of forecasting is the trend extrapolation. The basis of the trend extrapolation is the continuing of past trends in the future. The simplicity of the trend extrapolation process, on the one hand, and the absence of other information on the other hand, are the main reasons why the trend extrapolation is used for forecasting. The trend extrapolation is founded on the following assumptions: - The phenomenon development can be presented as an evolutionary trajectory or trend, - General conditions that influenced the trend development in the past will not undergo substantial changes in the future. Spare parts demand forecasting is constantly being done in all warehouses, workshops, and at all levels. Without demand forecasting, neither planning nor decision making can be done. Demand forecasting is the input for determining the level of reserve, size of the order, ordering cycles, etc. The question that arises is the one of the reliability and accuracy of a forecast and its effects. Forecasting 'by feeling' is not to be dismissed if there is nothing better, but in this case, one must be prepared for forecasting failures that cause unnecessary accumulation of certain spare parts, and also a chronic shortage of other spare parts. All this significantly increases costs and does not provide a satisfactory supply of spare parts. The main problem of the application of this model is that each

  10. Operational foreshock forecasting: Fifteen years after

    Science.gov (United States)

    Ogata, Y.

    2010-12-01

    We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to

  11. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  12. Global-warming forecasting models

    International Nuclear Information System (INIS)

    Moeller, K.P.

    1992-01-01

    In spite of an annual man-made quantity of about 20 billion tons, carbon dioxide has remained a trace gas in the atmosphere (350 ppm at present). The reliability of model calculations which forecast temperatures is dicussed in view of the world-wide increase in carbon dioxides. Computer simulations reveal a general, serious threat to the future of mankind. (DG) [de

  13. Quantifying probabilities of eruptions at Mount Etna (Sicily, Italy).

    Science.gov (United States)

    Brancato, Alfonso

    2010-05-01

    One of the major goals of modern volcanology is to set up sound risk-based decision-making in land-use planning and emergency management. Volcanic hazard must be managed with reliable estimates of quantitative long- and short-term eruption forecasting, but the large number of observables involved in a volcanic process suggests that a probabilistic approach could be a suitable tool in forecasting. The aim of this work is to quantify probabilistic estimate of the vent location for a suitable lava flow hazard assessment at Mt. Etna volcano, through the application of the code named BET (Marzocchi et al., 2004, 2008). The BET_EF model is based on the event tree philosophy assessed by Newhall and Hoblitt (2002), further developing the concept of vent location, epistemic uncertainties, and a fuzzy approach for monitoring measurements. A Bayesian event tree is a specialized branching graphical representation of events in which individual branches are alternative steps from a general prior event, and evolving into increasingly specific subsequent states. Then, the event tree attempts to graphically display all relevant possible outcomes of volcanic unrest in progressively higher levels of detail. The procedure is set to estimate an a priori probability distribution based upon theoretical knowledge, to accommodate it by using past data, and to modify it further by using current monitoring data. For the long-term forecasting, an a priori model, dealing with the present tectonic and volcanic structure of the Mt. Etna, is considered. The model is mainly based on past vent locations and fracture location datasets (XX century of eruptive history of the volcano). Considering the variation of the information through time, and their relationship with the structural setting of the volcano, datasets we are also able to define an a posteriori probability map for next vent opening. For short-term forecasting vent opening hazard assessment, the monitoring has a leading role, primarily

  14. Forecasting effects of global warming on biodiversity

    DEFF Research Database (Denmark)

    Botkin, D.B.; Saxe, H.; Araújo, M.B.

    2007-01-01

    The demand for accurate forecasting of the effects of global warming on biodiversity is growing, but current methods for forecasting have limitations. In this article, we compare and discuss the different uses of four forecasting methods: (1) models that consider species individually, (2) niche...... and theoretical ecological results suggest that many species could be at risk from global warming, during the recent ice ages surprisingly few species became extinct. The potential resolution of this conundrum gives insights into the requirements for more accurate and reliable forecasting. Our eight suggestions...

  15. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    Science.gov (United States)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  16. A methodology for Electric Power Load Forecasting

    Directory of Open Access Journals (Sweden)

    Eisa Almeshaiei

    2011-06-01

    Full Text Available Electricity demand forecasting is a central and integral process for planning periodical operations and facility expansion in the electricity sector. Demand pattern is almost very complex due to the deregulation of energy markets. Therefore, finding an appropriate forecasting model for a specific electricity network is not an easy task. Although many forecasting methods were developed, none can be generalized for all demand patterns. Therefore, this paper presents a pragmatic methodology that can be used as a guide to construct Electric Power Load Forecasting models. This methodology is mainly based on decomposition and segmentation of the load time series. Several statistical analyses are involved to study the load features and forecasting precision such as moving average and probability plots of load noise. Real daily load data from Kuwaiti electric network are used as a case study. Some results are reported to guide forecasting future needs of this network.

  17. Magnetogram Forecast: An All-Clear Space Weather Forecasting System

    Science.gov (United States)

    Barghouty, Nasser; Falconer, David

    2015-01-01

    Solar flares and coronal mass ejections (CMEs) are the drivers of severe space weather. Forecasting the probability of their occurrence is critical in improving space weather forecasts. The National Oceanic and Atmospheric Administration (NOAA) currently uses the McIntosh active region category system, in which each active region on the disk is assigned to one of 60 categories, and uses the historical flare rates of that category to make an initial forecast that can then be adjusted by the NOAA forecaster. Flares and CMEs are caused by the sudden release of energy from the coronal magnetic field by magnetic reconnection. It is believed that the rate of flare and CME occurrence in an active region is correlated with the free energy of an active region. While the free energy cannot be measured directly with present observations, proxies of the free energy can instead be used to characterize the relative free energy of an active region. The Magnetogram Forecast (MAG4) (output is available at the Community Coordinated Modeling Center) was conceived and designed to be a databased, all-clear forecasting system to support the operational goals of NASA's Space Radiation Analysis Group. The MAG4 system automatically downloads nearreal- time line-of-sight Helioseismic and Magnetic Imager (HMI) magnetograms on the Solar Dynamics Observatory (SDO) satellite, identifies active regions on the solar disk, measures a free-energy proxy, and then applies forecasting curves to convert the free-energy proxy into predicted event rates for X-class flares, M- and X-class flares, CMEs, fast CMEs, and solar energetic particle events (SPEs). The forecast curves themselves are derived from a sample of 40,000 magnetograms from 1,300 active region samples, observed by the Solar and Heliospheric Observatory Michelson Doppler Imager. Figure 1 is an example of MAG4 visual output

  18. Predictor-weighting strategies for probabilistic wind power forecasting with an analog ensemble

    Directory of Open Access Journals (Sweden)

    Constantin Junk

    2015-04-01

    Full Text Available Unlike deterministic forecasts, probabilistic predictions provide estimates of uncertainty, which is an additional value for decision-making. Previous studies have proposed the analog ensemble (AnEn, which is a technique to generate uncertainty information from a purely deterministic forecast. The objective of this study is to improve the AnEn performance for wind power forecasts by developing static and dynamic weighting strategies, which optimize the predictor combination with a brute-force continuous ranked probability score (CRPS minimization and a principal component analysis (PCA of the predictors. Predictors are taken from the high-resolution deterministic forecasts of the European Centre for Medium-Range Weather Forecasts (ECMWF, including forecasts of wind at several heights, geopotential height, pressure, and temperature, among others. The weighting strategies are compared at five wind farms in Europe and the U.S. situated in regions with different terrain complexity, both on and offshore, and significantly improve the deterministic and probabilistic AnEn forecast performance compared to the AnEn with 10‑m wind speed and direction as predictors and compared to PCA-based approaches. The AnEn methodology also provides reliable estimation of the forecast uncertainty. The optimized predictor combinations are strongly dependent on terrain complexity, local wind regimes, and atmospheric stratification. Since the proposed predictor-weighting strategies can accomplish both the selection of relevant predictors as well as finding their optimal weights, the AnEn performance is improved by up to 20 % at on and offshore sites.

  19. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  20. Exploiting Domain Knowledge to Forecast Heating Oil Consumption

    Science.gov (United States)

    Corliss, George F.; Sakauchi, Tsuginosuke; Vitullo, Steven R.; Brown, Ronald H.

    2011-11-01

    The GasDay laboratory at Marquette University provides forecasts of energy consumption. One such service is the Heating Oil Forecaster, a service for a heating oil or propane delivery company. Accurate forecasts can help reduce the number of trucks and drivers while providing efficient inventory management by stretching the time between deliveries. Accurate forecasts help retain valuable customers. If a customer runs out of fuel, the delivery service incurs costs for an emergency delivery and often a service call. Further, the customer probably changes providers. The basic modeling is simple: Fit delivery amounts sk to cumulative Heating Degree Days (HDDk = Σmax(0,60 °F—daily average temperature)), with wind adjustment, for each delivery period: sk≈ŝk = β0+β1HDDk. For the first few deliveries, there is not enough data to provide a reliable estimate K = 1/β1 so we use Bayesian techniques with priors constructed from historical data. A fresh model is trained for each customer with each delivery, producing daily consumption forecasts using actual and forecast weather until the next delivery. In practice, a delivery may not fill the oil tank if the delivery truck runs out of oil or the automatic shut-off activates prematurely. Special outlier detection and recovery based on domain knowledge addresses this and other special cases. The error at each delivery is the difference between that delivery and the aggregate of daily forecasts using actual weather since the preceding delivery. Out-of-sample testing yields MAPE = 21.2% and an average error of 6.0% of tank capacity for Company A. The MAPE and an average error as a percentage of tank capacity for Company B are 31.5 % and 6.6 %, respectively. One heating oil delivery company who uses this forecasting service [1] reported instances of a customer running out of oil reduced from about 250 in 50,000 deliveries per year before contracting for our service to about 10 with our service. They delivered slightly more

  1. Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error

    Science.gov (United States)

    Joslyn, Susan L.; LeClerc, Jared E.

    2012-01-01

    Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…

  2. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  3. Ensemble forecasting using sequential aggregation for photovoltaic power applications

    International Nuclear Information System (INIS)

    Thorey, Jean

    2017-01-01

    Our main objective is to improve the quality of photovoltaic power forecasts deriving from weather forecasts. Such forecasts are imperfect due to meteorological uncertainties and statistical modeling inaccuracies in the conversion of weather forecasts to power forecasts. First we gather several weather forecasts, secondly we generate multiple photovoltaic power forecasts, and finally we build linear combinations of the power forecasts. The minimization of the Continuous Ranked Probability Score (CRPS) allows to statistically calibrate the combination of these forecasts, and provides probabilistic forecasts under the form of a weighted empirical distribution function. We investigate the CRPS bias in this context and several properties of scoring rules which can be seen as a sum of quantile-weighted losses or a sum of threshold-weighted losses. The minimization procedure is achieved with online learning techniques. Such techniques come with theoretical guarantees of robustness on the predictive power of the combination of the forecasts. Essentially no assumptions are needed for the theoretical guarantees to hold. The proposed methods are applied to the forecast of solar radiation using satellite data, and the forecast of photovoltaic power based on high-resolution weather forecasts and standard ensembles of forecasts. (author) [fr

  4. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  5. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  6. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  7. Earthquake focal mechanism forecasting in Italy for PSHA purposes

    Science.gov (United States)

    Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola

    2018-01-01

    In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.

  8. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  9. kosh Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kpdt Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kewr Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. kiso Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. kpga Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. kbkw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. ktcl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. pgwt Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kpsp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kbih Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kdnl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kart Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. kilm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. kpne Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. kabi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. ptpn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kblf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. panc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. kpbi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. kgdv Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. kcmx Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kdls Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. koaj Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. krhi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. kbpk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. khuf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kbpi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. ktrk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kwmc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. katy Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. tjmz Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kdet Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. kcxp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. kbur Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. krkd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. pawg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kloz Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kcec Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. kdec Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. paor Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. kavl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kdrt Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kstl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. kbfi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. khsv Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. pafa Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kekn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. tncm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kith Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kgnv Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. ktoi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kgso Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. nstu Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. kmgm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. khib Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. pavd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kfar Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kluk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. kwwr Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. klse Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. ksts Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. koth Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kbfl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. ksgf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. kpkb Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. krog Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kbjc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. ksea Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kbwi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kftw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kpuw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kabq Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. ksny Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. khio Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. klaf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. kfoe Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. ksmx Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kipt Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. klch Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. kink Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. krut Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kbli Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kaoo Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. klit Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. ktup Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. ktop Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. klax Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. kprc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. katl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kmcn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kogb Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kama Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. ptkk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. kiwa Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. kavp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. kdca Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kbwg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kdfw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. kssi Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. pahn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. ksrq Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kpvd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kisp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. kttd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. pmdy Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. kont Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kyng Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. kcwa Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kflg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. krsw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kmyl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. krbg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. kril Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. ksus Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. padq Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. kbil Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. krfd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kdug Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. ktix Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. kcod Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. kslk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kgfl Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. kguc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. kmlu Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. kbff Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. ksmn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kdro Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. kmce Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. ktpa Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kmot Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kcre Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. klws Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. kotm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. khqm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. kabr Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. klal Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kelp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kecg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. khbg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. kpbf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. konp Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. pkwa Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. ktvf Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. paga Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. khks Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. kdsm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. kpsm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. kgrb Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kgmu Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. papg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kbgm Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. pamc Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. klrd Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. ksan Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. patk Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. kowb Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. klru Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kfxe Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. kjct Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  8. kcrg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  9. paaq Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  10. kaex Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  11. klbx Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  12. kmia Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  13. kpit Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  14. kcrw Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  15. paen Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  16. kast Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  17. kuin Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  18. kmht Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  19. kcys Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  20. kflo Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  1. pakn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  2. pabt Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  3. krdg Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  4. khdn Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  5. kjac Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  6. kphx Terminal Aerodrome Forecast

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — TAF (terminal aerodrome forecast or terminal area forecast) is a format for reporting weather forecast information, particularly as it relates to aviation. TAFs are...

  7. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  8. Robust forecast comparison

    OpenAIRE

    Jin, Sainan; Corradi, Valentina; Swanson, Norman

    2015-01-01

    Forecast accuracy is typically measured in terms of a given loss function. However, as a consequence of the use of misspecified models in multiple model comparisons, relative forecast rankings are loss function dependent. This paper addresses this issue by using a novel criterion for forecast evaluation which is based on the entire distribution of forecast errors. We introduce the concepts of general-loss (GL) forecast superiority and convex-loss (CL) forecast superiority, and we establish a ...

  9. On the validity of cosmological Fisher matrix forecasts

    International Nuclear Information System (INIS)

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen; Giannantonio, Tommaso

    2012-01-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w 0 and w a . For purely geometrical probes, and especially when marginalising over w a , we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function. More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts

  10. Forecaster Behaviour and Bias in Macroeconomic Forecasts

    OpenAIRE

    Roy Batchelor

    2007-01-01

    This paper documents the presence of systematic bias in the real GDP and inflation forecasts of private sector forecasters in the G7 economies in the years 1990–2005. The data come from the monthly Consensus Economics forecasting service, and bias is measured and tested for significance using parametric fixed effect panel regressions and nonparametric tests on accuracy ranks. We examine patterns across countries and forecasters to establish whether the bias reflects the inefficient use of i...

  11. Concerning the justiciability of demand forecasts

    International Nuclear Information System (INIS)

    Nierhaus, M.

    1977-01-01

    This subject plays at present in particular a role in the course of judicial examinations of immediately enforceable orders for the partial construction licences of nuclear power plants. The author distinguishes beween three kinds of forecast decisions: 1. Appraising forecast decisions with standards of judgment taken mainly from the fields of the art, culture, morality, religion are, according to the author, only legally verifyable to a limited extent. 2. With regard to forecast decisions not arguable, e.g. where the future behaviour of persons is concerned, the same should be applied basically. 3. In contrast to this, the following is applicable for programmatic, proceedingslike, or creative forecast decisions, in particular in economics: 'An administrative estimation privilege in a prognostic sense with the consequence that the court has to accept the forecast decision which lies within the forecast margins and which cannot be disproved, and that the court may not replace this forecast decision by its own probability judgment. In these cases, administration has the right to create its own forecast standards.' Judicial control in these cases was limited to certain substantive and procedural mistakes made by the administration in the course of forecast decision finding. (orig./HP) [de

  12. Concerning the justiciability of demand forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Nierhaus, M [Koeln Univ. (Germany, F.R.)

    1977-01-01

    This subject plays at present in particular a role in the course of judicial examinations of immediately enforceable orders for the partial construction licences of nuclear power plants. The author distinguishes beween three kinds of forecast decisions: 1. Appraising forecast decisions with standards of judgment taken mainly from the fields of the art, culture, morality, religion are, according to the author, only legally verifyable to a limited extent. 2. With regard to forecast decisions not arguable, e.g. where the future behaviour of persons is concerned, the same should be applied basically. 3. In contrast to this, the following is applicable for programmatic, proceedingslike, or creative forecast decisions, in particular in economics: 'An administrative estimation privilege in a prognostic sense with the consequence that the court has to accept the forecast decision which lies within the forecast margins and which cannot be disproved, and that the court may not replace this forecast decision by its own probability judgment. In these cases, administration has the right to create its own forecast standards.' Judicial control in these cases was limited to certain substantive and procedural mistakes made by the administration in the course of forecast decision finding.

  13. Economic assessment of flood forecasts for a risk-averse decision-maker

    Science.gov (United States)

    Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier-Filion, Thomas-Charles

    2017-04-01

    observed values) and in terms of their economic value. This assessment is performed for lead times of one to five days. The three systems are: (1) simple statistically dressed deterministic forecasts, (2) forecasts based on meteorological ensembles and (3) a variant of the latter that also includes an estimation of state variables uncertainty. The comparison takes place on the Montmorency River, a small flood-prone watershed in south central Quebec, Canada. The results show that forecasts quality as assessed by well-known tools such as the Continuous Ranked Probability Score or the reliability diagram do not necessarily translate directly into economic value, especially if the decision maker is not risk-neutral. In addition, results show that the economic value of forecasts for a risk-averse decision maker is very much influenced by the most extreme members of ensemble forecasts (upper tail of the predictive distributions). This study provides a new basis for further improvement of our comprehension of the complex interactions between forecasts uncertainty, risk-aversion and decision-making.

  14. Weather forecast

    CERN Document Server

    Courtier, P

    1994-02-07

    Weather prediction is performed using the numerical model of the atmosphere evolution.The evolution equations are derived from the Navier Stokes equation for the adiabatic part but the are very much complicated by the change of phase of water, the radiation porocess and the boundary layer.The technique used operationally is described. Weather prediction is an initial value problem and accurate initial conditions need to be specified. Due to the small number of observations available (105 ) as compared to the dimension of the model state variable (107),the problem is largely underdetermined. Techniques of optimal control and inverse problems are used and have been adapted to the large dimension of our problem. our problem.The at mosphere is a chaotic system; the implication for weather prediction is discussed. Ensemble prediction is used operationally and the technique for generating initial conditions which lead to a numerical divergence of the subsequent forecasts is described.

  15. Operational 0-3 h probabilistic quantitative precipitation forecasts: Recent performance and potential enhancements

    Science.gov (United States)

    Sokol, Z.; Kitzmiller, D.; Pešice, P.; Guan, S.

    2009-05-01

    The NOAA National Weather Service has maintained an automated, centralized 0-3 h prediction system for probabilistic quantitative precipitation forecasts since 2001. This advective-statistical system (ADSTAT) produces probabilities that rainfall will exceed multiple threshold values up to 50 mm at some location within a 40-km grid box. Operational characteristics and development methods for the system are described. Although development data were stratified by season and time of day, ADSTAT utilizes only a single set of nation-wide equations that relate predictor variables derived from radar reflectivity, lightning, satellite infrared temperatures, and numerical prediction model output to rainfall occurrence. A verification study documented herein showed that the operational ADSTAT reliably models regional variations in the relative frequency of heavy rain events. This was true even in the western United States, where no regional-scale, gridded hourly precipitation data were available during the development period in the 1990s. An effort was recently launched to improve the quality of ADSTAT forecasts by regionalizing the prediction equations and to adapt the model for application in the Czech Republic. We have experimented with incorporating various levels of regional specificity in the probability equations. The geographic localization study showed that in the warm season, regional climate differences and variations in the diurnal temperature cycle have a marked effect on the predictor-predictand relationships, and thus regionalization would lead to better statistical reliability in the forecasts.

  16. Forecasting energy demand and CO{sub 2}-emissions from energy production in the forest industry

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, H

    1998-12-31

    The purpose of this study was to develops new energy forecasting methods for the forest industry energy use. The scenarios have been the most commonly used forecasts, but they require a lot of work. The recent scenarios, which are made for the forest industry, give a wide range of results; e.g. from 27,8 TWh to 38 TWh for electricity use in 2010. There is a need for more simple and accurate methods for forecasting. The time scale for the study is from 1975 to 2010, i.e. 36 years. The basic data for the study is collected from time period 1975 - 1995. It includes the wood use, production of main product categories and energy use in the forest industry. The factors affecting energy use at both industry level and at mill level are presented. The most probable technology trends, which can have an effect on energy production and use and CO{sub 2}-emissions are studied. Recent forecasts for the forest industry energy use till the year 2010 are referred and analysed. Three alternative forecasting methods are studied more closely. These methods are (a) Regression analysis, (b) Growth curves and (c) Delphi-method. Total electricity demand, share of purchased electricity, total fuel demand and share of process-based biofuels are estimated for the time period 1996 - 2010. The results from the different methods are compared to each other and to the recent scenarios. The comparison is made for the results concerning the energy use and the usefulness of the methods in practical work. The average energy consumption given by the forecasts for electricity was 31,6 TWh and for fuel 6,2 Mtoe in 2010. The share of purchased electricity totalled 73 % and process based fuels 77 %. The figures from 1995 are 22,8 TWh, 5,5 Mtoe, 64 % and 68 % respectively. All three methods were suitable for forecasting. All the methods required less working hours and were easier to use than scenarios. The methods gave results with a smaller deviation than scenarios, e.g. with electricity use in 2010 from

  17. Forecasting energy demand and CO{sub 2}-emissions from energy production in the forest industry

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, H.

    1997-12-31

    The purpose of this study was to develops new energy forecasting methods for the forest industry energy use. The scenarios have been the most commonly used forecasts, but they require a lot of work. The recent scenarios, which are made for the forest industry, give a wide range of results; e.g. from 27,8 TWh to 38 TWh for electricity use in 2010. There is a need for more simple and accurate methods for forecasting. The time scale for the study is from 1975 to 2010, i.e. 36 years. The basic data for the study is collected from time period 1975 - 1995. It includes the wood use, production of main product categories and energy use in the forest industry. The factors affecting energy use at both industry level and at mill level are presented. The most probable technology trends, which can have an effect on energy production and use and CO{sub 2}-emissions are studied. Recent forecasts for the forest industry energy use till the year 2010 are referred and analysed. Three alternative forecasting methods are studied more closely. These methods are (a) Regression analysis, (b) Growth curves and (c) Delphi-method. Total electricity demand, share of purchased electricity, total fuel demand and share of process-based biofuels are estimated for the time period 1996 - 2010. The results from the different methods are compared to each other and to the recent scenarios. The comparison is made for the results concerning the energy use and the usefulness of the methods in practical work. The average energy consumption given by the forecasts for electricity was 31,6 TWh and for fuel 6,2 Mtoe in 2010. The share of purchased electricity totalled 73 % and process based fuels 77 %. The figures from 1995 are 22,8 TWh, 5,5 Mtoe, 64 % and 68 % respectively. All three methods were suitable for forecasting. All the methods required less working hours and were easier to use than scenarios. The methods gave results with a smaller deviation than scenarios, e.g. with electricity use in 2010 from

  18. Global Grid of Probabilities of Urban Expansion to 2030

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Grid of Probabilities of Urban Expansion to 2030 presents spatially explicit probabilistic forecasts of global urban land cover change from 2000 to 2030...

  19. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2018-03-01

    Full Text Available Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  20. 3-D visualization of ensemble weather forecasts - Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Science.gov (United States)

    Rautenhaus, M.; Grams, C. M.; Schäfler, A.; Westermann, R.

    2015-02-01

    We present the application of interactive 3-D visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the ECMWF ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and forecast wind field resolution. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (three to seven days before take-off).

  1. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  2. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  3. Probabilistic Forecast of Wind Power Generation by Stochastic Differential Equation Models

    KAUST Repository

    Elkantassi, Soumaya

    2017-01-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized

  4. National Forecast Charts

    Science.gov (United States)

    code. Press enter or select the go button to submit request Local forecast by "City, St" or Prediction Center on Twitter NCEP Quarterly Newsletter WPC Home Analyses and Forecasts National Forecast to all federal, state, and local government web resources and services. National Forecast Charts

  5. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  6. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  7. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  8. Forecasting freight flows

    DEFF Research Database (Denmark)

    Lyk-Jensen, Stéphanie

    2011-01-01

    Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...... constitute a valuable input to freight models for forecasting future capacity problems.......Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...

  9. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  10. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  11. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  12. Forecasting in Complex Systems

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2014-12-01

    Complex nonlinear systems are typically characterized by many degrees of freedom, as well as interactions between the elements. Interesting examples can be found in the areas of earthquakes and finance. In these two systems, fat tails play an important role in the statistical dynamics. For earthquake systems, the Gutenberg-Richter magnitude-frequency is applicable, whereas for daily returns for the securities in the financial markets are known to be characterized by leptokurtotic statistics in which the tails are power law. Very large fluctuations are present in both systems. In earthquake systems, one has the example of great earthquakes such as the M9.1, March 11, 2011 Tohoku event. In financial systems, one has the example of the market crash of October 19, 1987. Both were largely unexpected events that severely impacted the earth and financial systems systemically. Other examples include the M9.3 Andaman earthquake of December 26, 2004, and the Great Recession which began with the fall of Lehman Brothers investment bank on September 12, 2013. Forecasting the occurrence of these damaging events has great societal importance. In recent years, national funding agencies in a variety of countries have emphasized the importance of societal relevance in research, and in particular, the goal of improved forecasting technology. Previous work has shown that both earthquakes and financial crashes can be described by a common Landau-Ginzburg-type free energy model. These metastable systems are characterized by fat tail statistics near the classical spinodal. Correlations in these systems can grow and recede, but do not imply causation, a common source of misunderstanding. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In this talk, we describe the basic phenomenology of these systems and emphasize their similarities and differences. We also consider the problem of forecast validation and verification

  13. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  14. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  15. An enhanced radial basis function network for short-term electricity price forecasting

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2010-01-01

    This paper proposed a price forecasting system for electric market participants to reduce the risk of price volatility. Combining the Radial Basis Function Network (RBFN) and Orthogonal Experimental Design (OED), an Enhanced Radial Basis Function Network (ERBFN) has been proposed for the solving process. The Locational Marginal Price (LMP), system load, transmission flow and temperature of the PJM system were collected and the data clusters were embedded in the Excel Database according to the year, season, workday and weekend. With the OED applied to learning rates in the ERBFN, the forecasting error can be reduced during the training process to improve both accuracy and reliability. This would mean that even the ''spikes'' could be tracked closely. The Back-propagation Neural Network (BPN), Probability Neural Network (PNN), other algorithms, and the proposed ERBFN were all developed and compared to check the performance. Simulation results demonstrated the effectiveness of the proposed ERBFN to provide quality information in a price volatile environment. (author)

  16. Ensemble Forecasts with Useful Skill-Spread Relationships for African meningitis and Asia Streamflow Forecasting

    Science.gov (United States)

    Hopson, T. M.

    2014-12-01

    One potential benefit of an ensemble prediction system (EPS) is its capacity to forecast its own forecast error through the ensemble spread-error relationship. In practice, an EPS is often quite limited in its ability to represent the variable expectation of forecast error through the variable dispersion of the ensemble, and perhaps more fundamentally, in its ability to provide enough variability in the ensembles dispersion to make the skill-spread relationship even potentially useful (irrespective of whether the EPS is well-calibrated or not). In this paper we examine the ensemble skill-spread relationship of an ensemble constructed from the TIGGE (THORPEX Interactive Grand Global Ensemble) dataset of global forecasts and a combination of multi-model and post-processing approaches. Both of the multi-model and post-processing techniques are based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. The methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. A context for these concepts is provided by assessing the constructed ensemble in forecasting district-level humidity impacting the incidence of meningitis in the meningitis belt of Africa, and in forecasting flooding events in the Brahmaputra and Ganges basins of South Asia.

  17. An Intelligent Decision Support System for Workforce Forecast

    Science.gov (United States)

    2011-01-01

    growth. Brown (1999) developed a model to forecast dental workforce size and mix (by sex) for the first twenty years of the twenty first century in...forecasted competencies required to deliver needed dental services. Labor market signaling approaches based workforce forecasting model was presented...techniques viz. algebra, calculus or probability theory, (Law and Kelton, 1991). Simulation processes, same as conducting experiments on computers, deals

  18. Robust Forecasting of Non-Stationary Time Series

    OpenAIRE

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable forecasts in the presence of outliers, non-linearity, and heteroscedasticity. In the absence of outliers, the forecasts are only slightly less precise than those based on a localized Least Squares estima...

  19. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  20. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Science.gov (United States)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  1. Skilful seasonal forecasts of streamflow over Europe?

    Science.gov (United States)

    Arnal, Louise; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; Prudhomme, Christel; Neumann, Jessica; Krzeminski, Blazej; Pappenberger, Florian

    2018-04-01

    This paper considers whether there is any added value in using seasonal climate forecasts instead of historical meteorological observations for forecasting streamflow on seasonal timescales over Europe. A Europe-wide analysis of the skill of the newly operational EFAS (European Flood Awareness System) seasonal streamflow forecasts (produced by forcing the Lisflood model with the ECMWF System 4 seasonal climate forecasts), benchmarked against the ensemble streamflow prediction (ESP) forecasting approach (produced by forcing the Lisflood model with historical meteorological observations), is undertaken. The results suggest that, on average, the System 4 seasonal climate forecasts improve the streamflow predictability over historical meteorological observations for the first month of lead time only (in terms of hindcast accuracy, sharpness and overall performance). However, the predictability varies in space and time and is greater in winter and autumn. Parts of Europe additionally exhibit a longer predictability, up to 7 months of lead time, for certain months within a season. In terms of hindcast reliability, the EFAS seasonal streamflow hindcasts are on average less skilful than the ESP for all lead times. The results also highlight the potential usefulness of the EFAS seasonal streamflow forecasts for decision-making (measured in terms of the hindcast discrimination for the lower and upper terciles of the simulated streamflow). Although the ESP is the most potentially useful forecasting approach in Europe, the EFAS seasonal streamflow forecasts appear more potentially useful than the ESP in some regions and for certain seasons, especially in winter for almost 40 % of Europe. Patterns in the EFAS seasonal streamflow hindcast skill are however not mirrored in the System 4 seasonal climate hindcasts, hinting at the need for a better understanding of the link between hydrological and meteorological variables on seasonal timescales, with the aim of improving climate

  2. Fuel cycle forecasting - there are forecasts and there are forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Puechl, K H

    1975-12-01

    The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis.

  3. Fuel cycle forecasting - there are forecasts and there are forecasts

    International Nuclear Information System (INIS)

    Puechl, K.H.

    1975-01-01

    The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis. (author)

  4. An analog ensemble for short-term probabilistic solar power forecast

    International Nuclear Information System (INIS)

    Alessandrini, S.; Delle Monache, L.; Sperati, S.; Cervone, G.

    2015-01-01

    Highlights: • A novel method for solar power probabilistic forecasting is proposed. • The forecast accuracy does not depend on the nominal power. • The impact of climatology on forecast accuracy is evaluated. - Abstract: The energy produced by photovoltaic farms has a variable nature depending on astronomical and meteorological factors. The former are the solar elevation and the solar azimuth, which are easily predictable without any uncertainty. The amount of liquid water met by the solar radiation within the troposphere is the main meteorological factor influencing the solar power production, as a fraction of short wave solar radiation is reflected by the water particles and cannot reach the earth surface. The total cloud cover is a meteorological variable often used to indicate the presence of liquid water in the troposphere and has a limited predictability, which is also reflected on the global horizontal irradiance and, as a consequence, on solar photovoltaic power prediction. This lack of predictability makes the solar energy integration into the grid challenging. A cost-effective utilization of solar energy over a grid strongly depends on the accuracy and reliability of the power forecasts available to the Transmission System Operators (TSOs). Furthermore, several countries have in place legislation requiring solar power producers to pay penalties proportional to the errors of day-ahead energy forecasts, which makes the accuracy of such predictions a determining factor for producers to reduce their economic losses. Probabilistic predictions can provide accurate deterministic forecasts along with a quantification of their uncertainty, as well as a reliable estimate of the probability to overcome a certain production threshold. In this paper we propose the application of an analog ensemble (AnEn) method to generate probabilistic solar power forecasts (SPF). The AnEn is based on an historical set of deterministic numerical weather prediction (NWP) model

  5. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  6. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  7. Monthly forecasting of agricultural pests in Switzerland

    Science.gov (United States)

    Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.

    2012-04-01

    Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the

  8. Financial forecasts accuracy in Brazil's social security system.

    Directory of Open Access Journals (Sweden)

    Carlos Patrick Alves da Silva

    Full Text Available Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  9. Financial forecasts accuracy in Brazil's social security system.

    Science.gov (United States)

    Silva, Carlos Patrick Alves da; Puty, Claudio Alberto Castelo Branco; Silva, Marcelino Silva da; Carvalho, Solon Venâncio de; Francês, Carlos Renato Lisboa

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  10. Financial forecasts accuracy in Brazil’s social security system

    Science.gov (United States)

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government’s proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts. PMID:28859172

  11. Robust Approaches to Forecasting

    OpenAIRE

    Jennifer Castle; David Hendry; Michael P. Clements

    2014-01-01

    We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models. Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift. We derive the resulting forecast biases and error variances, and indicate when the methods ar...

  12. Inflation Forecast Contracts

    OpenAIRE

    Gersbach, Hans; Hahn, Volker

    2012-01-01

    We introduce a new type of incentive contract for central bankers: inflation forecast contracts, which make central bankers’ remunerations contingent on the precision of their inflation forecasts. We show that such contracts enable central bankers to influence inflation expectations more effectively, thus facilitating more successful stabilization of current inflation. Inflation forecast contracts improve the accuracy of inflation forecasts, but have adverse consequences for output. On balanc...

  13. Measuring inaccuracy in travel demand forecasting

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent

    2005-01-01

    as the basis for measurement. This paper presents the case against both objections. First, if one is interested in learning whether decisions about building transport infrastructure are based on reliable information, then it is exactly the traffic forecasted at the time of making the decision to build......Project promoters, forecasters, and managers sometimes object to two things in measuring inaccuracy in travel demand forecasting: (1)using the forecast made at the time of making the decision to build as the basis for measuring inaccuracy and (2)using traffic during the first year of operations...... that is of interest. Second, although ideally studies should take into account so-called demand ??ramp up?? over a period of years, the empirical evidence and practical considerations do not support this ideal requirement, at least not for large- N studies. Finally, the paper argues that large samples of inaccuracy...

  14. Flood Forecasting in River System Using ANFIS

    International Nuclear Information System (INIS)

    Ullah, Nazrin; Choudhury, P.

    2010-01-01

    The aim of the present study is to investigate applicability of artificial intelligence techniques such as ANFIS (Adaptive Neuro-Fuzzy Inference System) in forecasting flood flow in a river system. The proposed technique combines the learning ability of neural network with the transparent linguistic representation of fuzzy system. The technique is applied to forecast discharge at a downstream station using flow information at various upstream stations. A total of three years data has been selected for the implementation of this model. ANFIS models with various input structures and membership functions are constructed, trained and tested to evaluate efficiency of the models. Statistical indices such as Root Mean Square Error (RMSE), Correlation Coefficient (CORR) and Coefficient of Efficiency (CE) are used to evaluate performance of the ANFIS models in forecasting river flood. The values of the indices show that ANFIS model can accurately and reliably be used to forecast flood in a river system.

  15. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  16. Weather Forecasts are for Wimps. Why Water Resource Managers Do Not Use Climate Forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, S. [James Martin Institute of Science and Civilization, Said Business School, University of Oxford, OX1 1HP (United Kingdom); Lach, D. [Oregon State University, Corvallis, OR, 97331-4501 (United States); Ingram, H. [School of Social Ecology, University of California Irvine, Irvine, CA, 92697-7075 (United States)

    2005-04-15

    Short-term climate forecasting offers the promise of improved hydrologic management strategies. However, water resource managers in the United States have proven reluctant to incorporate them in decision making. While managers usually cite poor reliability of the forecasts as the reason for this, they are seldom able to demonstrate knowledge of the actual performance of forecasts or to consistently articulate the level of reliability that they would require. Analysis of three case studies in California, the Pacific Northwest, and metro Washington DC identifies institutional reasons that appear to lie behind managers reluctance to use the forecasts. These include traditional reliance on large built infrastructure, organizational conservatism and complexity, mismatch of temporal and spatial scales of forecasts to management needs, political disincentives to innovation, and regulatory constraints. The paper concludes that wider acceptance of the forecasts will depend on their being incorporated in existing organizational routines and industrial codes and practices, as well as changes in management incentives to innovation. Finer spatial resolution of forecasts and the regional integration of multi-agency functions would also enhance their usability. The title of this article is taken from an advertising slogan for the Oldsmobile Bravura SUV.

  17. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Science.gov (United States)

    2010-01-01

    ... financial ratings, and participation in reliability council, power pool, regional transmission group, power... analysis and modeling of the borrower's electric system loads as provided for in the load forecast work plan. (5) A narrative discussing the borrower's past, existing, and forecast of future electric system...

  18. Three-dimensional visualization of ensemble weather forecasts - Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Science.gov (United States)

    Rautenhaus, M.; Grams, C. M.; Schäfler, A.; Westermann, R.

    2015-07-01

    We present the application of interactive three-dimensional (3-D) visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 (THORPEX - North Atlantic Waveguide and Downstream Impact Experiment) campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts (WCBs) has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the European Centre for Medium Range Weather Forecasts (ECMWF) ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and grid spacing of the forecast wind field. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (3 to 7 days before take-off).

  19. Three-dimensional visualization of ensemble weather forecasts – Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2015-07-01

    Full Text Available We present the application of interactive three-dimensional (3-D visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 (THORPEX – North Atlantic Waveguide and Downstream Impact Experiment campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts (WCBs has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the European Centre for Medium Range Weather Forecasts (ECMWF ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and grid spacing of the forecast wind field. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (3 to 7 days before take-off.

  20. Forecaster’s utility and forecasts coherence

    DEFF Research Database (Denmark)

    Chini, Emilio Zanetti

    model to ease the statistical inference. A simulation study reveals that the test behaves consistently with the requirements of the theoretical literature. The locality of the scoring rule is fundamental to set dating algorithms to measure and forecast probability of recession in US business cycle...

  1. Electricity demand forecasting techniques

    International Nuclear Information System (INIS)

    Gnanalingam, K.

    1994-01-01

    Electricity demand forecasting plays an important role in power generation. The two areas of data that have to be forecasted in a power system are peak demand which determines the capacity (MW) of the plant required and annual energy demand (GWH). Methods used in electricity demand forecasting include time trend analysis and econometric methods. In forecasting, identification of manpower demand, identification of key planning factors, decision on planning horizon, differentiation between prediction and projection (i.e. development of different scenarios) and choosing from different forecasting techniques are important

  2. Spatial electric load forecasting

    CERN Document Server

    Willis, H Lee

    2002-01-01

    Containing 12 new chapters, this second edition contains offers increased-coverage of weather correction and normalization of forecasts, anticipation of redevelopment, determining the validity of announced developments, and minimizing risk from over- or under-planning. It provides specific examples and detailed explanations of key points to consider for both standard and unusual utility forecasting situations, information on new algorithms and concepts in forecasting, a review of forecasting pitfalls and mistakes, case studies depicting challenging forecast environments, and load models illustrating various types of demand.

  3. Near-term probabilistic forecast of significant wildfire events for the Western United States

    Science.gov (United States)

    Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly

    2016-01-01

    Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...

  4. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  5. Spatial Distribution of the Coefficient of Variation and Bayesian Forecast for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, Shunichi; Ogata, Yosihiko

    2016-04-01

    We propose a Bayesian method of probability forecasting for recurrent earthquakes of inland active faults in Japan. Renewal processes with the Brownian Passage Time (BPT) distribution are applied for over a half of active faults in Japan by the Headquarters for Earthquake Research Promotion (HERP) of Japan. Long-term forecast with the BPT distribution needs two parameters; the mean and coefficient of variation (COV) for recurrence intervals. The HERP applies a common COV parameter for all of these faults because most of them have very few specified paleoseismic events, which is not enough to estimate reliable COV values for respective faults. However, different COV estimates are proposed for the same paleoseismic catalog by some related works. It can make critical difference in forecast to apply different COV estimates and so COV should be carefully selected for individual faults. Recurrence intervals on a fault are, on the average, determined by the long-term slip rate caused by the tectonic motion but fluctuated by nearby seismicities which influence surrounding stress field. The COVs of recurrence intervals depend on such stress perturbation and so have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus we introduce a spatial structure on its COV parameter by Bayesian modeling with a Gaussian process prior. The COVs on active faults are correlated and take similar values for closely located faults. It is found that the spatial trends in the estimated COV values coincide with the density of active faults in Japan. We also show Bayesian forecasts by the proposed model using Markov chain Monte Carlo method. Our forecasts are different from HERP's forecast especially on the active faults where HERP's forecasts are very high or low.

  6. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  7. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya

    2017-10-03

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  8. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya; Kalligiannaki, Evangelia; Tempone, Raul

    2017-01-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  9. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  10. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  11. Access to Risk Mitigating Weather Forecasts and Changes in Farming Operations in East and West Africa: Evidence from a Baseline Survey

    Directory of Open Access Journals (Sweden)

    Abayomi Samuel Oyekale

    2015-10-01

    Full Text Available Unfavorable weather currently ranks among the major challenges facing agricultural development in many African countries. Impact mitigation through access to reliable and timely weather forecasts and other adaptive mechanisms are foremost in Africa’s policy dialogues and socio-economic development agendas. This paper analyzed the factors influencing access to forecasts on incidence of pests/diseases (PD and start of rainfall (SR. The data were collected by Climate Change Agriculture and Food Security (CCAFS and analyzed with Probit regression separately for East Africa, West Africa and the combined dataset. The results show that 62.7% and 56.4% of the farmers from East and West Africa had access to forecasts on start of rainfall, respectively. In addition, 39.3% and 49.4% of the farmers from East Africa indicated that forecasts on outbreak of pests/diseases and start of rainfall were respectively accompanied with advice as against 18.2% and 41.9% for West Africa. Having received forecasts on start of rainfall, 24.0% and 17.6% of the farmers from East and West Africa made decisions on timing of farming activities respectively. Probabilities of having access to forecasts on PD significantly increased with access to formal education, farm income and previous exposure to climatic shocks. Furthermore, probabilities of having access to forecasts on SR significantly increased (p < 0.05 with access to business income, radio and perception of more erratic rainfall, among others. It was recommended that promotion of informal education among illiterate farmers would enhance their climatic resilience, among others.

  12. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  13. a system approach to the long term forecasting of the climat data in baikal region

    Science.gov (United States)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method

  14. FORECASTING OF PERFORMANCE EVALUATION OF NEW VEHICLES

    Directory of Open Access Journals (Sweden)

    O. S. Krasheninin

    2016-12-01

    Full Text Available Purpose. The research work focuses on forecasting of performance evaluation of the tractive and non-tractive vehicles that will satisfy and meet the needs and requirements of the railway industry, which is constantly evolving. Methodology. Analysis of the technical condition of the existing fleet of rolling stock (tractive and non-tractive of Ukrainian Railways shows a substantial reduction that occurs in connection with its moral and physical wear and tear, as well as insufficient and limited purchase of new units of the tractive and non-tractive rolling stock in the desired quantity. In this situation there is a necessity of search of the methods for determination of rolling stock technical characteristics. One of such urgent and effective measures is to conduct forecasting of the defining characteristics of the vehicles based on the processes of their reproduction in conditions of limited resources using a continuous exponential function. The function of the growth rate of the projected figure degree for the vehicle determines the logistic characteristic that with unlimited resources has the form of an exponent, and with low ones – that of a line. Findings. The data obtained according to the proposed method allowed determining the expected (future value, that is the ratio of load to volume of the body for non-tractive rolling stock (gondola cars and weight-to-power for tractive rolling stock, the degree of forecast reliability and the standard forecast error, which show high prediction accuracy for the completed procedure. As a result, this will allow estimating the required characteristics of vehicles in the forecast year with high accuracy. Originality. The concept of forecasting the characteristics of the vehicles for decision-making on the evaluation of their prospects was proposed. Practical value. The forecasting methodology will reliably determine the technical parameters of tractive and non-tractive rolling stock, which will meet

  15. Satellite based Ocean Forecasting, the SOFT project

    Science.gov (United States)

    Stemmann, L.; Tintoré, J.; Moneris, S.

    2003-04-01

    The knowledge of future oceanic conditions would have enormous impact on human marine related areas. For such reasons, a number of international efforts are being carried out to obtain reliable and manageable ocean forecasting systems. Among the possible techniques that can be used to estimate the near future states of the ocean, an ocean forecasting system based on satellite imagery is developped through the Satelitte based Ocean ForecasTing project (SOFT). SOFT, established by the European Commission, considers the development of a forecasting system of the ocean space-time variability based on satellite data by using Artificial Intelligence techniques. This system will be merged with numerical simulation approaches, via assimilation techniques, to get a hybrid SOFT-numerical forecasting system of improved performance. The results of the project will provide efficient forecasting of sea-surface temperature structures, currents, dynamic height, and biological activity associated to chlorophyll fields. All these quantities could give valuable information on the planning and management of human activities in marine environments such as navigation, fisheries, pollution control, or coastal management. A detailed identification of present or new needs and potential end-users concerned by such an operational tool is being performed. The project would study solutions adapted to these specific needs.

  16. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  17. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  18. Inaccuracy in traffic forecasts

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent; Holm, Mette K. Skamris; Buhl, Søren Ladegaard

    2006-01-01

    This paper presents results from the first statistically significant study of traffic forecasts in transportation infrastructure projects. The sample used is the largest of its kind, covering 210 projects in 14 nations worth US$58 billion. The study shows with very high statistical significance...... that forecasters generally do a poor job of estimating the demand for transportation infrastructure projects. The result is substantial downside financial and economic risk. Forecasts have not become more accurate over the 30-year period studied. If techniques and skills for arriving at accurate demand forecasts...... forecasting. Highly inaccurate traffic forecasts combined with large standard deviations translate into large financial and economic risks. But such risks are typically ignored or downplayed by planners and decision-makers, to the detriment of social and economic welfare. The paper presents the data...

  19. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  20. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  1. FORECASTING MODELS IN MANAGEMENT

    OpenAIRE

    Sindelar, Jiri

    2008-01-01

    This article deals with the problems of forecasting models. First part of the article is dedicated to definition of the relevant areas (vertical and horizontal pillar of definition) and then the forecasting model itself is defined; as article presents theoretical background for further primary research, this definition is crucial. Finally the position of forecasting models within the management system is identified. The paper is a part of the outputs of FEM CULS grant no. 1312/11/3121.

  2. Forecasting in Planning

    OpenAIRE

    Ike, P.; Voogd, Henk; Voogd, Henk; Linden, Gerard

    2004-01-01

    This chapter begins with a discussion of qualitative forecasting by describing a number of methods that depend on judgements made by stakeholders, experts or other interested parties to arrive at forecasts. Two qualitative approaches are illuminated, the Delphi and scenario methods respectively. Quantitative forecasting is illustrated with a brief overview of time series methods. Both qualitative and quantitative methods are illustrated by an example. The role and relative importance of forec...

  3. The strategy of professional forecasting

    DEFF Research Database (Denmark)

    Ottaviani, Marco; Sørensen, Peter Norman

    2006-01-01

    We develop and compare two theories of professional forecasters’ strategic behavior. The first theory, reputational cheap talk, posits that forecasters endeavor to convince the market that they are well informed. The market evaluates their forecasting talent on the basis of the forecasts...... and the realized state. If the market expects forecasters to report their posterior expectations honestly, then forecasts are shaded toward the prior mean. With correct market expectations, equilibrium forecasts are imprecise but not shaded. The second theory posits that forecasters compete in a forecasting...... contest with pre-specified rules. In a winner-take-all contest, equilibrium forecasts are excessively differentiated...

  4. Wind and load forecast error model for multiple geographically distributed forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Reyes-Spindola, Jorge F.; Samaan, Nader; Diao, Ruisheng; Hafen, Ryan P. [Pacific Northwest National Laboratory, Richland, WA (United States)

    2010-07-01

    The impact of wind and load forecast errors on power grid operations is frequently evaluated by conducting multi-variant studies, where these errors are simulated repeatedly as random processes based on their known statistical characteristics. To simulate these errors correctly, we need to reflect their distributions (which do not necessarily follow a known distribution law), standard deviations. auto- and cross-correlations. For instance, load and wind forecast errors can be closely correlated in different zones of the system. This paper introduces a new methodology for generating multiple cross-correlated random processes to produce forecast error time-domain curves based on a transition probability matrix computed from an empirical error distribution function. The matrix will be used to generate new error time series with statistical features similar to observed errors. We present the derivation of the method and some experimental results obtained by generating new error forecasts together with their statistics. (orig.)

  5. Multi-step wind speed forecasting based on a hybrid forecasting architecture and an improved bat algorithm

    International Nuclear Information System (INIS)

    Xiao, Liye; Qian, Feng; Shao, Wei

    2017-01-01

    Highlights: • Propose a hybrid architecture based on a modified bat algorithm for multi-step wind speed forecasting. • Improve the accuracy of multi-step wind speed forecasting. • Modify bat algorithm with CG to improve optimized performance. - Abstract: As one of the most promising sustainable energy sources, wind energy plays an important role in energy development because of its cleanliness without causing pollution. Generally, wind speed forecasting, which has an essential influence on wind power systems, is regarded as a challenging task. Analyses based on single-step wind speed forecasting have been widely used, but their results are insufficient in ensuring the reliability and controllability of wind power systems. In this paper, a new forecasting architecture based on decomposing algorithms and modified neural networks is successfully developed for multi-step wind speed forecasting. Four different hybrid models are contained in this architecture, and to further improve the forecasting performance, a modified bat algorithm (BA) with the conjugate gradient (CG) method is developed to optimize the initial weights between layers and thresholds of the hidden layer of neural networks. To investigate the forecasting abilities of the four models, the wind speed data collected from four different wind power stations in Penglai, China, were used as a case study. The numerical experiments showed that the hybrid model including the singular spectrum analysis and general regression neural network with CG-BA (SSA-CG-BA-GRNN) achieved the most accurate forecasting results in one-step to three-step wind speed forecasting.

  6. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  7. A global flash flood forecasting system

    Science.gov (United States)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial

  8. House Price Forecasts, Forecaster Herding, and the Recent Crisis

    Directory of Open Access Journals (Sweden)

    Christian Pierdzioch

    2012-11-01

    Full Text Available We used the Wall Street Journal survey data for the period 2006–2012 to analyze whether forecasts of house prices and housing starts provide evidence of (anti-herding of forecasters. Forecasts are consistent with herding (anti-herding of forecasters if forecasts are biased towards (away from the consensus forecast. We found that anti-herding is prevalent among forecasters of house prices. We also report that, following the recent crisis, the prevalence of forecaster anti-herding seems to have changed over time.

  9. House Price Forecasts, Forecaster Herding, and the Recent Crisis

    DEFF Research Database (Denmark)

    Stadtmann, Georg; Pierdzioch; Ruelke

    2013-01-01

    We used the Wall Street Journal survey data for the period 2006–2012 to analyze whether forecasts of house prices and housing starts provide evidence of (anti-)herding of forecasters. Forecasts are consistent with herding (anti-herding) of forecasters if forecasts are biased towards (away from) t......) the consensus forecast. We found that anti-herding is prevalent among forecasters of house prices. We also report that, following the recent crisis, the prevalence of forecaster anti-herding seems to have changed over time....

  10. World Area Forecast System (WAFS)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The World Area Forecast System (WAFS) is a worldwide system by which world area forecast centers provide aeronautical meteorological en-route forecasts in uniform...

  11. Forecasting in Planning

    NARCIS (Netherlands)

    Ike, P.; Voogd, Henk; Voogd, Henk; Linden, Gerard

    2004-01-01

    This chapter begins with a discussion of qualitative forecasting by describing a number of methods that depend on judgements made by stakeholders, experts or other interested parties to arrive at forecasts. Two qualitative approaches are illuminated, the Delphi and scenario methods respectively.

  12. Improving Garch Volatility Forecasts

    NARCIS (Netherlands)

    Klaassen, F.J.G.M.

    1998-01-01

    Many researchers use GARCH models to generate volatility forecasts. We show, however, that such forecasts are too variable. To correct for this, we extend the GARCH model by distinguishing two regimes with different volatility levels. GARCH effects are allowed within each regime, so that our model

  13. Forecast Accuracy Uncertainty and Momentum

    OpenAIRE

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  14. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  15. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  16. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  17. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  18. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  19. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  20. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  1. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  2. Evaluating machine-learning techniques for recruitment forecasting of seven North East Atlantic fish species

    KAUST Repository

    Fernandes, José Antonio

    2015-01-01

    The effect of different factors (spawning biomass, environmental conditions) on recruitment is a subject of great importance in the management of fisheries, recovery plans and scenario exploration. In this study, recently proposed supervised classification techniques, tested by the machine-learning community, are applied to forecast the recruitment of seven fish species of North East Atlantic (anchovy, sardine, mackerel, horse mackerel, hake, blue whiting and albacore), using spawning, environmental and climatic data. In addition, the use of the probabilistic flexible naive Bayes classifier (FNBC) is proposed as modelling approach in order to reduce uncertainty for fisheries management purposes. Those improvements aim is to improve probability estimations of each possible outcome (low, medium and high recruitment) based in kernel density estimation, which is crucial for informed management decision making with high uncertainty. Finally, a comparison between goodness-of-fit and generalization power is provided, in order to assess the reliability of the final forecasting models. It is found that in most cases the proposed methodology provides useful information for management whereas the case of horse mackerel is an example of the limitations of the approach. The proposed improvements allow for a better probabilistic estimation of the different scenarios, i.e. to reduce the uncertainty in the provided forecasts.

  3. The effort to increase the space weather forecasting accuracy in KSWC

    Science.gov (United States)

    Choi, J. S.

    2017-12-01

    The Korean Space Weather Center (KSWC) of the National Radio Research Agency (RRA) is a government agency which is the official source of space weather information for Korean Government and the primary action agency of emergency measure to severe space weather condition as the Regional Warning Center of the International Space Environment Service (ISES). KSWC's main role is providing alerts, watches, and forecasts in order to minimize the space weather impacts on both of public and commercial sectors of satellites, aviation, communications, navigations, power grids, and etc. KSWC is also in charge of monitoring the space weather condition and conducting research and development for its main role of space weather operation in Korea. Recently, KSWC are focusing on increasing the accuracy of space weather forecasting results and verifying the model generated results. The forecasting accuracy will be calculated based on the probability statistical estimation so that the results can be compared numerically. Regarding the cosmic radiation does, we are gathering the actual measured data of radiation does using the instrument by cooperation with the domestic airlines. Based on the measurement, we are going to verify the reliability of SAFE system which was developed by KSWC to provide the cosmic radiation does information with the airplane cabin crew and public users.

  4. Propagation of Uncertainty in Bayesian Kernel Models - Application to Multiple-Step Ahead Forecasting

    DEFF Research Database (Denmark)

    Quinonero, Joaquin; Girard, Agathe; Larsen, Jan

    2003-01-01

    The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models such as the Gaus......The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models...... such as the Gaussian process and the relevance vector machine. We derive novel analytic expressions for the predictive mean and variance for Gaussian kernel shapes under the assumption of a Gaussian input distribution in the static case, and of a recursive Gaussian predictive density in iterative forecasting...

  5. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  6. Device for forecasting reactor power-up routes

    International Nuclear Information System (INIS)

    Fukuzaki, Takaharu.

    1980-01-01

    Purpose: To improve the reliability and forecasting accuracy for a device forecasting the change of the state on line in BWR type reactors. Constitution: The present state in a nuclear reactor is estimated in a present state judging section based on measuring signals for thermal power, core flow rate, control rod density and the like from the nuclear reactor, and the estimated results are accumulated in an operation result collecting section. While on the other hand, a forecasting section forecasts the future state in the reactor based on the signals from the forecasting condition setting section. The actual result values from the collecting section and the forecasting results are compared to each other. If they are not equal, new setting signals are outputted from the setting section to perform the forecasting again. These procedures are repeated till the difference between the forecast results and the actual result values is minimized, by which accurate forecasting for the state of the reactor is made possible. (Furukawa, Y.)

  7. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  8. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  9. Multicomponent ensemble models to forecast induced seismicity

    Science.gov (United States)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels

  10. Improving operational flood forecasting through data assimilation

    Science.gov (United States)

    Rakovec, Oldrich; Weerts, Albrecht; Uijlenhoet, Remko; Hazenberg, Pieter; Torfs, Paul

    2010-05-01

    Accurate flood forecasts have been a challenging topic in hydrology for decades. Uncertainty in hydrological forecasts is due to errors in initial state (e.g. forcing errors in historical mode), errors in model structure and parameters and last but not least the errors in model forcings (weather forecasts) during the forecast mode. More accurate flood forecasts can be obtained through data assimilation by merging observations with model simulations. This enables to identify the sources of uncertainties in the flood forecasting system. Our aim is to assess the different sources of error that affect the initial state and to investigate how they propagate through hydrological models with different levels of spatial variation, starting from lumped models. The knowledge thus obtained can then be used in a data assimilation scheme to improve the flood forecasts. This study presents the first results of this framework and focuses on quantifying precipitation errors and its effect on discharge simulations within the Ourthe catchment (1600 km2), which is situated in the Belgian Ardennes and is one of the larger subbasins of the Meuse River. Inside the catchment, hourly rain gauge information from 10 different locations is available over a period of 15 years. Based on these time series, the bootstrap method has been applied to generate precipitation ensembles. These were then used to simulate the catchment's discharges at the outlet. The corresponding streamflow ensembles were further assimilated with observed river discharges to update the model states of lumped hydrological models (R-PDM, HBV) through Residual Resampling. This particle filtering technique is a sequential data assimilation method and takes no prior assumption of the probability density function for the model states, which in contrast to the Ensemble Kalman filter does not have to be Gaussian. Our further research will be aimed at quantifying and reducing the sources of uncertainty that affect the initial

  11. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  12. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  13. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  14. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  15. Energy Consumption Forecasting for University Sector Buildings

    Directory of Open Access Journals (Sweden)

    Khuram Pervez Amber

    2017-10-01

    Full Text Available Reliable energy forecasting helps managers to prepare future budgets for their buildings. Therefore, a simple, easier, less time consuming and reliable forecasting model which could be used for different types of buildings is desired. In this paper, we have presented a forecasting model based on five years of real data sets for one dependent variable (the daily electricity consumption and six explanatory variables (ambient temperature, solar radiation, relative humidity, wind speed, weekday index and building type. A single mathematical equation for forecasting daily electricity usage of university buildings has been developed using the Multiple Regression (MR technique. Data of two such buildings, located at the Southwark Campus of London South Bank University in London, have been used for this study. The predicted test results of MR model are examined and judged against real electricity consumption data of both buildings for year 2011. The results demonstrate that out of six explanatory variables, three variables; surrounding temperature, weekday index and building type have significant influence on buildings energy consumption. The results of this model are associated with a Normalized Root Mean Square Error (NRMSE of 12% for the administrative building and 13% for the academic building. Finally, some limitations of this study have also been discussed.

  16. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. A High Precision Artificial Neural Networks Model for Short-Term Energy Load Forecasting

    Directory of Open Access Journals (Sweden)

    Ping-Huan Kuo

    2018-01-01

    Full Text Available One of the most important research topics in smart grid technology is load forecasting, because accuracy of load forecasting highly influences reliability of the smart grid systems. In the past, load forecasting was obtained by traditional analysis techniques such as time series analysis and linear regression. Since the load forecast focuses on aggregated electricity consumption patterns, researchers have recently integrated deep learning approaches with machine learning techniques. In this study, an accurate deep neural network algorithm for short-term load forecasting (STLF is introduced. The forecasting performance of proposed algorithm is compared with performances of five artificial intelligence algorithms that are commonly used in load forecasting. The Mean Absolute Percentage Error (MAPE and Cumulative Variation of Root Mean Square Error (CV-RMSE are used as accuracy evaluation indexes. The experiment results show that MAPE and CV-RMSE of proposed algorithm are 9.77% and 11.66%, respectively, displaying very high forecasting accuracy.

  19. Probabilistic forecasting of wind power generation using extreme learning machine

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Pinson, Pierre

    2014-01-01

    an extreme learning machine (ELM)-based probabilistic forecasting method for wind power generation. To account for the uncertainties in the forecasting results, several bootstrapmethods have been compared for modeling the regression uncertainty, based on which the pairs bootstrap method is identified......Accurate and reliable forecast of wind power is essential to power system operation and control. However, due to the nonstationarity of wind power series, traditional point forecasting can hardly be accurate, leading to increased uncertainties and risks for system operation. This paper proposes...... with the best performance. Consequently, a new method for prediction intervals formulation based on theELMand the pairs bootstrap is developed.Wind power forecasting has been conducted in different seasons using the proposed approach with the historical wind power time series as the inputs alone. The results...

  20. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  1. Coping with Changes in International Classifications of Sectors and Occupations: Application in Skills Forecasting. Research Paper No 43

    Science.gov (United States)

    Kvetan, Vladimir, Ed.

    2014-01-01

    Reliable and consistent time series are essential to any kind of economic forecasting. Skills forecasting needs to combine data from national accounts and labour force surveys, with the pan-European dimension of Cedefop's skills supply and demand forecasts, relying on different international classification standards. Sectoral classification (NACE)…

  2. Drought forecasting in Luanhe River basin involving climatic indices

    Science.gov (United States)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the

  3. An economic framework for forecasting land-use and ecosystem change

    International Nuclear Information System (INIS)

    Lewis, David J.

    2010-01-01

    This paper develops a joint econometric-simulation framework to forecast detailed empirical distributions of the spatial pattern of land-use and ecosystem change. In-sample and out-of-sample forecasting tests are used to examine the performance of the parcel-scale econometric and simulation models, and the importance of multiple forecasting challenges is assessed. The econometric-simulation method is integrated with an ecological model to generate forecasts of the probability of localized extinctions of an amphibian species. The paper demonstrates the potential of integrating economic and ecological models to generate ecological forecasts in the presence of alternative market conditions and land-use policy constraints. (author)

  4. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  5. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  6. About the National Forecast Chart

    Science.gov (United States)

    code. Press enter or select the go button to submit request Local forecast by "City, St" or Prediction Center on Twitter NCEP Quarterly Newsletter WPC Home Analyses and Forecasts National Forecast to all federal, state, and local government web resources and services. The National Forecast Charts

  7. MSSM Forecast for the LHC

    CERN Document Server

    Cabrera, Maria Eugenia; de Austri, Roberto Ruiz

    2009-01-01

    We perform a forecast of the MSSM with universal soft terms (CMSSM) for the LHC, based on an improved Bayesian analysis. We do not incorporate ad hoc measures of the fine-tuning to penalize unnatural possibilities: such penalization arises from the Bayesian analysis itself when the experimental value of $M_Z$ is considered. This allows to scan the whole parameter space, allowing arbitrarily large soft terms. Still the low-energy region is statistically favoured (even before including dark matter or g-2 constraints). Contrary to other studies, the results are almost unaffected by changing the upper limits taken for the soft terms. The results are also remarkable stable when using flat or logarithmic priors, a fact that arises from the larger statistical weight of the low-energy region in both cases. Then we incorporate all the important experimental constrains to the analysis, obtaining a map of the probability density of the MSSM parameter space, i.e. the forecast of the MSSM. Since not all the experimental i...

  8. Marine Point Forecasts

    Science.gov (United States)

    will link to the zone forecast and then allow further zooming to the point of interest whereas on the Honolulu, HI Chicago, IL Northern Indiana, IN Lake Charles, LA New Orleans, LA Boston, MA Caribou, ME

  9. Socioeconomic Forecasting : [Technical Summary

    Science.gov (United States)

    2012-01-01

    Because the traffic forecasts produced by the Indiana : Statewide Travel Demand Model (ISTDM) are driven by : the demographic and socioeconomic inputs to the model, : particular attention must be given to obtaining the most : accurate demographic and...

  10. NYHOPS Forecast Model Results

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — 3D Marine Nowcast/Forecast System for the New York Bight NYHOPS subdomain. Currents, waves, surface meteorology, and water conditions.

  11. Inflow forecasting at BPA

    Energy Technology Data Exchange (ETDEWEB)

    McManamon, A. [Bonneville Power Administration, Portland, OR (United States)

    2007-07-01

    The Columbia River Power System operates with consideration for flood control, endangered species, navigation, irrigation, water supply, recreation, other fish and wildlife concerns and power production. The Bonneville Power Association (BPA) located in Portland, Oregon is responsible for 35-40 per cent of the power consumed within the region. This presentation discussed inflow power concerns at BPA. The presentation illustrated elevational relief of projects; annual and daily variability; the hydrologic cycle; national river service weather forecasting service (NRSWFS); components of NRSWFS; and hydrologic forecast locations. Project operations and inventory were included along with a comparison of the 71-year average unregulated flow with regulated flow at the Dalles. Consistency between short-term and long-term forecasts and long-term streamflow forecasts were also illustrated in graphical format. The presentation also discussed the issue of reducing model and parameter uncertainty; reducing initial conditions uncertainty; snow updating; and reducing meteorological uncertainty. tabs., figs.

  12. CCAA seasonal forecasting

    International Development Research Centre (IDRC) Digital Library (Canada)

    Integrating meteorological and indigenous knowledge-based seasonal climate forecasts in ..... Explanation is based on spiritual and social values. Taught by .... that provided medicine and food became the subject of strict rules and practices ...

  13. Forecast Icing Product

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Forecast Icing Product (FIP) is an automatically-generated index suitable for depicting areas of potentially hazardous airframe icing. The FIP algorithm uses...

  14. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  15. Communicating weather forecast uncertainty: Do individual differences matter?

    Science.gov (United States)

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  17. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  18. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  19. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  20. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...