WorldWideScience

Sample records for reliable probability forecasts

  1. Forecasting reliability of transformer populations

    NARCIS (Netherlands)

    Schijndel, van A.; Wetzer, J.; Wouters, P.A.A.F.

    2007-01-01

    The expected replacement wave in the current power grid faces asset managers with challenging questions. Setting up a replacement strategy and planning calls for a forecast of the long term component reliability. For transformers the future failure probability can be predicted based on the ongoing

  2. Calibration and combination of dynamical seasonal forecasts to enhance the value of predicted probabilities for managing risk

    Science.gov (United States)

    Dutton, John A.; James, Richard P.; Ross, Jeremy D.

    2013-06-01

    Seasonal probability forecasts produced with numerical dynamics on supercomputers offer great potential value in managing risk and opportunity created by seasonal variability. The skill and reliability of contemporary forecast systems can be increased by calibration methods that use the historical performance of the forecast system to improve the ongoing real-time forecasts. Two calibration methods are applied to seasonal surface temperature forecasts of the US National Weather Service, the European Centre for Medium Range Weather Forecasts, and to a World Climate Service multi-model ensemble created by combining those two forecasts with Bayesian methods. As expected, the multi-model is somewhat more skillful and more reliable than the original models taken alone. The potential value of the multimodel in decision making is illustrated with the profits achieved in simulated trading of a weather derivative. In addition to examining the seasonal models, the article demonstrates that calibrated probability forecasts of weekly average temperatures for leads of 2-4 weeks are also skillful and reliable. The conversion of ensemble forecasts into probability distributions of impact variables is illustrated with degree days derived from the temperature forecasts. Some issues related to loss of stationarity owing to long-term warming are considered. The main conclusion of the article is that properly calibrated probabilistic forecasts possess sufficient skill and reliability to contribute to effective decisions in government and business activities that are sensitive to intraseasonal and seasonal climate variability.

  3. Electricity price forecasting using Enhanced Probability Neural Network

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2010-01-01

    This paper proposes a price forecasting system for electric market participants to reduce the risk of price volatility. Combining the Probability Neural Network (PNN) and Orthogonal Experimental Design (OED), an Enhanced Probability Neural Network (EPNN) is proposed in the solving process. In this paper, the Locational Marginal Price (LMP), system load and temperature of PJM system were collected and the data clusters were embedded in the Excel Database according to the year, season, workday, and weekend. With the OED to smooth parameters in the EPNN, the forecasting error can be improved during the training process to promote the accuracy and reliability where even the ''spikes'' can be tracked closely. Simulation results show the effectiveness of the proposed EPNN to provide quality information in a price volatile environment. (author)

  4. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  5. More intense experiences, less intense forecasts: why people overweight probability specifications in affective forecasts.

    Science.gov (United States)

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6).

  6. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  7. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.

  8. On the reliability of seasonal climate forecasts

    Science.gov (United States)

    Weisheimer, A.; Palmer, T. N.

    2014-01-01

    Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1–5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that ‘goodness’ should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a ‘5’ should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of ‘goodness’ rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching ‘5’ across all regions and variables in 30 years time. PMID:24789559

  9. Objective Lightning Probability Forecasts for East-Central Florida Airports

    Science.gov (United States)

    Crawford, Winfred C.

    2013-01-01

    The forecasters at the National Weather Service in Melbourne, FL, (NWS MLB) identified a need to make more accurate lightning forecasts to help alleviate delays due to thunderstorms in the vicinity of several commercial airports in central Florida at which they are responsible for issuing terminal aerodrome forecasts. Such forecasts would also provide safer ground operations around terminals, and would be of value to Center Weather Service Units serving air traffic controllers in Florida. To improve the forecast, the AMU was tasked to develop an objective lightning probability forecast tool for the airports using data from the National Lightning Detection Network (NLDN). The resulting forecast tool is similar to that developed by the AMU to support space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) for use by the 45th Weather Squadron (45 WS) in previous tasks (Lambert and Wheeler 2005, Lambert 2007). The lightning probability forecasts are valid for the time periods and areas needed by the NWS MLB forecasters in the warm season months, defined in this task as May-September.

  10. Exploring the interactions between forecast accuracy, risk perception and perceived forecast reliability in reservoir operator's decision to use forecast

    Science.gov (United States)

    Shafiee-Jood, M.; Cai, X.

    2017-12-01

    Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.

  11. Can confidence indicators forecast the probability of expansion in Croatia?

    Directory of Open Access Journals (Sweden)

    Mirjana Čižmešija

    2016-04-01

    Full Text Available The aim of this paper is to investigate how reliable are confidence indicators in forecasting the probability of expansion. We consider three Croatian Business Survey indicators: the Industrial Confidence Indicator (ICI, the Construction Confidence Indicator (BCI and the Retail Trade Confidence Indicator (RTCI. The quarterly data, used in the research, covered the periods from 1999/Q1 to 2014/Q1. Empirical analysis consists of two parts. The non-parametric Bry-Boschan algorithm is used for distinguishing periods of expansion from the period of recession in the Croatian economy. Then, various nonlinear probit models were estimated. The models differ with respect to the regressors (confidence indicators and the time lags. The positive signs of estimated parameters suggest that the probability of expansion increases with an increase in Confidence Indicators. Based on the obtained results, the conclusion is that ICI is the most powerful predictor of the probability of expansion in Croatia.

  12. Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier

    We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases into the c......We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...

  13. DEVELOPMENT OF THE PROBABLY-GEOGRAPHICAL FORECAST METHOD FOR DANGEROUS WEATHER PHENOMENA

    Directory of Open Access Journals (Sweden)

    Elena S. Popova

    2015-12-01

    Full Text Available This paper presents a scheme method of probably-geographical forecast for dangerous weather phenomena. Discuss two general realization stages of this method. Emphasize that developing method is response to actual questions of modern weather forecast and it’s appropriate phenomena: forecast is carried out for specific point in space and appropriate moment of time.

  14. Update to the Objective Lightning Probability Forecast Tool in use at Cape Canaveral Air Force Station, Florida

    Science.gov (United States)

    Lambert, Winifred; Roeder, William

    2013-01-01

    This conference poster describes the improvement of a set of lightning probability forecast equations that are used by the 45th Weather Squadron forecasters for their daily 1100 UTC (0700 EDT) weather briefing during the warm season months of May-September. This information is used for general scheduling of operations at Cape Canaveral Air Force Station and Kennedy Space Center. Forecasters at the Spaceflight Meteorology Group also make thunderstorm forecasts during Shuttle flight operations. Five modifications were made by the Applied Meteorology Unit: increased the period of record from 15 to 17 years, changed the method of calculating the flow regime of the day, calculated a new optimal layer relative humidity, used a new smoothing technique for the daily climatology, and used a new valid area. The test results indicated that the modified equations showed and increase in skill over the current equations, good reliability and an ability to distinguish between lightning and non-lightning days.

  15. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  16. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  17. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  18. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    Science.gov (United States)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  19. Reliability of structures by using probability and fatigue theories

    International Nuclear Information System (INIS)

    Lee, Ouk Sub; Kim, Dong Hyeok; Park, Yeon Chang

    2008-01-01

    Methodologies to calculate failure probability and to estimate the reliability of fatigue loaded structures are developed. The applicability of the methodologies is evaluated with the help of the fatigue crack growth models suggested by Paris and Walker. The probability theories such as the FORM (first order reliability method), the SORM (second order reliability method) and the MCS (Monte Carlo simulation) are utilized. It is found that the failure probability decreases with the increase of the design fatigue life and the applied minimum stress, the decrease of the initial edge crack size, the applied maximum stress and the slope of Paris equation. Furthermore, according to the sensitivity analysis of random variables, the slope of Pairs equation affects the failure probability dominantly among other random variables in the Paris and the Walker models

  20. Generalization of information-based concepts in forecast verification

    Science.gov (United States)

    Tödter, J.; Ahrens, B.

    2012-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.

  1. Ensemble prediction of floods – catchment non-linearity and forecast probabilities

    Directory of Open Access Journals (Sweden)

    C. Reszler

    2007-07-01

    Full Text Available Quantifying the uncertainty of flood forecasts by ensemble methods is becoming increasingly important for operational purposes. The aim of this paper is to examine how the ensemble distribution of precipitation forecasts propagates in the catchment system, and to interpret the flood forecast probabilities relative to the forecast errors. We use the 622 km2 Kamp catchment in Austria as an example where a comprehensive data set, including a 500 yr and a 1000 yr flood, is available. A spatially-distributed continuous rainfall-runoff model is used along with ensemble and deterministic precipitation forecasts that combine rain gauge data, radar data and the forecast fields of the ALADIN and ECMWF numerical weather prediction models. The analyses indicate that, for long lead times, the variability of the precipitation ensemble is amplified as it propagates through the catchment system as a result of non-linear catchment response. In contrast, for lead times shorter than the catchment lag time (e.g. 12 h and less, the variability of the precipitation ensemble is decreased as the forecasts are mainly controlled by observed upstream runoff and observed precipitation. Assuming that all ensemble members are equally likely, the statistical analyses for five flood events at the Kamp showed that the ensemble spread of the flood forecasts is always narrower than the distribution of the forecast errors. This is because the ensemble forecasts focus on the uncertainty in forecast precipitation as the dominant source of uncertainty, and other sources of uncertainty are not accounted for. However, a number of analyses, including Relative Operating Characteristic diagrams, indicate that the ensemble spread is a useful indicator to assess potential forecast errors for lead times larger than 12 h.

  2. The Probability of Default Under IFRS 9: Multi-period Estimation and Macroeconomic Forecast

    Directory of Open Access Journals (Sweden)

    Tomáš Vaněk

    2017-01-01

    Full Text Available In this paper we propose a straightforward, flexible and intuitive computational framework for the multi-period probability of default estimation incorporating macroeconomic forecasts. The concept is based on Markov models, the estimated economic adjustment coefficient and the official economic forecasts of the Czech National Bank. The economic forecasts are taken into account in a separate step to better distinguish between idiosyncratic and systemic risk. This approach is also attractive from the interpretational point of view. The proposed framework can be used especially when calculating lifetime expected credit losses under IFRS 9.

  3. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  4. How uncertain are day-ahead wind forecasts?

    Energy Technology Data Exchange (ETDEWEB)

    Grimit, E. [3TIER Environmental Forecast Group, Seattle, WA (United States)

    2006-07-01

    Recent advances in the combination of weather forecast ensembles with Bayesian statistical techniques have helped to address uncertainties in wind forecasting. Weather forecast ensembles are a collection of numerical weather predictions. The combination of several equally-skilled forecasts typically results in a consensus forecast with greater accuracy. The distribution of forecasts also provides an estimate of forecast inaccuracy. However, weather forecast ensembles tend to be under-dispersive, and not all forecast uncertainties can be taken into account. In order to address these issues, a multi-variate linear regression approach was used to correct the forecast bias for each ensemble member separately. Bayesian model averaging was used to provide a predictive probability density function to allow for multi-modal probability distributions. A test location in eastern Canada was used to demonstrate the approach. Results of the test showed that the method improved wind forecasts and generated reliable prediction intervals. Prediction intervals were much shorter than comparable intervals based on a single forecast or on historical observations alone. It was concluded that the approach will provide economic benefits to both wind energy developers and investors. refs., tabs., figs.

  5. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  6. Probabilistic forecasting for extreme NO2 pollution episodes

    International Nuclear Information System (INIS)

    Aznarte, José L.

    2017-01-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. - Highlights: • A new probabilistic forecasting system is presented to predict NO 2 concentrations. • While predicting the full distribution, it also outperforms other point-forecasting models. • Forecasts show good properties and peak concentrations are properly predicted. • It forecasts the probability of exceedance of thresholds, key to decision makers. • Relative forecasting importance of the variables is obtained as a by-product.

  7. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  8. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    Science.gov (United States)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  9. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    Science.gov (United States)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  10. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  11. Assessing the Effectiveness of the Cone of Probability as a Visual Means of Communicating Scientific Forecasts

    Science.gov (United States)

    Orlove, B. S.; Broad, K.; Meyer, R.

    2010-12-01

    We review the evolution, communication, and differing interpretations of the National Hurricane Center (NHC)'s "cone of uncertainty" hurricane forecast graphic, drawing on several related disciplines—cognitive psychology, visual anthropology, and risk communication theory. We examine the 2004 hurricane season, two specific hurricanes (Katrina 2005 and Ike 2008) and the 2010 hurricane season, still in progress. During the 2004 hurricane season, five named storms struck Florida. Our analysis of that season draws upon interviews with key government officials and media figures, archival research of Florida newspapers, analysis of public comments on the NHC cone of uncertainty graphic and a multiagency study of 2004 hurricane behavior. At that time, the hurricane forecast graphic was subject to misinterpretation by many members of the public. We identify several characteristics of this graphic that contributed to public misinterpretation. Residents overemphasized the specific track of the eye, failed to grasp the width of hurricanes, and generally did not recognize the timing of the passage of the hurricane. Little training was provided to emergency response managers in the interpretation of forecasts. In the following year, Katrina became a national scandal, further demonstrating the limitations of the cone as a means of leading to appropriate responses to forecasts. In the second half of the first decade of the 21st century, three major changes occurred in hurricane forecast communication: the forecasts themselves improved in terms of accuracy and lead time, the NHC made minor changes in the graphics and expanded the explanatory material that accompanies the graphics, and some efforts were made to reach out to emergency response planners and municipal officials to enhance their understanding of the forecasts and graphics. There were some improvements in the responses to Ike, though a number of deaths were due to inadequate evacuations, and property damage probably

  12. Plant calendar pattern based on rainfall forecast and the probability of its success in Deli Serdang regency of Indonesia

    Science.gov (United States)

    Darnius, O.; Sitorus, S.

    2018-03-01

    The objective of this study was to determine the pattern of plant calendar of three types of crops; namely, palawija, rice, andbanana, based on rainfall in Deli Serdang Regency. In the first stage, we forecasted rainfall by using time series analysis, and obtained appropriate model of ARIMA (1,0,0) (1,1,1)12. Based on the forecast result, we designed a plant calendar pattern for the three types of plant. Furthermore, the probability of success in the plant types following the plant calendar pattern was calculated by using the Markov process by discretizing the continuous rainfall data into three categories; namely, Below Normal (BN), Normal (N), and Above Normal (AN) to form the probability transition matrix. Finally, the combination of rainfall forecasting models and the Markov process were used to determine the pattern of cropping calendars and the probability of success in the three crops. This research used rainfall data of Deli Serdang Regency taken from the office of BMKG (Meteorologist Climatology and Geophysics Agency), Sampali Medan, Indonesia.

  13. Pollutant forecasting error based on persistence of wind direction

    International Nuclear Information System (INIS)

    Cooper, R.E.

    1978-01-01

    The purpose of this report is to provide a means of estimating the reliability of forecasts of downwind pollutant concentrations from atmospheric puff releases. These forecasts are based on assuming the persistence of wind direction as determined at the time of release. This initial forecast will be used to deploy survey teams, to predict population centers that may be affected, and to estimate the amount of time available for emergency response. Reliability of forecasting is evaluated by developing a cumulative probability distribution of error as a function of lapsed time following an assumed release. The cumulative error is determined by comparing the forecast pollutant concentration with the concentration measured by sampling along the real-time meteorological trajectory. It may be concluded that the assumption of meteorological persistence for emergency response is not very good for periods longer than 3 hours. Even within this period, the possibiity for large error exists due to wind direction shifts. These shifts could affect population areas totally different from those areas first indicated

  14. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  15. Forecasting systems reliability based on support vector regression with genetic algorithms

    International Nuclear Information System (INIS)

    Chen, K.-Y.

    2007-01-01

    This study applies a novel neural-network technique, support vector regression (SVR), to forecast reliability in engine systems. The aim of this study is to examine the feasibility of SVR in systems reliability prediction by comparing it with the existing neural-network approaches and the autoregressive integrated moving average (ARIMA) model. To build an effective SVR model, SVR's parameters must be set carefully. This study proposes a novel approach, known as GA-SVR, which searches for SVR's optimal parameters using real-value genetic algorithms, and then adopts the optimal parameters to construct the SVR models. A real reliability data for 40 suits of turbochargers were employed as the data set. The experimental results demonstrate that SVR outperforms the existing neural-network approaches and the traditional ARIMA models based on the normalized root mean square error and mean absolute percentage error

  16. Estimating predictive hydrological uncertainty by dressing deterministic and ensemble forecasts; a comparison, with application to Meuse and Rhine

    Science.gov (United States)

    Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.

    2017-12-01

    Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability

  17. Empirical investigation on using wind speed volatility to estimate the operation probability and power output of wind turbines

    International Nuclear Information System (INIS)

    Liu, Heping; Shi, Jing; Qu, Xiuli

    2013-01-01

    Highlights: ► Ten-minute wind speed and power generation data of an offshore wind turbine are used. ► An ARMA–GARCH-M model is built to simultaneously forecast wind speed mean and volatility. ► The operation probability and expected power output of the wind turbine are predicted. ► The integrated approach produces more accurate wind power forecasting than other conventional methods. - Abstract: In this paper, we introduce a quantitative methodology that performs the interval estimation of wind speed, calculates the operation probability of wind turbine, and forecasts the wind power output. The technological advantage of this methodology stems from the empowered capability of mean and volatility forecasting of wind speed. Based on the real wind speed and corresponding wind power output data from an offshore wind turbine, this methodology is applied to build an ARMA–GARCH-M model for wind speed forecasting, and then to compute the operation probability and the expected power output of the wind turbine. The results show that the developed methodology is effective, the obtained interval estimation of wind speed is reliable, and the forecasted operation probability and expected wind power output of the wind turbine are accurate

  18. Flood Forecasting Based on TIGGE Precipitation Ensemble Forecast

    Directory of Open Access Journals (Sweden)

    Jinyin Ye

    2016-01-01

    Full Text Available TIGGE (THORPEX International Grand Global Ensemble was a major part of the THORPEX (Observing System Research and Predictability Experiment. It integrates ensemble precipitation products from all the major forecast centers in the world and provides systematic evaluation on the multimodel ensemble prediction system. Development of meteorologic-hydrologic coupled flood forecasting model and early warning model based on the TIGGE precipitation ensemble forecast can provide flood probability forecast, extend the lead time of the flood forecast, and gain more time for decision-makers to make the right decision. In this study, precipitation ensemble forecast products from ECMWF, NCEP, and CMA are used to drive distributed hydrologic model TOPX. We focus on Yi River catchment and aim to build a flood forecast and early warning system. The results show that the meteorologic-hydrologic coupled model can satisfactorily predict the flow-process of four flood events. The predicted occurrence time of peak discharges is close to the observations. However, the magnitude of the peak discharges is significantly different due to various performances of the ensemble prediction systems. The coupled forecasting model can accurately predict occurrence of the peak time and the corresponding risk probability of peak discharge based on the probability distribution of peak time and flood warning, which can provide users a strong theoretical foundation and valuable information as a promising new approach.

  19. Interval forecasting of cyberattack intensity on informatization objects of industry using probability cluster model

    Science.gov (United States)

    Krakovsky, Y. M.; Luzgin, A. N.; Mikhailova, E. A.

    2018-05-01

    At present, cyber-security issues associated with the informatization objects of industry occupy one of the key niches in the state management system. As a result of functional disruption of these systems via cyberattacks, an emergency may arise related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. When cyberattacks occur with high intensity, in these conditions there is the need to develop protection against them, based on machine learning methods. This paper examines interval forecasting and presents results with a pre-set intensity level. The interval forecasting is carried out based on a probabilistic cluster model. This method involves forecasting of one of the two predetermined intervals in which a future value of the indicator will be located; probability estimates are used for this purpose. A dividing bound of these intervals is determined by a calculation method based on statistical characteristics of the indicator. Source data are used that includes a number of hourly cyberattacks using a honeypot from March to September 2013.

  20. Assessing the potential for improving S2S forecast skill through multimodel ensembling

    Science.gov (United States)

    Vigaud, N.; Robertson, A. W.; Tippett, M. K.; Wang, L.; Bell, M. J.

    2016-12-01

    Non-linear logistic regression is well suited to probability forecasting and has been successfully applied in the past to ensemble weather and climate predictions, providing access to the full probabilities distribution without any Gaussian assumption. However, little work has been done at sub-monthly lead times where relatively small re-forecast ensembles and lengths represent new challenges for which post-processing avenues have yet to be investigated. A promising approach consists in extending the definition of non-linear logistic regression by including the quantile of the forecast distribution as one of the predictors. So-called Extended Logistic Regression (ELR), which enables mutually consistent individual threshold probabilities, is here applied to ECMWF, CFSv2 and CMA re-forecasts from the S2S database in order to produce rainfall probabilities at weekly resolution. The ELR model is trained on seasonally-varying tercile categories computed for lead times of 1 to 4 weeks. It is then tested in a cross-validated manner, i.e. allowing real-time predictability applications, to produce rainfall tercile probabilities from individual weekly hindcasts that are finally combined by equal pooling. Results will be discussed over a broader North American region, where individual and MME forecasts generated out to 4 weeks lead are characterized by good probabilistic reliability but low sharpness, exhibiting systematically more skill in winter than summer.

  1. A convection-allowing ensemble forecast based on the breeding growth mode and associated optimization of precipitation forecast

    Science.gov (United States)

    Li, Xiang; He, Hongrang; Chen, Chaohui; Miao, Ziqing; Bai, Shigang

    2017-10-01

    A convection-allowing ensemble forecast experiment on a squall line was conducted based on the breeding growth mode (BGM). Meanwhile, the probability matched mean (PMM) and neighborhood ensemble probability (NEP) methods were used to optimize the associated precipitation forecast. The ensemble forecast predicted the precipitation tendency accurately, which was closer to the observation than in the control forecast. For heavy rainfall, the precipitation center produced by the ensemble forecast was also better. The Fractions Skill Score (FSS) results indicated that the ensemble mean was skillful in light rainfall, while the PMM produced better probability distribution of precipitation for heavy rainfall. Preliminary results demonstrated that convection-allowing ensemble forecast could improve precipitation forecast skill through providing valuable probability forecasts. It is necessary to employ new methods, such as the PMM and NEP, to generate precipitation probability forecasts. Nonetheless, the lack of spread and the overprediction of precipitation by the ensemble members are still problems that need to be solved.

  2. Reliability assessment and probability based design of reinforced concrete containments and shear walls

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Ellingwood, B.; Shinozuka, M.

    1986-03-01

    This report summarizes work completed under the program entitled, ''Probability-Based Load Combinations for Design of Category I Structures.'' Under this program, the probabilistic models for various static and dynamic loads were formulated. The randomness and uncertainties in material strengths and structural resistance were established. Several limit states of concrete containments and shear walls were identified and analytically formulated. Furthermore, the reliability analysis methods for estimating limit state probabilities were established. These reliability analysis methods can be used to evaluate the safety levels of nuclear structures under various combinations of static and dynamic loads. They can also be used to generate analytically the fragility data for PRA studies. In addition to the development of reliability analysis methods, probability-based design criteria for concrete containments and shear wall structures have also been developed. The proposed design criteria are in the load and resistance factor design (LRFD) format. The load and resistance factors are determined for several limit states and target limit state probabilities. Thus, the proposed design criteria are risk-consistent and have a well-established rationale. 73 refs., 18 figs., 16 tabs

  3. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  4. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    Science.gov (United States)

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  5. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  6. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  7. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  8. Verification of space weather forecasts at the UK Met Office

    Science.gov (United States)

    Bingham, S.; Sharpe, M.; Jackson, D.; Murray, S.

    2017-12-01

    The UK Met Office Space Weather Operations Centre (MOSWOC) has produced space weather guidance twice a day since its official opening in 2014. Guidance includes 4-day probabilistic forecasts of X-ray flares, geomagnetic storms, high-energy electron events and high-energy proton events. Evaluation of such forecasts is important to forecasters, stakeholders, model developers and users to understand the performance of these forecasts and also strengths and weaknesses to enable further development. Met Office terrestrial near real-time verification systems have been adapted to provide verification of X-ray flare and geomagnetic storm forecasts. Verification is updated daily to produce Relative Operating Characteristic (ROC) curves and Reliability diagrams, and rolling Ranked Probability Skill Scores (RPSSs) thus providing understanding of forecast performance and skill. Results suggest that the MOSWOC issued X-ray flare forecasts are usually not statistically significantly better than a benchmark climatological forecast (where the climatology is based on observations from the previous few months). By contrast, the issued geomagnetic storm activity forecast typically performs better against this climatological benchmark.

  9. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  10. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    Science.gov (United States)

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of  0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For

  11. Medium Range Forecasts Representation (and Long Range Forecasts?)

    Science.gov (United States)

    Vincendon, J.-C.

    2009-09-01

    The progress of the numerical forecasts urges us to interest us in more and more distant ranges. We thus supply more and more forecasts with term of some days. Nevertheless, precautions of use are necessary to give the most reliable and the most relevant possible information. Available in a TV bulletin or on quite other support (Internet, mobile phone), the interpretation and the representation of a medium range forecast (5 - 15 days) must be different from those of a short range forecast. Indeed, the "foresee-ability” of a meteorological phenomenon decreases gradually in the course of the ranges, it decreases all the more quickly that the phenomenon is of small scale. So, at the end of some days, the probability character of a forecast becomes very widely dominating. That is why in Meteo-France the forecasts of D+4 to D+7 are accompanied with a confidence index since around ten years. It is a figure between 1 and 5: the more we approach 5, the more the confidence in the supplied forecast is good. In the practice, an indication is supplied for period D+4 / D+5, the other one for period D+6 / D+7, every day being able to benefit from a different forecast, that is be represented in a independent way. We thus supply a global tendency over 24 hours with less and less precise symbols as the range goes away. Concrete examples will be presented. From now on two years, we also publish forecasts to D+8 / J+9, accompanied with a sign of confidence (" good reliability " or " to confirm "). These two days are grouped together on a single map because for us, the described tendency to this term is relevant on a duration about 48 hours with a spatial scale slightly superior to the synoptic scale. So, we avoid producing more than two zones of types of weather over France and we content with giving an evolution for the temperatures (still, in increase or in decline). Newspapers began to publish this information, it should soon be the case of televisions. It is particularly

  12. Incorporating forecast uncertainties into EENS for wind turbine studies

    Energy Technology Data Exchange (ETDEWEB)

    Toh, G.K.; Gooi, H.B. [School of EEE, Nanyang Technological University, Singapore 639798 (Singapore)

    2011-02-15

    The rapid increase in wind power generation around the world has stimulated the development of applicable technologies to model the uncertainties of wind power resulting from the stochastic nature of wind and fluctuations of demand for integration of wind turbine generators (WTGs). In this paper the load and wind power forecast errors are integrated into the expected energy not served (EENS) formulation through determination of probabilities using the normal distribution approach. The effects of forecast errors and wind energy penetration in the power system are traversed. The impact of wind energy penetration on system reliability, total cost for energy and reserve procurement is then studied for a conventional power system. The results show a degradation of system reliability with significant wind energy penetration in the generation system. This work provides a useful insight into system reliability and economics for the independent system operator (ISO) to deploy energy/reserve providers when WTGs are integrated into the existing power system. (author)

  13. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Science.gov (United States)

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis

    2016-03-01

    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  14. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  15. Operational foreshock forecasting: Fifteen years after

    Science.gov (United States)

    Ogata, Y.

    2010-12-01

    We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to

  16. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  17. Bulk electric system reliability evaluation incorporating wind power and demand side management

    Science.gov (United States)

    Huang, Dange

    Electric power systems are experiencing dramatic changes with respect to structure, operation and regulation and are facing increasing pressure due to environmental and societal constraints. Bulk electric system reliability is an important consideration in power system planning, design and operation particularly in the new competitive environment. A wide range of methods have been developed to perform bulk electric system reliability evaluation. Theoretically, sequential Monte Carlo simulation can include all aspects and contingencies in a power system and can be used to produce an informative set of reliability indices. It has become a practical and viable tool for large system reliability assessment technique due to the development of computing power and is used in the studies described in this thesis. The well-being approach used in this research provides the opportunity to integrate an accepted deterministic criterion into a probabilistic framework. This research work includes the investigation of important factors that impact bulk electric system adequacy evaluation and security constrained adequacy assessment using the well-being analysis framework. Load forecast uncertainty is an important consideration in an electrical power system. This research includes load forecast uncertainty considerations in bulk electric system reliability assessment and the effects on system, load point and well-being indices and reliability index probability distributions are examined. There has been increasing worldwide interest in the utilization of wind power as a renewable energy source over the last two decades due to enhanced public awareness of the environment. Increasing penetration of wind power has significant impacts on power system reliability, and security analyses become more uncertain due to the unpredictable nature of wind power. The effects of wind power additions in generating and bulk electric system reliability assessment considering site wind speed

  18. Case studies of extended model-based flood forecasting: prediction of dike strength and flood impacts

    Science.gov (United States)

    Stuparu, Dana; Bachmann, Daniel; Bogaard, Tom; Twigt, Daniel; Verkade, Jan; de Bruijn, Karin; de Leeuw, Annemargreet

    2017-04-01

    Flood forecasts, warning and emergency response are important components in flood risk management. Most flood forecasting systems use models to translate weather predictions to forecasted discharges or water levels. However, this information is often not sufficient for real time decisions. A sound understanding of the reliability of embankments and flood dynamics is needed to react timely and reduce the negative effects of the flood. Where are the weak points in the dike system? When, how much and where the water will flow? When and where is the greatest impact expected? Model-based flood impact forecasting tries to answer these questions by adding new dimensions to the existing forecasting systems by providing forecasted information about: (a) the dike strength during the event (reliability), (b) the flood extent in case of an overflow or a dike failure (flood spread) and (c) the assets at risk (impacts). This work presents three study-cases in which such a set-up is applied. Special features are highlighted. Forecasting of dike strength. The first study-case focusses on the forecast of dike strength in the Netherlands for the river Rhine branches Waal, Nederrijn and IJssel. A so-called reliability transformation is used to translate the predicted water levels at selected dike sections into failure probabilities during a flood event. The reliability of a dike section is defined by fragility curves - a summary of the dike strength conditional to the water level. The reliability information enhances the emergency management and inspections of embankments. Ensemble forecasting. The second study-case shows the setup of a flood impact forecasting system in Dumfries, Scotland. The existing forecasting system is extended with a 2D flood spreading model in combination with the Delft-FIAT impact model. Ensemble forecasts are used to make use of the uncertainty in the precipitation forecasts, which is useful to quantify the certainty of a forecasted flood event. From global

  19. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  20. On New Cautious Structural Reliability Models in the Framework of imprecise Probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev V.; Kozine, Igor

    2010-01-01

    models and gen-eralizing conventional ones to imprecise probabili-ties. The theoretical setup employed for this purpose is imprecise statistical reasoning (Walley 1991), whose general framework is provided by upper and lower previsions (expectations). The appeal of this theory is its ability to capture......Uncertainty of parameters in engineering design has been modeled in different frameworks such as inter-val analysis, fuzzy set and possibility theories, ran-dom set theory and imprecise probability theory. The authors of this paper for many years have been de-veloping new imprecise reliability...... both aleatory (stochas-tic) and epistemic uncertainty and the flexibility with which information can be represented. The previous research of the authors related to generalizing structural reliability models to impre-cise statistical measures is summarized in Utkin & Kozine (2002) and Utkin (2004...

  1. 3-D visualization of ensemble weather forecasts - Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Science.gov (United States)

    Rautenhaus, M.; Grams, C. M.; Schäfler, A.; Westermann, R.

    2015-02-01

    We present the application of interactive 3-D visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the ECMWF ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and forecast wind field resolution. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (three to seven days before take-off).

  2. Three-dimensional visualization of ensemble weather forecasts - Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Science.gov (United States)

    Rautenhaus, M.; Grams, C. M.; Schäfler, A.; Westermann, R.

    2015-07-01

    We present the application of interactive three-dimensional (3-D) visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 (THORPEX - North Atlantic Waveguide and Downstream Impact Experiment) campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts (WCBs) has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the European Centre for Medium Range Weather Forecasts (ECMWF) ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and grid spacing of the forecast wind field. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (3 to 7 days before take-off).

  3. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  4. Predictive Uncertainty Estimation in Water Demand Forecasting Using the Model Conditional Processor

    Directory of Open Access Journals (Sweden)

    Amos O. Anele

    2018-04-01

    Full Text Available In a previous paper, a number of potential models for short-term water demand (STWD prediction have been analysed to find the ones with the best fit. The results obtained in Anele et al. (2017 showed that hybrid models may be considered as the accurate and appropriate forecasting models for STWD prediction. However, such best single valued forecast does not guarantee reliable and robust decisions, which can be properly obtained via model uncertainty processors (MUPs. MUPs provide an estimate of the full predictive densities and not only the single valued expected prediction. Amongst other MUPs, the purpose of this paper is to use the multi-variate version of the model conditional processor (MCP, proposed by Todini (2008, to demonstrate how the estimation of the predictive probability conditional to a number of relatively good predictive models may improve our knowledge, thus reducing the predictive uncertainty (PU when forecasting into the unknown future. Through the MCP approach, the probability distribution of the future water demand can be assessed depending on the forecast provided by one or more deterministic forecasting models. Based on an average weekly data of 168 h, the probability density of the future demand is built conditional on three models’ predictions, namely the autoregressive-moving average (ARMA, feed-forward back propagation neural network (FFBP-NN and hybrid model (i.e., combined forecast from ARMA and FFBP-NN. The results obtained show that MCP may be effectively used for real-time STWD prediction since it brings out the PU connected to its forecast, and such information could help water utilities estimate the risk connected to a decision.

  5. Evaluation of statistical models for forecast errors from the HBV model

    Science.gov (United States)

    Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur

    2010-04-01

    SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.

  6. Predicting Flow Breakdown Probability and Duration in Stochastic Network Models: Impact on Travel Time Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Jing [ORNL; Mahmassani, Hani S. [Northwestern University, Evanston

    2011-01-01

    This paper proposes a methodology to produce random flow breakdown endogenously in a mesoscopic operational model, by capturing breakdown probability and duration. Based on previous research findings that probability of flow breakdown can be represented as a function of flow rate and the duration can be characterized by a hazard model. By generating random flow breakdown at various levels and capturing the traffic characteristics at the onset of the breakdown, the stochastic network simulation model provides a tool for evaluating travel time variability. The proposed model can be used for (1) providing reliability related traveler information; (2) designing ITS (intelligent transportation systems) strategies to improve reliability; and (3) evaluating reliability-related performance measures of the system.

  7. Estimating the benefits of single value and probability forecasting for flood warning

    NARCIS (Netherlands)

    Verkade, J.S.; Werner, M.G.F.

    2011-01-01

    Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS). These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty

  8. Estimating the benefits of single value and probability forecasting for flood warning

    NARCIS (Netherlands)

    Verkade, J.S.; Werner, M.G.F.

    2011-01-01

    Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS). These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed floods, or surprises. This forecasting

  9. Three-dimensional visualization of ensemble weather forecasts – Part 2: Forecasting warm conveyor belt situations for aircraft-based field campaigns

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2015-07-01

    Full Text Available We present the application of interactive three-dimensional (3-D visualization of ensemble weather predictions to forecasting warm conveyor belt situations during aircraft-based atmospheric research campaigns. Motivated by forecast requirements of the T-NAWDEX-Falcon 2012 (THORPEX – North Atlantic Waveguide and Downstream Impact Experiment campaign, a method to predict 3-D probabilities of the spatial occurrence of warm conveyor belts (WCBs has been developed. Probabilities are derived from Lagrangian particle trajectories computed on the forecast wind fields of the European Centre for Medium Range Weather Forecasts (ECMWF ensemble prediction system. Integration of the method into the 3-D ensemble visualization tool Met.3D, introduced in the first part of this study, facilitates interactive visualization of WCB features and derived probabilities in the context of the ECMWF ensemble forecast. We investigate the sensitivity of the method with respect to trajectory seeding and grid spacing of the forecast wind field. Furthermore, we propose a visual analysis method to quantitatively analyse the contribution of ensemble members to a probability region and, thus, to assist the forecaster in interpreting the obtained probabilities. A case study, revisiting a forecast case from T-NAWDEX-Falcon, illustrates the practical application of Met.3D and demonstrates the use of 3-D and uncertainty visualization for weather forecasting and for planning flight routes in the medium forecast range (3 to 7 days before take-off.

  10. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  11. Reliability assessment for thickness measurements of pipe wall using probability of detection

    International Nuclear Information System (INIS)

    Nakamoto, Hiroyuki; Kojima, Fumio; Kato, Sho

    2013-01-01

    This paper proposes a reliability assessment method for thickness measurements of pipe wall using probability of detection (POD). Thicknesses of pipes are measured by qualified inspectors with ultrasonic thickness gauges. The inspection results are affected by human factors of the inspectors and include some errors, because the inspectors have different experiences and frequency of inspections. In order to ensure reliability for inspection results, first, POD evaluates experimental results of pipe-wall thickness inspection. We verify that the results have differences depending on inspectors including qualified inspectors. Second, two human factors that affect POD are indicated. Finally, it is confirmed that POD can identify the human factors and ensure reliability for pipe-wall thickness inspections. (author)

  12. On new cautious structural reliability models in the framework of imprecise probabilities

    DEFF Research Database (Denmark)

    Utkin, Lev; Kozine, Igor

    2010-01-01

    measures when the number of events of interest or observations is very small. The main feature of the models is that prior ignorance is not modelled by a fixed single prior distribution, but by a class of priors which is defined by upper and lower probabilities that can converge as statistical data......New imprecise structural reliability models are described in this paper. They are developed based on the imprecise Bayesian inference and are imprecise Dirichlet, imprecise negative binomial, gamma-exponential and normal models. The models are applied to computing cautious structural reliability...

  13. How will climate novelty influence ecological forecasts? Using the Quaternary to assess future reliability.

    Science.gov (United States)

    Fitzpatrick, Matthew C; Blois, Jessica L; Williams, John W; Nieto-Lugilde, Diego; Maguire, Kaitlin C; Lorenz, David J

    2018-03-23

    Future climates are projected to be highly novel relative to recent climates. Climate novelty challenges models that correlate ecological patterns to climate variables and then use these relationships to forecast ecological responses to future climate change. Here, we quantify the magnitude and ecological significance of future climate novelty by comparing it to novel climates over the past 21,000 years in North America. We then use relationships between model performance and climate novelty derived from the fossil pollen record from eastern North America to estimate the expected decrease in predictive skill of ecological forecasting models as future climate novelty increases. We show that, in the high emissions scenario (RCP 8.5) and by late 21st century, future climate novelty is similar to or higher than peak levels of climate novelty over the last 21,000 years. The accuracy of ecological forecasting models is projected to decline steadily over the coming decades in response to increasing climate novelty, although models that incorporate co-occurrences among species may retain somewhat higher predictive skill. In addition to quantifying future climate novelty in the context of late Quaternary climate change, this work underscores the challenges of making reliable forecasts to an increasingly novel future, while highlighting the need to assess potential avenues for improvement, such as increased reliance on geological analogs for future novel climates and improving existing models by pooling data through time and incorporating assemblage-level information. © 2018 John Wiley & Sons Ltd.

  14. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Science.gov (United States)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  15. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  16. Should we use seasonnal meteorological ensemble forecasts for hydrological forecasting? A case study for nordic watersheds in Canada.

    Science.gov (United States)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert; Guay, Catherine

    2017-04-01

    Hydro-electricity is a major source of energy for many countries throughout the world, including Canada. Long lead-time streamflow forecasts are all the more valuable as they help decision making and dam management. Different techniques exist for long-term hydrological forecasting. Perhaps the most well-known is 'Extended Streamflow Prediction' (ESP), which considers past meteorological scenarios as possible, often equiprobable, future scenarios. In the ESP framework, those past-observed meteorological scenarios (climatology) are used in turn as the inputs of a chosen hydrological model to produce ensemble forecasts (one member corresponding to each year in the available database). Many hydropower companies, including Hydro-Québec (province of Quebec, Canada) use variants of the above described ESP system operationally for long-term operation planning. The ESP system accounts for the hydrological initial conditions and for the natural variability of the meteorological variables. However, it cannot consider the current initial state of the atmosphere. Climate models can help remedy this drawback. In the context of a changing climate, dynamical forecasts issued from climate models seem to be an interesting avenue to improve upon the ESP method and could help hydropower companies to adapt their management practices to an evolving climate. Long-range forecasts from climate models can also be helpful for water management at locations where records of past meteorological conditions are short or nonexistent. In this study, we compare 7-month hydrological forecasts obtained from climate model outputs to an ESP system. The ESP system mimics the one used operationally at Hydro-Québec. The dynamical climate forecasts are produced by the European Center for Medium range Weather Forecasts (ECMWF) System4. Forecasts quality is assessed using numerical scores such as the Continuous Ranked Probability Score (CRPS) and the Ignorance score and also graphical tools such as the

  17. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  18. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Directory of Open Access Journals (Sweden)

    A. Schepen

    2018-03-01

    Full Text Available Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S, which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  19. Estimating the benefits of single value and probability forecasting for flood warning

    Directory of Open Access Journals (Sweden)

    J. S. Verkade

    2011-12-01

    Full Text Available Flood risk can be reduced by means of flood forecasting, warning and response systems (FFWRS. These systems include a forecasting sub-system which is imperfect, meaning that inherent uncertainties in hydrological forecasts may result in false alarms and missed events. This forecasting uncertainty decreases the potential reduction of flood risk, but is seldom accounted for in estimates of the benefits of FFWRSs. In the present paper, a method to estimate the benefits of (imperfect FFWRSs in reducing flood risk is presented. The method is based on a hydro-economic model of expected annual damage (EAD due to flooding, combined with the concept of Relative Economic Value (REV. The estimated benefits include not only the reduction of flood losses due to a warning response, but also consider the costs of the warning response itself, as well as the costs associated with forecasting uncertainty. The method allows for estimation of the benefits of FFWRSs that use either deterministic or probabilistic forecasts. Through application to a case study, it is shown that FFWRSs using a probabilistic forecast have the potential to realise higher benefits at all lead-times. However, it is also shown that provision of warning at increasing lead-time does not necessarily lead to an increasing reduction of flood risk, but rather that an optimal lead-time at which warnings are provided can be established as a function of forecast uncertainty and the cost-loss ratio of the user receiving and responding to the warning.

  20. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  1. Regional corrections and checking the reliability of geomagnetic forecasts

    International Nuclear Information System (INIS)

    Afanas'eva, V.I.; Shevnin, A.D.

    1978-01-01

    Regional corrections of the K index mark estimate with respect to the Moskva observatory are reviewed in order to improve the short-range forecast of the geomagnetic activity and to promote it within the aqua area. The forecasts of the storms of all categories and weak perturbations have been verified for the predominant days in the catalogue of the magnetic storms family. It is shown that the adopted methods of forecasts yield considerably good results for weak perturbations as well as for weak and moderate magnetic storms. Strong and very strong storms are less predictable

  2. Operational 0-3 h probabilistic quantitative precipitation forecasts: Recent performance and potential enhancements

    Science.gov (United States)

    Sokol, Z.; Kitzmiller, D.; Pešice, P.; Guan, S.

    2009-05-01

    The NOAA National Weather Service has maintained an automated, centralized 0-3 h prediction system for probabilistic quantitative precipitation forecasts since 2001. This advective-statistical system (ADSTAT) produces probabilities that rainfall will exceed multiple threshold values up to 50 mm at some location within a 40-km grid box. Operational characteristics and development methods for the system are described. Although development data were stratified by season and time of day, ADSTAT utilizes only a single set of nation-wide equations that relate predictor variables derived from radar reflectivity, lightning, satellite infrared temperatures, and numerical prediction model output to rainfall occurrence. A verification study documented herein showed that the operational ADSTAT reliably models regional variations in the relative frequency of heavy rain events. This was true even in the western United States, where no regional-scale, gridded hourly precipitation data were available during the development period in the 1990s. An effort was recently launched to improve the quality of ADSTAT forecasts by regionalizing the prediction equations and to adapt the model for application in the Czech Republic. We have experimented with incorporating various levels of regional specificity in the probability equations. The geographic localization study showed that in the warm season, regional climate differences and variations in the diurnal temperature cycle have a marked effect on the predictor-predictand relationships, and thus regionalization would lead to better statistical reliability in the forecasts.

  3. On density forecast evaluation

    NARCIS (Netherlands)

    Diks, C.

    2008-01-01

    Traditionally, probability integral transforms (PITs) have been popular means for evaluating density forecasts. For an ideal density forecast, the PITs should be uniformly distributed on the unit interval and independent. However, this is only a necessary condition, and not a sufficient one, as

  4. Weather Forecasts are for Wimps. Why Water Resource Managers Do Not Use Climate Forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, S. [James Martin Institute of Science and Civilization, Said Business School, University of Oxford, OX1 1HP (United Kingdom); Lach, D. [Oregon State University, Corvallis, OR, 97331-4501 (United States); Ingram, H. [School of Social Ecology, University of California Irvine, Irvine, CA, 92697-7075 (United States)

    2005-04-15

    Short-term climate forecasting offers the promise of improved hydrologic management strategies. However, water resource managers in the United States have proven reluctant to incorporate them in decision making. While managers usually cite poor reliability of the forecasts as the reason for this, they are seldom able to demonstrate knowledge of the actual performance of forecasts or to consistently articulate the level of reliability that they would require. Analysis of three case studies in California, the Pacific Northwest, and metro Washington DC identifies institutional reasons that appear to lie behind managers reluctance to use the forecasts. These include traditional reliance on large built infrastructure, organizational conservatism and complexity, mismatch of temporal and spatial scales of forecasts to management needs, political disincentives to innovation, and regulatory constraints. The paper concludes that wider acceptance of the forecasts will depend on their being incorporated in existing organizational routines and industrial codes and practices, as well as changes in management incentives to innovation. Finer spatial resolution of forecasts and the regional integration of multi-agency functions would also enhance their usability. The title of this article is taken from an advertising slogan for the Oldsmobile Bravura SUV.

  5. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    Science.gov (United States)

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  6. Ecological forecasts: An emerging imperative

    Science.gov (United States)

    James S. Clark; Steven R. Carpenter; Mary Barber; Scott Collins; Andy Dobson; Jonathan A. Foley; David M. Lodge; Mercedes Pascual; Roger Pielke; William Pizer; Cathy Pringle; Walter V. Reid; Kenneth A. Rose; Osvaldo Sala; William H. Schlesinger; Diana H. Wall; David Wear

    2001-01-01

    Planning and decision-making can be improved by access to reliable forecasts of ecosystem state, ecosystem services, and natural capital. Availability of new data sets, together with progress in computation and statistics, will increase our ability to forecast ecosystem change. An agenda that would lead toward a capacity to produce, evaluate, and communicate forecasts...

  7. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  8. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  9. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  10. Uncertainty Forecasts Improve Weather-Related Decisions and Attenuate the Effects of Forecast Error

    Science.gov (United States)

    Joslyn, Susan L.; LeClerc, Jared E.

    2012-01-01

    Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather…

  11. Real-time emergency forecasting technique for situation management systems

    Science.gov (United States)

    Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.

    2018-05-01

    The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.

  12. Spatial electric load forecasting

    CERN Document Server

    Willis, H Lee

    2002-01-01

    Spatial Electric Load Forecasting Consumer Demand for Power and ReliabilityCoincidence and Load BehaviorLoad Curve and End-Use ModelingWeather and Electric LoadWeather Design Criteria and Forecast NormalizationSpatial Load Growth BehaviorSpatial Forecast Accuracy and Error MeasuresTrending MethodsSimulation Method: Basic ConceptsA Detailed Look at the Simulation MethodBasics of Computerized SimulationAnalytical Building Blocks for Spatial SimulationAdvanced Elements of Computerized SimulationHybrid Trending-Simulation MethodsAdvanced

  13. United States streamflow probabilities based on forecasted La Nina, winter-spring 2000

    Science.gov (United States)

    Dettinger, M.D.; Cayan, D.R.; Redmond, K.T.

    1999-01-01

    Although for the last 5 months the TahitiDarwin Southern Oscillation Index (SOI) has hovered close to normal, the “equatorial” SOI has remained in the La Niña category and predictions are calling for La Niña conditions this winter. In view of these predictions of continuing La Niña and as a direct extension of previous studies of the relations between El NiñoSouthern Oscil-lation (ENSO) conditions and streamflow in the United States (e.g., Redmond and Koch, 1991; Cayan and Webb, 1992; Redmond and Cayan, 1994; Dettinger et al., 1998; Garen, 1998; Cayan et al., 1999; Dettinger et al., in press), the probabilities that United States streamflows from December 1999 through July 2000 will be in upper and lower thirds (terciles) of the historical records are estimated here. The processes that link ENSO to North American streamflow are discussed in detail in these diagnostics studies. Our justification for generating this forecast is threefold: (1) Cayan et al. (1999) recently have shown that ENSO influences on streamflow variations and extremes are proportionately larger than the corresponding precipitation teleconnections. (2) Redmond and Cayan (1994) and Dettinger et al. (in press) also have shown that the low-frequency evolution of ENSO conditions support long-lead correlations between ENSO and streamflow in many rivers of the conterminous United States. (3) In many rivers, significant (weeks-to-months) delays between precipitation and the release to streams of snowmelt or ground-water discharge can support even longer term forecasts of streamflow than is possible for precipitation. The relatively slow, orderly evolution of El Niño-Southern Oscillation episodes, the accentuated dependence of streamflow upon ENSO, and the long lags between precipitation and flow encourage us to provide the following analysis as a simple prediction of this year’s river flows.

  14. Space Weather Forecasting at IZMIRAN

    Science.gov (United States)

    Gaidash, S. P.; Belov, A. V.; Abunina, M. A.; Abunin, A. A.

    2017-12-01

    Since 1998, the Institute of Terrestrial Magnetism, Ionosphere, and Radio Wave Propagation (IZMIRAN) has had an operating heliogeophysical service—the Center for Space Weather Forecasts. This center transfers the results of basic research in solar-terrestrial physics into daily forecasting of various space weather parameters for various lead times. The forecasts are promptly available to interested consumers. This article describes the center and the main types of forecasts it provides: solar and geomagnetic activity, magnetospheric electron fluxes, and probabilities of proton increases. The challenges associated with the forecasting of effects of coronal mass ejections and coronal holes are discussed. Verification data are provided for the center's forecasts.

  15. Reliability Assessment of Wind Farm Electrical System Based on a Probability Transfer Technique

    Directory of Open Access Journals (Sweden)

    Hejun Yang

    2018-03-01

    Full Text Available The electrical system of a wind farm has a significant influence on the wind farm reliability and electrical energy yield. The disconnect switch installed in an electrical system cannot only improve the operating flexibility, but also enhance the reliability for a wind farm. Therefore, this paper develops a probabilistic transfer technique for integrating the electrical topology structure, the isolation operation of disconnect switch, and stochastic failure of electrical equipment into the reliability assessment of wind farm electrical system. Firstly, as the traditional two-state reliability model of electrical equipment cannot consider the isolation operation, so the paper develops a three-state reliability model to replace the two-state model for incorporating the isolation operation. In addition, a proportion apportion technique is presented to evaluate the state probability. Secondly, this paper develops a probabilistic transfer technique based on the thoughts that through transfer the unreliability of electrical system to the energy transmission interruption of wind turbine generators (WTGs. Finally, some novel indices for describing the reliability of wind farm electrical system are designed, and the variance coefficient of the designed indices is used as a convergence criterion to determine the termination of the assessment process. The proposed technique is applied to the reliability assessment of a wind farm with the different topologies. The simulation results show that the proposed techniques are effective in practical applications.

  16. An analog ensemble for short-term probabilistic solar power forecast

    International Nuclear Information System (INIS)

    Alessandrini, S.; Delle Monache, L.; Sperati, S.; Cervone, G.

    2015-01-01

    Highlights: • A novel method for solar power probabilistic forecasting is proposed. • The forecast accuracy does not depend on the nominal power. • The impact of climatology on forecast accuracy is evaluated. - Abstract: The energy produced by photovoltaic farms has a variable nature depending on astronomical and meteorological factors. The former are the solar elevation and the solar azimuth, which are easily predictable without any uncertainty. The amount of liquid water met by the solar radiation within the troposphere is the main meteorological factor influencing the solar power production, as a fraction of short wave solar radiation is reflected by the water particles and cannot reach the earth surface. The total cloud cover is a meteorological variable often used to indicate the presence of liquid water in the troposphere and has a limited predictability, which is also reflected on the global horizontal irradiance and, as a consequence, on solar photovoltaic power prediction. This lack of predictability makes the solar energy integration into the grid challenging. A cost-effective utilization of solar energy over a grid strongly depends on the accuracy and reliability of the power forecasts available to the Transmission System Operators (TSOs). Furthermore, several countries have in place legislation requiring solar power producers to pay penalties proportional to the errors of day-ahead energy forecasts, which makes the accuracy of such predictions a determining factor for producers to reduce their economic losses. Probabilistic predictions can provide accurate deterministic forecasts along with a quantification of their uncertainty, as well as a reliable estimate of the probability to overcome a certain production threshold. In this paper we propose the application of an analog ensemble (AnEn) method to generate probabilistic solar power forecasts (SPF). The AnEn is based on an historical set of deterministic numerical weather prediction (NWP) model

  17. An application and verification of ensemble forecasting on wind power to assess operational risk indicators in power grids

    Energy Technology Data Exchange (ETDEWEB)

    Alessandrini, S.; Ciapessoni, E.; Cirio, D.; Pitto, A.; Sperati, S. [Ricerca sul Sistema Energetico RSE S.p.A., Milan (Italy). Power System Development Dept. and Environment and Sustainable Development Dept.; Pinson, P. [Technical University of Denmark, Lyngby (Denmark). DTU Informatics

    2012-07-01

    Wind energy is part of the so-called not schedulable renewable sources, i.e. it must be exploited when it is available, otherwise it is lost. In European regulation it has priority of dispatch over conventional generation, to maximize green energy production. However, being variable and uncertain, wind (and solar) generation raises several issues for the security of the power grids operation. In particular, Transmission System Operators (TSOs) need as accurate as possible forecasts. Nowadays a deterministic approach in wind power forecasting (WPF) could easily be considered insufficient to face the uncertainty associated to wind energy. In order to obtain information about the accuracy of a forecast and a reliable estimation of its uncertainty, probabilistic forecasting is becoming increasingly widespread. In this paper we investigate the performances of the COnsortium for Small-scale MOdelling Limited area Ensemble Prediction System (COSMO-LEPS). First the ensemble application is followed by assessment of its properties (i.e. consistency, reliability) using different verification indices and diagrams calculated on wind power. Then we provide examples of how EPS based wind power forecast can be used in power system security analyses. Quantifying the forecast uncertainty allows to determine more accurately the regulation reserve requirements, hence improving security of operation and reducing system costs. In particular, the paper also presents a probabilistic power flow (PPF) technique developed at RSE and aimed to evaluate the impact of wind power forecast accuracy on the probability of security violations in power systems. (orig.)

  18. Test-retest reliability of the Middlesex Assessment of Mental State (MEAMS): a preliminary investigation in people with probable dementia.

    Science.gov (United States)

    Powell, T; Brooker, D J; Papadopolous, A

    1993-05-01

    Relative and absolute test-retest reliability of the MEAMS was examined in 12 subjects with probable dementia and 12 matched controls. Relative reliability was good. Measures of absolute reliability showed scores changing by up to 3 points over an interval of a week. A version effect was found to be in evidence.

  19. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2018-03-01

    Full Text Available Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  20. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  1. Predictor-weighting strategies for probabilistic wind power forecasting with an analog ensemble

    Directory of Open Access Journals (Sweden)

    Constantin Junk

    2015-04-01

    Full Text Available Unlike deterministic forecasts, probabilistic predictions provide estimates of uncertainty, which is an additional value for decision-making. Previous studies have proposed the analog ensemble (AnEn, which is a technique to generate uncertainty information from a purely deterministic forecast. The objective of this study is to improve the AnEn performance for wind power forecasts by developing static and dynamic weighting strategies, which optimize the predictor combination with a brute-force continuous ranked probability score (CRPS minimization and a principal component analysis (PCA of the predictors. Predictors are taken from the high-resolution deterministic forecasts of the European Centre for Medium-Range Weather Forecasts (ECMWF, including forecasts of wind at several heights, geopotential height, pressure, and temperature, among others. The weighting strategies are compared at five wind farms in Europe and the U.S. situated in regions with different terrain complexity, both on and offshore, and significantly improve the deterministic and probabilistic AnEn forecast performance compared to the AnEn with 10‑m wind speed and direction as predictors and compared to PCA-based approaches. The AnEn methodology also provides reliable estimation of the forecast uncertainty. The optimized predictor combinations are strongly dependent on terrain complexity, local wind regimes, and atmospheric stratification. Since the proposed predictor-weighting strategies can accomplish both the selection of relevant predictors as well as finding their optimal weights, the AnEn performance is improved by up to 20 % at on and offshore sites.

  2. Personnel reliability impact on petrochemical facilities monitoring system's failure skipping probability

    Science.gov (United States)

    Kostyukov, V. N.; Naumenko, A. P.

    2017-08-01

    The paper dwells upon urgent issues of evaluating impact of actions conducted by complex technological systems operators on their safe operation considering application of condition monitoring systems for elements and sub-systems of petrochemical production facilities. The main task for the research is to distinguish factors and criteria of monitoring system properties description, which would allow to evaluate impact of errors made by personnel on operation of real-time condition monitoring and diagnostic systems for machinery of petrochemical facilities, and find and objective criteria for monitoring system class, considering a human factor. On the basis of real-time condition monitoring concepts of sudden failure skipping risk, static and dynamic error, monitoring systems, one may solve a task of evaluation of impact that personnel's qualification has on monitoring system operation in terms of error in personnel or operators' actions while receiving information from monitoring systems and operating a technological system. Operator is considered as a part of the technological system. Although, personnel's behavior is usually a combination of the following parameters: input signal - information perceiving, reaction - decision making, response - decision implementing. Based on several researches on behavior of nuclear powers station operators in USA, Italy and other countries, as well as on researches conducted by Russian scientists, required data on operator's reliability were selected for analysis of operator's behavior at technological facilities diagnostics and monitoring systems. The calculations revealed that for the monitoring system selected as an example, the failure skipping risk for the set values of static (less than 0.01) and dynamic (less than 0.001) errors considering all related factors of data on reliability of information perception, decision-making, and reaction fulfilled is 0.037, in case when all the facilities and error probability are under

  3. Flood forecasting and uncertainty of precipitation forecasts

    International Nuclear Information System (INIS)

    Kobold, Mira; Suselj, Kay

    2004-01-01

    The timely and accurate flood forecasting is essential for the reliable flood warning. The effectiveness of flood warning is dependent on the forecast accuracy of certain physical parameters, such as the peak magnitude of the flood, its timing, location and duration. The conceptual rainfall - runoff models enable the estimation of these parameters and lead to useful operational forecasts. The accurate rainfall is the most important input into hydrological models. The input for the rainfall can be real time rain-gauges data, or weather radar data, or meteorological forecasted precipitation. The torrential nature of streams and fast runoff are characteristic for the most of the Slovenian rivers. Extensive damage is caused almost every year- by rainstorms affecting different regions of Slovenia' The lag time between rainfall and runoff is very short for Slovenian territory and on-line data are used only for now casting. Forecasted precipitations are necessary for hydrological forecast for some days ahead. ECMWF (European Centre for Medium-Range Weather Forecasts) gives general forecast for several days ahead while more detailed precipitation data with limited area ALADIN/Sl model are available for two days ahead. There is a certain degree of uncertainty using such precipitation forecasts based on meteorological models. The variability of precipitation is very high in Slovenia and the uncertainty of ECMWF predicted precipitation is very large for Slovenian territory. ECMWF model can predict precipitation events correctly, but underestimates amount of precipitation in general The average underestimation is about 60% for Slovenian region. The predictions of limited area ALADIN/Si model up to; 48 hours ahead show greater applicability in hydrological forecasting. The hydrological models are sensitive to precipitation input. The deviation of runoff is much bigger than the rainfall deviation. Runoff to rainfall error fraction is about 1.6. If spatial and time distribution

  4. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  5. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  6. Problems of Forecast

    OpenAIRE

    Kucharavy , Dmitry; De Guio , Roland

    2005-01-01

    International audience; The ability to foresee future technology is a key task of Innovative Design. The paper focuses on the obstacles to reliable prediction of technological evolution for the purpose of Innovative Design. First, a brief analysis of problems for existing forecasting methods is presented. The causes for the complexity of technology prediction are discussed in the context of reduction of the forecast errors. Second, using a contradiction analysis, a set of problems related to ...

  7. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Science.gov (United States)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  8. A Wind Forecasting System for Energy Application

    Science.gov (United States)

    Courtney, Jennifer; Lynch, Peter; Sweeney, Conor

    2010-05-01

    Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated

  9. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  10. Communicating weather forecast uncertainty: Do individual differences matter?

    Science.gov (United States)

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  12. 7 CFR 1710.206 - Approval requirements for load forecasts prepared pursuant to approved load forecast work plans.

    Science.gov (United States)

    2010-01-01

    ... financial ratings, and participation in reliability council, power pool, regional transmission group, power... analysis and modeling of the borrower's electric system loads as provided for in the load forecast work plan. (5) A narrative discussing the borrower's past, existing, and forecast of future electric system...

  13. Novel methodology for pharmaceutical expenditure forecast

    OpenAIRE

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aball?a, Samuel; R?muzat, C?cile; Urbinati, Duccio; Kornfeld, ?sa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    Background and objective: The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the ‘EU Pharmaceutical e...

  14. Benchmark analysis of forecasted seasonal temperature over different climatic areas

    Science.gov (United States)

    Giunta, G.; Salerno, R.; Ceppi, A.; Ercolani, G.; Mancini, M.

    2015-12-01

    From a long-term perspective, an improvement of seasonal forecasting, which is often exclusively based on climatology, could provide a new capability for the management of energy resources in a time scale of just a few months. This paper regards a benchmark analysis in relation to long-term temperature forecasts over Italy in the year 2010, comparing the eni-kassandra meteo forecast (e-kmf®) model, the Climate Forecast System-National Centers for Environmental Prediction (CFS-NCEP) model, and the climatological reference (based on 25-year data) with observations. Statistical indexes are used to understand the reliability of the prediction of 2-m monthly air temperatures with a perspective of 12 weeks ahead. The results show how the best performance is achieved by the e-kmf® system which improves the reliability for long-term forecasts compared to climatology and the CFS-NCEP model. By using the reliable high-performance forecast system, it is possible to optimize the natural gas portfolio and management operations, thereby obtaining a competitive advantage in the European energy market.

  15. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  16. Using HPC within an operational forecasting configuration

    Science.gov (United States)

    Jagers, H. R. A.; Genseberger, M.; van den Broek, M. A. F. H.

    2012-04-01

    Various natural disasters are caused by high-intensity events, for example: extreme rainfall can in a short time cause major damage in river catchments, storms can cause havoc in coastal areas. To assist emergency response teams in operational decisions, it's important to have reliable information and predictions as soon as possible. This starts before the event by providing early warnings about imminent risks and estimated probabilities of possible scenarios. In the context of various applications worldwide, Deltares has developed an open and highly configurable forecasting and early warning system: Delft-FEWS. Finding the right balance between simulation time (and hence prediction lead time) and simulation accuracy and detail is challenging. Model resolution may be crucial to capture certain critical physical processes. Uncertainty in forcing conditions may require running large ensembles of models; data assimilation techniques may require additional ensembles and repeated simulations. The computational demand is steadily increasing and data streams become bigger. Using HPC resources is a logical step; in different settings Delft-FEWS has been configured to take advantage of distributed computational resources available to improve and accelerate the forecasting process (e.g. Montanari et al, 2006). We will illustrate the system by means of a couple of practical applications including the real-time dynamic forecasting of wind driven waves, flow of water, and wave overtopping at dikes of Lake IJssel and neighboring lakes in the center of The Netherlands. Montanari et al., 2006. Development of an ensemble flood forecasting system for the Po river basin, First MAP D-PHASE Scientific Meeting, 6-8 November 2006, Vienna, Austria.

  17. SHORT-TERM FORECASTING OF MORTGAGE LENDING

    Directory of Open Access Journals (Sweden)

    Irina V. Orlova

    2013-01-01

    Full Text Available The article considers the methodological and algorithmic problems arising in modeling and forecasting of time series of mortgage loans. Focuses on the processes of formation of the levels of time series of mortgage loans and the problem of choice and identification of models in the conditions of small samples. For forecasting options are selected and implemented a model of autoregressive and moving average, which allowed to obtain reliable forecasts.

  18. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  19. Multivariate performance reliability prediction in real-time

    International Nuclear Information System (INIS)

    Lu, S.; Lu, H.; Kolarik, W.J.

    2001-01-01

    This paper presents a technique for predicting system performance reliability in real-time considering multiple failure modes. The technique includes on-line multivariate monitoring and forecasting of selected performance measures and conditional performance reliability estimates. The performance measures across time are treated as a multivariate time series. A state-space approach is used to model the multivariate time series. Recursive forecasting is performed by adopting Kalman filtering. The predicted mean vectors and covariance matrix of performance measures are used for the assessment of system survival/reliability with respect to the conditional performance reliability. The technique and modeling protocol discussed in this paper provide a means to forecast and evaluate the performance of an individual system in a dynamic environment in real-time. The paper also presents an example to demonstrate the technique

  20. Wind and load forecast error model for multiple geographically distributed forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Reyes-Spindola, Jorge F.; Samaan, Nader; Diao, Ruisheng; Hafen, Ryan P. [Pacific Northwest National Laboratory, Richland, WA (United States)

    2010-07-01

    The impact of wind and load forecast errors on power grid operations is frequently evaluated by conducting multi-variant studies, where these errors are simulated repeatedly as random processes based on their known statistical characteristics. To simulate these errors correctly, we need to reflect their distributions (which do not necessarily follow a known distribution law), standard deviations. auto- and cross-correlations. For instance, load and wind forecast errors can be closely correlated in different zones of the system. This paper introduces a new methodology for generating multiple cross-correlated random processes to produce forecast error time-domain curves based on a transition probability matrix computed from an empirical error distribution function. The matrix will be used to generate new error time series with statistical features similar to observed errors. We present the derivation of the method and some experimental results obtained by generating new error forecasts together with their statistics. (orig.)

  1. Bayesian analyses of seasonal runoff forecasts

    Science.gov (United States)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  2. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  3. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  4. Economic assessment of flood forecasts for a risk-averse decision-maker

    Science.gov (United States)

    Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier-Filion, Thomas-Charles

    2017-04-01

    observed values) and in terms of their economic value. This assessment is performed for lead times of one to five days. The three systems are: (1) simple statistically dressed deterministic forecasts, (2) forecasts based on meteorological ensembles and (3) a variant of the latter that also includes an estimation of state variables uncertainty. The comparison takes place on the Montmorency River, a small flood-prone watershed in south central Quebec, Canada. The results show that forecasts quality as assessed by well-known tools such as the Continuous Ranked Probability Score or the reliability diagram do not necessarily translate directly into economic value, especially if the decision maker is not risk-neutral. In addition, results show that the economic value of forecasts for a risk-averse decision maker is very much influenced by the most extreme members of ensemble forecasts (upper tail of the predictive distributions). This study provides a new basis for further improvement of our comprehension of the complex interactions between forecasts uncertainty, risk-aversion and decision-making.

  5. Access to Risk Mitigating Weather Forecasts and Changes in Farming Operations in East and West Africa: Evidence from a Baseline Survey

    Directory of Open Access Journals (Sweden)

    Abayomi Samuel Oyekale

    2015-10-01

    Full Text Available Unfavorable weather currently ranks among the major challenges facing agricultural development in many African countries. Impact mitigation through access to reliable and timely weather forecasts and other adaptive mechanisms are foremost in Africa’s policy dialogues and socio-economic development agendas. This paper analyzed the factors influencing access to forecasts on incidence of pests/diseases (PD and start of rainfall (SR. The data were collected by Climate Change Agriculture and Food Security (CCAFS and analyzed with Probit regression separately for East Africa, West Africa and the combined dataset. The results show that 62.7% and 56.4% of the farmers from East and West Africa had access to forecasts on start of rainfall, respectively. In addition, 39.3% and 49.4% of the farmers from East Africa indicated that forecasts on outbreak of pests/diseases and start of rainfall were respectively accompanied with advice as against 18.2% and 41.9% for West Africa. Having received forecasts on start of rainfall, 24.0% and 17.6% of the farmers from East and West Africa made decisions on timing of farming activities respectively. Probabilities of having access to forecasts on PD significantly increased with access to formal education, farm income and previous exposure to climatic shocks. Furthermore, probabilities of having access to forecasts on SR significantly increased (p < 0.05 with access to business income, radio and perception of more erratic rainfall, among others. It was recommended that promotion of informal education among illiterate farmers would enhance their climatic resilience, among others.

  6. Objective Lightning Probability Forecasting for Kennedy Space Center and Cape Canaveral Air Force Station, Phase III

    Science.gov (United States)

    Crawford, Winifred C.

    2010-01-01

    The AMU created new logistic regression equations in an effort to increase the skill of the Objective Lightning Forecast Tool developed in Phase II (Lambert 2007). One equation was created for each of five sub-seasons based on the daily lightning climatology instead of by month as was done in Phase II. The assumption was that these equations would capture the physical attributes that contribute to thunderstorm formation more so than monthly equations. However, the SS values in Section 5.3.2 showed that the Phase III equations had worse skill than the Phase II equations and, therefore, will not be transitioned into operations. The current Objective Lightning Forecast Tool developed in Phase II will continue to be used operationally in MIDDS. Three warm seasons were added to the Phase II dataset to increase the POR from 17 to 20 years (1989-2008), and data for October were included since the daily climatology showed lightning occurrence extending into that month. None of the three methods tested to determine the start of the subseason in each individual year were able to discern the start dates with consistent accuracy. Therefore, the start dates were determined by the daily climatology shown in Figure 10 and were the same in every year. The procedures used to create the predictors and develop the equations were identical to those in Phase II. The equations were made up of one to three predictors. TI and the flow regime probabilities were the top predictors followed by 1-day persistence, then VT and Ll. Each equation outperformed four other forecast methods by 7-57% using the verification dataset, but the new equations were outperformed by the Phase II equations in every sub-season. The reason for the degradation may be due to the fact that the same sub-season start dates were used in every year. It is likely there was overlap of sub-season days at the beginning and end of each defined sub-season in each individual year, which could very well affect equation

  7. International Aftershock Forecasting: Lessons from the Gorkha Earthquake

    Science.gov (United States)

    Michael, A. J.; Blanpied, M. L.; Brady, S. R.; van der Elst, N.; Hardebeck, J.; Mayberry, G. C.; Page, M. T.; Smoczyk, G. M.; Wein, A. M.

    2015-12-01

    Following the M7.8 Gorhka, Nepal, earthquake of April 25, 2015 the USGS issued a series of aftershock forecasts. The initial impetus for these forecasts was a request from the USAID Office of US Foreign Disaster Assistance to support their Disaster Assistance Response Team (DART) which coordinated US Government disaster response, including search and rescue, with the Government of Nepal. Because of the possible utility of the forecasts to people in the region and other response teams, the USGS released these forecasts publicly through the USGS Earthquake Program web site. The initial forecast used the Reasenberg and Jones (Science, 1989) model with generic parameters developed for active deep continental regions based on the Garcia et al. (BSSA, 2012) tectonic regionalization. These were then updated to reflect a lower productivity and higher decay rate based on the observed aftershocks, although relying on teleseismic observations, with a high magnitude-of-completeness, limited the amount of data. After the 12 May M7.3 aftershock, the forecasts used an Epidemic Type Aftershock Sequence model to better characterize the multiple sources of earthquake clustering. This model provided better estimates of aftershock uncertainty. These forecast messages were crafted based on lessons learned from the Christchurch earthquake along with input from the U.S. Embassy staff in Kathmandu. Challenges included how to balance simple messaging with forecasts over a variety of time periods (week, month, and year), whether to characterize probabilities with words such as those suggested by the IPCC (IPCC, 2010), how to word the messages in a way that would translate accurately into Nepali and not alarm the public, and how to present the probabilities of unlikely but possible large and potentially damaging aftershocks, such as the M7.3 event, which had an estimated probability of only 1-in-200 for the week in which it occurred.

  8. On the validity of cosmological Fisher matrix forecasts

    International Nuclear Information System (INIS)

    Wolz, Laura; Kilbinger, Martin; Weller, Jochen; Giannantonio, Tommaso

    2012-01-01

    We present a comparison of Fisher matrix forecasts for cosmological probes with Monte Carlo Markov Chain (MCMC) posterior likelihood estimation methods. We analyse the performance of future Dark Energy Task Force (DETF) stage-III and stage-IV dark-energy surveys using supernovae, baryon acoustic oscillations and weak lensing as probes. We concentrate in particular on the dark-energy equation of state parameters w 0 and w a . For purely geometrical probes, and especially when marginalising over w a , we find considerable disagreement between the two methods, since in this case the Fisher matrix can not reproduce the highly non-elliptical shape of the likelihood function. More quantitatively, the Fisher method underestimates the marginalized errors for purely geometrical probes between 30%-70%. For cases including structure formation such as weak lensing, we find that the posterior probability contours from the Fisher matrix estimation are in good agreement with the MCMC contours and the forecasted errors only changing on the 5% level. We then explore non-linear transformations resulting in physically-motivated parameters and investigate whether these parameterisations exhibit a Gaussian behaviour. We conclude that for the purely geometrical probes and, more generally, in cases where it is not known whether the likelihood is close to Gaussian, the Fisher matrix is not the appropriate tool to produce reliable forecasts

  9. PyForecastTools

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-22

    The PyForecastTools package provides Python routines for calculating metrics for model validation, forecast verification and model comparison. For continuous predictands the package provides functions for calculating bias (mean error, mean percentage error, median log accuracy, symmetric signed bias), and for calculating accuracy (mean squared error, mean absolute error, mean absolute scaled error, normalized RMSE, median symmetric accuracy). Convenience routines to calculate the component parts (e.g. forecast error, scaled error) of each metric are also provided. To compare models the package provides: generic skill score; percent better. Robust measures of scale including median absolute deviation, robust standard deviation, robust coefficient of variation and the Sn estimator are all provided by the package. Finally, the package implements Python classes for NxN contingency tables. In the case of a multi-class prediction, accuracy and skill metrics such as proportion correct and the Heidke and Peirce skill scores are provided as object methods. The special case of a 2x2 contingency table inherits from the NxN class and provides many additional metrics for binary classification: probability of detection, probability of false detection, false alarm ration, threat score, equitable threat score, bias. Confidence intervals for many of these quantities can be calculated using either the Wald method or Agresti-Coull intervals.

  10. Forecasting effects of global warming on biodiversity

    DEFF Research Database (Denmark)

    Botkin, D.B.; Saxe, H.; Araújo, M.B.

    2007-01-01

    The demand for accurate forecasting of the effects of global warming on biodiversity is growing, but current methods for forecasting have limitations. In this article, we compare and discuss the different uses of four forecasting methods: (1) models that consider species individually, (2) niche...... and theoretical ecological results suggest that many species could be at risk from global warming, during the recent ice ages surprisingly few species became extinct. The potential resolution of this conundrum gives insights into the requirements for more accurate and reliable forecasting. Our eight suggestions...

  11. Drought forecasting in Luanhe River basin involving climatic indices

    Science.gov (United States)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the

  12. Ensemble Forecasts with Useful Skill-Spread Relationships for African meningitis and Asia Streamflow Forecasting

    Science.gov (United States)

    Hopson, T. M.

    2014-12-01

    One potential benefit of an ensemble prediction system (EPS) is its capacity to forecast its own forecast error through the ensemble spread-error relationship. In practice, an EPS is often quite limited in its ability to represent the variable expectation of forecast error through the variable dispersion of the ensemble, and perhaps more fundamentally, in its ability to provide enough variability in the ensembles dispersion to make the skill-spread relationship even potentially useful (irrespective of whether the EPS is well-calibrated or not). In this paper we examine the ensemble skill-spread relationship of an ensemble constructed from the TIGGE (THORPEX Interactive Grand Global Ensemble) dataset of global forecasts and a combination of multi-model and post-processing approaches. Both of the multi-model and post-processing techniques are based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. The methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. A context for these concepts is provided by assessing the constructed ensemble in forecasting district-level humidity impacting the incidence of meningitis in the meningitis belt of Africa, and in forecasting flooding events in the Brahmaputra and Ganges basins of South Asia.

  13. Wind power forecast

    Energy Technology Data Exchange (ETDEWEB)

    Pestana, Rui [Rede Electrica Nacional (REN), S.A., Lisboa (Portugal). Dept. Systems and Development System Operator; Trancoso, Ana Rosa; Delgado Domingos, Jose [Univ. Tecnica de Lisboa (Portugal). Seccao de Ambiente e Energia

    2012-07-01

    Accurate wind power forecast are needed to reduce integration costs in the electric grid caused by wind inherent variability. Currently, Portugal has a significant wind power penetration level and consequently the need to have reliable wind power forecasts at different temporal scales, including localized events such as ramps. This paper provides an overview of the methodologies used by REN to forecast wind power at national level, based on statistical and probabilistic combinations of NWP and measured data with the aim of improving accuracy of pure NWP. Results show that significant improvement can be achieved with statistical combination with persistence in the short-term and with probabilistic combination in the medium-term. NWP are also able to detect ramp events with 3 day notice to the operational planning. (orig.)

  14. An independent system operator's perspective on operational ramp forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Porter, G. [New Brunswick System Operator, Fredericton, NB (Canada)

    2010-07-01

    One of the principal roles of the power system operator is to select the most economical resources to reliably supply electric system power needs. Operational wind power production forecasts are required by system operators in order to understand the impact of ramp event forecasting on dispatch functions. A centralized dispatch approach can contribute to a more efficient use of resources that traditional economic dispatch methods. Wind ramping events can have a significant impact on system reliability. Power systems can have constrained or robust transmission systems, and may also be islanded or have large connections to neighbouring systems. Power resources can include both flexible and inflexible generation resources. Wind integration tools must be used by system operators to improve communications and connections with wind power plants. Improved wind forecasting techniques are also needed. Sensitivity to forecast errors is dependent on current system conditions. System operators require basic production forecasts, probabilistic forecasts, and event forecasts. Forecasting errors were presented as well as charts outlining the implications of various forecasts. tabs., figs.

  15. Earthquake focal mechanism forecasting in Italy for PSHA purposes

    Science.gov (United States)

    Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola

    2018-01-01

    In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.

  16. Robust Forecasting of Non-Stationary Time Series

    OpenAIRE

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable forecasts in the presence of outliers, non-linearity, and heteroscedasticity. In the absence of outliers, the forecasts are only slightly less precise than those based on a localized Least Squares estima...

  17. Statistical eruption forecast for the Chilean Southern Volcanic Zone: typical probabilities of volcanic eruptions as baseline for possibly enhanced activity following the large 2010 Concepción earthquake

    Directory of Open Access Journals (Sweden)

    Y. Dzierma

    2010-10-01

    Full Text Available A probabilistic eruption forecast is provided for ten volcanoes of the Chilean Southern Volcanic Zone (SVZ. Since 70% of the Chilean population lives in this area, the estimation of future eruption likelihood is an important part of hazard assessment. After investigating the completeness and stationarity of the historical eruption time series, the exponential, Weibull, and log-logistic distribution functions are fit to the repose time distributions for the individual volcanoes and the models are evaluated. This procedure has been implemented in two different ways to methodologically compare details in the fitting process. With regard to the probability of at least one VEI ≥ 2 eruption in the next decade, Llaima, Villarrica and Nevados de Chillán are most likely to erupt, while Osorno shows the lowest eruption probability among the volcanoes analysed. In addition to giving a compilation of the statistical eruption forecasts along the historically most active volcanoes of the SVZ, this paper aims to give "typical" eruption probabilities, which may in the future permit to distinguish possibly enhanced activity in the aftermath of the large 2010 Concepción earthquake.

  18. An Intelligent Decision Support System for Workforce Forecast

    Science.gov (United States)

    2011-01-01

    growth. Brown (1999) developed a model to forecast dental workforce size and mix (by sex) for the first twenty years of the twenty first century in...forecasted competencies required to deliver needed dental services. Labor market signaling approaches based workforce forecasting model was presented...techniques viz. algebra, calculus or probability theory, (Law and Kelton, 1991). Simulation processes, same as conducting experiments on computers, deals

  19. Magnetogram Forecast: An All-Clear Space Weather Forecasting System

    Science.gov (United States)

    Barghouty, Nasser; Falconer, David

    2015-01-01

    Solar flares and coronal mass ejections (CMEs) are the drivers of severe space weather. Forecasting the probability of their occurrence is critical in improving space weather forecasts. The National Oceanic and Atmospheric Administration (NOAA) currently uses the McIntosh active region category system, in which each active region on the disk is assigned to one of 60 categories, and uses the historical flare rates of that category to make an initial forecast that can then be adjusted by the NOAA forecaster. Flares and CMEs are caused by the sudden release of energy from the coronal magnetic field by magnetic reconnection. It is believed that the rate of flare and CME occurrence in an active region is correlated with the free energy of an active region. While the free energy cannot be measured directly with present observations, proxies of the free energy can instead be used to characterize the relative free energy of an active region. The Magnetogram Forecast (MAG4) (output is available at the Community Coordinated Modeling Center) was conceived and designed to be a databased, all-clear forecasting system to support the operational goals of NASA's Space Radiation Analysis Group. The MAG4 system automatically downloads nearreal- time line-of-sight Helioseismic and Magnetic Imager (HMI) magnetograms on the Solar Dynamics Observatory (SDO) satellite, identifies active regions on the solar disk, measures a free-energy proxy, and then applies forecasting curves to convert the free-energy proxy into predicted event rates for X-class flares, M- and X-class flares, CMEs, fast CMEs, and solar energetic particle events (SPEs). The forecast curves themselves are derived from a sample of 40,000 magnetograms from 1,300 active region samples, observed by the Solar and Heliospheric Observatory Michelson Doppler Imager. Figure 1 is an example of MAG4 visual output

  20. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  1. The Art and Science of Long-Range Space Weather Forecasting

    Science.gov (United States)

    Hathaway, David H.; Wilson, Robert M.

    2006-01-01

    Long-range space weather forecasts are akin to seasonal forecasts of terrestrial weather. We don t expect to forecast individual events but we do hope to forecast the underlying level of activity important for satellite operations and mission pl&g. Forecasting space weather conditions years or decades into the future has traditionally been based on empirical models of the solar cycle. Models for the shape of the cycle as a function of its amplitude become reliable once the amplitude is well determined - usually two to three years after minimum. Forecasting the amplitude of a cycle well before that time has been more of an art than a science - usually based on cycle statistics and trends. Recent developments in dynamo theory -the theory explaining the generation of the Sun s magnetic field and the solar activity cycle - have now produced models with predictive capabilities. Testing these models with historical sunspot cycle data indicates that these predictions may be highly reliable one, or even two, cycles into the future.

  2. The Value of Seasonal Climate Forecasts in Managing Energy Resources.

    Science.gov (United States)

    Brown Weiss, Edith

    1982-04-01

    Research and interviews with officials of the United States energy industry and a systems analysis of decision making in a natural gas utility lead to the conclusion that seasonal climate forecasts would only have limited value in fine tuning the management of energy supply, even if the forecasts were more reliable and detailed than at present.On the other hand, reliable forecasts could be useful to state and local governments both as a signal to adopt long-term measures to increase the efficiency of energy use and to initiate short-term measures to reduce energy demand in anticipation of a weather-induced energy crisis.To be useful for these purposes, state governments would need better data on energy demand patterns and available energy supplies, staff competent to interpret climate forecasts, and greater incentive to conserve. The use of seasonal climate forecasts is not likely to be constrained by fear of legal action by those claiming to be injured by a possible incorrect forecast.

  3. Exploiting Domain Knowledge to Forecast Heating Oil Consumption

    Science.gov (United States)

    Corliss, George F.; Sakauchi, Tsuginosuke; Vitullo, Steven R.; Brown, Ronald H.

    2011-11-01

    The GasDay laboratory at Marquette University provides forecasts of energy consumption. One such service is the Heating Oil Forecaster, a service for a heating oil or propane delivery company. Accurate forecasts can help reduce the number of trucks and drivers while providing efficient inventory management by stretching the time between deliveries. Accurate forecasts help retain valuable customers. If a customer runs out of fuel, the delivery service incurs costs for an emergency delivery and often a service call. Further, the customer probably changes providers. The basic modeling is simple: Fit delivery amounts sk to cumulative Heating Degree Days (HDDk = Σmax(0,60 °F—daily average temperature)), with wind adjustment, for each delivery period: sk≈ŝk = β0+β1HDDk. For the first few deliveries, there is not enough data to provide a reliable estimate K = 1/β1 so we use Bayesian techniques with priors constructed from historical data. A fresh model is trained for each customer with each delivery, producing daily consumption forecasts using actual and forecast weather until the next delivery. In practice, a delivery may not fill the oil tank if the delivery truck runs out of oil or the automatic shut-off activates prematurely. Special outlier detection and recovery based on domain knowledge addresses this and other special cases. The error at each delivery is the difference between that delivery and the aggregate of daily forecasts using actual weather since the preceding delivery. Out-of-sample testing yields MAPE = 21.2% and an average error of 6.0% of tank capacity for Company A. The MAPE and an average error as a percentage of tank capacity for Company B are 31.5 % and 6.6 %, respectively. One heating oil delivery company who uses this forecasting service [1] reported instances of a customer running out of oil reduced from about 250 in 50,000 deliveries per year before contracting for our service to about 10 with our service. They delivered slightly more

  4. Financial forecasts accuracy in Brazil's social security system.

    Directory of Open Access Journals (Sweden)

    Carlos Patrick Alves da Silva

    Full Text Available Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  5. Financial forecasts accuracy in Brazil's social security system.

    Science.gov (United States)

    Silva, Carlos Patrick Alves da; Puty, Claudio Alberto Castelo Branco; Silva, Marcelino Silva da; Carvalho, Solon Venâncio de; Francês, Carlos Renato Lisboa

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government's proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts.

  6. Device for forecasting reactor power-up routes

    International Nuclear Information System (INIS)

    Fukuzaki, Takaharu.

    1980-01-01

    Purpose: To improve the reliability and forecasting accuracy for a device forecasting the change of the state on line in BWR type reactors. Constitution: The present state in a nuclear reactor is estimated in a present state judging section based on measuring signals for thermal power, core flow rate, control rod density and the like from the nuclear reactor, and the estimated results are accumulated in an operation result collecting section. While on the other hand, a forecasting section forecasts the future state in the reactor based on the signals from the forecasting condition setting section. The actual result values from the collecting section and the forecasting results are compared to each other. If they are not equal, new setting signals are outputted from the setting section to perform the forecasting again. These procedures are repeated till the difference between the forecast results and the actual result values is minimized, by which accurate forecasting for the state of the reactor is made possible. (Furukawa, Y.)

  7. Propagation of Uncertainty in Bayesian Kernel Models - Application to Multiple-Step Ahead Forecasting

    DEFF Research Database (Denmark)

    Quinonero, Joaquin; Girard, Agathe; Larsen, Jan

    2003-01-01

    The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models such as the Gaus......The object of Bayesian modelling is predictive distribution, which, in a forecasting scenario, enables evaluation of forecasted values and their uncertainties. We focus on reliably estimating the predictive mean and variance of forecasted values using Bayesian kernel based models...... such as the Gaussian process and the relevance vector machine. We derive novel analytic expressions for the predictive mean and variance for Gaussian kernel shapes under the assumption of a Gaussian input distribution in the static case, and of a recursive Gaussian predictive density in iterative forecasting...

  8. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  9. An enhanced radial basis function network for short-term electricity price forecasting

    International Nuclear Information System (INIS)

    Lin, Whei-Min; Gow, Hong-Jey; Tsai, Ming-Tang

    2010-01-01

    This paper proposed a price forecasting system for electric market participants to reduce the risk of price volatility. Combining the Radial Basis Function Network (RBFN) and Orthogonal Experimental Design (OED), an Enhanced Radial Basis Function Network (ERBFN) has been proposed for the solving process. The Locational Marginal Price (LMP), system load, transmission flow and temperature of the PJM system were collected and the data clusters were embedded in the Excel Database according to the year, season, workday and weekend. With the OED applied to learning rates in the ERBFN, the forecasting error can be reduced during the training process to improve both accuracy and reliability. This would mean that even the ''spikes'' could be tracked closely. The Back-propagation Neural Network (BPN), Probability Neural Network (PNN), other algorithms, and the proposed ERBFN were all developed and compared to check the performance. Simulation results demonstrated the effectiveness of the proposed ERBFN to provide quality information in a price volatile environment. (author)

  10. Reply to "Comment on 'Nonparametric forecasting of low-dimensional dynamical systems' ".

    Science.gov (United States)

    Berry, Tyrus; Giannakis, Dimitrios; Harlim, John

    2016-03-01

    In this Reply we provide additional results which allow a better comparison of the diffusion forecast and the "past-noise" forecasting (PNF) approach for the El Niño index. We remark on some qualitative differences between the diffusion forecast and PNF, and we suggest an alternative use of the diffusion forecast for the purposes of forecasting the probabilities of extreme events.

  11. Concerning the justiciability of demand forecasts

    International Nuclear Information System (INIS)

    Nierhaus, M.

    1977-01-01

    This subject plays at present in particular a role in the course of judicial examinations of immediately enforceable orders for the partial construction licences of nuclear power plants. The author distinguishes beween three kinds of forecast decisions: 1. Appraising forecast decisions with standards of judgment taken mainly from the fields of the art, culture, morality, religion are, according to the author, only legally verifyable to a limited extent. 2. With regard to forecast decisions not arguable, e.g. where the future behaviour of persons is concerned, the same should be applied basically. 3. In contrast to this, the following is applicable for programmatic, proceedingslike, or creative forecast decisions, in particular in economics: 'An administrative estimation privilege in a prognostic sense with the consequence that the court has to accept the forecast decision which lies within the forecast margins and which cannot be disproved, and that the court may not replace this forecast decision by its own probability judgment. In these cases, administration has the right to create its own forecast standards.' Judicial control in these cases was limited to certain substantive and procedural mistakes made by the administration in the course of forecast decision finding. (orig./HP) [de

  12. Concerning the justiciability of demand forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Nierhaus, M [Koeln Univ. (Germany, F.R.)

    1977-01-01

    This subject plays at present in particular a role in the course of judicial examinations of immediately enforceable orders for the partial construction licences of nuclear power plants. The author distinguishes beween three kinds of forecast decisions: 1. Appraising forecast decisions with standards of judgment taken mainly from the fields of the art, culture, morality, religion are, according to the author, only legally verifyable to a limited extent. 2. With regard to forecast decisions not arguable, e.g. where the future behaviour of persons is concerned, the same should be applied basically. 3. In contrast to this, the following is applicable for programmatic, proceedingslike, or creative forecast decisions, in particular in economics: 'An administrative estimation privilege in a prognostic sense with the consequence that the court has to accept the forecast decision which lies within the forecast margins and which cannot be disproved, and that the court may not replace this forecast decision by its own probability judgment. In these cases, administration has the right to create its own forecast standards.' Judicial control in these cases was limited to certain substantive and procedural mistakes made by the administration in the course of forecast decision finding.

  13. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    Science.gov (United States)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help

  14. a system approach to the long term forecasting of the climat data in baikal region

    Science.gov (United States)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method

  15. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California: A Framework for Objectively Leveraging Weather and Climate Forecasts in a Decision Support Environment

    Science.gov (United States)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Whitin, B.

    2017-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation and flow forecasts to inform the flood operations of reservoirs. The Ensemble Forecast Operations (EFO) alternative is a probabilistic approach of FIRO that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, release decisions are made to manage forecasted risk of reaching critical operational thresholds. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC. The ESP hindcast was developed using Global Ensemble Forecast System version 10 precipitation reforecasts processed with the Hydrologic Ensemble Forecast System to generate daily reforecasts of 61 flow ensemble members for a 15-day forecast horizon. Model simulation results demonstrate that the EFO alternative may improve water supply reliability for Lake Mendocino yet not increase flood risk for downstream areas. The developed operations framework can directly leverage improved skill in the second week of the forecast and is extendable into the S2S time domain given the demonstration of improved skill through a reliable reforecast of adequate historical duration and consistent with operationally available numerical weather predictions.

  16. A robust method to forecast volcanic ash clouds

    Science.gov (United States)

    Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin

    2012-01-01

    Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an

  17. The effort to increase the space weather forecasting accuracy in KSWC

    Science.gov (United States)

    Choi, J. S.

    2017-12-01

    The Korean Space Weather Center (KSWC) of the National Radio Research Agency (RRA) is a government agency which is the official source of space weather information for Korean Government and the primary action agency of emergency measure to severe space weather condition as the Regional Warning Center of the International Space Environment Service (ISES). KSWC's main role is providing alerts, watches, and forecasts in order to minimize the space weather impacts on both of public and commercial sectors of satellites, aviation, communications, navigations, power grids, and etc. KSWC is also in charge of monitoring the space weather condition and conducting research and development for its main role of space weather operation in Korea. Recently, KSWC are focusing on increasing the accuracy of space weather forecasting results and verifying the model generated results. The forecasting accuracy will be calculated based on the probability statistical estimation so that the results can be compared numerically. Regarding the cosmic radiation does, we are gathering the actual measured data of radiation does using the instrument by cooperation with the domestic airlines. Based on the measurement, we are going to verify the reliability of SAFE system which was developed by KSWC to provide the cosmic radiation does information with the airplane cabin crew and public users.

  18. Verification of ECMWF System 4 for seasonal hydrological forecasting in a northern climate

    Science.gov (United States)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert

    2017-11-01

    Hydropower production requires optimal dam and reservoir management to prevent flooding damage and avoid operation losses. In a northern climate, where spring freshet constitutes the main inflow volume, seasonal forecasts can help to establish a yearly strategy. Long-term hydrological forecasts often rely on past observations of streamflow or meteorological data. Another alternative is to use ensemble meteorological forecasts produced by climate models. In this paper, those produced by the ECMWF (European Centre for Medium-Range Forecast) System 4 are examined and bias is characterized. Bias correction, through the linear scaling method, improves the performance of the raw ensemble meteorological forecasts in terms of continuous ranked probability score (CRPS). Then, three seasonal ensemble hydrological forecasting systems are compared: (1) the climatology of simulated streamflow, (2) the ensemble hydrological forecasts based on climatology (ESP) and (3) the hydrological forecasts based on bias-corrected ensemble meteorological forecasts from System 4 (corr-DSP). Simulated streamflow computed using observed meteorological data is used as benchmark. Accounting for initial conditions is valuable even for long-term forecasts. ESP and corr-DSP both outperform the climatology of simulated streamflow for lead times from 1 to 5 months depending on the season and watershed. Integrating information about future meteorological conditions also improves monthly volume forecasts. For the 1-month lead time, a gain exists for almost all watersheds during winter, summer and fall. However, volume forecasts performance for spring varies from one watershed to another. For most of them, the performance is close to the performance of ESP. For longer lead times, the CRPS skill score is mostly in favour of ESP, even if for many watersheds, ESP and corr-DSP have comparable skill. Corr-DSP appears quite reliable but, in some cases, under-dispersion or bias is observed. A more complex bias

  19. A new method for evaluating the availability, reliability, and maintainability whatever may be the probability law

    International Nuclear Information System (INIS)

    Doyon, L.R.; CEA Centre d'Etudes Nucleaires de Saclay, 91 - Gif-sur-Yvette

    1975-01-01

    A simple method is presented for computer solving every system model (availability, reliability, and maintenance) with intervals between failures, and time duration for repairs distributed according to any probability law, and for any maintainance policy. A matrix equation is obtained using Markov diagrams. An example is given with the solution by the APAFS program (Algorithme Pour l'Analyse de la Fiabilite des Systemes) [fr

  20. On the incidence of meteorological and hydrological processors: Effect of resolution, sharpness and reliability of hydrological ensemble forecasts

    Science.gov (United States)

    Abaza, Mabrouk; Anctil, François; Fortin, Vincent; Perreault, Luc

    2017-12-01

    Meteorological and hydrological ensemble prediction systems are imperfect. Their outputs could often be improved through the use of a statistical processor, opening up the question of the necessity of using both processors (meteorological and hydrological), only one of them, or none. This experiment compares the predictive distributions from four hydrological ensemble prediction systems (H-EPS) utilising the Ensemble Kalman filter (EnKF) probabilistic sequential data assimilation scheme. They differ in the inclusion or not of the Distribution Based Scaling (DBS) method for post-processing meteorological forecasts and the ensemble Bayesian Model Averaging (ensemble BMA) method for hydrological forecast post-processing. The experiment is implemented on three large watersheds and relies on the combination of two meteorological reforecast products: the 4-member Canadian reforecasts from the Canadian Centre for Meteorological and Environmental Prediction (CCMEP) and the 10-member American reforecasts from the National Oceanic and Atmospheric Administration (NOAA), leading to 14 members at each time step. Results show that all four tested H-EPS lead to resolution and sharpness values that are quite similar, with an advantage to DBS + EnKF. The ensemble BMA is unable to compensate for any bias left in the precipitation ensemble forecasts. On the other hand, it succeeds in calibrating ensemble members that are otherwise under-dispersed. If reliability is preferred over resolution and sharpness, DBS + EnKF + ensemble BMA performs best, making use of both processors in the H-EPS system. Conversely, for enhanced resolution and sharpness, DBS is the preferred method.

  1. Towards an Australian ensemble streamflow forecasting system for flood prediction and water management

    Science.gov (United States)

    Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.

    2016-12-01

    Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.

  2. Quantifying probabilities of eruptions at Mount Etna (Sicily, Italy).

    Science.gov (United States)

    Brancato, Alfonso

    2010-05-01

    One of the major goals of modern volcanology is to set up sound risk-based decision-making in land-use planning and emergency management. Volcanic hazard must be managed with reliable estimates of quantitative long- and short-term eruption forecasting, but the large number of observables involved in a volcanic process suggests that a probabilistic approach could be a suitable tool in forecasting. The aim of this work is to quantify probabilistic estimate of the vent location for a suitable lava flow hazard assessment at Mt. Etna volcano, through the application of the code named BET (Marzocchi et al., 2004, 2008). The BET_EF model is based on the event tree philosophy assessed by Newhall and Hoblitt (2002), further developing the concept of vent location, epistemic uncertainties, and a fuzzy approach for monitoring measurements. A Bayesian event tree is a specialized branching graphical representation of events in which individual branches are alternative steps from a general prior event, and evolving into increasingly specific subsequent states. Then, the event tree attempts to graphically display all relevant possible outcomes of volcanic unrest in progressively higher levels of detail. The procedure is set to estimate an a priori probability distribution based upon theoretical knowledge, to accommodate it by using past data, and to modify it further by using current monitoring data. For the long-term forecasting, an a priori model, dealing with the present tectonic and volcanic structure of the Mt. Etna, is considered. The model is mainly based on past vent locations and fracture location datasets (XX century of eruptive history of the volcano). Considering the variation of the information through time, and their relationship with the structural setting of the volcano, datasets we are also able to define an a posteriori probability map for next vent opening. For short-term forecasting vent opening hazard assessment, the monitoring has a leading role, primarily

  3. Financial forecasts accuracy in Brazil’s social security system

    Science.gov (United States)

    2017-01-01

    Long-term social security statistical forecasts produced and disseminated by the Brazilian government aim to provide accurate results that would serve as background information for optimal policy decisions. These forecasts are being used as support for the government’s proposed pension reform that plans to radically change the Brazilian Constitution insofar as Social Security is concerned. However, the reliability of official results is uncertain since no systematic evaluation of these forecasts has ever been published by the Brazilian government or anyone else. This paper aims to present a study of the accuracy and methodology of the instruments used by the Brazilian government to carry out long-term actuarial forecasts. We base our research on an empirical and probabilistic analysis of the official models. Our empirical analysis shows that the long-term Social Security forecasts are systematically biased in the short term and have significant errors that render them meaningless in the long run. Moreover, the low level of transparency in the methods impaired the replication of results published by the Brazilian Government and the use of outdated data compromises forecast results. In the theoretical analysis, based on a mathematical modeling approach, we discuss the complexity and limitations of the macroeconomic forecast through the computation of confidence intervals. We demonstrate the problems related to error measurement inherent to any forecasting process. We then extend this exercise to the computation of confidence intervals for Social Security forecasts. This mathematical exercise raises questions about the degree of reliability of the Social Security forecasts. PMID:28859172

  4. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  5. Forecast Combination under Heavy-Tailed Errors

    Directory of Open Access Journals (Sweden)

    Gang Cheng

    2015-11-01

    Full Text Available Forecast combination has been proven to be a very important technique to obtain accurate predictions for various applications in economics, finance, marketing and many other areas. In many applications, forecast errors exhibit heavy-tailed behaviors for various reasons. Unfortunately, to our knowledge, little has been done to obtain reliable forecast combinations for such situations. The familiar forecast combination methods, such as simple average, least squares regression or those based on the variance-covariance of the forecasts, may perform very poorly due to the fact that outliers tend to occur, and they make these methods have unstable weights, leading to un-robust forecasts. To address this problem, in this paper, we propose two nonparametric forecast combination methods. One is specially proposed for the situations in which the forecast errors are strongly believed to have heavy tails that can be modeled by a scaled Student’s t-distribution; the other is designed for relatively more general situations when there is a lack of strong or consistent evidence on the tail behaviors of the forecast errors due to a shortage of data and/or an evolving data-generating process. Adaptive risk bounds of both methods are developed. They show that the resulting combined forecasts yield near optimal mean forecast errors relative to the candidate forecasts. Simulations and a real example demonstrate their superior performance in that they indeed tend to have significantly smaller prediction errors than the previous combination methods in the presence of forecast outliers.

  6. A methodology for Electric Power Load Forecasting

    Directory of Open Access Journals (Sweden)

    Eisa Almeshaiei

    2011-06-01

    Full Text Available Electricity demand forecasting is a central and integral process for planning periodical operations and facility expansion in the electricity sector. Demand pattern is almost very complex due to the deregulation of energy markets. Therefore, finding an appropriate forecasting model for a specific electricity network is not an easy task. Although many forecasting methods were developed, none can be generalized for all demand patterns. Therefore, this paper presents a pragmatic methodology that can be used as a guide to construct Electric Power Load Forecasting models. This methodology is mainly based on decomposition and segmentation of the load time series. Several statistical analyses are involved to study the load features and forecasting precision such as moving average and probability plots of load noise. Real daily load data from Kuwaiti electric network are used as a case study. Some results are reported to guide forecasting future needs of this network.

  7. Forecasting infectious disease emergence subject to seasonal forcing.

    Science.gov (United States)

    Miller, Paige B; O'Dea, Eamon B; Rohani, Pejman; Drake, John M

    2017-09-06

    Despite high vaccination coverage, many childhood infections pose a growing threat to human populations. Accurate disease forecasting would be of tremendous value to public health. Forecasting disease emergence using early warning signals (EWS) is possible in non-seasonal models of infectious diseases. Here, we assessed whether EWS also anticipate disease emergence in seasonal models. We simulated the dynamics of an immunizing infectious pathogen approaching the tipping point to disease endemicity. To explore the effect of seasonality on the reliability of early warning statistics, we varied the amplitude of fluctuations around the average transmission. We proposed and analyzed two new early warning signals based on the wavelet spectrum. We measured the reliability of the early warning signals depending on the strength of their trend preceding the tipping point and then calculated the Area Under the Curve (AUC) statistic. Early warning signals were reliable when disease transmission was subject to seasonal forcing. Wavelet-based early warning signals were as reliable as other conventional early warning signals. We found that removing seasonal trends, prior to analysis, did not improve early warning statistics uniformly. Early warning signals anticipate the onset of critical transitions for infectious diseases which are subject to seasonal forcing. Wavelet-based early warning statistics can also be used to forecast infectious disease.

  8. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  9. Bayesian quantitative precipitation forecasts in terms of quantiles

    Science.gov (United States)

    Bentzien, Sabrina; Friederichs, Petra

    2014-05-01

    Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into

  10. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E. [EG and G Idaho, Inc., Idaho Falls, ID (United States)

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data.

  11. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual. Part 2: Human error probability (HEP) data; Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  12. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR): Data manual, Part 2: Human Error Probability (HEP) Data. Volume 5, Revision 4

    International Nuclear Information System (INIS)

    Reece, W.J.; Gilbert, B.G.; Richards, R.E.

    1994-09-01

    This data manual contains a hard copy of the information in the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) Version 3.5 database, which is sponsored by the US Nuclear Regulatory Commission. NUCLARR was designed as a tool for risk analysis. Many of the nuclear reactors in the US and several outside the US are represented in the NUCLARR database. NUCLARR includes both human error probability estimates for workers at the plants and hardware failure data for nuclear reactor equipment. Aggregations of these data yield valuable reliability estimates for probabilistic risk assessments and human reliability analyses. The data manual is organized to permit manual searches of the information if the computerized version is not available. Originally, the manual was published in three parts. In this revision the introductory material located in the original Part 1 has been incorporated into the text of Parts 2 and 3. The user can now find introductory material either in the original Part 1, or in Parts 2 and 3 as revised. Part 2 contains the human error probability data, and Part 3, the hardware component reliability data

  13. Spatial load forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Willis, H.L.; Engel, M.V.; Buri, M.J.

    1995-04-01

    The reliability, efficiency, and economy of a power delivery system depend mainly on how well its substations, transmission lines, and distribution feeders are located within the utility service area, and how well their capacities match power needs in their respective localities. Often, utility planners are forced to commit to sites, rights of way, and equipment capacities year in advance. A necessary element of effective expansion planning is a forecast of where and how much demand must be served by the future T and D system. This article reports that a three-stage method forecasts with accuracy and detail, allowing meaningful determination of sties and sizes for future substation, transmission, and distribution facilities.

  14. [Combined forecasting system of peritonitis outcome].

    Science.gov (United States)

    Lebedev, N V; Klimov, A E; Agrba, S B; Gaidukevich, E K

    To create a reliable system for assessing of severity and prediction of the outcome of peritonitis. Critical analysis of the systems for peritonitis severity assessment is presented. The study included outcomes of 347 patients who admitted at the Department of Faculty Surgery of Peoples' Friendship University of Russia in 2015-2016. The cause of peritonitis were destructive forms of acute appendicitis, cholecystitis, perforated gastroduodenal ulcer, various perforation of small and large intestines (including tumor). Combined forecasting system for peritonitis severity assessment is created. The system includes clinical, laboratory data, assessment of systemic inflammatory response (SIRS) and severity of organ failure (qSOFA). The authors focused on easily identifiable parameters which are available in virtually any surgical hospital. Threshold value (lethal outcome probability over 50%) is 8 scores in this system. Sensitivity, specificity and accuracy were 93.3, 99.7 and 98.9%, respectively according to ROC-curve that exceeds those parameters of MPI and APACHE II.

  15. Effects of track and threat information on judgments of hurricane strike probability.

    Science.gov (United States)

    Wu, Hao-Che; Lindell, Michael K; Prater, Carla S; Samuelson, Charles D

    2014-06-01

    Although evacuation is one of the best strategies for protecting citizens from hurricane threat, the ways that local elected officials use hurricane data in deciding whether to issue hurricane evacuation orders is not well understood. To begin to address this problem, we examined the effects of hurricane track and intensity information in a laboratory setting where participants judged the probability that hypothetical hurricanes with a constant bearing (i.e., straight line forecast track) would make landfall in each of eight 45 degree sectors around the Gulf of Mexico. The results from 162 participants in a student sample showed that the judged strike probability distributions over the eight sectors within each scenario were, unsurprisingly, unimodal and centered on the sector toward which the forecast track pointed. More significantly, although strike probability judgments for the sector in the direction of the forecast track were generally higher than the corresponding judgments for the other sectors, the latter were not zero. Most significantly, there were no appreciable differences in the patterns of strike probability judgments for hurricane tracks represented by a forecast track only, an uncertainty cone only, or forecast track with an uncertainty cone-a result consistent with a recent survey of coastal residents threatened by Hurricane Charley. The study results suggest that people are able to correctly process basic information about hurricane tracks but they do make some errors. More research is needed to understand the sources of these errors and to identify better methods of displaying uncertainty about hurricane parameters. © 2013 Society for Risk Analysis.

  16. Ensemble forecasting using sequential aggregation for photovoltaic power applications

    International Nuclear Information System (INIS)

    Thorey, Jean

    2017-01-01

    Our main objective is to improve the quality of photovoltaic power forecasts deriving from weather forecasts. Such forecasts are imperfect due to meteorological uncertainties and statistical modeling inaccuracies in the conversion of weather forecasts to power forecasts. First we gather several weather forecasts, secondly we generate multiple photovoltaic power forecasts, and finally we build linear combinations of the power forecasts. The minimization of the Continuous Ranked Probability Score (CRPS) allows to statistically calibrate the combination of these forecasts, and provides probabilistic forecasts under the form of a weighted empirical distribution function. We investigate the CRPS bias in this context and several properties of scoring rules which can be seen as a sum of quantile-weighted losses or a sum of threshold-weighted losses. The minimization procedure is achieved with online learning techniques. Such techniques come with theoretical guarantees of robustness on the predictive power of the combination of the forecasts. Essentially no assumptions are needed for the theoretical guarantees to hold. The proposed methods are applied to the forecast of solar radiation using satellite data, and the forecast of photovoltaic power based on high-resolution weather forecasts and standard ensembles of forecasts. (author) [fr

  17. Forecasting electricity consumption in Pakistan: the way forward

    International Nuclear Information System (INIS)

    Hussain, Anwar; Rahman, Muhammad; Memon, Junaid Alam

    2016-01-01

    Growing shortfall of electricity in Pakistan affects almost all sectors of its economy. For proper policy formulation, it is imperative to have reliable forecasts of electricity consumption. This paper applies Holt-Winter and Autoregressive Integrated Moving Average (ARIMA) models on time series secondary data from 1980 to 2011 to forecast total and component wise electricity consumption in Pakistan. Results reveal that Holt-Winter is the appropriate model for forecasting electricity consumption in Pakistan. It also suggests that electricity consumption would continue to increase throughout the projected period and widen the consumption-production gap in case of failure to respond the issue appropriately. It further reveals that demand would be highest in the household sector as compared to all other sectors and the increase in the energy generation would be less than the increase in total electricity consumption throughout the projected period. The study discuss various options to reduce the demand-supply gap and provide reliable electricity to different sectors of the economy. - Highlights: • We forecast total and component wise electricity consumption for Pakistan. • Electricity shortfall in Pakistan will increase in future if same situation exists. • Various options exist to cope with the electricity crisis in the country. • Holt-winter model gives best forecasts for electricity consumption in the country.

  18. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  19. Probabilistic Forecast of Wind Power Generation by Stochastic Differential Equation Models

    KAUST Repository

    Elkantassi, Soumaya

    2017-01-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized

  20. Forecasting seeing and parameters of long-exposure images by means of ARIMA

    Science.gov (United States)

    Kornilov, Matwey V.

    2016-02-01

    Atmospheric turbulence is the one of the major limiting factors for ground-based astronomical observations. In this paper, the problem of short-term forecasting seeing is discussed. The real data that were obtained by atmospheric optical turbulence (OT) measurements above Mount Shatdzhatmaz in 2007-2013 have been analysed. Linear auto-regressive integrated moving average (ARIMA) models are used for the forecasting. A new procedure for forecasting the image characteristics of direct astronomical observations (central image intensity, full width at half maximum, radius encircling 80 % of the energy) has been proposed. Probability density functions of the forecast of these quantities are 1.5-2 times thinner than the respective unconditional probability density functions. Overall, this study found that the described technique could adequately describe temporal stochastic variations of the OT power.

  1. Profit Forecast Model Using Monte Carlo Simulation in Excel

    Directory of Open Access Journals (Sweden)

    Petru BALOGH

    2014-01-01

    Full Text Available Profit forecast is very important for any company. The purpose of this study is to provide a method to estimate the profit and the probability of obtaining the expected profit. Monte Carlo methods are stochastic techniques–meaning they are based on the use of random numbers and probability statistics to investigate problems. Monte Carlo simulation furnishes the decision-maker with a range of possible outcomes and the probabilities they will occur for any choice of action. Our example of Monte Carlo simulation in Excel will be a simplified profit forecast model. Each step of the analysis will be described in detail. The input data for the case presented: the number of leads per month, the percentage of leads that result in sales, , the cost of a single lead, the profit per sale and fixed cost, allow obtaining profit and associated probabilities of achieving.

  2. Energy Consumption Forecasting for University Sector Buildings

    Directory of Open Access Journals (Sweden)

    Khuram Pervez Amber

    2017-10-01

    Full Text Available Reliable energy forecasting helps managers to prepare future budgets for their buildings. Therefore, a simple, easier, less time consuming and reliable forecasting model which could be used for different types of buildings is desired. In this paper, we have presented a forecasting model based on five years of real data sets for one dependent variable (the daily electricity consumption and six explanatory variables (ambient temperature, solar radiation, relative humidity, wind speed, weekday index and building type. A single mathematical equation for forecasting daily electricity usage of university buildings has been developed using the Multiple Regression (MR technique. Data of two such buildings, located at the Southwark Campus of London South Bank University in London, have been used for this study. The predicted test results of MR model are examined and judged against real electricity consumption data of both buildings for year 2011. The results demonstrate that out of six explanatory variables, three variables; surrounding temperature, weekday index and building type have significant influence on buildings energy consumption. The results of this model are associated with a Normalized Root Mean Square Error (NRMSE of 12% for the administrative building and 13% for the academic building. Finally, some limitations of this study have also been discussed.

  3. Multi-step wind speed forecasting based on a hybrid forecasting architecture and an improved bat algorithm

    International Nuclear Information System (INIS)

    Xiao, Liye; Qian, Feng; Shao, Wei

    2017-01-01

    Highlights: • Propose a hybrid architecture based on a modified bat algorithm for multi-step wind speed forecasting. • Improve the accuracy of multi-step wind speed forecasting. • Modify bat algorithm with CG to improve optimized performance. - Abstract: As one of the most promising sustainable energy sources, wind energy plays an important role in energy development because of its cleanliness without causing pollution. Generally, wind speed forecasting, which has an essential influence on wind power systems, is regarded as a challenging task. Analyses based on single-step wind speed forecasting have been widely used, but their results are insufficient in ensuring the reliability and controllability of wind power systems. In this paper, a new forecasting architecture based on decomposing algorithms and modified neural networks is successfully developed for multi-step wind speed forecasting. Four different hybrid models are contained in this architecture, and to further improve the forecasting performance, a modified bat algorithm (BA) with the conjugate gradient (CG) method is developed to optimize the initial weights between layers and thresholds of the hidden layer of neural networks. To investigate the forecasting abilities of the four models, the wind speed data collected from four different wind power stations in Penglai, China, were used as a case study. The numerical experiments showed that the hybrid model including the singular spectrum analysis and general regression neural network with CG-BA (SSA-CG-BA-GRNN) achieved the most accurate forecasting results in one-step to three-step wind speed forecasting.

  4. Forecasting Volatility of USD/MUR Exchange Rate using a GARCH ...

    African Journals Online (AJOL)

    that both distributions may forecast quite well with a slight advantage to the. GARCH(1 ... Financial time series tend to exhibit certain characteristic features such as volatility ... Heteroscedasticity-adjusted MAE to evaluate the forecasts. Chuanga et .... the Student's-t distribution or the GED with the following probability density.

  5. An economic framework for forecasting land-use and ecosystem change

    International Nuclear Information System (INIS)

    Lewis, David J.

    2010-01-01

    This paper develops a joint econometric-simulation framework to forecast detailed empirical distributions of the spatial pattern of land-use and ecosystem change. In-sample and out-of-sample forecasting tests are used to examine the performance of the parcel-scale econometric and simulation models, and the importance of multiple forecasting challenges is assessed. The econometric-simulation method is integrated with an ecological model to generate forecasts of the probability of localized extinctions of an amphibian species. The paper demonstrates the potential of integrating economic and ecological models to generate ecological forecasts in the presence of alternative market conditions and land-use policy constraints. (author)

  6. Objective Lightning Forecasting at Kennedy Space Center and Cape Canaveral Air Force Station using Cloud-to-Ground Lightning Surveillance System Data

    Science.gov (United States)

    Lambert, Winfred; Wheeler, Mark; Roeder, William

    2005-01-01

    The 45th Weather Squadron (45 WS) at Cape Canaveral Air-Force Station (CCAFS)ln Florida issues a probability of lightning occurrence in their daily 24-hour and weekly planning forecasts. This information is used for general planning of operations at CCAFS and Kennedy Space Center (KSC). These facilities are located in east-central Florida at the east end of a corridor known as 'Lightning Alley', an indication that lightning has a large impact on space-lift operations. Much of the current lightning probability forecast is based on a subjective analysis of model and observational data and an objective forecast tool developed over 30 years ago. The 45 WS requested that a new lightning probability forecast tool based on statistical analysis of more recent historical warm season (May-September) data be developed in order to increase the objectivity of the daily thunderstorm probability forecast. The resulting tool is a set of statistical lightning forecast equations, one for each month of the warm season, that provide a lightning occurrence probability for the day by 1100 UTC (0700 EDT) during the warm season.

  7. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  8. The influence of frequency and reliability of in-service inspection on reactor pressure vessel disruptive failure probability

    International Nuclear Information System (INIS)

    Jordan, G.M.

    1977-01-01

    A simple probabilistic methodology is used to investigate the benefit, in terms of reduction of disruptive failure probability, which comes from the application of periodic In Service Inspection (ISI) to nuclear pressure vessels. The analysis indicates the strong interaction between inspection benefit and the intrinsic quality of the structure. In order to quantify the inspection benefit, assumptions are made which allow the quality to be characterized in terms of the parameters governing a Log Normal distribution of time-to-failure. Using these assumptions, it is shown that the overall benefit of ISI is unlikely to exceed an order of magnitude in terms of reduction of disruptive failure probability. The method is extended to evaluate the effect of the periodicity and reliability of the inspection process itself. (author)

  9. The influence of frequency and reliability of in-service inspection on reactor pressure vessel disruptive failure probability

    International Nuclear Information System (INIS)

    Jordan, G.M.

    1978-01-01

    A simple probabilistic methodology is used to investigate the benefit, in terms of reduction of disruptive failure probability, which comes from the application of periodic in-service inspection to nuclear pressure vessels. The analysis indicates the strong interaction between inspection benefit and the intrinsic quality of the structure. In order to quantify the inspection benefit, assumptions are made which allow the quality to be characterised in terms of the parameters governing a log normal distribution of time - to - failure. Using these assumptions, it is shown that the overall benefit of in-service inspection unlikely to exceed an order of magnitude in terms of reduction of disruptive failure probability. The method is extended to evaluate the effect of the periodicity and reliability of the inspection process itself. (author)

  10. Towards a GME ensemble forecasting system: Ensemble initialization using the breeding technique

    Directory of Open Access Journals (Sweden)

    Jan D. Keller

    2008-12-01

    Full Text Available The quantitative forecast of precipitation requires a probabilistic background particularly with regard to forecast lead times of more than 3 days. As only ensemble simulations can provide useful information of the underlying probability density function, we built a new ensemble forecasting system (GME-EFS based on the GME model of the German Meteorological Service (DWD. For the generation of appropriate initial ensemble perturbations we chose the breeding technique developed by Toth and Kalnay (1993, 1997, which develops perturbations by estimating the regions of largest model error induced uncertainty. This method is applied and tested in the framework of quasi-operational forecasts for a three month period in 2007. The performance of the resulting ensemble forecasts are compared to the operational ensemble prediction systems ECMWF EPS and NCEP GFS by means of ensemble spread of free atmosphere parameters (geopotential and temperature and ensemble skill of precipitation forecasting. This comparison indicates that the GME ensemble forecasting system (GME-EFS provides reasonable forecasts with spread skill score comparable to that of the NCEP GFS. An analysis with the continuous ranked probability score exhibits a lack of resolution for the GME forecasts compared to the operational ensembles. However, with significant enhancements during the 3 month test period, the first results of our work with the GME-EFS indicate possibilities for further development as well as the potential for later operational usage.

  11. A new spinning reserve requirement forecast method for deregulated electricity markets

    International Nuclear Information System (INIS)

    Amjady, Nima; Keynia, Farshid

    2010-01-01

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  12. A new spinning reserve requirement forecast method for deregulated electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Amjady, Nima; Keynia, Farshid [Department of Electrical Engineering, Semnan University, Semnan (Iran)

    2010-06-15

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  13. Problems in diagnosing and forecasting power equipment reliability

    Energy Technology Data Exchange (ETDEWEB)

    Popkov, V I; Demirchyan, K S

    1979-11-01

    This general survey deals with approaches to the resolution of such problems as the gathering, analysis and systematization of data on component defects in power equipment and setting up feedback with the manufacturing plants and planning organizations to improve equipment reliability. Such efforts on the part of designers, manufacturers and operating and repair organizations in analyzing faults in 300 MW turbogenerators during 1974-1977 reduced the specific fault rate by 20 to 25% and the downtime per failure by 35 to 40%. Since power equipment should operate for several hundreds of thousands of hours (20 to 30 years) and the majority of power components have guaranteed service lives of no more than 10/sup 5/ hours, an extremely difficult problem is the determination of the reliability of equipment past the 10/sup 5/ point. The present trend in the USSR Unified Power System towards increasing the number of shutdowns and startups, which in the case of turbogenerators of up 1200 MW power can reach 7500 to 10,000 cycles is noted. Other areas briefly treated are: MHD generator reliability and economy; nuclear power plant reliability and safety; the reliability of high-power high-voltage thyristor converters; the difficulties involved in scale modeling of power system reliability and the high cost of the requisite full-scale studies; the poor understanding of long term corrosion and erosion processes. The review concludes with arguments in favor of greater computerization of all aspects of power system management.

  14. Forecasting Italian seismicity through a spatio-temporal physical model: importance of considering time-dependency and reliability of the forecast

    Directory of Open Access Journals (Sweden)

    Amir Hakimhashemi

    2010-11-01

    Full Text Available We apply here a forecasting model to the Italian region for the spatio-temporal distribution of seismicity based on a smoothing Kernel function, Coulomb stress variations, and a rate-and-state friction law. We tested the feasibility of this approach, and analyzed the importance of introducing time-dependency in forecasting future events. The change in seismicity rate as a function of time was estimated by calculating the Coulomb stress change imparted by large earthquakes. We applied our approach to the region of Italy, and used all of the cataloged earthquakes that occurred up to 2006 to generate the reference seismicity rate. For calculation of the time-dependent seismicity rate changes, we estimated the rate-and-state stress transfer imparted by all of the ML≥4.0 earthquakes that occurred during 2007 and 2008. To validate the results, we first compared the reference seismicity rate with the distribution of ML≥1.8 earthquakes since 2007, using both a non-declustered and a declustered catalog. A positive correlation was found, and all of the forecast earthquakes had locations within 82% and 87% of the study area with the highest seismicity rate, respectively. Furthermore, 95% of the forecast earthquakes had locations within 27% and 47% of the study area with the highest seismicity rate, respectively. For the time-dependent seismicity rate changes, the number of events with locations in the regions with a seismicity rate increase was 11% more than in the regions with a seismicity rate decrease.

  15. Measuring inaccuracy in travel demand forecasting

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent

    2005-01-01

    as the basis for measurement. This paper presents the case against both objections. First, if one is interested in learning whether decisions about building transport infrastructure are based on reliable information, then it is exactly the traffic forecasted at the time of making the decision to build......Project promoters, forecasters, and managers sometimes object to two things in measuring inaccuracy in travel demand forecasting: (1)using the forecast made at the time of making the decision to build as the basis for measuring inaccuracy and (2)using traffic during the first year of operations...... that is of interest. Second, although ideally studies should take into account so-called demand ??ramp up?? over a period of years, the empirical evidence and practical considerations do not support this ideal requirement, at least not for large- N studies. Finally, the paper argues that large samples of inaccuracy...

  16. Near-term probabilistic forecast of significant wildfire events for the Western United States

    Science.gov (United States)

    Haiganoush K. Preisler; Karin L. Riley; Crystal S. Stonesifer; Dave E. Calkin; Matt Jolly

    2016-01-01

    Fire danger and potential for large fires in the United States (US) is currently indicated via several forecasted qualitative indices. However, landscape-level quantitative forecasts of the probability of a large fire are currently lacking. In this study, we present a framework for forecasting large fire occurrence - an extreme value event - and evaluating...

  17. Research and Application of a Hybrid Forecasting Model Based on Data Decomposition for Electrical Load Forecasting

    Directory of Open Access Journals (Sweden)

    Yuqi Dong

    2016-12-01

    Full Text Available Accurate short-term electrical load forecasting plays a pivotal role in the national economy and people’s livelihood through providing effective future plans and ensuring a reliable supply of sustainable electricity. Although considerable work has been done to select suitable models and optimize the model parameters to forecast the short-term electrical load, few models are built based on the characteristics of time series, which will have a great impact on the forecasting accuracy. For that reason, this paper proposes a hybrid model based on data decomposition considering periodicity, trend and randomness of the original electrical load time series data. Through preprocessing and analyzing the original time series, the generalized regression neural network optimized by genetic algorithm is used to forecast the short-term electrical load. The experimental results demonstrate that the proposed hybrid model can not only achieve a good fitting ability, but it can also approximate the actual values when dealing with non-linear time series data with periodicity, trend and randomness.

  18. Exploring What Determines the Use of Forecasts of Varying Time Periods in Guanacaste, Costa Rica

    Science.gov (United States)

    Babcock, M.; Wong-Parodi, G.; Grossmann, I.; Small, M. J.

    2016-12-01

    Weather and climate forecasts are promoted as ways to improve water management, especially in the face of changing environmental conditions. However, studies indicate many stakeholders who may benefit from such information do not use it. This study sought to better understand which personal factors (e.g., trust in forecast sources, perceptions of accuracy) were important determinants of the use of 4-day, 3-month, and 12-month rainfall forecasts by stakeholders in water management-related sectors in the seasonally dry province of Guanacaste, Costa Rica. From August to October 2015, we surveyed 87 stakeholders from a mix of government agencies, local water committees, large farms, tourist businesses, environmental NGO's, and the public. The result of an exploratory factor analysis suggests that trust in "informal" forecast sources (traditional methods, family advice) and in "formal" sources (government, university and private company science) are independent of each other. The result of logistic regression analyses suggest that 1) greater understanding of forecasts is associated with a greater probability of 4-day and 3-month forecast use, but not 12-month forecast use, 2) a greater probability of 3-month forecast use is associated with a lower level of trust in "informal" sources, and 3), feeling less secure about water resources, and regularly using many sources of information (and specifically formal meetings and reports) are each associated with a greater probability of using 12-month forecasts. While limited by the sample size, and affected by the factoring method and regression model assumptions, these results do appear to suggest that while forecasts of all times scales are used to some extent, local decision makers' decisions to use 4-day and 3-month forecasts appear to be more intrinsically motivated (based on their level of understanding and trust) and the use of 12-month forecasts seems to be more motivated by a sense of requirement or mandate.

  19. MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity

    Science.gov (United States)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2014-01-01

    MAG4 (Magnetogram Forecast), developed originally for NASA/SRAG (Space Radiation Analysis Group), is an automated program that analyzes magnetograms from the HMI (Helioseismic and Magnetic Imager) instrument on NASA SDO (Solar Dynamics Observatory), and automatically converts the rate (or probability) of major flares (M- and X-class), Coronal Mass Ejections (CMEs), and Solar Energetic Particle Events. MAG4 does not forecast that a flare will occur at a particular time in the next 24 or 48 hours; rather the probability of one occurring.

  20. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya

    2017-10-03

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  1. INFERENCE AND SENSITIVITY IN STOCHASTIC WIND POWER FORECAST MODELS.

    KAUST Repository

    Elkantassi, Soumaya; Kalligiannaki, Evangelia; Tempone, Raul

    2017-01-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  2. Hourly weather forecasts for gas turbine power generation

    Directory of Open Access Journals (Sweden)

    G. Giunta

    2017-06-01

    Full Text Available An hourly short-term weather forecast can optimize processes in Combined Cycle Gas Turbine (CCGT plants by helping to reduce imbalance charges on the national power grid. Consequently, a reliable meteorological prediction for a given power plant is crucial for obtaining competitive prices for the electric market, better planning and stock management, sales and supplies of energy sources. The paper discusses the short-term hourly temperature forecasts, at lead time day+1 and day+2, over a period of thirteen months in 2012 and 2013 for six Italian CCGT power plants of 390 MW each (260 MW from the gas turbine and 130 MW from the steam turbine. These CCGT plants are placed in three different Italian climate areas: the Po Valley, the Adriatic coast, and the North Tyrrhenian coast. The meteorological model applied in this study is the eni-Kassandra Meteo Forecast (e‑kmf™, a multi-model approach system to provide probabilistic forecasts with a Kalman filter used to improve accuracy of local temperature predictions. Performance skill scores, computed by the output data of the meteorological model, are compared with local observations, and used to evaluate forecast reliability. In the study, the approach has shown good overall scores encompassing more than 50,000 hourly temperature values. Some differences from one site to another, due to local meteorological phenomena, can affect the short-term forecast performance, with consequent impacts on gas-to-power production and related negative imbalances. For operational application of the methodology in CCGT power plant, the benefits and limits have been successfully identified.

  3. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  4. Evaluating machine-learning techniques for recruitment forecasting of seven North East Atlantic fish species

    KAUST Repository

    Fernandes, José Antonio

    2015-01-01

    The effect of different factors (spawning biomass, environmental conditions) on recruitment is a subject of great importance in the management of fisheries, recovery plans and scenario exploration. In this study, recently proposed supervised classification techniques, tested by the machine-learning community, are applied to forecast the recruitment of seven fish species of North East Atlantic (anchovy, sardine, mackerel, horse mackerel, hake, blue whiting and albacore), using spawning, environmental and climatic data. In addition, the use of the probabilistic flexible naive Bayes classifier (FNBC) is proposed as modelling approach in order to reduce uncertainty for fisheries management purposes. Those improvements aim is to improve probability estimations of each possible outcome (low, medium and high recruitment) based in kernel density estimation, which is crucial for informed management decision making with high uncertainty. Finally, a comparison between goodness-of-fit and generalization power is provided, in order to assess the reliability of the final forecasting models. It is found that in most cases the proposed methodology provides useful information for management whereas the case of horse mackerel is an example of the limitations of the approach. The proposed improvements allow for a better probabilistic estimation of the different scenarios, i.e. to reduce the uncertainty in the provided forecasts.

  5. Accurate Short-Term Power Forecasting of Wind Turbines: The Case of Jeju Island’s Wind Farm

    OpenAIRE

    BeomJun Park; Jin Hur

    2017-01-01

    Short-term wind power forecasting is a technique which tells system operators how much wind power can be expected at a specific time. Due to the increasing penetration of wind generating resources into the power grids, short-term wind power forecasting is becoming an important issue for grid integration analysis. The high reliability of wind power forecasting can contribute to the successful integration of wind generating resources into the power grids. To guarantee the reliability of forecas...

  6. Human error probability evaluation as part of reliability analysis of digital protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Lee, D. Y.; Han, J. B.

    2003-03-01

    A case of study on human reliability analysis has been performed as part of reliability analysis of digital protection system of the reactor automatically actuates the shutdown system of the reactor when demanded. However, the safety analysis takes credit for operator action as a diverse mean for tripping the reactor for, though a low probability, ATWS scenario. Based on the available information two cases, viz., human error in tripping the reactor and calibration error for instrumentations in protection system, have been analyzed. Wherever applicable a parametric study has also been performed

  7. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  8. Global Grid of Probabilities of Urban Expansion to 2030

    Data.gov (United States)

    National Aeronautics and Space Administration — The Global Grid of Probabilities of Urban Expansion to 2030 presents spatially explicit probabilistic forecasts of global urban land cover change from 2000 to 2030...

  9. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  10. Stochastic model of forecasting spare parts demand

    Directory of Open Access Journals (Sweden)

    Ivan S. Milojević

    2012-01-01

    hypothesis of the existence of phenomenon change trends, the next step in the methodology of forecasting is the determination of a specific growth curve that describes the regularity of the development in time. These curves of growth are obtained by the analytical representation (expression of dynamic lines. There are two basic stages in the process of expression and they are: - The choice of the type of curve the shape of which corresponds to the character of the dynamic order variation - the determination of the number of values (evaluation of the curve parameters. The most widespread method of forecasting is the trend extrapolation. The basis of the trend extrapolation is the continuing of past trends in the future. The simplicity of the trend extrapolation process, on the one hand, and the absence of other information on the other hand, are the main reasons why the trend extrapolation is used for forecasting. The trend extrapolation is founded on the following assumptions: - The phenomenon development can be presented as an evolutionary trajectory or trend, - General conditions that influenced the trend development in the past will not undergo substantial changes in the future. Spare parts demand forecasting is constantly being done in all warehouses, workshops, and at all levels. Without demand forecasting, neither planning nor decision making can be done. Demand forecasting is the input for determining the level of reserve, size of the order, ordering cycles, etc. The question that arises is the one of the reliability and accuracy of a forecast and its effects. Forecasting 'by feeling' is not to be dismissed if there is nothing better, but in this case, one must be prepared for forecasting failures that cause unnecessary accumulation of certain spare parts, and also a chronic shortage of other spare parts. All this significantly increases costs and does not provide a satisfactory supply of spare parts. The main problem of the application of this model is that each

  11. The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting

    Science.gov (United States)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio

    2017-08-01

    This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.

  12. Staged decision making based on probabilistic forecasting

    Science.gov (United States)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  13. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  14. Long Range River Discharge Forecasting Using the Gravity Recovery and Climate Experiment (GRACE) Satellite to Predict Conditions for Endemic Cholera

    Science.gov (United States)

    Jutla, A.; Akanda, A. S.; Colwell, R. R.

    2014-12-01

    Prediction of conditions of an impending disease outbreak remains a challenge but is achievable if the associated and appropriate large scale hydroclimatic process can be estimated in advance. Outbreaks of diarrheal diseases such as cholera, are related to episodic seasonal variability in river discharge in the regions where water and sanitation infrastructure are inadequate and insufficient. However, forecasting river discharge, few months in advance, remains elusive where cholera outbreaks are frequent, probably due to non-availability of geophysical data as well as transboundary water stresses. Here, we show that satellite derived water storage from Gravity Recovery and Climate Experiment Forecasting (GRACE) sensors can provide reliable estimates on river discharge atleast two months in advance over regional scales. Bayesian regression models predicted flooding and drought conditions, a prerequisite for cholera outbreaks, in Bengal Delta with an overall accuracy of 70% for upto 60 days in advance without using any other ancillary ground based data. Forecasting of river discharge will have significant impacts on planning and designing intervention strategies for potential cholera outbreaks in the coastal regions where the disease remain endemic and often fatal.

  15. Forecaster’s utility and forecasts coherence

    DEFF Research Database (Denmark)

    Chini, Emilio Zanetti

    model to ease the statistical inference. A simulation study reveals that the test behaves consistently with the requirements of the theoretical literature. The locality of the scoring rule is fundamental to set dating algorithms to measure and forecast probability of recession in US business cycle...

  16. Statistical-Dynamical Seasonal Forecasts of Central-Southwest Asian Winter Precipitation.

    Science.gov (United States)

    Tippett, Michael K.; Goddard, Lisa; Barnston, Anthony G.

    2005-06-01

    Interannual precipitation variability in central-southwest (CSW) Asia has been associated with East Asian jet stream variability and western Pacific tropical convection. However, atmospheric general circulation models (AGCMs) forced by observed sea surface temperature (SST) poorly simulate the region's interannual precipitation variability. The statistical-dynamical approach uses statistical methods to correct systematic deficiencies in the response of AGCMs to SST forcing. Statistical correction methods linking model-simulated Indo-west Pacific precipitation and observed CSW Asia precipitation result in modest, but statistically significant, cross-validated simulation skill in the northeast part of the domain for the period from 1951 to 1998. The statistical-dynamical method is also applied to recent (winter 1998/99 to 2002/03) multimodel, two-tier December-March precipitation forecasts initiated in October. This period includes 4 yr (winter of 1998/99 to 2001/02) of severe drought. Tercile probability forecasts are produced using ensemble-mean forecasts and forecast error estimates. The statistical-dynamical forecasts show enhanced probability of below-normal precipitation for the four drought years and capture the return to normal conditions in part of the region during the winter of 2002/03.May Kabul be without gold, but not without snow.—Traditional Afghan proverb

  17. FORECASTING OF PERFORMANCE EVALUATION OF NEW VEHICLES

    Directory of Open Access Journals (Sweden)

    O. S. Krasheninin

    2016-12-01

    Full Text Available Purpose. The research work focuses on forecasting of performance evaluation of the tractive and non-tractive vehicles that will satisfy and meet the needs and requirements of the railway industry, which is constantly evolving. Methodology. Analysis of the technical condition of the existing fleet of rolling stock (tractive and non-tractive of Ukrainian Railways shows a substantial reduction that occurs in connection with its moral and physical wear and tear, as well as insufficient and limited purchase of new units of the tractive and non-tractive rolling stock in the desired quantity. In this situation there is a necessity of search of the methods for determination of rolling stock technical characteristics. One of such urgent and effective measures is to conduct forecasting of the defining characteristics of the vehicles based on the processes of their reproduction in conditions of limited resources using a continuous exponential function. The function of the growth rate of the projected figure degree for the vehicle determines the logistic characteristic that with unlimited resources has the form of an exponent, and with low ones – that of a line. Findings. The data obtained according to the proposed method allowed determining the expected (future value, that is the ratio of load to volume of the body for non-tractive rolling stock (gondola cars and weight-to-power for tractive rolling stock, the degree of forecast reliability and the standard forecast error, which show high prediction accuracy for the completed procedure. As a result, this will allow estimating the required characteristics of vehicles in the forecast year with high accuracy. Originality. The concept of forecasting the characteristics of the vehicles for decision-making on the evaluation of their prospects was proposed. Practical value. The forecasting methodology will reliably determine the technical parameters of tractive and non-tractive rolling stock, which will meet

  18. Global-warming forecasting models

    International Nuclear Information System (INIS)

    Moeller, K.P.

    1992-01-01

    In spite of an annual man-made quantity of about 20 billion tons, carbon dioxide has remained a trace gas in the atmosphere (350 ppm at present). The reliability of model calculations which forecast temperatures is dicussed in view of the world-wide increase in carbon dioxides. Computer simulations reveal a general, serious threat to the future of mankind. (DG) [de

  19. Forecasting distribution of numbers of large fires

    Science.gov (United States)

    Eidenshink, Jeffery C.; Preisler, Haiganoush K.; Howard, Stephen; Burgan, Robert E.

    2014-01-01

    Systems to estimate forest fire potential commonly utilize one or more indexes that relate to expected fire behavior; however they indicate neither the chance that a large fire will occur, nor the expected number of large fires. That is, they do not quantify the probabilistic nature of fire danger. In this work we use large fire occurrence information from the Monitoring Trends in Burn Severity project, and satellite and surface observations of fuel conditions in the form of the Fire Potential Index, to estimate two aspects of fire danger: 1) the probability that a 1 acre ignition will result in a 100+ acre fire, and 2) the probabilities of having at least 1, 2, 3, or 4 large fires within a Predictive Services Area in the forthcoming week. These statistical processes are the main thrust of the paper and are used to produce two daily national forecasts that are available from the U.S. Geological Survey, Earth Resources Observation and Science Center and via the Wildland Fire Assessment System. A validation study of our forecasts for the 2013 fire season demonstrated good agreement between observed and forecasted values.

  20. Spatial Distribution of the Coefficient of Variation and Bayesian Forecast for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, Shunichi; Ogata, Yosihiko

    2016-04-01

    We propose a Bayesian method of probability forecasting for recurrent earthquakes of inland active faults in Japan. Renewal processes with the Brownian Passage Time (BPT) distribution are applied for over a half of active faults in Japan by the Headquarters for Earthquake Research Promotion (HERP) of Japan. Long-term forecast with the BPT distribution needs two parameters; the mean and coefficient of variation (COV) for recurrence intervals. The HERP applies a common COV parameter for all of these faults because most of them have very few specified paleoseismic events, which is not enough to estimate reliable COV values for respective faults. However, different COV estimates are proposed for the same paleoseismic catalog by some related works. It can make critical difference in forecast to apply different COV estimates and so COV should be carefully selected for individual faults. Recurrence intervals on a fault are, on the average, determined by the long-term slip rate caused by the tectonic motion but fluctuated by nearby seismicities which influence surrounding stress field. The COVs of recurrence intervals depend on such stress perturbation and so have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus we introduce a spatial structure on its COV parameter by Bayesian modeling with a Gaussian process prior. The COVs on active faults are correlated and take similar values for closely located faults. It is found that the spatial trends in the estimated COV values coincide with the density of active faults in Japan. We also show Bayesian forecasts by the proposed model using Markov chain Monte Carlo method. Our forecasts are different from HERP's forecast especially on the active faults where HERP's forecasts are very high or low.

  1. Statistical equivalence and test-retest reliability of delay and probability discounting using real and hypothetical rewards.

    Science.gov (United States)

    Matusiewicz, Alexis K; Carter, Anne E; Landes, Reid D; Yi, Richard

    2013-11-01

    Delay discounting (DD) and probability discounting (PD) refer to the reduction in the subjective value of outcomes as a function of delay and uncertainty, respectively. Elevated measures of discounting are associated with a variety of maladaptive behaviors, and confidence in the validity of these measures is imperative. The present research examined (1) the statistical equivalence of discounting measures when rewards were hypothetical or real, and (2) their 1-week reliability. While previous research has partially explored these issues using the low threshold of nonsignificant difference, the present study fully addressed this issue using the more-compelling threshold of statistical equivalence. DD and PD measures were collected from 28 healthy adults using real and hypothetical $50 rewards during each of two experimental sessions, one week apart. Analyses using area-under-the-curve measures revealed a general pattern of statistical equivalence, indicating equivalence of real/hypothetical conditions as well as 1-week reliability. Exceptions are identified and discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Rebuttal of "Polar bear population forecasts: a public-policy forecasting audit"

    Science.gov (United States)

    Amstrup, Steven C.; Caswell, Hal; DeWeaver, Eric; Stirling, Ian; Douglas, David C.; Marcot, Bruce G.; Hunter, Christine M.

    2009-01-01

    Observed declines in the Arctic sea ice have resulted in a variety of negative effects on polar bears (Ursus maritimus). Projections for additional future declines in sea ice resulted in a proposal to list polar bears as a threatened species under the United States Endangered Species Act. To provide information for the Department of the Interior's listing-decision process, the US Geological Survey (USGS) produced a series of nine research reports evaluating the present and future status of polar bears throughout their range. In response, Armstrong et al. [Armstrong, J. S., K. C. Green, W. Soon. 2008. Polar bear population forecasts: A public-policy forecasting audit. Interfaces 38(5) 382–405], which we will refer to as AGS, performed an audit of two of these nine reports. AGS claimed that the general circulation models upon which the USGS reports relied were not valid forecasting tools, that USGS researchers were not objective or lacked independence from policy decisions, that they did not utilize all available information in constructing their forecasts, and that they violated numerous principles of forecasting espoused by AGS. AGS (p. 382) concluded that the two USGS reports were "unscientific and inconsequential to decision makers." We evaluate the AGS audit and show how AGS are mistaken or misleading on every claim. We provide evidence that general circulation models are useful in forecasting future climate conditions and that corporate and government leaders are relying on these models to do so. We clarify the strict independence of the USGS from the listing decision. We show that the allegations of failure to follow the principles of forecasting espoused by AGS are either incorrect or are based on misconceptions about the Arctic environment, polar bear biology, or statistical and mathematical methods. We conclude by showing that the AGS principles of forecasting are too ambiguous and subjective to be used as a reliable basis for auditing scientific

  3. Evaluation of the fast orthogonal search method for forecasting chloride levels in the Deltona groundwater supply (Florida, USA)

    Science.gov (United States)

    El-Jaat, Majda; Hulley, Michael; Tétreault, Michel

    2018-02-01

    Despite the broad impact and importance of saltwater intrusion in coastal aquifers, little research has been directed towards forecasting saltwater intrusion in areas where the source of saltwater is uncertain. Saline contamination in inland groundwater supplies is a concern for numerous communities in the southern US including the city of Deltona, Florida. Furthermore, conventional numerical tools for forecasting saltwater contamination are heavily dependent on reliable characterization of the physical characteristics of underlying aquifers, information that is often absent or challenging to obtain. To overcome these limitations, a reliable alternative data-driven model for forecasting salinity in a groundwater supply was developed for Deltona using the fast orthogonal search (FOS) method. FOS was applied on monthly water-demand data and corresponding chloride concentrations at water supply wells. Groundwater salinity measurements from Deltona water supply wells were applied to evaluate the forecasting capability and accuracy of the FOS model. Accurate and reliable groundwater salinity forecasting is necessary to support effective and sustainable coastal-water resource planning and management. The available (27) water supply wells for Deltona were randomly split into three test groups for the purposes of FOS model development and performance assessment. Based on four performance indices (RMSE, RSR, NSEC, and R), the FOS model proved to be a reliable and robust forecaster of groundwater salinity. FOS is relatively inexpensive to apply, is not based on rigorous physical characterization of the water supply aquifer, and yields reliable estimates of groundwater salinity in active water supply wells.

  4. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the

  5. Enhanced short-term wind power forecasting and value to grid operations. The wind forecasting improvement project

    Energy Technology Data Exchange (ETDEWEB)

    Orwig, Kirsten D. [National Renewable Energy Laboratory (NREL), Golden, CO (United States). Transmission Grid Integration; Benjamin, Stan; Wilczak, James; Marquis, Melinda [National Oceanic and Atmospheric Administration, Boulder, CO (United States). Earth System Research Lab.; Stern, Andrew [National Oceanic and Atmospheric Administration, Silver Spring, MD (United States); Clark, Charlton; Cline, Joel [U.S. Department of Energy, Washington, DC (United States). Wind and Water Power Program; Finley, Catherine [WindLogics, Grand Rapids, MN (United States); Freedman, Jeffrey [AWS Truepower, Albany, NY (United States)

    2012-07-01

    The current state-of-the-art wind power forecasting in the 0- to 6-h timeframe has levels of uncertainty that are adding increased costs and risks to the U.S. electrical grid. It is widely recognized within the electrical grid community that improvements to these forecasts could greatly reduce the costs and risks associated with integrating higher penetrations of wind energy. The U.S. Department of Energy has sponsored a research campaign in partnership with the National Oceanic and Atmospheric Administration (NOAA) and private industry to foster improvements in wind power forecasting. The research campaign involves a three-pronged approach: (1) a one-year field measurement campaign within two regions; (2) enhancement of NOAA's experimental 3-km High-Resolution Rapid Refresh (HRRR) model by assimilating the data from the field campaign; and (3) evaluation of the economic and reliability benefits of improved forecasts to grid operators. This paper and presentation provide an overview of the regions selected, instrumentation deployed, data quality and control, assimilation of data into HRRR, and preliminary results of HRRR performance analysis. (orig.)

  6. Forecast of reliability for mechanical components subjected to wearing; Pronostico de la fiabilidad de componentes mecanicos sometidos a desgaste

    Energy Technology Data Exchange (ETDEWEB)

    Angulo-Zevallos, J.; Castellote-Varona, C.; Alanbari, M.

    2010-07-01

    Generally, improving quality and price of products, obtaining a complete customer satisfaction and achieving excellence in all the processes are some of the challenges currently set up by every company. To do this, knowing frequently the reliability of some component is necessary. To achieve this goal, a research, that contributes with clear ideas and offers a methodology for the assessment of the parameters involved in the reliability calculation, becomes necessary. A parameter closely related to this concept is the probability of product failure depending on the operating time. It is known that mechanical components fail by: creep, fatigue, wear, corrosion, etc. This article proposes a methodology for finding the reliability of a component subject to wear, such as brake pads, grinding wheels, brake linings of clutch discs, etc. (Author)

  7. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    Science.gov (United States)

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  8. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  9. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  10. Long-range forecasting of intermittent streamflow

    Science.gov (United States)

    van Ogtrop, F. F.; Vervoort, R. W.; Heller, G. Z.; Stasinopoulos, D. M.; Rigby, R. A.

    2011-11-01

    Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS) to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth) of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  11. Long-range forecasting of intermittent streamflow

    Directory of Open Access Journals (Sweden)

    F. F. van Ogtrop

    2011-11-01

    Full Text Available Long-range forecasting of intermittent streamflow in semi-arid Australia poses a number of major challenges. One of the challenges relates to modelling zero, skewed, non-stationary, and non-linear data. To address this, a statistical model to forecast streamflow up to 12 months ahead is applied to five semi-arid catchments in South Western Queensland. The model uses logistic regression through Generalised Additive Models for Location, Scale and Shape (GAMLSS to determine the probability of flow occurring in any of the systems. We then use the same regression framework in combination with a right-skewed distribution, the Box-Cox t distribution, to model the intensity (depth of the non-zero streamflows. Time, seasonality and climate indices, describing the Pacific and Indian Ocean sea surface temperatures, are tested as covariates in the GAMLSS model to make probabilistic 6 and 12-month forecasts of the occurrence and intensity of streamflow. The output reveals that in the study region the occurrence and variability of flow is driven by sea surface temperatures and therefore forecasts can be made with some skill.

  12. Risk Analysis of Reservoir Flood Routing Calculation Based on Inflow Forecast Uncertainty

    Directory of Open Access Journals (Sweden)

    Binquan Li

    2016-10-01

    Full Text Available Possible risks in reservoir flood control and regulation cannot be objectively assessed by deterministic flood forecasts, resulting in the probability of reservoir failure. We demonstrated a risk analysis of reservoir flood routing calculation accounting for inflow forecast uncertainty in a sub-basin of Huaihe River, China. The Xinanjiang model was used to provide deterministic flood forecasts, and was combined with the Hydrologic Uncertainty Processor (HUP to quantify reservoir inflow uncertainty in the probability density function (PDF form. Furthermore, the PDFs of reservoir water level (RWL and the risk rate of RWL exceeding a defined safety control level could be obtained. Results suggested that the median forecast (50th percentiles of HUP showed better agreement with observed inflows than the Xinanjiang model did in terms of the performance measures of flood process, peak, and volume. In addition, most observations (77.2% were bracketed by the uncertainty band of 90% confidence interval, with some small exceptions of high flows. Results proved that this framework of risk analysis could provide not only the deterministic forecasts of inflow and RWL, but also the fundamental uncertainty information (e.g., 90% confidence band for the reservoir flood routing calculation.

  13. Short-term Probabilistic Load Forecasting with the Consideration of Human Body Amenity

    Directory of Open Access Journals (Sweden)

    Ning Lu

    2013-02-01

    Full Text Available Load forecasting is the basis of power system planning and design. It is important for the economic operation and reliability assurance of power system. However, the results of load forecasting given by most existing methods are deterministic. This study aims at probabilistic load forecasting. First, the support vector machine regression is used to acquire the deterministic results of load forecasting with the consideration of human body amenity. Then the probabilistic load forecasting at a certain confidence level is given after the analysis of error distribution law corresponding to certain heat index interval. The final simulation shows that this probabilistic forecasting method is easy to implement and can provide more information than the deterministic forecasting results, and thus is helpful for decision-makers to make reasonable decisions.

  14. Probabilistic forecasting of wind power generation using extreme learning machine

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Pinson, Pierre

    2014-01-01

    an extreme learning machine (ELM)-based probabilistic forecasting method for wind power generation. To account for the uncertainties in the forecasting results, several bootstrapmethods have been compared for modeling the regression uncertainty, based on which the pairs bootstrap method is identified......Accurate and reliable forecast of wind power is essential to power system operation and control. However, due to the nonstationarity of wind power series, traditional point forecasting can hardly be accurate, leading to increased uncertainties and risks for system operation. This paper proposes...... with the best performance. Consequently, a new method for prediction intervals formulation based on theELMand the pairs bootstrap is developed.Wind power forecasting has been conducted in different seasons using the proposed approach with the historical wind power time series as the inputs alone. The results...

  15. On the relation between forecast precision and trading profitability of financial analysts

    DEFF Research Database (Denmark)

    Marinelli, Carlo; Weissensteiner, Alex

    2014-01-01

    We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions for the ......We analyze the relation between earnings forecast accuracy and the expected profitability of financial analysts. Modeling forecast errors with a multivariate normal distribution, a complete characterization of the payoff of each analyst is provided. In particular, closed-form expressions...... for the probability density function, for the expectation, and, more generally, for moments of all orders are obtained. Our analysis shows that the relationship between forecast precision and trading profitability needs not be monotonic, and that the impact of the correlation between the forecasts on the expected...

  16. Flood Forecasting in River System Using ANFIS

    International Nuclear Information System (INIS)

    Ullah, Nazrin; Choudhury, P.

    2010-01-01

    The aim of the present study is to investigate applicability of artificial intelligence techniques such as ANFIS (Adaptive Neuro-Fuzzy Inference System) in forecasting flood flow in a river system. The proposed technique combines the learning ability of neural network with the transparent linguistic representation of fuzzy system. The technique is applied to forecast discharge at a downstream station using flow information at various upstream stations. A total of three years data has been selected for the implementation of this model. ANFIS models with various input structures and membership functions are constructed, trained and tested to evaluate efficiency of the models. Statistical indices such as Root Mean Square Error (RMSE), Correlation Coefficient (CORR) and Coefficient of Efficiency (CE) are used to evaluate performance of the ANFIS models in forecasting river flood. The values of the indices show that ANFIS model can accurately and reliably be used to forecast flood in a river system.

  17. Approach for an integral power transformer reliability model

    NARCIS (Netherlands)

    Schijndel, van A.; Wouters, P.A.A.F.; Steennis, E.F.; Wetzer, J.M.

    2012-01-01

    In electrical power transmission and distribution networks power transformers represent a crucial group of assets both in terms of reliability and investments. In order to safeguard the required quality at acceptable costs, decisions must be based on a reliable forecast of future behaviour. The aim

  18. Robust Forecasting of Non-Stationary Time Series

    NARCIS (Netherlands)

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable

  19. Application of the Fokker-Plank-Kolmogorov equation for affluence forecast to hydropower reservoirs (Betania Case)

    International Nuclear Information System (INIS)

    Dominguez Calle, Efrain Antonio

    2004-01-01

    This paper shows a modeling technique to forecast probability density curves for the flows that represent the monthly affluence to hydropower reservoirs. Briefly, the factors that require affluence forecast in terms of probabilities, the ranges of existing forecast methods as well as the contradiction between those techniques and the real requirements of decision-making procedures are pointed out. The mentioned contradiction is resolved applying the Fokker-Planck-Kolmogorov equation that describes the time evolution of a stochastic process that can be considered as markovian. We show the numerical scheme for this equation, its initial and boundary conditions, and its application results in the case of Betania's reservoir

  20. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  1. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  2. Satellite based Ocean Forecasting, the SOFT project

    Science.gov (United States)

    Stemmann, L.; Tintoré, J.; Moneris, S.

    2003-04-01

    The knowledge of future oceanic conditions would have enormous impact on human marine related areas. For such reasons, a number of international efforts are being carried out to obtain reliable and manageable ocean forecasting systems. Among the possible techniques that can be used to estimate the near future states of the ocean, an ocean forecasting system based on satellite imagery is developped through the Satelitte based Ocean ForecasTing project (SOFT). SOFT, established by the European Commission, considers the development of a forecasting system of the ocean space-time variability based on satellite data by using Artificial Intelligence techniques. This system will be merged with numerical simulation approaches, via assimilation techniques, to get a hybrid SOFT-numerical forecasting system of improved performance. The results of the project will provide efficient forecasting of sea-surface temperature structures, currents, dynamic height, and biological activity associated to chlorophyll fields. All these quantities could give valuable information on the planning and management of human activities in marine environments such as navigation, fisheries, pollution control, or coastal management. A detailed identification of present or new needs and potential end-users concerned by such an operational tool is being performed. The project would study solutions adapted to these specific needs.

  3. A Bayesian Method for Short-Term Probabilistic Forecasting of Photovoltaic Generation in Smart Grid Operation and Control

    Directory of Open Access Journals (Sweden)

    Gabriella Ferruzzi

    2013-02-01

    Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.

  4. Probabilistic Forecast of Wind Power Generation by Stochastic Differential Equation Models

    KAUST Repository

    Elkantassi, Soumaya

    2017-04-01

    Reliable forecasting of wind power generation is crucial to optimal control of costs in generation of electricity with respect to the electricity demand. Here, we propose and analyze stochastic wind power forecast models described by parametrized stochastic differential equations, which introduce appropriate fluctuations in numerical forecast outputs. We use an approximate maximum likelihood method to infer the model parameters taking into account the time correlated sets of data. Furthermore, we study the validity and sensitivity of the parameters for each model. We applied our models to Uruguayan wind power production as determined by historical data and corresponding numerical forecasts for the period of March 1 to May 31, 2016.

  5. An Integrated Modeling Approach for Forecasting Long-Term Energy Demand in Pakistan

    OpenAIRE

    Syed Aziz Ur Rehman; Yanpeng Cai; Rizwan Fazal; Gordhan Das Walasai; Nayyar Hussain Mirjat

    2017-01-01

    Energy planning and policy development require an in-depth assessment of energy resources and long-term demand forecast estimates. Pakistan, unfortunately, lacks reliable data on its energy resources as well do not have dependable long-term energy demand forecasts. As a result, the policy makers could not come up with an effective energy policy in the history of the country. Energy demand forecast has attained greatest ever attention in the perspective of growing population and diminishing fo...

  6. Forecasting metal prices: Do forecasters herd?

    DEFF Research Database (Denmark)

    Pierdzioch, C.; Rulke, J. C.; Stadtmann, G.

    2013-01-01

    We analyze more than 20,000 forecasts of nine metal prices at four different forecast horizons. We document that forecasts are heterogeneous and report that anti-herding appears to be a source of this heterogeneity. Forecaster anti-herding reflects strategic interactions among forecasters...

  7. Meta-heuristic CRPS minimization for the calibration of short-range probabilistic forecasts

    Science.gov (United States)

    Mohammadi, Seyedeh Atefeh; Rahmani, Morteza; Azadi, Majid

    2016-08-01

    This paper deals with the probabilistic short-range temperature forecasts over synoptic meteorological stations across Iran using non-homogeneous Gaussian regression (NGR). NGR creates a Gaussian forecast probability density function (PDF) from the ensemble output. The mean of the normal predictive PDF is a bias-corrected weighted average of the ensemble members and its variance is a linear function of the raw ensemble variance. The coefficients for the mean and variance are estimated by minimizing the continuous ranked probability score (CRPS) during a training period. CRPS is a scoring rule for distributional forecasts. In the paper of Gneiting et al. (Mon Weather Rev 133:1098-1118, 2005), Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is used to minimize the CRPS. Since BFGS is a conventional optimization method with its own limitations, we suggest using the particle swarm optimization (PSO), a robust meta-heuristic method, to minimize the CRPS. The ensemble prediction system used in this study consists of nine different configurations of the weather research and forecasting model for 48-h forecasts of temperature during autumn and winter 2011 and 2012. The probabilistic forecasts were evaluated using several common verification scores including Brier score, attribute diagram and rank histogram. Results show that both BFGS and PSO find the optimal solution and show the same evaluation scores, but PSO can do this with a feasible random first guess and much less computational complexity.

  8. Technical note: Combining quantile forecasts and predictive distributions of streamflows

    Science.gov (United States)

    Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano

    2017-11-01

    The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.

  9. Comparative Analysis of NOAA REFM and SNB3GEO Tools for the Forecast of the Fluxes of High-Energy Electrons at GEO

    Science.gov (United States)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  10. Comparative analysis of NOAA REFM and SNB3GEO tools for the forecast of the fluxes of high-energy electrons at GEO

    Science.gov (United States)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, H.; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field Bz observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  11. Reliability-Based Code Calibration

    DEFF Research Database (Denmark)

    Faber, M.H.; Sørensen, John Dalsgaard

    2003-01-01

    The present paper addresses fundamental concepts of reliability based code calibration. First basic principles of structural reliability theory are introduced and it is shown how the results of FORM based reliability analysis may be related to partial safety factors and characteristic values....... Thereafter the code calibration problem is presented in its principal decision theoretical form and it is discussed how acceptable levels of failure probability (or target reliabilities) may be established. Furthermore suggested values for acceptable annual failure probabilities are given for ultimate...... and serviceability limit states. Finally the paper describes the Joint Committee on Structural Safety (JCSS) recommended procedure - CodeCal - for the practical implementation of reliability based code calibration of LRFD based design codes....

  12. Integrating a Storage Factor into R-NARX Neural Networks for Flood Forecasts

    Science.gov (United States)

    Chou, Po-Kai; Chang, Li-Chiu; Chang, Fi-John; Shih, Ban-Jwu

    2017-04-01

    Because mountainous terrains and steep landforms rapidly accelerate the speed of flood flow in Taiwan island, accurate multi-step-ahead inflow forecasts during typhoon events for providing reliable information benefiting the decision-makings of reservoir pre-storm release and flood-control operation are considered crucial and challenging. Various types of artificial neural networks (ANNs) have been successfully applied in hydrological fields. This study proposes a recurrent configuration of the nonlinear autoregressive with exogenous inputs (NARX) network, called R-NARX, with various effective inputs to forecast the inflows of the Feitsui Reservoir, a pivot reservoir for water supply to Taipei metropolitan in Taiwan, during typhoon periods. The proposed R-NARX is constructed based on the recurrent neural network (RNN), which is commonly used for modelling nonlinear dynamical systems. A large number of hourly rainfall and inflow data sets collected from 95 historical typhoon events in the last thirty years are used to train, validate and test the models. The potential input variables, including rainfall in previous time steps (one to six hours), cumulative rainfall, the storage factor and the storage function, are assessed, and various models are constructed with their reliability and accuracy being tested. We find that the previous (t-2) rainfall and cumulative rainfall are crucial inputs and the storage factor and the storage function would also improve the forecast accuracy of the models. We demonstrate that the R-NARX model not only can accurately forecast the inflows but also effectively catch the peak flow without adopting observed inflow data during the entire typhoon period. Besides, the model with the storage factor is superior to the model with the storage function, where its improvement can reach 24%. This approach can well model the rainfall-runoff process for the entire flood forecasting period without the use of observed inflow data and can provide

  13. In-season retail sales forecasting using survival models

    African Journals Online (AJOL)

    Retail sales forecasting, survival analysis, time series analysis, Holt's smoothing .... where fx(t) is the probability density function of the future lifetime, Tx, of a .... Adjustments were made to the shape of the smoothed mortality rates in light of new.

  14. Skilful seasonal forecasts of streamflow over Europe?

    Science.gov (United States)

    Arnal, Louise; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; Prudhomme, Christel; Neumann, Jessica; Krzeminski, Blazej; Pappenberger, Florian

    2018-04-01

    This paper considers whether there is any added value in using seasonal climate forecasts instead of historical meteorological observations for forecasting streamflow on seasonal timescales over Europe. A Europe-wide analysis of the skill of the newly operational EFAS (European Flood Awareness System) seasonal streamflow forecasts (produced by forcing the Lisflood model with the ECMWF System 4 seasonal climate forecasts), benchmarked against the ensemble streamflow prediction (ESP) forecasting approach (produced by forcing the Lisflood model with historical meteorological observations), is undertaken. The results suggest that, on average, the System 4 seasonal climate forecasts improve the streamflow predictability over historical meteorological observations for the first month of lead time only (in terms of hindcast accuracy, sharpness and overall performance). However, the predictability varies in space and time and is greater in winter and autumn. Parts of Europe additionally exhibit a longer predictability, up to 7 months of lead time, for certain months within a season. In terms of hindcast reliability, the EFAS seasonal streamflow hindcasts are on average less skilful than the ESP for all lead times. The results also highlight the potential usefulness of the EFAS seasonal streamflow forecasts for decision-making (measured in terms of the hindcast discrimination for the lower and upper terciles of the simulated streamflow). Although the ESP is the most potentially useful forecasting approach in Europe, the EFAS seasonal streamflow forecasts appear more potentially useful than the ESP in some regions and for certain seasons, especially in winter for almost 40 % of Europe. Patterns in the EFAS seasonal streamflow hindcast skill are however not mirrored in the System 4 seasonal climate hindcasts, hinting at the need for a better understanding of the link between hydrological and meteorological variables on seasonal timescales, with the aim of improving climate

  15. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cheung, WanYin; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Lu, Siyuan; Hamann, Hendrik F.; Sun, Qian; Lehman, Brad

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance, cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.

  16. Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea

    Science.gov (United States)

    Mel, Riccardo; Lionello, Piero

    2014-12-01

    In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS

  17. Ensemble Streamflow Forecast Improvements in NYC's Operations Support Tool

    Science.gov (United States)

    Wang, L.; Weiss, W. J.; Porter, J.; Schaake, J. C.; Day, G. N.; Sheer, D. P.

    2013-12-01

    Like most other water supply utilities, New York City's Department of Environmental Protection (DEP) has operational challenges associated with drought and wet weather events. During drought conditions, DEP must maintain water supply reliability to 9 million customers as well as meet environmental release requirements downstream of its reservoirs. During and after wet weather events, DEP must maintain turbidity compliance in its unfiltered Catskill and Delaware reservoir systems and minimize spills to mitigate downstream flooding. Proactive reservoir management - such as release restrictions to prepare for a drought or preventative drawdown in advance of a large storm - can alleviate negative impacts associated with extreme events. It is important for water managers to understand the risks associated with proactive operations so unintended consequences such as endangering water supply reliability with excessive drawdown prior to a storm event are minimized. Probabilistic hydrologic forecasts are a critical tool in quantifying these risks and allow water managers to make more informed operational decisions. DEP has recently completed development of an Operations Support Tool (OST) that integrates ensemble streamflow forecasts, real-time observations, and a reservoir system operations model into a user-friendly graphical interface that allows its water managers to take robust and defensible proactive measures in the face of challenging system conditions. Since initial development of OST was first presented at the 2011 AGU Fall Meeting, significant improvements have been made to the forecast system. First, the monthly AR1 forecasts ('Hirsch method') were upgraded with a generalized linear model (GLM) utilizing historical daily correlations ('Extended Hirsch method' or 'eHirsch'). The development of eHirsch forecasts improved predictive skill over the Hirsch method in the first week to a month from the forecast date and produced more realistic hydrographs on the tail

  18. Parametric decadal climate forecast recalibration (DeFoReSt 1.0

    Directory of Open Access Journals (Sweden)

    A. Pasternack

    2018-01-01

    Full Text Available Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt, a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS. Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.

  19. Parametric decadal climate forecast recalibration (DeFoReSt 1.0)

    Science.gov (United States)

    Pasternack, Alexander; Bhend, Jonas; Liniger, Mark A.; Rust, Henning W.; Müller, Wolfgang A.; Ulbrich, Uwe

    2018-01-01

    Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift) and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt), a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS). Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.

  20. Tool for Forecasting Cool-Season Peak Winds Across Kennedy Space Center and Cape Canaveral Air Force Station

    Science.gov (United States)

    Barrett, Joe H., III; Roeder, William P.

    2010-01-01

    The expected peak wind speed for the day is an important element in the daily morning forecast for ground and space launch operations at Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45th Weather Squadron (45 WS) must issue forecast advisories for KSC/CCAFS when they expect peak gusts for >= 25, >= 35, and >= 50 kt thresholds at any level from the surface to 300 ft. In Phase I of this task, the 45 WS tasked the Applied Meteorology Unit (AMU) to develop a cool-season (October - April) tool to help forecast the non-convective peak wind from the surface to 300 ft at KSC/CCAFS. During the warm season, these wind speeds are rarely exceeded except during convective winds or under the influence of tropical cyclones, for which other techniques are already in use. The tool used single and multiple linear regression equations to predict the peak wind from the morning sounding. The forecaster manually entered several observed sounding parameters into a Microsoft Excel graphical user interface (GUI), and then the tool displayed the forecast peak wind speed, average wind speed at the time of the peak wind, the timing of the peak wind and the probability the peak wind will meet or exceed 35, 50 and 60 kt. The 45 WS customers later dropped the requirement for >= 60 kt wind warnings. During Phase II of this task, the AMU expanded the period of record (POR) by six years to increase the number of observations used to create the forecast equations. A large number of possible predictors were evaluated from archived soundings, including inversion depth and strength, low-level wind shear, mixing height, temperature lapse rate and winds from the surface to 3000 ft. Each day in the POR was stratified in a number of ways, such as by low-level wind direction, synoptic weather pattern, precipitation and Bulk Richardson number. The most accurate Phase II equations were then selected for an independent verification. The Phase I and II forecast methods were

  1. Short-Term Wind Speed Forecasting for Power System Operations

    KAUST Repository

    Zhu, Xinxin

    2012-04-01

    The emphasis on renewable energy and concerns about the environment have led to large-scale wind energy penetration worldwide. However, there are also significant challenges associated with the use of wind energy due to the intermittent and unstable nature of wind. High-quality short-term wind speed forecasting is critical to reliable and secure power system operations. This article begins with an overview of the current status of worldwide wind power developments and future trends. It then reviews some statistical short-term wind speed forecasting models, including traditional time series approaches and more advanced space-time statistical models. It also discusses the evaluation of forecast accuracy, in particular, the need for realistic loss functions. New challenges in wind speed forecasting regarding ramp events and offshore wind farms are also presented. © 2012 The Authors. International Statistical Review © 2012 International Statistical Institute.

  2. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  3. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  4. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [University of Texas at Dallas; Feng, Cong [University of Texas at Dallas; Wang, Zhenke [University of Texas at Dallas; Zhang, Jie [University of Texas at Dallas

    2018-02-01

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  5. Monthly forecasting of agricultural pests in Switzerland

    Science.gov (United States)

    Hirschi, M.; Dubrovsky, M.; Spirig, C.; Samietz, J.; Calanca, P.; Weigel, A. P.; Fischer, A. M.; Rotach, M. W.

    2012-04-01

    Given the repercussions of pests and diseases on agricultural production, detailed forecasting tools have been developed to simulate the degree of infestation depending on actual weather conditions. The life cycle of pests is most successfully predicted if the micro-climate of the immediate environment (habitat) of the causative organisms can be simulated. Sub-seasonal pest forecasts therefore require weather information for the relevant habitats and the appropriate time scale. The pest forecasting system SOPRA (www.sopra.info) currently in operation in Switzerland relies on such detailed weather information, using hourly weather observations up to the day the forecast is issued, but only a climatology for the forecasting period. Here, we aim at improving the skill of SOPRA forecasts by transforming the weekly information provided by ECMWF monthly forecasts (MOFCs) into hourly weather series as required for the prediction of upcoming life phases of the codling moth, the major insect pest in apple orchards worldwide. Due to the probabilistic nature of operational monthly forecasts and the limited spatial and temporal resolution, their information needs to be post-processed for use in a pest model. In this study, we developed a statistical downscaling approach for MOFCs that includes the following steps: (i) application of a stochastic weather generator to generate a large pool of daily weather series consistent with the climate at a specific location, (ii) a subsequent re-sampling of weather series from this pool to optimally represent the evolution of the weekly MOFC anomalies, and (iii) a final extension to hourly weather series suitable for the pest forecasting model. Results show a clear improvement in the forecast skill of occurrences of upcoming codling moth life phases when incorporating MOFCs as compared to the operational pest forecasting system. This is true both in terms of root mean squared errors and of the continuous rank probability scores of the

  6. Forecasting Fire Insurance Loss Ratio in Misr Insurance Company

    Directory of Open Access Journals (Sweden)

    Tarek TAHA

    2017-06-01

    Full Text Available Loss ratio is one of the most important indicator that has many strategic decisions applications, such as pricing, underwriting, investment, reinsurance and reserving decisions. It serves as an early warning of financial solvency of insurance companies and it can be judged on the strength of the financial position of these companies. The aim of this study is to identify the reliable time series-forecasting model to forecast loss ratio estimates of fire segment in Misr insurance company. Box-Jenkins Analysis is applied on actual reported loss ratios data for Misr insurance company for the period 1980/1981– 2013/2014. The study concludes that the best forecasting model is ARMA(1,1.

  7. A hybrid approach for probabilistic forecasting of electricity price

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Wang, Yelei

    2014-01-01

    to the nonstationarities involved in market clearing prices (MCPs), it is rather difficult to accurately predict MCPs in advance. The challenge is getting intensified as more and more renewable energy and other new technologies emerged in smart grids. Therefore transformation from traditional point forecasts...... electricity price forecasting is proposed in this paper. The effectiveness of the proposed hybrid method has been validated through comprehensive tests using real price data from Australian electricity market.......The electricity market plays a key role in realizing the economic prophecy of smart grids. Accurate and reliable electricity market price forecasting is essential to facilitate various decision making activities of market participants in the future smart grid environment. However, due...

  8. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  9. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  10. A robust combination approach for short-term wind speed forecasting and analysis – Combination of the ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM) forecasts using a GPR (Gaussian Process Regression) model

    International Nuclear Information System (INIS)

    Wang, Jianzhou; Hu, Jianming

    2015-01-01

    With the increasing importance of wind power as a component of power systems, the problems induced by the stochastic and intermittent nature of wind speed have compelled system operators and researchers to search for more reliable techniques to forecast wind speed. This paper proposes a combination model for probabilistic short-term wind speed forecasting. In this proposed hybrid approach, EWT (Empirical Wavelet Transform) is employed to extract meaningful information from a wind speed series by designing an appropriate wavelet filter bank. The GPR (Gaussian Process Regression) model is utilized to combine independent forecasts generated by various forecasting engines (ARIMA (Autoregressive Integrated Moving Average), ELM (Extreme Learning Machine), SVM (Support Vector Machine) and LSSVM (Least Square SVM)) in a nonlinear way rather than the commonly used linear way. The proposed approach provides more probabilistic information for wind speed predictions besides improving the forecasting accuracy for single-value predictions. The effectiveness of the proposed approach is demonstrated with wind speed data from two wind farms in China. The results indicate that the individual forecasting engines do not consistently forecast short-term wind speed for the two sites, and the proposed combination method can generate a more reliable and accurate forecast. - Highlights: • The proposed approach can make probabilistic modeling for wind speed series. • The proposed approach adapts to the time-varying characteristic of the wind speed. • The hybrid approach can extract the meaningful components from the wind speed series. • The proposed method can generate adaptive, reliable and more accurate forecasting results. • The proposed model combines four independent forecasting engines in a nonlinear way.

  11. Communicating Uncertainty in Volcanic Ash Forecasts: Decision-Making and Information Preferences

    Science.gov (United States)

    Mulder, Kelsey; Black, Alison; Charlton-Perez, Andrew; McCloy, Rachel; Lickiss, Matthew

    2016-04-01

    The Robust Assessment and Communication of Environmental Risk (RACER) consortium, an interdisciplinary research team focusing on communication of uncertainty with respect to natural hazards, hosted a Volcanic Ash Workshop to discuss issues related to volcanic ash forecasting, especially forecast uncertainty. Part of the workshop was a decision game in which participants including forecasters, academics, and members of the Aviation Industry were given hypothetical volcanic ash concentration forecasts and asked whether they would approve a given flight path. The uncertainty information was presented in different formats including hazard maps, line graphs, and percent probabilities. Results from the decision game will be presented with a focus on information preferences, understanding of the forecasts, and whether different formats of the same volcanic ash forecast resulted in different flight decisions. Implications of this research will help the design and presentation of volcanic ash plume decision tools and can also help advise design of other natural hazard information.

  12. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  13. Coping with Changes in International Classifications of Sectors and Occupations: Application in Skills Forecasting. Research Paper No 43

    Science.gov (United States)

    Kvetan, Vladimir, Ed.

    2014-01-01

    Reliable and consistent time series are essential to any kind of economic forecasting. Skills forecasting needs to combine data from national accounts and labour force surveys, with the pan-European dimension of Cedefop's skills supply and demand forecasts, relying on different international classification standards. Sectoral classification (NACE)…

  14. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  15. Markov Chain Modelling for Short-Term NDVI Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Stepčenko Artūrs

    2016-12-01

    Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.

  16. A Condition Based Maintenance Approach to Forecasting B-1 Aircraft Parts

    Science.gov (United States)

    2017-03-23

    Air Force Institute of Technology AFIT Scholar Theses and Dissertations 3-23-2017 A Condition Based Maintenance Approach to Forecasting B-1 Aircraft...component’s life history where reliability forecasts could be stipulated based on a component’s current condition . One of the major issues their report noted...Engine Condition Monitoring System Specification. Contract Number DOT-CG-80513-A. Grand Prairie, TX. Air Force Materiel Command. (2011) Requirements For

  17. A High Precision Artificial Neural Networks Model for Short-Term Energy Load Forecasting

    Directory of Open Access Journals (Sweden)

    Ping-Huan Kuo

    2018-01-01

    Full Text Available One of the most important research topics in smart grid technology is load forecasting, because accuracy of load forecasting highly influences reliability of the smart grid systems. In the past, load forecasting was obtained by traditional analysis techniques such as time series analysis and linear regression. Since the load forecast focuses on aggregated electricity consumption patterns, researchers have recently integrated deep learning approaches with machine learning techniques. In this study, an accurate deep neural network algorithm for short-term load forecasting (STLF is introduced. The forecasting performance of proposed algorithm is compared with performances of five artificial intelligence algorithms that are commonly used in load forecasting. The Mean Absolute Percentage Error (MAPE and Cumulative Variation of Root Mean Square Error (CV-RMSE are used as accuracy evaluation indexes. The experiment results show that MAPE and CV-RMSE of proposed algorithm are 9.77% and 11.66%, respectively, displaying very high forecasting accuracy.

  18. Forecasting ability of the investor sentiment endurance index: The case of oil service stock returns and crude oil prices

    International Nuclear Information System (INIS)

    He, Ling T.; Casey, K.M.

    2015-01-01

    Using a binomial probability distribution model this paper creates an endurance index of oil service investor sentiment. The index reflects the probability of the high or low stock price being the close price for the PHLX Oil Service Sector Index. Results of this study reveal the substantial forecasting ability of the sentiment endurance index. Monthly and quarterly rolling forecasts of returns of oil service stocks have an overall accuracy as high as 52% to 57%. In addition, the index shows decent forecasting ability on changes in crude oil prices, especially, WTI prices. The accuracy of 6-quarter rolling forecasts is 55%. The sentiment endurance index, along with the procedure of true forecasting and accuracy ratio, applied in this study provides investors and analysts of oil service sector stocks and crude oil prices as well as energy policy-makers with effective analytical tools

  19. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  20. Probabilistic Wind Power Forecasting with Hybrid Artificial Neural Networks

    DEFF Research Database (Denmark)

    Wan, Can; Song, Yonghua; Xu, Zhao

    2016-01-01

    probabilities of prediction errors provide an alternative yet effective solution. This article proposes a hybrid artificial neural network approach to generate prediction intervals of wind power. An extreme learning machine is applied to conduct point prediction of wind power and estimate model uncertainties...... via a bootstrap technique. Subsequently, the maximum likelihood estimation method is employed to construct a distinct neural network to estimate the noise variance of forecasting results. The proposed approach has been tested on multi-step forecasting of high-resolution (10-min) wind power using...... actual wind power data from Denmark. The numerical results demonstrate that the proposed hybrid artificial neural network approach is effective and efficient for probabilistic forecasting of wind power and has high potential in practical applications....

  1. Transport project evaluation: feasibility risk assessment and scenario forecasting

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2017-01-01

    This paper presents a new approach to transport project assessment in terms of feasibility risk assessment and reference class forecasting. Conventionally, transport project assessment is based upon a Cost-Benefit Analysis (CBA) where evaluation criteria such as Benefit Cost Ratios (BCR...... on the preliminary construction cost estimates. Hereafter, a quantitative risk analysis is provided making use of Monte Carlo simulation. This approach facilitates random input parameters based upon reference class forecasting, hence, a parameter data fit has been performed in order to obtain validated probability...... Scenario Forecasting (RSF) frame. The RSF is anchored in the cost-benefit analysis; thus, it provides decision-makers with a quantitative mean of assessing the transport infrastructure project. First, the RSF method introduces uncertainties within the CBA by applying Optimism Bias uplifts...

  2. Day-Ahead Probabilistic Model for Scheduling the Operation of a Wind Pumped-Storage Hybrid Power Station: Overcoming Forecasting Errors to Ensure Reliability of Supply to the Grid

    Directory of Open Access Journals (Sweden)

    Jakub Jurasz

    2018-06-01

    Full Text Available Variable renewable energy sources (VRES, such as solarphotovoltaic (PV and wind turbines (WT, are starting to play a significant role in several energy systems around the globe. To overcome the problem of their non-dispatchable and stochastic nature, several approaches have been proposed so far. This paper describes a novel mathematical model for scheduling the operation of a wind-powered pumped-storage hydroelectricity (PSH hybrid for 25 to 48 h ahead. The model is based on mathematical programming and wind speed forecasts for the next 1 to 24 h, along with predicted upper reservoir occupancy for the 24th hour ahead. The results indicate that by coupling a 2-MW conventional wind turbine with a PSH of energy storing capacity equal to 54 MWh it is possible to significantly reduce the intraday energy generation coefficient of variation from 31% for pure wind turbine to 1.15% for a wind-powered PSH The scheduling errors calculated based on mean absolute percentage error (MAPE are significantly smaller for such a coupling than those seen for wind generation forecasts, at 2.39% and 27%, respectively. This is even stronger emphasized by the fact that, those for wind generation were calculated for forecasts made for the next 1 to 24 h, while those for scheduled generation were calculated for forecasts made for the next 25 to 48 h. The results clearly show that the proposed scheduling approach ensures the high reliability of the WT–PSH energy source.

  3. Bounds on survival probability given mean probability of failure per demand; and the paradoxical advantages of uncertainty

    International Nuclear Information System (INIS)

    Strigini, Lorenzo; Wright, David

    2014-01-01

    When deciding whether to accept into service a new safety-critical system, or choosing between alternative systems, uncertainty about the parameters that affect future failure probability may be a major problem. This uncertainty can be extreme if there is the possibility of unknown design errors (e.g. in software), or wide variation between nominally equivalent components. We study the effect of parameter uncertainty on future reliability (survival probability), for systems required to have low risk of even only one failure or accident over the long term (e.g. their whole operational lifetime) and characterised by a single reliability parameter (e.g. probability of failure per demand – pfd). A complete mathematical treatment requires stating a probability distribution for any parameter with uncertain value. This is hard, so calculations are often performed using point estimates, like the expected value. We investigate conditions under which such simplified descriptions yield reliability values that are sure to be pessimistic (or optimistic) bounds for a prediction based on the true distribution. Two important observations are (i) using the expected value of the reliability parameter as its true value guarantees a pessimistic estimate of reliability, a useful property in most safety-related decisions; (ii) with a given expected pfd, broader distributions (in a formally defined meaning of “broader”), that is, systems that are a priori “less predictable”, lower the risk of failures or accidents. Result (i) justifies the simplification of using a mean in reliability modelling; we discuss within which scope this justification applies, and explore related scenarios, e.g. how things improve if we can test the system before operation. Result (ii) not only offers more flexible ways of bounding reliability predictions, but also has important, often counter-intuitive implications for decision making in various areas, like selection of components, project management

  4. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  5. A national econometric forecasting model of the dental sector.

    Science.gov (United States)

    Feldstein, P J; Roehrig, C S

    1980-01-01

    The Econometric Model of the the Dental Sector forecasts a broad range of dental sector variables, including dental care prices; the amount of care produced and consumed; employment of hygienists, dental assistants, and clericals; hours worked by dentists; dental incomes; and number of dentists. These forecasts are based upon values specified by the user for the various factors which help determine the supply an demand for dental care, such as the size of the population, per capita income, the proportion of the population covered by private dental insurance, the cost of hiring clericals and dental assistants, and relevant government policies. In a test of its reliability, the model forecast dental sector behavior quite accurately for the period 1971 through 1977. PMID:7461974

  6. Finite element reliability analysis of fatigue life

    International Nuclear Information System (INIS)

    Harkness, H.H.; Belytschko, T.; Liu, W.K.

    1992-01-01

    Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)

  7. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [Univ. of Texas-Dallas, Richardson, TX (United States); Feng, Cong [Univ. of Texas-Dallas, Richardson, TX (United States); Wang, Zhenke [Univ. of Texas-Dallas, Richardson, TX (United States); Zhang, Jie [Univ. of Texas-Dallas, Richardson, TX (United States)

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  8. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  9. Determining effective forecast horizons for multi-purpose reservoirs with short- and long-term operating objectives

    Science.gov (United States)

    Luchner, Jakob; Anghileri, Daniela; Castelletti, Andrea

    2017-04-01

    Real-time control of multi-purpose reservoirs can benefit significantly from hydro-meteorological forecast products. Because of their reliability, the most used forecasts range on time scales from hours to few days and are suitable for short-term operation targets such as flood control. In recent years, hydro-meteorological forecasts have become more accurate and reliable on longer time scales, which are more relevant to long-term reservoir operation targets such as water supply. While the forecast quality of such products has been studied extensively, the forecast value, i.e. the operational effectiveness of using forecasts to support water management, has been only relatively explored. It is comparatively easy to identify the most effective forecasting information needed to design reservoir operation rules for flood control but it is not straightforward to identify which forecast variable and lead time is needed to define effective hedging rules for operational targets with slow dynamics such as water supply. The task is even more complex when multiple targets, with diverse slow and fast dynamics, are considered at the same time. In these cases, the relative importance of different pieces of information, e.g. magnitude and timing of peak flow rate and accumulated inflow on different time lags, may vary depending on the season or the hydrological conditions. In this work, we analyze the relationship between operational forecast value and streamflow forecast horizon for different multi-purpose reservoir trade-offs. We use the Information Selection and Assessment (ISA) framework to identify the most effective forecast variables and horizons for informing multi-objective reservoir operation over short- and long-term temporal scales. The ISA framework is an automatic iterative procedure to discriminate the information with the highest potential to improve multi-objective reservoir operating performance. Forecast variables and horizons are selected using a feature

  10. Evaluation of ensemble precipitation forecasts generated through post-processing in a Canadian catchment

    Science.gov (United States)

    Jha, Sanjeev K.; Shrestha, Durga L.; Stadnyk, Tricia A.; Coulibaly, Paulin

    2018-03-01

    Flooding in Canada is often caused by heavy rainfall during the snowmelt period. Hydrologic forecast centers rely on precipitation forecasts obtained from numerical weather prediction (NWP) models to enforce hydrological models for streamflow forecasting. The uncertainties in raw quantitative precipitation forecasts (QPFs) are enhanced by physiography and orography effects over a diverse landscape, particularly in the western catchments of Canada. A Bayesian post-processing approach called rainfall post-processing (RPP), developed in Australia (Robertson et al., 2013; Shrestha et al., 2015), has been applied to assess its forecast performance in a Canadian catchment. Raw QPFs obtained from two sources, Global Ensemble Forecasting System (GEFS) Reforecast 2 project, from the National Centers for Environmental Prediction, and Global Deterministic Forecast System (GDPS), from Environment and Climate Change Canada, are used in this study. The study period from January 2013 to December 2015 covered a major flood event in Calgary, Alberta, Canada. Post-processed results show that the RPP is able to remove the bias and reduce the errors of both GEFS and GDPS forecasts. Ensembles generated from the RPP reliably quantify the forecast uncertainty.

  11. Forecasting Wind and Solar Generation: Improving System Operations, Greening the Grid (Spanish Version)

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Tian; Chernyakhovskiy, Ilya; Brancucci Martinez-Anido, Carlo

    2016-04-01

    This document is the Spanish version of 'Greening the Grid- Forecasting Wind and Solar Generation Improving System Operations'. It discusses improving system operations with forecasting with and solar generation. By integrating variable renewable energy (VRE) forecasts into system operations, power system operators can anticipate up- and down-ramps in VRE generation in order to cost-effectively balance load and generation in intra-day and day-ahead scheduling. This leads to reduced fuel costs, improved system reliability, and maximum use of renewable resources.

  12. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  13. Natural gas demand forecast system based on the application of artificial neural networks

    International Nuclear Information System (INIS)

    Sanfeliu, J.M.; Doumanian, J.E.

    1997-01-01

    Gas Natural BAN, as a distribution gas company since 1993 in the north and west area of Buenos Aires Argentina, with 1,000,000 customers, had to develop a gas demand forecast system which should comply with the following basic requirements: Be able to do reliable forecasts with short historical information (2 years); Distinguish demands in areas of different characteristics, i.e. mainly residential, mainly industrial; Self-learning capability. To accomplish above goals, Gas Natural BAN chose in view of its own necessities, an artificial intelligence application (neural networks). 'SANDRA', the gas demand forecast system for gas distribution used by Gas Natural BAN, has the following features: Daily gas demand forecast, Hourly gas demand forecast and Breakdown of both forecast for each of the 3 basic zones in which the distribution area of Gas Natural BAN is divided. (au)

  14. Analysis of probability of defects in the disposal canisters

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Kuusela, P.

    2011-06-01

    This report presents a probability model for the reliability of the spent nuclear waste final disposal canister. Reliability means here that the welding of the canister lid has no critical defects from the long-term safety point of view. From the reliability point of view, both the reliability of the welding process (that no critical defects will be born) and the non-destructive testing (NDT) process (all critical defects will be detected) are equally important. In the probability model, critical defects in a weld were simplified into a few types. Also the possibility of human errors in the NDT process was taken into account in a simple manner. At this moment there is very little representative data to determine the reliability of welding and also the data on NDT is not well suited for the needs of this study. Therefore calculations presented here are based on expert judgements and on several assumptions that have not been verified yet. The Bayesian probability model shows the importance of the uncertainty in the estimation of the reliability parameters. The effect of uncertainty is that the probability distribution of the number of defective canisters becomes flat for larger numbers of canisters compared to the binomial probability distribution in case of known parameter values. In order to reduce the uncertainty, more information is needed from both the reliability of the welding and NDT processes. It would also be important to analyse the role of human factors in these processes since their role is not reflected in typical test data which is used to estimate 'normal process variation'.The reported model should be seen as a tool to quantify the roles of different methods and procedures in the weld inspection process. (orig.)

  15. Enhancing COSMO-DE ensemble forecasts by inexpensive techniques

    Directory of Open Access Journals (Sweden)

    Zied Ben Bouallègue

    2013-02-01

    Full Text Available COSMO-DE-EPS, a convection-permitting ensemble prediction system based on the high-resolution numerical weather prediction model COSMO-DE, is pre-operational since December 2010, providing probabilistic forecasts which cover Germany. This ensemble system comprises 20 members based on variations of the lateral boundary conditions, the physics parameterizations and the initial conditions. In order to increase the sample size in a computationally inexpensive way, COSMO-DE-EPS is combined with alternative ensemble techniques: the neighborhood method and the time-lagged approach. Their impact on the quality of the resulting probabilistic forecasts is assessed. Objective verification is performed over a six months period, scores based on the Brier score and its decomposition are shown for June 2011. The combination of the ensemble system with the alternative approaches improves probabilistic forecasts of precipitation in particular for high precipitation thresholds. Moreover, combining COSMO-DE-EPS with only the time-lagged approach improves the skill of area probabilities for precipitation and does not deteriorate the skill of 2 m-temperature and wind gusts forecasts.

  16. A global flash flood forecasting system

    Science.gov (United States)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial

  17. Use of Markov chains for forecasting labor requirements in black coal mines

    Energy Technology Data Exchange (ETDEWEB)

    Penar, L.; Przybyla, H.

    1987-01-01

    Increasing mining depth, deterioration of mining conditions and technology development are causes of changes in labor requirements. In mines with stable coal output these changes in most cases are of a qualitative character, in mines with an increasing or decreasing coal output they are of a quantitative character. Methods for forecasting personnel needs, in particular professional requirements, are discussed. Quantitative and qualitative changes are accurately described by heterogenous Markov chains. A structure consisting of interdependent variables is the subject of a forecast. Changes that occur within the structure of time units is the subject of investigations. For a homogenous Markov chain probabilities of a transition from the i-state to the j-state are determined (the probabilities being time independent). For a heterogenous Markov chain probabilities of a transition from the i-state to the j-state are non-conditioned. The method was developed for the ODRA 1325 computers. 8 refs.

  18. Short-Term Wind Speed Hybrid Forecasting Model Based on Bias Correcting Study and Its Application

    OpenAIRE

    Mingfei Niu; Shaolong Sun; Jie Wu; Yuanlei Zhang

    2015-01-01

    The accuracy of wind speed forecasting is becoming increasingly important to improve and optimize renewable wind power generation. In particular, reliable short-term wind speed forecasting can enable model predictive control of wind turbines and real-time optimization of wind farm operation. However, due to the strong stochastic nature and dynamic uncertainty of wind speed, the forecasting of wind speed data using different patterns is difficult. This paper proposes a novel combination bias c...

  19. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  20. Lightning Initiation Forecasting: An Operational Dual-Polarimetric Radar Technique

    Science.gov (United States)

    Woodard, Crystal J.; Carey, L. D.; Petersen, W. A.; Roeder, W. P.

    2011-01-01

    The objective of this NASA MSFC and NOAA CSTAR funded study is to develop and test operational forecast algorithms for the prediction of lightning initiation utilizing the C-band dual-polarimetric radar, UAHuntsville's Advanced Radar for Meteorological and Operational Research (ARMOR). Although there is a rich research history of radar signatures associated with lightning initiation, few studies have utilized dual-polarimetric radar signatures (e.g., Z(sub dr) columns) and capabilities (e.g., fuzzy-logic particle identification [PID] of precipitation ice) in an operational algorithm for first flash forecasting. The specific goal of this study is to develop and test polarimetric techniques that enhance the performance of current operational radar reflectivity based first flash algorithms. Improving lightning watch and warning performance will positively impact personnel safety in both work and leisure environments. Advanced warnings can provide space shuttle launch managers time to respond appropriately to secure equipment and personnel, while they can also provide appropriate warnings for spectators and players of leisure sporting events to seek safe shelter. Through the analysis of eight case dates, consisting of 35 pulse-type thunderstorms and 20 non-thunderstorm case studies, lightning initiation forecast techniques were developed and tested. The hypothesis is that the additional dual-polarimetric information could potentially reduce false alarms while maintaining high probability of detection and increasing lead-time for the prediction of the first lightning flash relative to reflectivity-only based techniques. To test the hypothesis, various physically-based techniques using polarimetric variables and/or PID categories, which are strongly correlated to initial storm electrification (e.g., large precipitation ice production via drop freezing), were benchmarked against the operational reflectivity-only based approaches to find the best compromise between

  1. Economic evaluation of short-term wind power forecast in ERCOT. Preliminary results

    Energy Technology Data Exchange (ETDEWEB)

    Orwig, Kirsten D.; Hodge, Bri-Mathias; Brinkman, Greg; Ela, Erik; Milligan, Michael [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Banunarayanan, Venkat; Nasir, Saleh [ICF International, Fairfax, VA (United States); Freedman, Jeff [AWS Truepower, Albany, NY (United States)

    2012-07-01

    A number of wind energy integration studies have investigated the monetary value of using day-ahead wind power forecasts for grid operation decisions. Historically, these studies have shown that large cost savings could be gained by grid operators implementing the forecasts in their system operations. To date, none of these studies have investigated the value of shorter term (0- to 6-h ahead) wind power forecasts. In 2010, the Department of Energy and the National Oceanic and Atmospheric Administration partnered to form the Wind Forecasting Improvement Project (WFIP) to fund improvements in short-term wind forecasts and determine the economic value of these improvements to grid operators. In this work, we discuss the preliminary results of the economic benefit analysis portion of the WFIP for the Electric Reliability Council of Texas. The improvements seen in the wind forecasts are examined and the economic results of a production cost model simulation are analyzed. (orig.)

  2. Demonstrating the value of larger ensembles in forecasting physical systems

    Directory of Open Access Journals (Sweden)

    Reason L. Machete

    2016-12-01

    Full Text Available Ensemble simulation propagates a collection of initial states forward in time in a Monte Carlo fashion. Depending on the fidelity of the model and the properties of the initial ensemble, the goal of ensemble simulation can range from merely quantifying variations in the sensitivity of the model all the way to providing actionable probability forecasts of the future. Whatever the goal is, success depends on the properties of the ensemble, and there is a longstanding discussion in meteorology as to the size of initial condition ensemble most appropriate for Numerical Weather Prediction. In terms of resource allocation: how is one to divide finite computing resources between model complexity, ensemble size, data assimilation and other components of the forecast system. One wishes to avoid undersampling information available from the model's dynamics, yet one also wishes to use the highest fidelity model available. Arguably, a higher fidelity model can better exploit a larger ensemble; nevertheless it is often suggested that a relatively small ensemble, say ~16 members, is sufficient and that larger ensembles are not an effective investment of resources. This claim is shown to be dubious when the goal is probabilistic forecasting, even in settings where the forecast model is informative but imperfect. Probability forecasts for a ‘simple’ physical system are evaluated at different lead times; ensembles of up to 256 members are considered. The pure density estimation context (where ensemble members are drawn from the same underlying distribution as the target differs from the forecasting context, where one is given a high fidelity (but imperfect model. In the forecasting context, the information provided by additional members depends also on the fidelity of the model, the ensemble formation scheme (data assimilation, the ensemble interpretation and the nature of the observational noise. The effect of increasing the ensemble size is quantified by

  3. Quantitative assessment of probability of failing safely for the safety instrumented system using reliability block diagram method

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Zhao, Shoutang; Hu, Bin

    2015-01-01

    Highlights: • Models of PFS for SIS were established by using the reliability block diagram. • The more accurate calculation of PFS for SIS can be acquired by using SL. • Degraded operation of complex SIS does not affect the availability of SIS. • The safe undetected failure is the largest contribution to the PFS of SIS. - Abstract: The spurious trip of safety instrumented system (SIS) brings great economic losses to production. How to ensure the safety instrumented system is reliable and available has been put on the schedule. But the existing models on spurious trip rate (STR) or probability of failing safely (PFS) are too simplified and not accurate, in-depth studies of availability to obtain more accurate PFS for SIS are required. Based on the analysis of factors that influence the PFS for the SIS, using reliability block diagram method (RBD), the quantitative study of PFS for the SIS is carried out, and gives some application examples. The results show that, the common cause failure will increase the PFS; degraded operation does not affect the availability of the SIS; if the equipment was tested and repaired one by one, the unavailability of the SIS can be ignored; the corresponding occurrence time of independent safe undetected failure should be the system lifecycle (SL) rather than the proof test interval and the independent safe undetected failure is the largest contribution to the PFS for the SIS

  4. Improving wave forecasting by integrating ensemble modelling and machine learning

    Science.gov (United States)

    O'Donncha, F.; Zhang, Y.; James, S. C.

    2017-12-01

    Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.

  5. Towards the Olympic Games: Guanabara Bay Forecasting System and its Application on the Floating Debris Cleaning Actions.

    Science.gov (United States)

    Pimentel, F. P.; Marques Da Cruz, L.; Cabral, M. M.; Miranda, T. C.; Garção, H. F.; Oliveira, A. L. S. C.; Carvalho, G. V.; Soares, F.; São Tiago, P. M.; Barmak, R. B.; Rinaldi, F.; dos Santos, F. A.; Da Rocha Fragoso, M.; Pellegrini, J. C.

    2016-02-01

    Marine debris is a widespread pollution issue that affects almost all water bodies and is remarkably relevant in estuaries and bays. Rio de Janeiro city will host the 2016 Olympic Games and Guanabara Bay will be the venue for the sailing competitions. Historically serving as deposit for all types of waste, this water body suffers with major environmental problems, one of them being the massive presence of floating garbage. Therefore, it is of great importance to count on effective contingency actions to address this issue. In this sense, an operational ocean forecasting system was designed and it is presently being used by the Rio de Janeiro State Government to manage and control the cleaning actions on the bay. The forecasting system makes use of high resolution hydrodynamic and atmospheric models and a lagragian particle transport model, in order to provide probabilistic forecasts maps of the areas where the debris are most probably accumulating. All the results are displayed on an interactive GIS web platform along with the tracks of the boats that make the garbage collection, so the decision makers can easily command the actions, enhancing its efficiency. The integration of in situ data and advanced techniques such as Lyapunov exponent analysis are also being developed in the system, so to increase its forecast reliability. Additionally, the system also gathers and compiles on its database all the information on the debris collection, including quantity, type, locations, accumulation areas and their correlation with the environmental factors that drive the runoff and surface drift. Combining probabilistic, deterministic and statistical approaches, the forecasting system of Guanabara Bay has been proving to be a powerful tool for the environmental management and will be of great importance on helping securing the safety and fairness of the Olympic sailing competitions. The system design, its components and main results are presented in this paper.

  6. Peak Wind Tool for General Forecasting

    Science.gov (United States)

    Barrett, Joe H., III

    2010-01-01

    The expected peak wind speed of the day is an important forecast element in the 45th Weather Squadron's (45 WS) daily 24-Hour and Weekly Planning Forecasts. The forecasts are used for ground and space launch operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS). The 45 WS also issues wind advisories for KSC/CCAFS when they expect wind gusts to meet or exceed 25 kt, 35 kt and 50 kt thresholds at any level from the surface to 300 ft. The 45 WS forecasters have indicated peak wind speeds are challenging to forecast, particularly in the cool season months of October - April. In Phase I of this task, the Applied Meteorology Unit (AMU) developed a tool to help the 45 WS forecast non-convective winds at KSC/CCAFS for the 24-hour period of 0800 to 0800 local time. The tool was delivered as a Microsoft Excel graphical user interface (GUI). The GUI displayed the forecast of peak wind speed, 5-minute average wind speed at the time of the peak wind, timing of the peak wind and probability the peak speed would meet or exceed 25 kt, 35 kt and 50 kt. For the current task (Phase II ), the 45 WS requested additional observations be used for the creation of the forecast equations by expanding the period of record (POR). Additional parameters were evaluated as predictors, including wind speeds between 500 ft and 3000 ft, static stability classification, Bulk Richardson Number, mixing depth, vertical wind shear, temperature inversion strength and depth and wind direction. Using a verification data set, the AMU compared the performance of the Phase I and II prediction methods. Just as in Phase I, the tool was delivered as a Microsoft Excel GUI. The 45 WS requested the tool also be available in the Meteorological Interactive Data Display System (MIDDS). The AMU first expanded the POR by two years by adding tower observations, surface observations and CCAFS (XMR) soundings for the cool season months of March 2007 to April 2009. The POR was expanded

  7. Probabilistic forecasting of the solar irradiance with recursive ARMA and GARCH models

    DEFF Research Database (Denmark)

    David, M.; Ramahatana, F.; Trombe, Pierre-Julien

    2016-01-01

    Forecasting of the solar irradiance is a key feature in order to increase the penetration rate of solar energy into the energy grids. Indeed, the anticipation of the fluctuations of the solar renewables allows a better management of the production means of electricity and a better operation...... sky index show some similarities with that of financial time series. The aim of this paper is to assess the performances of a commonly used combination of two linear models (ARMA and GARCH) in econometrics in order to provide probabilistic forecasts of solar irradiance. In addition, a recursive...... regarding the statistical distribution of the error, the reliability of the probabilistic forecasts stands in the same order of magnitude as other works done in the field of solar forecasting....

  8. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  9. Forecasting the Seasonal Timing of Maine's Lobster Fishery

    Directory of Open Access Journals (Sweden)

    Katherine E. Mills

    2017-11-01

    Full Text Available The fishery for American lobster is currently the highest-valued commercial fishery in the United States, worth over US$620 million in dockside value in 2015. During a marine heat wave in 2012, the fishery was disrupted by the early warming of spring ocean temperatures and subsequent influx of lobster landings. This situation resulted in a price collapse, as the supply chain was not prepared for the early and abundant landings of lobsters. Motivated by this series of events, we have developed a forecast of when the Maine (USA lobster fishery will shift into its high volume summer landings period. The forecast uses a regression approach to relate spring ocean temperatures derived from four NERACOOS buoys along the coast of Maine to the start day of the high landings period of the fishery. Tested against conditions in past years, the forecast is able to predict the start day to within 1 week of the actual start, and the forecast can be issued 3–4 months prior to the onset of the high-landings period, providing valuable lead-time for the fishery and its associated supply chain to prepare for the upcoming season. Forecast results are conveyed in a probabilistic manner and are updated weekly over a 6-week forecasting period so that users can assess the certainty and consistency of the forecast and factor the uncertainty into their use of the information in a given year. By focusing on the timing of events, this type of seasonal forecast provides climate-relevant information to users at time scales that are meaningful for operational decisions. As climate change alters seasonal phenology and reduces the reliability of past experience as a guide for future expectations, this type of forecast can enable fishing industry participants to better adjust to and prepare for operating in the context of climate change.

  10. Applications of the gambling score in evaluating earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  11. House Price Forecasts, Forecaster Herding, and the Recent Crisis

    DEFF Research Database (Denmark)

    Stadtmann, Georg; Pierdzioch; Ruelke

    2013-01-01

    We used the Wall Street Journal survey data for the period 2006–2012 to analyze whether forecasts of house prices and housing starts provide evidence of (anti-)herding of forecasters. Forecasts are consistent with herding (anti-herding) of forecasters if forecasts are biased towards (away from) t......) the consensus forecast. We found that anti-herding is prevalent among forecasters of house prices. We also report that, following the recent crisis, the prevalence of forecaster anti-herding seems to have changed over time....

  12. Determining the bounds of skilful forecast range for probabilistic prediction of system-wide wind power generation

    Directory of Open Access Journals (Sweden)

    Dirk Cannon

    2017-06-01

    Full Text Available State-of-the-art wind power forecasts beyond a few hours ahead rely on global numerical weather prediction models to forecast the future large-scale atmospheric state. Often they provide initial and boundary conditions for nested high resolution simulations. In this paper, both upper and lower bounds on forecast range are identified within which global ensemble forecasts provide skilful information for system-wide wind power applications. An upper bound on forecast range is associated with the limit of predictability, beyond which forecasts have no more skill than predictions based on climatological statistics. A lower bound is defined at the lead time beyond which the resolved uncertainty associated with estimating the future large-scale atmospheric state is larger than the unresolved uncertainty associated with estimating the system-wide wind power response to a given large-scale state.The bounds of skilful ensemble forecast range are quantified for three leading global forecast systems. The power system of Great Britain (GB is used as an example because independent verifying data is available from National Grid. The upper bound defined by forecasts of GB-total wind power generation at a specific point in time is found to be 6–8 days. The lower bound is found to be 1.4–2.4 days. Both bounds depend on the global forecast system and vary seasonally. In addition, forecasts of the probability of an extreme power ramp event were found to possess a shorter limit of predictability (4.5–5.5 days. The upper bound on this forecast range can only be extended by improving the global forecast system (outside the control of most users or by changing the metric used in the probability forecast. Improved downscaling and microscale modelling of the wind farm response may act to decrease the lower bound. The potential gain from such improvements have diminishing returns beyond the short-range (out to around 2 days.

  13. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    Science.gov (United States)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be

  14. House Price Forecasts, Forecaster Herding, and the Recent Crisis

    Directory of Open Access Journals (Sweden)

    Christian Pierdzioch

    2012-11-01

    Full Text Available We used the Wall Street Journal survey data for the period 2006–2012 to analyze whether forecasts of house prices and housing starts provide evidence of (anti-herding of forecasters. Forecasts are consistent with herding (anti-herding of forecasters if forecasts are biased towards (away from the consensus forecast. We found that anti-herding is prevalent among forecasters of house prices. We also report that, following the recent crisis, the prevalence of forecaster anti-herding seems to have changed over time.

  15. Short-Term Wind Power Forecasting Using the Enhanced Particle Swarm Optimization Based Hybrid Method

    Directory of Open Access Journals (Sweden)

    Wen-Yeau Chang

    2013-09-01

    Full Text Available High penetration of wind power in the electricity system provides many challenges to power system operators, mainly due to the unpredictability and variability of wind power generation. Although wind energy may not be dispatched, an accurate forecasting method of wind speed and power generation can help power system operators reduce the risk of an unreliable electricity supply. This paper proposes an enhanced particle swarm optimization (EPSO based hybrid forecasting method for short-term wind power forecasting. The hybrid forecasting method combines the persistence method, the back propagation neural network, and the radial basis function (RBF neural network. The EPSO algorithm is employed to optimize the weight coefficients in the hybrid forecasting method. To demonstrate the effectiveness of the proposed method, the method is tested on the practical information of wind power generation of a wind energy conversion system (WECS installed on the Taichung coast of Taiwan. Comparisons of forecasting performance are made with the individual forecasting methods. Good agreements between the realistic values and forecasting values are obtained; the test results show the proposed forecasting method is accurate and reliable.

  16. Coupling meteorological and hydrological models for flood forecasting

    Directory of Open Access Journals (Sweden)

    Bartholmes

    2005-01-01

    Full Text Available This paper deals with the problem of analysing the coupling of meteorological meso-scale quantitative precipitation forecasts with distributed rainfall-runoff models to extend the forecasting horizon. Traditionally, semi-distributed rainfall-runoff models have been used for real time flood forecasting. More recently, increased computer capabilities allow the utilisation of distributed hydrological models with mesh sizes from tenths of metres to a few kilometres. On the other hand, meteorological models, providing the quantitative precipitation forecast, tend to produce average values on meshes ranging from slightly less than 10 to 200 kilometres. Therefore, to improve the quality of flood forecasts, the effects of coupling the meteorological and the hydrological models at different scales were analysed. A distributed hydrological model (TOPKAPI was developed and calibrated using a 1x1 km mesh for the case of the river Po closed at Ponte Spessa (catchment area c. 37000 km2. The model was then coupled with several other European meteorological models ranging from the Limited Area Models (provided by DMI and DWD with resolutions from 0.0625° * 0.0625°, to the ECMWF ensemble predictions with a resolution of 1.85° * 1.85°. Interesting results, describing the coupled model behaviour, are available for a meteorological extreme event in Northern Italy (Nov. 1994. The results demonstrate the poor reliability of the quantitative precipitation forecasts produced by meteorological models presently available; this is not resolved using the Ensemble Forecasting technique, when compared with results obtainable with measured rainfall.

  17. Evaluation of ensemble precipitation forecasts generated through post-processing in a Canadian catchment

    Directory of Open Access Journals (Sweden)

    S. K. Jha

    2018-03-01

    Full Text Available Flooding in Canada is often caused by heavy rainfall during the snowmelt period. Hydrologic forecast centers rely on precipitation forecasts obtained from numerical weather prediction (NWP models to enforce hydrological models for streamflow forecasting. The uncertainties in raw quantitative precipitation forecasts (QPFs are enhanced by physiography and orography effects over a diverse landscape, particularly in the western catchments of Canada. A Bayesian post-processing approach called rainfall post-processing (RPP, developed in Australia (Robertson et al., 2013; Shrestha et al., 2015, has been applied to assess its forecast performance in a Canadian catchment. Raw QPFs obtained from two sources, Global Ensemble Forecasting System (GEFS Reforecast 2 project, from the National Centers for Environmental Prediction, and Global Deterministic Forecast System (GDPS, from Environment and Climate Change Canada, are used in this study. The study period from January 2013 to December 2015 covered a major flood event in Calgary, Alberta, Canada. Post-processed results show that the RPP is able to remove the bias and reduce the errors of both GEFS and GDPS forecasts. Ensembles generated from the RPP reliably quantify the forecast uncertainty.

  18. The Forecasting Procedure for Long-Term Wind Speed in the Zhangye Area

    Directory of Open Access Journals (Sweden)

    Zhenhai Guo

    2010-01-01

    Full Text Available Energy crisis has made it urgent to find alternative energy sources for sustainable energy supply; wind energy is one of the attractive alternatives. Within a wind energy system, the wind speed is one key parameter; accurately forecasting of wind speed can minimize the scheduling errors and in turn increase the reliability of the electric power grid and reduce the power market ancillary service costs. This paper proposes a new hybrid model for long-term wind speed forecasting based on the first definite season index method and the Autoregressive Moving Average (ARMA models or the Generalized Autoregressive Conditional Heteroskedasticity (GARCH forecasting models. The forecasting errors are analyzed and compared with the ones obtained from the ARMA, GARCH model, and Support Vector Machine (SVM; the simulation process and results show that the developed method is simple and quite efficient for daily average wind speed forecasting of Hexi Corridor in China.

  19. Towards real-time eruption forecasting in the Auckland Volcanic Field: application of BET_EF during the New Zealand National Disaster Exercise `Ruaumoko'

    Science.gov (United States)

    Lindsay, Jan; Marzocchi, Warner; Jolly, Gill; Constantinescu, Robert; Selva, Jacopo; Sandri, Laura

    2010-03-01

    The Auckland Volcanic Field (AVF) is a young basaltic field that lies beneath the urban area of Auckland, New Zealand’s largest city. Over the past 250,000 years the AVF has produced at least 49 basaltic centers; the last eruption was only 600 years ago. In recognition of the high risk associated with a possible future eruption in Auckland, the New Zealand government ran Exercise Ruaumoko in March 2008, a test of New Zealand’s nation-wide preparedness for responding to a major disaster resulting from a volcanic eruption in Auckland City. The exercise scenario was developed in secret, and covered the period of precursory activity up until the eruption. During Exercise Ruaumoko we adapted a recently developed statistical code for eruption forecasting, namely BET_EF (Bayesian Event Tree for Eruption Forecasting), to independently track the unrest evolution and to forecast the most likely onset time, location and style of the initial phase of the simulated eruption. The code was set up before the start of the exercise by entering reliable information on the past history of the AVF as well as the monitoring signals expected in the event of magmatic unrest and an impending eruption. The average probabilities calculated by BET_EF during Exercise Ruaumoko corresponded well to the probabilities subjectively (and independently) estimated by the advising scientists (differences of few percentage units), and provided a sound forecast of the timing (before the event, the eruption probability reached 90%) and location of the eruption. This application of BET_EF to a volcanic field that has experienced no historical activity and for which otherwise limited prior information is available shows its versatility and potential usefulness as a tool to aid decision-making for a wide range of volcano types. Our near real-time application of BET_EF during Exercise Ruaumoko highlighted its potential to clarify and possibly optimize decision-making procedures in a future AVF eruption

  20. Development of Thresholds and Exceedance Probabilities for Influent Water Quality to Meet Drinking Water Regulations

    Science.gov (United States)

    Reeves, K. L.; Samson, C.; Summers, R. S.; Balaji, R.

    2017-12-01

    Drinking water treatment utilities (DWTU) are tasked with the challenge of meeting disinfection and disinfection byproduct (DBP) regulations to provide safe, reliable drinking water under changing climate and land surface characteristics. DBPs form in drinking water when disinfectants, commonly chlorine, react with organic matter as measured by total organic carbon (TOC), and physical removal of pathogen microorganisms are achieved by filtration and monitored by turbidity removal. Turbidity and TOC in influent waters to DWTUs are expected to increase due to variable climate and more frequent fires and droughts. Traditional methods for forecasting turbidity and TOC require catchment specific data (i.e. streamflow) and have difficulties predicting them under non-stationary climate. A modelling framework was developed to assist DWTUs with assessing their risk for future compliance with disinfection and DBP regulations under changing climate. A local polynomial method was developed to predict surface water TOC using climate data collected from NOAA, Normalized Difference Vegetation Index (NDVI) data from the IRI Data Library, and historical TOC data from three DWTUs in diverse geographic locations. Characteristics from the DWTUs were used in the EPA Water Treatment Plant model to determine thresholds for influent TOC that resulted in DBP concentrations within compliance. Lastly, extreme value theory was used to predict probabilities of threshold exceedances under the current climate. Results from the utilities were used to produce a generalized TOC threshold approach that only requires water temperature and bromide concentration. The threshold exceedance model will be used to estimate probabilities of exceedances under projected climate scenarios. Initial results show that TOC can be forecasted using widely available data via statistical methods, where temperature, precipitation, Palmer Drought Severity Index, and NDVI with various lags were shown to be important

  1. Developing a Local Neurofuzzy Model for Short-Term Wind Power Forecasting

    Directory of Open Access Journals (Sweden)

    E. Faghihnia

    2014-01-01

    Full Text Available Large scale integration of wind generation capacity into power systems introduces operational challenges due to wind power uncertainty and variability. Therefore, accurate wind power forecast is important for reliable and economic operation of the power systems. Complexities and nonlinearities exhibited by wind power time series necessitate use of elaborative and sophisticated approaches for wind power forecasting. In this paper, a local neurofuzzy (LNF approach, trained by the polynomial model tree (POLYMOT learning algorithm, is proposed for short-term wind power forecasting. The LNF approach is constructed based on the contribution of local polynomial models which can efficiently model wind power generation. Data from Sotavento wind farm in Spain was used to validate the proposed LNF approach. Comparison between performance of the proposed approach and several recently published approaches illustrates capability of the LNF model for accurate wind power forecasting.

  2. Multicomponent ensemble models to forecast induced seismicity

    Science.gov (United States)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels

  3. Against all odds -- Probabilistic forecasts and decision making

    Science.gov (United States)

    Liechti, Katharina; Zappa, Massimiliano

    2015-04-01

    In the city of Zurich (Switzerland) the setting is such that the damage potential due to flooding of the river Sihl is estimated to about 5 billion US dollars. The flood forecasting system that is used by the administration for decision making runs continuously since 2007. It has a time horizon of max. five days and operates at hourly time steps. The flood forecasting system includes three different model chains. Two of those are run by the deterministic NWP models COSMO-2 and COSMO-7 and one is driven by the probabilistic NWP COSMO-Leps. The model chains are consistent since February 2010, so five full years are available for the evaluation for the system. The system was evaluated continuously and is a very nice example to present the added value that lies in probabilistic forecasts. The forecasts are available on an online-platform to the decision makers. Several graphical representations of the forecasts and forecast-history are available to support decision making and to rate the current situation. The communication between forecasters and decision-makers is quite close. To put it short, an ideal situation. However, an event or better put a non-event in summer 2014 showed that the knowledge about the general superiority of probabilistic forecasts doesn't necessarily mean that the decisions taken in a specific situation will be based on that probabilistic forecast. Some years of experience allow gaining confidence in the system, both for the forecasters and for the decision-makers. Even if from the theoretical point of view the handling during crisis situation is well designed, a first event demonstrated that the dialog with the decision-makers still lacks of exercise during such situations. We argue, that a false alarm is a needed experience to consolidate real-time emergency procedures relying on ensemble predictions. A missed event would probably also fit, but, in our case, we are very happy not to report about this option.

  4. Very Short-term Nonparametric Probabilistic Forecasting of Renewable Energy Generation - with Application to Solar Energy

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Pinson, Pierre; Gooi, Hoay Beng

    2016-01-01

    Due to the inherent uncertainty involved in renewable energy forecasting, uncertainty quantification is a key input to maintain acceptable levels of reliability and profitability in power system operation. A proposal is formulated and evaluated here for the case of solar power generation, when only...... approach to generate very short-term predictive densities, i.e., for lead times between a few minutes to one hour ahead, with fast frequency updates. We rely on an Extreme Learning Machine (ELM) as a fast regression model, trained in varied ways to obtain both point and quantile forecasts of solar power...... generation. Four probabilistic methods are implemented as benchmarks. Rival approaches are evaluated based on a number of test cases for two solar power generation sites in different climatic regions, allowing us to show that our approach results in generation of skilful and reliable probabilistic forecasts...

  5. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  6. EXPENSES FORECASTING MODEL IN UNIVERSITY PROJECTS PLANNING

    Directory of Open Access Journals (Sweden)

    Sergei A. Arustamov

    2016-11-01

    Full Text Available The paper deals with mathematical model presentation of cash flows in project funding. We describe different types of expenses linked to university project activities. Problems of project budgeting that contribute most uncertainty have been revealed. As an example of the model implementation we consider calculation of vacation allowance expenses for project participants. We define problems of forecast for funds reservation: calculation based on methodology established by the Ministry of Education and Science calculation according to the vacation schedule and prediction of the most probable amount. A stochastic model for vacation allowance expenses has been developed. We have proposed methods and solution of the problems that increase the accuracy of forecasting for funds reservation based on 2015 data.

  7. Assessing North American multimodel ensemble (NMME) seasonal forecast skill to assist in the early warning of hydrometeorological extremes over East Africa

    Science.gov (United States)

    Shukla, Shraddhanand; Roberts, Jason B.; Hoell. Andrew,; Funk, Chris; Robertson, Franklin R.; Kirtmann, Benjamin

    2016-01-01

    The skill of North American multimodel ensemble (NMME) seasonal forecasts in East Africa (EA), which encompasses one of the most food and water insecure areas of the world, is evaluated using deterministic, categorical, and probabilistic evaluation methods. The skill is estimated for all three primary growing seasons: March–May (MAM), July–September (JAS), and October–December (OND). It is found that the precipitation forecast skill in this region is generally limited and statistically significant over only a small part of the domain. In the case of MAM (JAS) [OND] season it exceeds the skill of climatological forecasts in parts of equatorial EA (Northern Ethiopia) [equatorial EA] for up to 2 (5) [5] months lead. Temperature forecast skill is generally much higher than precipitation forecast skill (in terms of deterministic and probabilistic skill scores) and statistically significant over a majority of the region. Over the region as a whole, temperature forecasts also exhibit greater reliability than the precipitation forecasts. The NMME ensemble forecasts are found to be more skillful and reliable than the forecast from any individual model. The results also demonstrate that for some seasons (e.g. JAS), the predictability of precipitation signals varies and is higher during certain climate events (e.g. ENSO). Finally, potential room for improvement in forecast skill is identified in some models by comparing homogeneous predictability in individual NMME models with their respective forecast skill.

  8. The Use of Some Forecasting Methods and SWOT Analysis in the Selected Processes of Foundry

    Directory of Open Access Journals (Sweden)

    Szymszal J.

    2017-12-01

    Full Text Available Forecasting and analysis SWOT are helping tools in the business activity, because under conditions of dynamic changes in both closer and more distant surroundings, reliable, forward-looking information and trends analysis are playing a decisive role. At present, the ability to use available data in forecasting and other analyzes according with changes in business environment are the key managerial skills required, since both forecasting and SWOT analysis are a integral part of the management process, and the appropriate level of forecasting knowledge is increasingly appreciated. Examples of practical use of some forecasting methods in optimization of the procurement, production and distribution processes in foundries are given. The possibilities of using conventional quantitative forecasting methods based on econometric and adaptive models applying the creep trend and harmonic weights are presented. The econometric models were additionally supplemented with the presentation of error estimation methodology, quality assessment and statistical verification of the forecast. The possibility of using qualitative forecasts based on SWOT analysis was also mentioned.

  9. Short-range quantitative precipitation forecasting using Deep Learning approaches

    Science.gov (United States)

    Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.

    2017-12-01

    Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.

  10. ECONOMETRIC FORECAST OF AGRICULTURAL SECTOR INVESTING IN LVOV REGION

    Directory of Open Access Journals (Sweden)

    Rostyslav Lytvyn

    2014-07-01

    Full Text Available Purpose of economic processes forecasting in agriculture is more relevant and urgent in recent years with application of applied econometric methods. In represented research paper, these methods are used to forecast investment and the main agricultural industry indicators of Lvov region of Ukraine. The linear trend model, the parabolic trend model and the exponential trend model were elaborated from the period from 2000 to 2009 in this scientific study using applied statistical tool STATGRAFICS and EXCEL spreadsheets. And with assistance of these models forecast for investment on the basis of data of essential indicators of agrarian sector of the region for 2010 and 2011 was made. All models with probability р=0,95 are adequate experimental data for 2000-2009 years, that allow to make the forecast of investments and main agricultural indicators of the researched region by these models for 2010 and 2011 years. Nevertheless, it should be pointed out that, because of small amount of input data analysis of regression equations coefficients have more qualitative than quantitative influence upon resulting variable y6.

  11. Estimating deficit probabilities with price-responsive demand in contract-based electricity markets

    International Nuclear Information System (INIS)

    Galetovic, Alexander; Munoz, Cristian M.

    2009-01-01

    Studies that estimate deficit probabilities in hydrothermal systems have generally ignored the response of demand to changing prices, in the belief that such response is largely irrelevant. We show that ignoring the response of demand to prices can lead to substantial over or under estimation of the probability of an energy deficit. To make our point we present an estimation of deficit probabilities in Chile's Central Interconnected System between 2006 and 2010. This period is characterized by tight supply, fast consumption growth and rising electricity prices. When the response of demand to rising prices is acknowledged, forecasted deficit probabilities and marginal costs are shown to be substantially lower

  12. An experimental system for flood risk forecasting at global scale

    Science.gov (United States)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  13. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  14. A new forecast presentation tool for offshore contractors

    Science.gov (United States)

    Jørgensen, M.

    2009-09-01

    Contractors working off shore are often very sensitive to both sea and weather conditions, and it's essential that they have easy access to reliable information on coming conditions to enable planning of when to start or shut down offshore operations to avoid loss of life and materials. Danish Meteorological Institute, DMI, recently, in cooperation with business partners in the field, developed a new application to accommodate that need. The "Marine Forecast Service” is a browser based forecast presentation tool. It provides an interface for the user to enable easy and quick access to all relevant meteorological and oceanographic forecasts and observations for a given area of interest. Each customer gains access to the application via a standard login/password procedure. Once logged in, the user can inspect animated forecast maps of parameters like wind, gust, wave height, swell and current among others. Supplementing the general maps, the user can choose to look at forecast graphs for each of the locations where the user is running operations. These forecast graphs can also be overlaid with the user's own in situ observations, if such exist. Furthermore, the data from the graphs can be exported as data files that the customer can use in his own applications as he desires. As part of the application, a forecaster's view on the current and near future weather situation is presented to the user as well, adding further value to the information presented through maps and graphs. Among other features of the product, animated radar and satellite images could be mentioned. And finally the application provides the possibility of a "second opinion” through traditional weather charts from another recognized provider of weather forecasts. The presentation will provide more detailed insights into the contents of the applications as well as some of the experiences with the product.

  15. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  16. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  17. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  18. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  19. Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain

    Science.gov (United States)

    Dai, Yonghui; Han, Dongmei; Dai, Weihui

    2014-01-01

    The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP) neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market. PMID:24782659

  20. Modeling and Computing of Stock Index Forecasting Based on Neural Network and Markov Chain

    Directory of Open Access Journals (Sweden)

    Yonghui Dai

    2014-01-01

    Full Text Available The stock index reflects the fluctuation of the stock market. For a long time, there have been a lot of researches on the forecast of stock index. However, the traditional method is limited to achieving an ideal precision in the dynamic market due to the influences of many factors such as the economic situation, policy changes, and emergency events. Therefore, the approach based on adaptive modeling and conditional probability transfer causes the new attention of researchers. This paper presents a new forecast method by the combination of improved back-propagation (BP neural network and Markov chain, as well as its modeling and computing technology. This method includes initial forecasting by improved BP neural network, division of Markov state region, computing of the state transition probability matrix, and the prediction adjustment. Results of the empirical study show that this method can achieve high accuracy in the stock index prediction, and it could provide a good reference for the investment in stock market.

  1. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  2. Model for Adjustment of Aggregate Forecasts using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Taracena–Sanz L. F.

    2010-07-01

    Full Text Available This research suggests a contribution in the implementation of forecasting models. The proposed model is developed with the aim to fit the projection of demand to surroundings of firms, and this is based on three considerations that cause that in many cases the forecasts of the demand are different from reality, such as: 1 one of the problems most difficult to model in the forecasts is the uncertainty related to the information available; 2 the methods traditionally used by firms for the projection of demand mainly are based on past behavior of the market (historical demand; and 3 these methods do not consider in their analysis the factors that are influencing so that the observed behaviour occurs. Therefore, the proposed model is based on the implementation of Fuzzy Logic, integrating the main variables that affect the behavior of market demand, and which are not considered in the classical statistical methods. The model was applied to a bottling of carbonated beverages, and with the adjustment of the projection of demand a more reliable forecast was obtained.

  3. Toward the Probabilistic Forecasting of High-latitude GPS Phase Scintillation

    Science.gov (United States)

    Prikryl, P.; Jayachandran, P.T.; Mushini, S. C.; Richardson, I. G.

    2012-01-01

    The phase scintillation index was obtained from L1 GPS data collected with the Canadian High Arctic Ionospheric Network (CHAIN) during years of extended solar minimum 2008-2010. Phase scintillation occurs predominantly on the dayside in the cusp and in the nightside auroral oval. We set forth a probabilistic forecast method of phase scintillation in the cusp based on the arrival time of either solar wind corotating interaction regions (CIRs) or interplanetary coronal mass ejections (ICMEs). CIRs on the leading edge of high-speed streams (HSS) from coronal holes are known to cause recurrent geomagnetic and ionospheric disturbances that can be forecast one or several solar rotations in advance. Superposed epoch analysis of phase scintillation occurrence showed a sharp increase in scintillation occurrence just after the arrival of high-speed solar wind and a peak associated with weak to moderate CMEs during the solar minimum. Cumulative probability distribution functions for the phase scintillation occurrence in the cusp are obtained from statistical data for days before and after CIR and ICME arrivals. The probability curves are also specified for low and high (below and above median) values of various solar wind plasma parameters. The initial results are used to demonstrate a forecasting technique on two example periods of CIRs and ICMEs.

  4. Reference Scenario Forecasting: A New Approach to Transport Project Assessment

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen; Skougaard, Britt Zoëga

    2010-01-01

    This paper presents a new approach to transport project assessment in terms of feasibility risk assessment and reference class forecasting. Normally, transport project assessment is based upon a cost-benefit approach where evaluation criteria such as net present values are obtained. Recent research...... construction cost estimates. Hereafter, a quantitative risk analysis is provided making use of Monte Carlo simulation. This stochastic approach facilitates random input parameters based upon reference class forecasting, hence, a parameter data fit has been performed in order to obtain validated probability...... forecasting (RSF) frame. The RSF is anchored in the cost-benefit analysis (CBA), thus, it provides decision-makers with a quantitative mean of assessing the transport infrastructure project. First, the RSF method introduces uncertainties within the CBA by applying Optimism Bias uplifts on the preliminary...

  5. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  6. Model-free aftershock forecasts constructed from similar sequences in the past

    Science.gov (United States)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity

  7. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    Energy Technology Data Exchange (ETDEWEB)

    Soltanzadeh, I. [Tehran Univ. (Iran, Islamic Republic of). Inst. of Geophysics; Azadi, M.; Vakili, G.A. [Atmospheric Science and Meteorological Research Center (ASMERC), Teheran (Iran, Islamic Republic of)

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast. (orig.)

  8. Using Bayesian Model Averaging (BMA to calibrate probabilistic surface temperature forecasts over Iran

    Directory of Open Access Journals (Sweden)

    I. Soltanzadeh

    2011-07-01

    Full Text Available Using Bayesian Model Averaging (BMA, an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM, with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP Global Forecast System (GFS and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009 over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  9. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  10. Forecasting energy demand and CO{sub 2}-emissions from energy production in the forest industry

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, H

    1998-12-31

    The purpose of this study was to develops new energy forecasting methods for the forest industry energy use. The scenarios have been the most commonly used forecasts, but they require a lot of work. The recent scenarios, which are made for the forest industry, give a wide range of results; e.g. from 27,8 TWh to 38 TWh for electricity use in 2010. There is a need for more simple and accurate methods for forecasting. The time scale for the study is from 1975 to 2010, i.e. 36 years. The basic data for the study is collected from time period 1975 - 1995. It includes the wood use, production of main product categories and energy use in the forest industry. The factors affecting energy use at both industry level and at mill level are presented. The most probable technology trends, which can have an effect on energy production and use and CO{sub 2}-emissions are studied. Recent forecasts for the forest industry energy use till the year 2010 are referred and analysed. Three alternative forecasting methods are studied more closely. These methods are (a) Regression analysis, (b) Growth curves and (c) Delphi-method. Total electricity demand, share of purchased electricity, total fuel demand and share of process-based biofuels are estimated for the time period 1996 - 2010. The results from the different methods are compared to each other and to the recent scenarios. The comparison is made for the results concerning the energy use and the usefulness of the methods in practical work. The average energy consumption given by the forecasts for electricity was 31,6 TWh and for fuel 6,2 Mtoe in 2010. The share of purchased electricity totalled 73 % and process based fuels 77 %. The figures from 1995 are 22,8 TWh, 5,5 Mtoe, 64 % and 68 % respectively. All three methods were suitable for forecasting. All the methods required less working hours and were easier to use than scenarios. The methods gave results with a smaller deviation than scenarios, e.g. with electricity use in 2010 from

  11. Forecasting energy demand and CO{sub 2}-emissions from energy production in the forest industry

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, H.

    1997-12-31

    The purpose of this study was to develops new energy forecasting methods for the forest industry energy use. The scenarios have been the most commonly used forecasts, but they require a lot of work. The recent scenarios, which are made for the forest industry, give a wide range of results; e.g. from 27,8 TWh to 38 TWh for electricity use in 2010. There is a need for more simple and accurate methods for forecasting. The time scale for the study is from 1975 to 2010, i.e. 36 years. The basic data for the study is collected from time period 1975 - 1995. It includes the wood use, production of main product categories and energy use in the forest industry. The factors affecting energy use at both industry level and at mill level are presented. The most probable technology trends, which can have an effect on energy production and use and CO{sub 2}-emissions are studied. Recent forecasts for the forest industry energy use till the year 2010 are referred and analysed. Three alternative forecasting methods are studied more closely. These methods are (a) Regression analysis, (b) Growth curves and (c) Delphi-method. Total electricity demand, share of purchased electricity, total fuel demand and share of process-based biofuels are estimated for the time period 1996 - 2010. The results from the different methods are compared to each other and to the recent scenarios. The comparison is made for the results concerning the energy use and the usefulness of the methods in practical work. The average energy consumption given by the forecasts for electricity was 31,6 TWh and for fuel 6,2 Mtoe in 2010. The share of purchased electricity totalled 73 % and process based fuels 77 %. The figures from 1995 are 22,8 TWh, 5,5 Mtoe, 64 % and 68 % respectively. All three methods were suitable for forecasting. All the methods required less working hours and were easier to use than scenarios. The methods gave results with a smaller deviation than scenarios, e.g. with electricity use in 2010 from

  12. Reliable before-fabrication forecasting of normal and touch mode MEMS capacitive pressure sensor: modeling and simulation

    Science.gov (United States)

    Jindal, Sumit Kumar; Mahajan, Ankush; Raghuwanshi, Sanjeev Kumar

    2017-10-01

    An analytical model and numerical simulation for the performance of MEMS capacitive pressure sensors in both normal and touch modes is required for expected behavior of the sensor prior to their fabrication. Obtaining such information should be based on a complete analysis of performance parameters such as deflection of diaphragm, change of capacitance when the diaphragm deflects, and sensitivity of the sensor. In the literature, limited work has been carried out on the above-stated issue; moreover, due to approximation factors of polynomials, a tolerance error cannot be overseen. Reliable before-fabrication forecasting requires exact mathematical calculation of the parameters involved. A second-order polynomial equation is calculated mathematically for key performance parameters of both modes. This eliminates the approximation factor, and an exact result can be studied, maintaining high accuracy. The elimination of approximation factors and an approach of exact results are based on a new design parameter (δ) that we propose. The design parameter gives an initial hint to the designers on how the sensor will behave once it is fabricated. The complete work is aided by extensive mathematical detailing of all the parameters involved. Next, we verified our claims using MATLAB® simulation. Since MATLAB® effectively provides the simulation theory for the design approach, more complicated finite element method is not used.

  13. MSSM Forecast for the LHC

    CERN Document Server

    Cabrera, Maria Eugenia; de Austri, Roberto Ruiz

    2009-01-01

    We perform a forecast of the MSSM with universal soft terms (CMSSM) for the LHC, based on an improved Bayesian analysis. We do not incorporate ad hoc measures of the fine-tuning to penalize unnatural possibilities: such penalization arises from the Bayesian analysis itself when the experimental value of $M_Z$ is considered. This allows to scan the whole parameter space, allowing arbitrarily large soft terms. Still the low-energy region is statistically favoured (even before including dark matter or g-2 constraints). Contrary to other studies, the results are almost unaffected by changing the upper limits taken for the soft terms. The results are also remarkable stable when using flat or logarithmic priors, a fact that arises from the larger statistical weight of the low-energy region in both cases. Then we incorporate all the important experimental constrains to the analysis, obtaining a map of the probability density of the MSSM parameter space, i.e. the forecast of the MSSM. Since not all the experimental i...

  14. Prediction of kharif rice yield at Kharagpur using disaggregated extended range rainfall forecasts

    Science.gov (United States)

    Dhekale, B. S.; Nageswararao, M. M.; Nair, Archana; Mohanty, U. C.; Swain, D. K.; Singh, K. K.; Arunbabu, T.

    2017-08-01

    The Extended Range Forecasts System (ERFS) has been generating monthly and seasonal forecasts on real-time basis throughout the year over India since 2009. India is one of the major rice producer and consumer in South Asia; more than 50% of the Indian population depends on rice as staple food. Rice is mainly grown in kharif season, which contributed 84% of the total annual rice production of the country. Rice cultivation in India is rainfed, which depends largely on rains, so reliability of the rainfall forecast plays a crucial role for planning the kharif rice crop. In the present study, an attempt has been made to test the reliability of seasonal and sub-seasonal ERFS summer monsoon rainfall forecasts for kharif rice yield predictions at Kharagpur, West Bengal by using CERES-Rice (DSSATv4.5) model. These ERFS forecasts are produced as monthly and seasonal mean values and are converted into daily sequences with stochastic weather generators for use with crop growth models. The daily sequences are generated from ERFS seasonal (June-September) and sub-seasonal (July-September, August-September, and September) summer monsoon (June to September) rainfall forecasts which are considered as input in CERES-rice crop simulation model for the crop yield prediction for hindcast (1985-2008) and real-time mode (2009-2015). The yield simulated using India Meteorological Department (IMD) observed daily rainfall data is considered as baseline yield for evaluating the performance of predicted yields using the ERFS forecasts. The findings revealed that the stochastic disaggregation can be used to disaggregate the monthly/seasonal ERFS forecasts into daily sequences. The year to year variability in rice yield at Kharagpur is efficiently predicted by using the ERFS forecast products in hindcast as well as real time, and significant enhancement in the prediction skill is noticed with advancement in the season due to incorporation of observed weather data which reduces uncertainty of

  15. Forecast Informed Reservoir Operations: Bringing Science and Decision-Makers Together to Explore Use of Hydrometeorological Forecasts to Support Future Reservoir Operations

    Science.gov (United States)

    Ralph, F. M.; Jasperse, J.

    2017-12-01

    Forecast Informed Reservoir Operations (FIRO) is a proposed strategy that is exploring inorporation of improved hydrometeorological forecasts of land-falling atmospheric rivers on the U.S. West Coast into reservoir operations. The first testbed for this strategy is Lake Mendocino, which is located in the East Fork of the 1485 mi2 Russian River Watershed in northern California. This project is guided by the Lake Mendocino FIRO Steering Committee (SC). The SC is an ad hoc committee that consists of water managers and scientists from several federal, state, and local agencies, and universities who have teamed to evaluate whether current or improved technology and scientific understanding can be utilized to improve water supply reliability, enhance flood mitigation and support recovery of listed salmon for the Russian River of northern California. In 2015, the SC created a detailed work plan, which included a Preliminary Viability Assessment, which has now been completed. The SC developed a vision that operational efficiency would be improved by using forecasts to inform decisions about releasing or storing water. FIRO would use available reservoir storage in an efficient manner by (1) better forecasting inflow (or lack of inflow) with enhanced technology, and (2) adapting operation in real time to meet the need for storage, rather than making storage available just in case it is needed. The envisioned FIRO strategy has the potential to simultaneously improve water supply reliability, flood protection, and ecosystem outcomes through a more efficient use of existing infrastructure while requiring minimal capital improvements in the physical structure of the dam. This presentation will provide an overview of the creation of the FIRO SC and how it operates, and describes the lessons learned through this partnership. Results in the FIRO Preliminary Viability Assessment will be summarized and next steps described.

  16. A survey on wind power ramp forecasting.

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.; Gama, J.; Matias, L.; Botterud, A.; Wang, J. (Decision and Information Sciences); (INESC Porto)

    2011-02-23

    The increasing use of wind power as a source of electricity poses new challenges with regard to both power production and load balance in the electricity grid. This new source of energy is volatile and highly variable. The only way to integrate such power into the grid is to develop reliable and accurate wind power forecasting systems. Electricity generated from wind power can be highly variable at several different timescales: sub-hourly, hourly, daily, and seasonally. Wind energy, like other electricity sources, must be scheduled. Although wind power forecasting methods are used, the ability to predict wind plant output remains relatively low for short-term operation. Because instantaneous electrical generation and consumption must remain in balance to maintain grid stability, wind power's variability can present substantial challenges when large amounts of wind power are incorporated into a grid system. A critical issue is ramp events, which are sudden and large changes (increases or decreases) in wind power. This report presents an overview of current ramp definitions and state-of-the-art approaches in ramp event forecasting.

  17. Questionnaire of social probability and potential consequences: Examination of reliability and validity on Serbian population

    Directory of Open Access Journals (Sweden)

    Ranđelović Kristina M.

    2014-01-01

    Full Text Available Prejudice in judgment has an important role in cognitive models of psychopathology. Every selective processing of emotionally relevant stimuli is called cognitive prejudice. One of the cognitive prejudices that is considered a key factor of socially - anxious disorder is prejudice in judgment. It is defined as a disposition to overestimate the probability of occurrence of negative social events in the near future, as well as potential consequences (agitation that might follow them. The perception of danger is essentially determined by a joined effect of subjective assessment of probability and agitation created by certain events. The researches have shown that socially-anxious individuals have a more expressive prejudice in judgment and that it can be reduced by applying certain psychotherapeutic and pharmacological treatments, which proves its relevance for the socially-anxious disorder. Considering the significance of the prejudice in judgment construct for the research and clinical practice and the lack of instruments that is operational in our country, the basic purpose of this paper is to check metric characteristics of the Serbian version of one of the most often mentioned and used questionnaires aimed at the assessment of this construct. It is the Questionnaire of social probability and potential consequences, which has two subscales: 1 to examine the reliability of the questionnaire on the sample of examinees from Serbia; 2 to examine the latent structure of the questionnaire and 3 to examine the construct of validity of the questionnaire by checking the correlations with other relevant constructs (personality traits, anxiety as a trait and fear of negative evaluation.The was adapted for Serbian language from English. The sample consists of 166 examinees, aged from 19 to 29 (AS = 21,73; SD = 1,43. The questionnaire for sensitivity to confirmation assessment was used to estimate personality traits, Anxiety as a trait was estimated by the

  18. Comparison of Statistical Post-Processing Methods for Probabilistic Wind Speed Forecasting

    Science.gov (United States)

    Han, Keunhee; Choi, JunTae; Kim, Chansoo

    2018-02-01

    In this study, the statistical post-processing methods that include bias-corrected and probabilistic forecasts of wind speed measured in PyeongChang, which is scheduled to host the 2018 Winter Olympics, are compared and analyzed to provide more accurate weather information. The six post-processing methods used in this study are as follows: mean bias-corrected forecast, mean and variance bias-corrected forecast, decaying averaging forecast, mean absolute bias-corrected forecast, and the alternative implementations of ensemble model output statistics (EMOS) and Bayesian model averaging (BMA) models, which are EMOS and BMA exchangeable models by assuming exchangeable ensemble members and simplified version of EMOS and BMA models. Observations for wind speed were obtained from the 26 stations in PyeongChang and 51 ensemble member forecasts derived from the European Centre for Medium-Range Weather Forecasts (ECMWF Directorate, 2012) that were obtained between 1 May 2013 and 18 March 2016. Prior to applying the post-processing methods, reliability analysis was conducted by using rank histograms to identify the statistical consistency of ensemble forecast and corresponding observations. Based on the results of our study, we found that the prediction skills of probabilistic forecasts of EMOS and BMA models were superior to the biascorrected forecasts in terms of deterministic prediction, whereas in probabilistic prediction, BMA models showed better prediction skill than EMOS. Even though the simplified version of BMA model exhibited best prediction skill among the mentioned six methods, the results showed that the differences of prediction skills between the versions of EMOS and BMA were negligible.

  19. AN EVALUATION OF POINT AND DENSITY FORECASTS FOR SELECTED EU FARM GATE MILK PRICES

    Directory of Open Access Journals (Sweden)

    Dennis Bergmann

    2018-01-01

    Full Text Available Fundamental changes to the common agricultural policy (CAP have led to greater market orientation which in turn has resulted in sharply increased variability of EU farm gate milk prices and thus farmers’ income. In this market environment reliable forecasts of farm gate milk prices are extremely important as farmers can make improved decisions with regards to cash flow management and budget preparation. In addition these forecasts may be used in setting fixed priced contracts between dairy farmers and processors thus providing certainty and reducing risk. In this study both point and density forecasts from various time series models for farm gate milk prices in Germany, Ireland and for an average EU price series are evaluated using a rolling window framework. Additionally forecasts of the individual models are combined using different combination schemes. The results of the out of sample evaluation show that ARIMA type models perform well on short forecast horizons (1 to 3 month while the structural time series approach performs well on longer forecast horizons (12 month. Finally combining individual forecasts of different models significantly improves the forecast performance for all forecast horizons.

  20. Developing integrated performance assessment and forecasting the level of financial and economic enterprise stability

    Directory of Open Access Journals (Sweden)

    Khudyakova T.A.

    2017-01-01

    Full Text Available The article deals with the problem of assessing and forecasting the level of financial and economic enterprise stability through the integrated indicators development. Currently, many enterprises operate under variable external environment, which imposes a strict requirement to consider this uncertainty. For the evaluation, analysis and prediction of the sustainability of the enterprise in the conditions of crisis we believe it possible and necessary to use the apparatus of probability theory and mathematical statistics. This problem solution will improve quantitative assessing the financial and economic stability level, forecasting possible scenarios of the enterprise development and, therefore, based on the proactive management principles and adaptation processes will greatly increase their effective functioning, as well as reduce bankruptcy probability.

  1. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  2. Forecasting of Radiation Belts: Results From the PROGRESS Project.

    Science.gov (United States)

    Balikhin, M. A.; Arber, T. D.; Ganushkina, N. Y.; Walker, S. N.

    2017-12-01

    Forecasting of Radiation Belts: Results from the PROGRESS Project. The overall goal of the PROGRESS project, funded in frame of EU Horizon2020 programme, is to combine first principles based models with the systems science methodologies to achieve reliable forecasts of the geo-space particle radiation environment.The PROGRESS incorporates three themes : The propagation of the solar wind to L1, Forecast of geomagnetic indices, and forecast of fluxes of energetic electrons within the magnetosphere. One of the important aspects of the PROGRESS project is the development of statistical wave models for magnetospheric waves that affect the dynamics of energetic electrons such as lower band chorus, hiss and equatorial noise. The error reduction ratio (ERR) concept has been used to optimise the set of solar wind and geomagnetic parameters for organisation of statistical wave models for these emissions. The resulting sets of parameters and statistical wave models will be presented and discussed. However the ERR analysis also indicates that the combination of solar wind and geomagnetic parameters accounts for only part of the variance of the emissions under investigation (lower band chorus, hiss and equatorial noise). In addition, advances in the forecast of fluxes of energetic electrons, exploiting empirical models and the first principles IMPTAM model achieved by the PROGRESS project is presented.

  3. Statistical Short-Range Guidance for Peak Wind Speed Forecasts on Kennedy Space Center/Cape Canaveral Air Force Station: Phase I Results

    Science.gov (United States)

    Lambert, Winifred C.; Merceret, Francis J. (Technical Monitor)

    2002-01-01

    This report describes the results of the ANU's (Applied Meteorology Unit) Short-Range Statistical Forecasting task for peak winds. The peak wind speeds are an important forecast element for the Space Shuttle and Expendable Launch Vehicle programs. The Keith Weather Squadron and the Spaceflight Meteorology Group indicate that peak winds are challenging to forecast. The Applied Meteorology Unit was tasked to develop tools that aid in short-range forecasts of peak winds at tower sites of operational interest. A 7 year record of wind tower data was used in the analysis. Hourly and directional climatologies by tower and month were developed to determine the seasonal behavior of the average and peak winds. In all climatologies, the average and peak wind speeds were highly variable in time. This indicated that the development of a peak wind forecasting tool would be difficult. Probability density functions (PDF) of peak wind speed were calculated to determine the distribution of peak speed with average speed. These provide forecasters with a means of determining the probability of meeting or exceeding a certain peak wind given an observed or forecast average speed. The climatologies and PDFs provide tools with which to make peak wind forecasts that are critical to safe operations.

  4. High-Resolution Hydrological Sub-Seasonal Forecasting for Water Resources Management Over Europe

    Science.gov (United States)

    Wood, E. F.; Wanders, N.; Pan, M.; Sheffield, J.; Samaniego, L. E.; Thober, S.; Kumar, R.; Prudhomme, C.; Houghton-Carr, H.

    2017-12-01

    For decision-making at the sub-seasonal and seasonal time scale, hydrological forecasts with a high temporal and spatial resolution are required by water managers. So far such forecasts have been unavailable due to 1) lack of availability of meteorological seasonal forecasts, 2) coarse temporal resolution of meteorological seasonal forecasts, requiring temporal downscaling, 3) lack of consistency between observations and seasonal forecasts, requiring bias-correction. The EDgE (End-to-end Demonstrator for improved decision making in the water sector in Europe) project commissioned by the ECMWF (C3S) created a unique dataset of hydrological seasonal forecasts derived from four global climate models (CanCM4, FLOR-B01, ECMF, LFPW) in combination with four global hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), resulting in 208 forecasts for any given day. The forecasts provide a daily temporal and 5-km spatial resolution, and are bias corrected against E-OBS meteorological observations. The forecasts are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs), created in collaboration with the end-user community of the EDgE project (e.g. the percentage of ensemble realizations above the 10th percentile of monthly river flow, or below the 90th). Results show skillful forecasts for discharge from 3 months to 6 months (latter for N Europe due to snow); for soil moisture up to three months due precipitation forecast skill and short initial condition memory; and for groundwater greater than 6 months (lowest skill in western Europe.) The SCIIs are effective in communicating both forecast skill and uncertainty. Overall the new system provides an unprecedented ensemble for seasonal forecasts with significant skill over Europe to support water management. The consistency in both the GCM forecasts and the LSM parameterization ensures a stable and reliable forecast framework and methodology, even if additional GCMs or LSMs are added in the future.

  5. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts

    DEFF Research Database (Denmark)

    Vilhelmsen, Troels Norvin; Ferre, Ty Paul

    2017-01-01

    . In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific...... measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when...

  6. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    DEFF Research Database (Denmark)

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  7. Use of High-Resolution WRF Simulations to Forecast Lightning Threat

    Science.gov (United States)

    McCaul, E. W., Jr.; LaCasse, K.; Goodman, S. J.; Cecil, D. J.

    2008-01-01

    Recent observational studies have confirmed the existence of a robust statistical relationship between lightning flash rates and the amount of large precipitating ice hydrometeors aloft in storms. This relationship is exploited, in conjunction with the capabilities of cloud-resolving forecast models such as WRF, to forecast explicitly the threat of lightning from convective storms using selected output fields from the model forecasts. The simulated vertical flux of graupel at -15C and the shape of the simulated reflectivity profile are tested in this study as proxies for charge separation processes and their associated lightning risk. Our lightning forecast method differs from others in that it is entirely based on high-resolution simulation output, without reliance on any climatological data. short [6-8 h) simulations are conducted for a number of case studies for which three-dmmensional lightning validation data from the North Alabama Lightning Mapping Array are available. Experiments indicate that initialization of the WRF model on a 2 km grid using Eta boundary conditions, Doppler radar radial velocity fields, and METAR and ACARS data y&eld satisfactory simulations. __nalyses of the lightning threat fields suggests that both the graupel flux and reflectivity profile approaches, when properly calibrated, can yield reasonable lightning threat forecasts, although an ensemble approach is probably desirable in order to reduce the tendency for misplacement of modeled storms to hurt the accuracy of the forecasts. Our lightning threat forecasts are also compared to other more traditional means of forecasting thunderstorms, such as those based on inspection of the convective available potential energy field.

  8. The management of subsurface uncertainty using probabilistic modeling of life cycle production forecasts and cash flows

    International Nuclear Information System (INIS)

    Olatunbosun, O. O.

    1998-01-01

    The subject pertains to the implementation of the full range of subsurface uncertainties in life cycle probabilistic forecasting and its extension to project cash flows using the methodology of probabilities. A new tool has been developed in the probabilistic application of Crystal-Ball which can model reservoir volumetrics, life cycle production forecasts and project cash flows in a single environment. The tool is modular such that the volumetrics and cash flow modules are optional. Production forecasts are often generated by applying a decline equation to single best estimate values of input parameters such as initial potential, decline rate, abandonment rate etc -or sometimes by results of reservoir simulation. This new tool provides a means of implementing the full range of uncertainties and interdependencies of the input parameters into the production forecasts by defining the input parameters as probability density functions, PDFs and performing several iterations to generate an expectation curve forecast. Abandonment rate is implemented in each iteration via a link to an OPEX model. The expectation curve forecast is input into a cash flow model to generate a probabilistic NPV. Base case and sensitivity runs from reservoir simulation can likewise form the basis for a probabilistic production forecast from which a probabilistic cash flow can be generated. A good illustration of the application of this tool is in the modelling of the production forecast for a well that encounters its target reservoirs in OUT/ODT situation and thus has significant uncertainties. The uncertainty in presence and size (if present) of gas cap and dependency between ultimate recovery and initial potential amongst other uncertainties can be easily implemented in the production forecast with this tool. From the expectation curve forecast, a probabilistic NPV can be easily generated. Possible applications of this tool include: i. estimation of range of actual recoverable volumes based

  9. Forecast Combinations

    OpenAIRE

    Timmermann, Allan G

    2005-01-01

    Forecast combinations have frequently been found in empirical studies to produce better forecasts on average than methods based on the ex-ante best individual forecasting model. Moreover, simple combinations that ignore correlations between forecast errors often dominate more refined combination schemes aimed at estimating the theoretically optimal combination weights. In this paper we analyse theoretically the factors that determine the advantages from combining forecasts (for example, the d...

  10. Forecaster Behaviour and Bias in Macroeconomic Forecasts

    OpenAIRE

    Roy Batchelor

    2007-01-01

    This paper documents the presence of systematic bias in the real GDP and inflation forecasts of private sector forecasters in the G7 economies in the years 1990–2005. The data come from the monthly Consensus Economics forecasting service, and bias is measured and tested for significance using parametric fixed effect panel regressions and nonparametric tests on accuracy ranks. We examine patterns across countries and forecasters to establish whether the bias reflects the inefficient use of i...

  11. Clustering and Support Vector Regression for Water Demand Forecasting and Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Antonio Candelieri

    2017-03-01

    Full Text Available This paper presents a completely data-driven and machine-learning-based approach, in two stages, to first characterize and then forecast hourly water demand in the short term with applications of two different data sources: urban water demand (SCADA data and individual customer water consumption (AMR data. In the first case, reliable forecasting can be used to optimize operations, particularly the pumping schedule, in order to reduce energy-related costs, while in the second case, the comparison between forecast and actual values may support the online detection of anomalies, such as smart meter faults, fraud or possible cyber-physical attacks. Results are presented for a real case: the water distribution network in Milan.

  12. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  13. Do location specific forecasts pose a new challenge for communicating uncertainty?

    Science.gov (United States)

    Abraham, Shyamali; Bartlett, Rachel; Standage, Matthew; Black, Alison; Charlton-Perez, Andrew; McCloy, Rachel

    2015-04-01

    In the last decade, the growth of local, site-specific weather forecasts delivered by mobile phone or website represents arguably the fastest change in forecast consumption since the beginning of Television weather forecasts 60 years ago. In this study, a street-interception survey of 274 members of the public a clear first preference for narrow weather forecasts above traditional broad weather forecasts is shown for the first time, with a clear bias towards this preference for users under 40. The impact of this change on the understanding of forecast probability and intensity information is explored. While the correct interpretation of the statement 'There is a 30% chance of rain tomorrow' is still low in the cohort, in common with previous studies, a clear impact of age and educational attainment on understanding is shown, with those under 40 and educated to degree level or above more likely to correctly interpret it. The interpretation of rainfall intensity descriptors ('Light', 'Moderate', 'Heavy') by the cohort is shown to be significantly different to official and expert assessment of the same descriptors and to have large variance amongst the cohort. However, despite these key uncertainties, members of the cohort generally seem to make appropriate decisions about rainfall forecasts. There is some evidence that the decisions made are different depending on the communication format used, and the cohort expressed a clear preference for tabular over graphical weather forecast presentation.

  14. Forecast combinations

    OpenAIRE

    Aiolfi, Marco; Capistrán, Carlos; Timmermann, Allan

    2010-01-01

    We consider combinations of subjective survey forecasts and model-based forecasts from linear and non-linear univariate specifications as well as multivariate factor-augmented models. Empirical results suggest that a simple equal-weighted average of survey forecasts outperform the best model-based forecasts for a majority of macroeconomic variables and forecast horizons. Additional improvements can in some cases be gained by using a simple equal-weighted average of survey and model-based fore...

  15. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts.

    Science.gov (United States)

    Vilhelmsen, Troels N; Ferré, Ty P A

    2017-09-15

    Hydrological models are often set up to provide specific forecasts of interest. Owing to the inherent uncertainty in data used to derive model structure and used to constrain parameter variations, the model forecasts will be uncertain. Additional data collection is often performed to minimize this forecast uncertainty. Given our common financial restrictions, it is critical that we identify data with maximal information content with respect to forecast of interest. In practice, this often devolves to qualitative decisions based on expert opinion. However, there is no assurance that this will lead to optimal design, especially for complex hydrogeological problems. Specifically, these complexities include considerations of multiple forecasts, shared information among potential observations, information content of existing data, and the assumptions and simplifications underlying model construction. In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when selecting future measurement sets. © 2017, National Ground Water Association.

  16. An overview of wind power forecast types and their use in large-scale integration of wind power

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Henrik Aalborg; Nielsen, Torben Skov [ENFOR A/S, Horslholm (Denmark); Madsen, Henrik [Technical Univ. of Denmark, Lyngby (Denmark). Informatics and Mathematical Modelling

    2011-07-01

    Wind power forecast characteristics are described and it is shown how analyses of actual decision problems can be used to derive the forecast characteristics important in a given situation. Generally, characteristics related to resolution in space and time, together with the required maximal forecast horizon are easily identified. However, identification of forecast characteristics required for optimal decision support requires a more thorough investigation, which is illustrated by examples. Generally, quantile forecasts of the future wind power production are required, but the transformation of a quantile forecast into an actual decisions is highly dependent on the precise formulation of the decision problem. Furthermore, when consequences of neighbouring time steps interact, quantile forecasts are not sufficient. It is argued that a general solution in such cases is to base the decision on reliable scenarios of the future wind power production. (orig.)

  17. Influence of wind energy forecast in deterministic and probabilistic sizing of reserves

    Energy Technology Data Exchange (ETDEWEB)

    Gil, A.; Torre, M. de la; Dominguez, T.; Rivas, R. [Red Electrica de Espana (REE), Madrid (Spain). Dept. Centro de Control Electrico

    2010-07-01

    One of the challenges in large-scale wind energy integration in electrical systems is coping with wind forecast uncertainties at the time of sizing generation reserves. These reserves must be sized large enough so that they don't compromise security of supply or the balance of the system, but economic efficiency must be also kept in mind. This paper describes two methods of sizing spinning reserves taking into account wind forecast uncertainties, deterministic using a probabilistic wind forecast and probabilistic using stochastic variables. The deterministic method calculates the spinning reserve needed by adding components each of them in order to overcome one single uncertainty: demand errors, the biggest thermal generation loss and wind forecast errors. The probabilistic method assumes that demand forecast errors, short-term thermal group unavailability and wind forecast errors are independent stochastic variables and calculates the probability density function of the three variables combined. These methods are being used in the case of the Spanish peninsular system, in which wind energy accounted for 14% of the total electrical energy produced in the year 2009 and is one of the systems in the world with the highest wind penetration levels. (orig.)

  18. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    Science.gov (United States)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in

  19. Earthquakes and forecast reliability: thermoactivation and mesomechanics of the focal zone

    Science.gov (United States)

    Kalinnikov, I. I.; Manukin, A. B.; Matyunin, V. P.

    2017-06-01

    According to our data, the involvement of the fundamental laws of physics, in particular, consideration of an earthquake as a particular macroprocess with a peak together with the thermofluctuational activation of mechanical stresses in some environments, makes it possible to move beyond the traditional idea of the issue of earthquake prediction. Many formal parameters of statistical processing of the geophysical data can be provided with a physical sense related to the mesomechanics of structural changes in a stressed solid body. Measures for improving the efficiency of observations and their mathematical processing to solve the forecasting issues have been specified.

  20. Data Assimilation at FLUXNET to Improve Models towards Ecological Forecasting (Invited)

    Science.gov (United States)

    Luo, Y.

    2009-12-01

    Dramatically increased volumes of data from observational and experimental networks such as FLUXNET call for transformation of ecological research to increase its emphasis on quantitative forecasting. Ecological forecasting will also meet the societal need to develop better strategies for natural resource management in a world of ongoing global change. Traditionally, ecological forecasting has been based on process-based models, informed by data in largely ad hoc ways. Although most ecological models incorporate some representation of mechanistic processes, today’s ecological models are generally not adequate to quantify real-world dynamics and provide reliable forecasts with accompanying estimates of uncertainty. A key tool to improve ecological forecasting is data assimilation, which uses data to inform initial conditions and to help constrain a model during simulation to yield results that approximate reality as closely as possible. In an era with dramatically increased availability of data from observational and experimental networks, data assimilation is a key technique that helps convert the raw data into ecologically meaningful products so as to accelerate our understanding of ecological processes, test ecological theory, forecast changes in ecological services, and better serve the society. This talk will use examples to illustrate how data from FLUXNET have been assimilated with process-based models to improve estimates of model parameters and state variables; to quantify uncertainties in ecological forecasting arising from observations, models and their interactions; and to evaluate information contributions of data and model toward short- and long-term forecasting of ecosystem responses to global change.

  1. PredicForex. A tool for a reliable market. Playing with currencies.

    Directory of Open Access Journals (Sweden)

    C. Cortés Velasco

    2009-12-01

    Full Text Available The Forex market is a very interesting market. Finding a suitable tool to forecast currency behavior will be of great interest. It is almost impossible to find a 100 % reliable tool. This market is like any other one, unpredictable. However we developed a very interesting tool that makes use of WebCrawler, data mining and web services to offer and forecast an advice to any user or broker.

  2. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    Science.gov (United States)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  3. Short-term droughts forecast using Markov chain model in Victoria, Australia

    Science.gov (United States)

    Rahmat, Siti Nazahiyah; Jayasuriya, Niranjali; Bhuiyan, Muhammed A.

    2017-07-01

    A comprehensive risk management strategy for dealing with drought should include both short-term and long-term planning. The objective of this paper is to present an early warning method to forecast drought using the Standardised Precipitation Index (SPI) and a non-homogeneous Markov chain model. A model such as this is useful for short-term planning. The developed method has been used to forecast droughts at a number of meteorological monitoring stations that have been regionalised into six (6) homogenous clusters with similar drought characteristics based on SPI. The non-homogeneous Markov chain model was used to estimate drought probabilities and drought predictions up to 3 months ahead. The drought severity classes defined using the SPI were computed at a 12-month time scale. The drought probabilities and the predictions were computed for six clusters that depict similar drought characteristics in Victoria, Australia. Overall, the drought severity class predicted was quite similar for all the clusters, with the non-drought class probabilities ranging from 49 to 57 %. For all clusters, the near normal class had a probability of occurrence varying from 27 to 38 %. For the more moderate and severe classes, the probabilities ranged from 2 to 13 % and 3 to 1 %, respectively. The developed model predicted drought situations 1 month ahead reasonably well. However, 2 and 3 months ahead predictions should be used with caution until the models are developed further.

  4. Improving operational flood forecasting through data assimilation

    Science.gov (United States)

    Rakovec, Oldrich; Weerts, Albrecht; Uijlenhoet, Remko; Hazenberg, Pieter; Torfs, Paul

    2010-05-01

    Accurate flood forecasts have been a challenging topic in hydrology for decades. Uncertainty in hydrological forecasts is due to errors in initial state (e.g. forcing errors in historical mode), errors in model structure and parameters and last but not least the errors in model forcings (weather forecasts) during the forecast mode. More accurate flood forecasts can be obtained through data assimilation by merging observations with model simulations. This enables to identify the sources of uncertainties in the flood forecasting system. Our aim is to assess the different sources of error that affect the initial state and to investigate how they propagate through hydrological models with different levels of spatial variation, starting from lumped models. The knowledge thus obtained can then be used in a data assimilation scheme to improve the flood forecasts. This study presents the first results of this framework and focuses on quantifying precipitation errors and its effect on discharge simulations within the Ourthe catchment (1600 km2), which is situated in the Belgian Ardennes and is one of the larger subbasins of the Meuse River. Inside the catchment, hourly rain gauge information from 10 different locations is available over a period of 15 years. Based on these time series, the bootstrap method has been applied to generate precipitation ensembles. These were then used to simulate the catchment's discharges at the outlet. The corresponding streamflow ensembles were further assimilated with observed river discharges to update the model states of lumped hydrological models (R-PDM, HBV) through Residual Resampling. This particle filtering technique is a sequential data assimilation method and takes no prior assumption of the probability density function for the model states, which in contrast to the Ensemble Kalman filter does not have to be Gaussian. Our further research will be aimed at quantifying and reducing the sources of uncertainty that affect the initial

  5. Space-time wind speed forecasting for improved power system dispatch

    KAUST Repository

    Zhu, Xinxin

    2014-02-27

    To support large-scale integration of wind power into electric energy systems, state-of-the-art wind speed forecasting methods should be able to provide accurate and adequate information to enable efficient, reliable, and cost-effective scheduling of wind power. Here, we incorporate space-time wind forecasts into electric power system scheduling. First, we propose a modified regime-switching, space-time wind speed forecasting model that allows the forecast regimes to vary with the dominant wind direction and with the seasons, hence avoiding a subjective choice of regimes. Then, results from the wind forecasts are incorporated into a power system economic dispatch model, the cost of which is used as a loss measure of the quality of the forecast models. This, in turn, leads to cost-effective scheduling of system-wide wind generation. Potential economic benefits arise from the system-wide generation of cost savings and from the ancillary service cost savings. We illustrate the economic benefits using a test system in the northwest region of the United States. Compared with persistence and autoregressive models, our model suggests that cost savings from integration of wind power could be on the scale of tens of millions of dollars annually in regions with high wind penetration, such as Texas and the Pacific northwest. © 2014 Sociedad de Estadística e Investigación Operativa.

  6. Interval forecasting of cyber-attacks on industrial control systems

    Science.gov (United States)

    Ivanyo, Y. M.; Krakovsky, Y. M.; Luzgin, A. N.

    2018-03-01

    At present, cyber-security issues of industrial control systems occupy one of the key niches in a state system of planning and management Functional disruption of these systems via cyber-attacks may lead to emergencies related to loss of life, environmental disasters, major financial and economic damage, or disrupted activities of cities and settlements. There is then an urgent need to develop protection methods against cyber-attacks. This paper studied the results of cyber-attack interval forecasting with a pre-set intensity level of cyber-attacks. Interval forecasting is the forecasting of one interval from two predetermined ones in which a future value of the indicator will be obtained. For this, probability estimates of these events were used. For interval forecasting, a probabilistic neural network with a dynamic updating value of the smoothing parameter was used. A dividing bound of these intervals was determined by a calculation method based on statistical characteristics of the indicator. The number of cyber-attacks per hour that were received through a honeypot from March to September 2013 for the group ‘zeppo-norcal’ was selected as the indicator.

  7. Coastal and Riverine Flood Forecast Model powered by ADCIRC

    Science.gov (United States)

    Khalid, A.; Ferreira, C.

    2017-12-01

    might provide better and more reliable forecast for the flood affected communities.

  8. Understanding Farmers’ Forecast Use from Their Beliefs, Values, Social Norms, and Perceived Obstacles

    Science.gov (United States)

    Hu, Qi; Pytlik Zillig, Lisa M.; Lynne, Gary D.; Tomkins, Alan J.; Waltman, William J.; Hayes, Michael J.; Hubbard, Kenneth G.; Artikov, Ikrom; Hoffman, Stacey J.; Wilhite, Donald A.

    2006-09-01

    Although the accuracy of weather and climate forecasts is continuously improving and new information retrieved from climate data is adding to the understanding of climate variation, use of the forecasts and climate information by farmers in farming decisions has changed little. This lack of change may result from knowledge barriers and psychological, social, and economic factors that undermine farmer motivation to use forecasts and climate information. According to the theory of planned behavior (TPB), the motivation to use forecasts may arise from personal attitudes, social norms, and perceived control or ability to use forecasts in specific decisions. These attributes are examined using data from a survey designed around the TPB and conducted among farming communities in the region of eastern Nebraska and the western U.S. Corn Belt. There were three major findings: 1) the utility and value of the forecasts for farming decisions as perceived by farmers are, on average, around 3.0 on a 0 7 scale, indicating much room to improve attitudes toward the forecast value. 2) The use of forecasts by farmers to influence decisions is likely affected by several social groups that can provide “expert viewpoints” on forecast use. 3) A major obstacle, next to forecast accuracy, is the perceived identity and reliability of the forecast makers. Given the rapidly increasing number of forecasts in this growing service business, the ambiguous identity of forecast providers may have left farmers confused and may have prevented them from developing both trust in forecasts and skills to use them. These findings shed light on productive avenues for increasing the influence of forecasts, which may lead to greater farming productivity. In addition, this study establishes a set of reference points that can be used for comparisons with future studies to quantify changes in forecast use and influence.

  9. Comparison of ensemble post-processing approaches, based on empirical and dynamical error modelisation of rainfall-runoff model forecasts

    Science.gov (United States)

    Chardon, J.; Mathevet, T.; Le Lay, M.; Gailhard, J.

    2012-04-01

    In the context of a national energy company (EDF : Electricité de France), hydro-meteorological forecasts are necessary to ensure safety and security of installations, meet environmental standards and improve water ressources management and decision making. Hydrological ensemble forecasts allow a better representation of meteorological and hydrological forecasts uncertainties and improve human expertise of hydrological forecasts, which is essential to synthesize available informations, coming from different meteorological and hydrological models and human experience. An operational hydrological ensemble forecasting chain has been developed at EDF since 2008 and is being used since 2010 on more than 30 watersheds in France. This ensemble forecasting chain is characterized ensemble pre-processing (rainfall and temperature) and post-processing (streamflow), where a large human expertise is solicited. The aim of this paper is to compare 2 hydrological ensemble post-processing methods developed at EDF in order improve ensemble forecasts reliability (similar to Monatanari &Brath, 2004; Schaefli et al., 2007). The aim of the post-processing methods is to dress hydrological ensemble forecasts with hydrological model uncertainties, based on perfect forecasts. The first method (called empirical approach) is based on a statistical modelisation of empirical error of perfect forecasts, by streamflow sub-samples of quantile class and lead-time. The second method (called dynamical approach) is based on streamflow sub-samples of quantile class and streamflow variation, and lead-time. On a set of 20 watersheds used for operational forecasts, results show that both approaches are necessary to ensure a good post-processing of hydrological ensemble, allowing a good improvement of reliability, skill and sharpness of ensemble forecasts. The comparison of the empirical and dynamical approaches shows the limits of the empirical approach which is not able to take into account hydrological

  10. Novel methodology for pharmaceutical expenditure forecast.

    Science.gov (United States)

    Vataire, Anne-Lise; Cetinsoy, Laurent; Aballéa, Samuel; Rémuzat, Cécile; Urbinati, Duccio; Kornfeld, Åsa; Mzoughi, Olfa; Toumi, Mondher

    2014-01-01

    The value appreciation of new drugs across countries today features a disruption that is making the historical data that are used for forecasting pharmaceutical expenditure poorly reliable. Forecasting methods rarely addressed uncertainty. The objective of this project was to propose a methodology to perform pharmaceutical expenditure forecasting that integrates expected policy changes and uncertainty (developed for the European Commission as the 'EU Pharmaceutical expenditure forecast'; see http://ec.europa.eu/health/healthcare/key_documents/index_en.htm). 1) Identification of all pharmaceuticals going off-patent and new branded medicinal products over a 5-year forecasting period in seven European Union (EU) Member States. 2) Development of a model to estimate direct and indirect impacts (based on health policies and clinical experts) on savings of generics and biosimilars. Inputs were originator sales value, patent expiry date, time to launch after marketing authorization, price discount, penetration rate, time to peak sales, and impact on brand price. 3) Development of a model for new drugs, which estimated sales progression in a competitive environment. Clinical expected benefits as well as commercial potential were assessed for each product by clinical experts. Inputs were development phase, marketing authorization dates, orphan condition, market size, and competitors. 4) Separate analysis of the budget impact of products going off-patent and new drugs according to several perspectives, distribution chains, and outcomes. 5) Addressing uncertainty surrounding estimations via deterministic and probabilistic sensitivity analysis. This methodology has proven to be effective by 1) identifying the main parameters impacting the variations in pharmaceutical expenditure forecasting across countries: generics discounts and penetration, brand price after patent loss, reimbursement rate, the penetration of biosimilars and discount price, distribution chains, and the time

  11. Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty

    Science.gov (United States)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge

  12. Statistical thunderstorm short time forecast for the Barranquilla airport

    International Nuclear Information System (INIS)

    Cardenas Posso, Yadira; Pabon Caicedo, Jose Daniel; Montoya Gaviria, Gerardo de Jesus

    2004-01-01

    Based on logistic regression, an approach to thunderstorm forecasting is proposed as well as a model for the Barranquilla (Colombia) city airport. With the analysis of both meteorological surface and height variables, such as thermodynamic indices that represent the physical processes involved in thunderstorm generation, the relationship between these variables and the occurrence of the phenomenon is brought out; the variables and indices with the greatest influence were identified and, with their use, the thunderstorm processes were summarized in a single mathematical function that allows the determination of the probability of occurrence or not occurrence of a thunderstorm on a specific day. That function was tested as a forecast tool for the Barranquilla airport

  13. Short-term wind power combined forecasting based on error forecast correction

    International Nuclear Information System (INIS)

    Liang, Zhengtang; Liang, Jun; Wang, Chengfu; Dong, Xiaoming; Miao, Xiaofeng

    2016-01-01

    Highlights: • The correlation relationships of short-term wind power forecast errors are studied. • The correlation analysis method of the multi-step forecast errors is proposed. • A strategy selecting the input variables for the error forecast models is proposed. • Several novel combined models based on error forecast correction are proposed. • The combined models have improved the short-term wind power forecasting accuracy. - Abstract: With the increasing contribution of wind power to electric power grids, accurate forecasting of short-term wind power has become particularly valuable for wind farm operators, utility operators and customers. The aim of this study is to investigate the interdependence structure of errors in short-term wind power forecasting that is crucial for building error forecast models with regression learning algorithms to correct predictions and improve final forecasting accuracy. In this paper, several novel short-term wind power combined forecasting models based on error forecast correction are proposed in the one-step ahead, continuous and discontinuous multi-step ahead forecasting modes. First, the correlation relationships of forecast errors of the autoregressive model, the persistence method and the support vector machine model in various forecasting modes have been investigated to determine whether the error forecast models can be established by regression learning algorithms. Second, according to the results of the correlation analysis, the range of input variables is defined and an efficient strategy for selecting the input variables for the error forecast models is proposed. Finally, several combined forecasting models are proposed, in which the error forecast models are based on support vector machine/extreme learning machine, and correct the short-term wind power forecast values. The data collected from a wind farm in Hebei Province, China, are selected as a case study to demonstrate the effectiveness of the proposed

  14. Research and Application of Hybrid Forecasting Model Based on an Optimal Feature Selection System—A Case Study on Electrical Load Forecasting

    Directory of Open Access Journals (Sweden)

    Yunxuan Dong

    2017-04-01

    Full Text Available The process of modernizing smart grid prominently increases the complexity and uncertainty in scheduling and operation of power systems, and, in order to develop a more reliable, flexible, efficient and resilient grid, electrical load forecasting is not only an important key but is still a difficult and challenging task as well. In this paper, a short-term electrical load forecasting model, with a unit for feature learning named Pyramid System and recurrent neural networks, has been developed and it can effectively promote the stability and security of the power grid. Nine types of methods for feature learning are compared in this work to select the best one for learning target, and two criteria have been employed to evaluate the accuracy of the prediction intervals. Furthermore, an electrical load forecasting method based on recurrent neural networks has been formed to achieve the relational diagram of historical data, and, to be specific, the proposed techniques are applied to electrical load forecasting using the data collected from New South Wales, Australia. The simulation results show that the proposed hybrid models can not only satisfactorily approximate the actual value but they are also able to be effective tools in the planning of smart grids.

  15. Incorporating operational experience and design changes in availability forecasts

    International Nuclear Information System (INIS)

    Norman, D.

    1988-01-01

    Reliability or availability forecasts which are based solely on past operating experience will be precise if the sample is large enough, and unbiased if nothing in the future design, environment, operating region or anything else changes. Unfortunately, life is never like that. This paper considers the methodology and philosophy of modifying forecasts based on past experience to take account also of changes in design, construction methods, operating philosophy, environments, operator training and so on, between the plants which provided the operating experience and the plant for which the forecast is being made. This emphasises the importance of collecting, assessing, and learning from past data and of a thorough knowledge of future designs, and procurement, operation, and maintenance policies. The difference between targets and central estimates is also discussed. The paper concludes that improvements in future availability can be made by learning from past experience, but that certain conditions must be fulfilled in order to do so. (author)

  16. Geomagnetic storm forecasting service StormFocus: 5 years online

    Science.gov (United States)

    Podladchikova, Tatiana; Petrukovich, Anatoly; Yermolaev, Yuri

    2018-04-01

    Forecasting geomagnetic storms is highly important for many space weather applications. In this study, we review performance of the geomagnetic storm forecasting service StormFocus during 2011-2016. The service was implemented in 2011 at SpaceWeather.Ru and predicts the expected strength of geomagnetic storms as measured by Dst index several hours ahead. The forecast is based on L1 solar wind and IMF measurements and is updated every hour. The solar maximum of cycle 24 is weak, so most of the statistics are on rather moderate storms. We verify quality of selection criteria, as well as reliability of real-time input data in comparison with the final values, available in archives. In real-time operation 87% of storms were correctly predicted while the reanalysis running on final OMNI data predicts successfully 97% of storms. Thus the main reasons for prediction errors are discrepancies between real-time and final data (Dst, solar wind and IMF) due to processing errors, specifics of datasets.

  17. Mean-term forecast of coke production in the world

    International Nuclear Information System (INIS)

    Ukhmylova, G.S.

    1996-01-01

    The causes of decrease in consumption of metallurgical coke in the world in the ninetieth and at the present time are analyzed. Reduction of reliable coke supply sources to the world market is noted. The data on the coke import and export in the world in 1990-1994 are presented and corresponding forecasts for 2000 and 2005 are given

  18. Adaptive calibration of (u,v)‐wind ensemble forecasts

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    of sufficient reliability. The original framework introduced here allows for an adaptive bivariate calibration of these ensemble forecasts. The originality of this methodology lies in the fact that calibrated ensembles still consist of a set of (space–time) trajectories, after translation and dilation...... of translation and dilation factors are discussed. Copyright © 2012 Royal Meteorological Society...

  19. Fuel cycle forecasting - there are forecasts and there are forecasts

    International Nuclear Information System (INIS)

    Puechl, K.H.

    1975-01-01

    The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis. (author)

  20. Fuel cycle forecasting - there are forecasts and there are forecasts

    Energy Technology Data Exchange (ETDEWEB)

    Puechl, K H

    1975-12-01

    The FORECAST-NUCLEAR computer program described recognizes that forecasts are made to answer a variety of questions and, therefore, that no single forecast is universally appropriate. Also, it recognizes that no two individuals will completely agree as to the input data that are appropriate for obtaining an answer to even a single simple question. Accordingly, the program was written from a utilitarian standpoint: it allows working with multiple projections; data inputting is simple to allow game-playing; computation time is short to minimize the cost of 'what if' assessements; and detail is internally carried to allow meaningful analysis.

  1. Predicting critical transitions in dynamical systems from time series using nonstationary probability density modeling.

    Science.gov (United States)

    Kwasniok, Frank

    2013-11-01

    A time series analysis method for predicting the probability density of a dynamical system is proposed. A nonstationary parametric model of the probability density is estimated from data within a maximum likelihood framework and then extrapolated to forecast the future probability density and explore the system for critical transitions or tipping points. A full systematic account of parameter uncertainty is taken. The technique is generic, independent of the underlying dynamics of the system. The method is verified on simulated data and then applied to prediction of Arctic sea-ice extent.

  2. Wave ensemble forecast in the Western Mediterranean Sea, application to an early warning system.

    Science.gov (United States)

    Pallares, Elena; Hernandez, Hector; Moré, Jordi; Espino, Manuel; Sairouni, Abdel

    2015-04-01

    The Western Mediterranean Sea is a highly heterogeneous and variable area, as is reflected on the wind field, the current field, and the waves, mainly in the first kilometers offshore. As a result of this variability, the wave forecast in these regions is quite complicated to perform, usually with some accuracy problems during energetic storm events. Moreover, is in these areas where most of the economic activities take part, including fisheries, sailing, tourism, coastal management and offshore renewal energy platforms. In order to introduce an indicator of the probability of occurrence of the different sea states and give more detailed information of the forecast to the end users, an ensemble wave forecast system is considered. The ensemble prediction systems have already been used in the last decades for the meteorological forecast; to deal with the uncertainties of the initial conditions and the different parametrizations used in the models, which may introduce some errors in the forecast, a bunch of different perturbed meteorological simulations are considered as possible future scenarios and compared with the deterministic forecast. In the present work, the SWAN wave model (v41.01) has been implemented for the Western Mediterranean sea, forced with wind fields produced by the deterministic Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS). The wind fields includes a deterministic forecast (also named control), between 11 and 21 ensemble members, and some intelligent member obtained from the ensemble, as the mean of all the members. Four buoys located in the study area, moored in coastal waters, have been used to validate the results. The outputs include all the time series, with a forecast horizon of 8 days and represented in spaghetti diagrams, the spread of the system and the probability at different thresholds. The main goal of this exercise is to be able to determine the degree of the uncertainty of the wave forecast, meaningful

  3. Initial phase of the development of sunspot groups and their forecast

    International Nuclear Information System (INIS)

    Berlyand, B.O.; Burov, V.A.; Stepanyan, N.N.

    1979-01-01

    Some characteristics of the initial phase of sunspot groups and their forecast have been considered. Experimental data on 340 sunspot groups were obtained in 1967-1969. It was found that oscillations of the magnetic flux in the groups indicate the possibility of the existence of typical periods (2 and 4 days) of the magnetic field development. Most of the groups appears in young plages. The probability of the protons injection from the young groups is very small. The typical time of the development of the proton centre is 10-30 days. The characteristics of the group on the first day of its existence are vaguely connected with the lifetime of the group. On the second and third days the magnetic characteristics (the summary magnetic flux and the number of the unipolar regions) have the highest correlation coefficient (approximately 70%) with the lifetime of the group. The problem of the group lifetime forecast was being solved with the pattern recognition technique. On the base of the second day observation of the existence of the group verification of the received forecast 14% exceeds the verification of the climatological forecast. The forecast of the Zurich class with the same technique is effective beginning with the fifth day of the group existence and the forecast of the flare activity of the group since the day of its appearance. The exceeding of the verification as compared with the climatological forecasts in these problems is 10% and 8% accordingly

  4. Forecasting Housing Approvals in Australia: Do Forecasters Herd?

    DEFF Research Database (Denmark)

    Stadtmann, Georg; Pierdzioch; Rülke

    2012-01-01

    Price trends in housing markets may reflect herding of market participants. A natural question is whether such herding, to the extent that it occurred, reflects herding in forecasts of professional forecasters. Using more than 6,000 forecasts of housing approvals for Australia, we did not find...

  5. Probability based load combinations for design of category I structures

    International Nuclear Information System (INIS)

    Reich, M.; Hwang, H.

    1985-01-01

    This paper discusses a reliability analysis method and a procedure for developing the load combination design criteria for category I structures. For safety evaluation of category I concrete structures under various static and dynamic loads, a probability-based reliability analysis method has been developed. This reliability analysis method is also used as a tool for determining the load factors for design of category I structures. In this paper, the load combinations for design of concrete containments, corresponding to a target limit state probability of 1.0 x 10 -6 in 4 years, are described. A comparison of containments designed using the ASME code and the proposed design criteria is also presented

  6. Short-term ensemble radar rainfall forecasts for hydrological applications

    Science.gov (United States)

    Codo de Oliveira, M.; Rico-Ramirez, M. A.

    2016-12-01

    Flooding is a very common natural disaster around the world, putting local population and economy at risk. Forecasting floods several hours ahead and issuing warnings are of main importance to permit proper response in emergency situations. However, it is important to know the uncertainties related to the rainfall forecasting in order to produce more reliable forecasts. Nowcasting models (short-term rainfall forecasts) are able to produce high spatial and temporal resolution predictions that are useful in hydrological applications. Nonetheless, they are subject to uncertainties mainly due to the nowcasting model used, errors in radar rainfall estimation, temporal development of the velocity field and to the fact that precipitation processes such as growth and decay are not taken into account. In this study an ensemble generation scheme using rain gauge data as a reference to estimate radars errors is used to produce forecasts with up to 3h lead-time. The ensembles try to assess in a realistic way the residual uncertainties that remain even after correction algorithms are applied in the radar data. The ensembles produced are compered to a stochastic ensemble generator. Furthermore, the rainfall forecast output was used as an input in a hydrodynamic sewer network model and also in hydrological model for catchments of different sizes in north England. A comparative analysis was carried of how was carried out to assess how the radar uncertainties propagate into these models. The first named author is grateful to CAPES - Ciencia sem Fronteiras for funding this PhD research.

  7. Forecasting freight flows

    DEFF Research Database (Denmark)

    Lyk-Jensen, Stéphanie

    2011-01-01

    Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...... constitute a valuable input to freight models for forecasting future capacity problems.......Trade patterns and transport markets are changing as a result of the growth and globalization of international trade, and forecasting future freight flow has to rely on trade forecasts. Forecasting freight flows is critical for matching infrastructure supply to demand and for assessing investment...

  8. Short-term Forecast of Automatic Frequency Restoration Reserve from a Renewable Energy Based Virtual Power Plant

    OpenAIRE

    Camal , Simon; Michiorri , Andrea; Kariniotakis , Georges; Liebelt , Andreas

    2017-01-01

    International audience; This paper presents the initial findings on a new forecast approach for ancillary services delivered by aggregated renewable power plants. The increasing penetration of distributed variable generators challenges grid reliability. Wind and photovoltaic power plants are technically able to provide ancillary services, but their stochastic behavior currently impedes their integration into reserve mechanisms. A methodology is developed to forecast the flexibility that a win...

  9. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    Science.gov (United States)

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  10. Robust forecast comparison

    OpenAIRE

    Jin, Sainan; Corradi, Valentina; Swanson, Norman

    2015-01-01

    Forecast accuracy is typically measured in terms of a given loss function. However, as a consequence of the use of misspecified models in multiple model comparisons, relative forecast rankings are loss function dependent. This paper addresses this issue by using a novel criterion for forecast evaluation which is based on the entire distribution of forecast errors. We introduce the concepts of general-loss (GL) forecast superiority and convex-loss (CL) forecast superiority, and we establish a ...

  11. Flood Risk Assessment and Forecasting for the Ganges-Brahmaputra-Meghna River Basins

    Science.gov (United States)

    Hopson, T. M.; Priya, S.; Young, W.; Avasthi, A.; Clayton, T. D.; Brakenridge, G. R.; Birkett, C. M.; Riddle, E. E.; Broman, D.; Boehnert, J.; Sampson, K. M.; Kettner, A.; Singh, D.

    2017-12-01

    During the 2017 South Asia monsoon, torrential rains and catastrophic floods affected more than 45 million people, including 16 million children, across the Ganges-Brahmaputra-Meghna (GBM) basins. The basin is recognized as one of the world's most disaster-prone regions, with severe floods occurring almost annually causing extreme loss of life and property. In light of this vulnerability, the World Bank and collaborators have contributed toward reducing future flood impacts through recent developments to improve operational preparedness for such events, as well as efforts in more general preparedness and resilience building through planning based on detailed risk assessments. With respect to improved event-specific flood preparedness through operational warnings, we discuss a new forecasting system that provides probability-based flood forecasts developed for more than 85 GBM locations. Forecasts are available online, along with near-real-time data maps of rainfall (predicted and actual) and river levels. The new system uses multiple data sets and multiple models to enhance forecasting skill, and provides improved forecasts up to 16 days in advance of the arrival of high waters. These longer lead times provide the opportunity to save both lives and livelihoods. With sufficient advance notice, for example, farmers can harvest a threatened rice crop or move vulnerable livestock to higher ground. Importantly, the forecasts not only predict future water levels but indicate the level of confidence in each forecast. Knowing whether the probability of a danger-level flood is 10 percent or 90 percent helps people to decide what, if any, action to take. With respect to efforts in general preparedness and resilience building, we also present a recent flood risk assessment, and how it provides, for the first time, a numbers-based view of the impacts of different size floods across the Ganges basin. The findings help identify priority areas for tackling flood risks (for

  12. Forecasting deflation, intrusion and eruption at inflating volcanoes

    Science.gov (United States)

    Blake, Stephen; Cortés, Joaquín A.

    2018-01-01

    A principal goal of volcanology is to successfully forecast the start of volcanic eruptions. This paper introduces a general forecasting method, which relies on a stream of monitoring data and a statistical description of a given threshold criterion for an eruption to start. Specifically we investigate the timing of intrusive and eruptive events at inflating volcanoes. The gradual inflation of the ground surface is a well-known phenomenon at many volcanoes and is attributable to pressurised magma accumulating within a shallow chamber. Inflation usually culminates in a rapid deflation event caused by magma escaping from the chamber to produce a shallow intrusion and, in some cases, a volcanic eruption. We show that the ground elevation during 15 inflation periods at Krafla volcano, Iceland, increased with time towards a limiting value by following a decaying exponential with characteristic timescale τ. The available data for Krafla, Kilauea and Mauna Loa volcanoes show that the duration of inflation (t*) is approximately equal to τ. The distribution of t* / τ values follows a log-logistic distribution in which the central 60% of the data lie between 0.99 deflation event starting during a specified time interval to be estimated. The time window in which there is a specified probability of deflation starting can also be forecast, and forecasts can be updated after each new deformation measurement. The method provides stronger forecasts than one based on the distribution of repose times alone and is transferable to other types of monitoring data and/or other patterns of pre-eruptive unrest.

  13. A Simple and Effective Approach for the Prediction of Turbine Power Production From Wind Speed Forecast

    Directory of Open Access Journals (Sweden)

    Marino Marrocu

    2017-11-01

    Full Text Available An accurate forecast of the power generated by a wind turbine is of paramount importance for its optimal exploitation. Several forecasting methods have been proposed either based on a physical modeling or using a statistical approach. All of them rely on the availability of high quality measures of local wind speed, corresponding generated power and on numerical weather forecasts. In this paper, a simple and effective wind power forecast technique, based on the probability distribution mapping of wind speed forecast and observed power data, is presented and it is applied to two turbines located on the island of Borkum (Germany in the North Sea. The wind speed forecast of the ECMWF model at 100 m from the ground is used as the prognostic meteorological parameter. Training procedures are based entirely on relatively short time series of power measurements. Results show that our approach has skills that are similar or better than those obtained using more standard methods when measured with mean absolute error.

  14. On multivariate imputation and forecasting of decadal wind speed missing data.

    Science.gov (United States)

    Wesonga, Ronald

    2015-01-01

    This paper demonstrates the application of multiple imputations by chained equations and time series forecasting of wind speed data. The study was motivated by the high prevalence of missing wind speed historic data. Findings based on the fully conditional specification under multiple imputations by chained equations, provided reliable wind speed missing data imputations. Further, the forecasting model shows, the smoothing parameter, alpha (0.014) close to zero, confirming that recent past observations are more suitable for use to forecast wind speeds. The maximum decadal wind speed for Entebbe International Airport was estimated to be 17.6 metres per second at a 0.05 level of significance with a bound on the error of estimation of 10.8 metres per second. The large bound on the error of estimations confirms the dynamic tendencies of wind speed at the airport under study.

  15. Use of MLCM3 Software for Flash Flood Modeling and Forecasting

    Directory of Open Access Journals (Sweden)

    Inna Pivovarova

    2018-01-01

    Full Text Available Accurate and timely flash floods forecasting, especially, in ungauged and poorly gauged basins, is one of the most important and challenging problems to be solved by the international hydrological community. In changing climate and variable anthropogenic impact on river basins, as well as due to low density of surface hydrometeorological network, flash flood forecasting based on “traditional” physically based, or conceptual, or statistical hydrological models often becomes inefficient. Unfortunately, most of river basins in Russia are poorly gauged or ungauged; besides, lack of hydrogeological data is quite typical. However, the developing economy and population safety necessitate issuing warnings based on reliable forecasts. For this purpose, a new hydrological model, MLCM3 (Multi-Layer Conceptual Model, 3 rd generation has been developed in the Russian State Hydrometeorological University. The model showed good results in more than 50 tested basins.

  16. Forecasting in Complex Systems

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2014-12-01

    Complex nonlinear systems are typically characterized by many degrees of freedom, as well as interactions between the elements. Interesting examples can be found in the areas of earthquakes and finance. In these two systems, fat tails play an important role in the statistical dynamics. For earthquake systems, the Gutenberg-Richter magnitude-frequency is applicable, whereas for daily returns for the securities in the financial markets are known to be characterized by leptokurtotic statistics in which the tails are power law. Very large fluctuations are present in both systems. In earthquake systems, one has the example of great earthquakes such as the M9.1, March 11, 2011 Tohoku event. In financial systems, one has the example of the market crash of October 19, 1987. Both were largely unexpected events that severely impacted the earth and financial systems systemically. Other examples include the M9.3 Andaman earthquake of December 26, 2004, and the Great Recession which began with the fall of Lehman Brothers investment bank on September 12, 2013. Forecasting the occurrence of these damaging events has great societal importance. In recent years, national funding agencies in a variety of countries have emphasized the importance of societal relevance in research, and in particular, the goal of improved forecasting technology. Previous work has shown that both earthquakes and financial crashes can be described by a common Landau-Ginzburg-type free energy model. These metastable systems are characterized by fat tail statistics near the classical spinodal. Correlations in these systems can grow and recede, but do not imply causation, a common source of misunderstanding. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In this talk, we describe the basic phenomenology of these systems and emphasize their similarities and differences. We also consider the problem of forecast validation and verification

  17. Ecological Forecasting in the Applied Sciences Program and Input to the Decadal Survey

    Science.gov (United States)

    Skiles, Joseph

    2015-01-01

    Ecological forecasting uses knowledge of physics, ecology and physiology to predict how ecosystems will change in the future in response to environmental factors. Further, Ecological Forecasting employs observations and models to predict the effects of environmental change on ecosystems. In doing so, it applies information from the physical, biological, and social sciences and promotes a scientific synthesis across the domains of physics, geology, chemistry, biology, and psychology. The goal is reliable forecasts that allow decision makers access to science-based tools in order to project changes in living systems. The next decadal survey will direct the development Earth Observation sensors and satellites for the next ten years. It is important that these new sensors and satellites address the requirements for ecosystem models, imagery, and other data for resource management. This presentation will give examples of these model inputs and some resources needed for NASA to continue effective Ecological Forecasting.

  18. Optimal Release Time and Sensitivity Analysis Using a New NHPP Software Reliability Model with Probability of Fault Removal Subject to Operating Environments

    Directory of Open Access Journals (Sweden)

    Kwang Yoon Song

    2018-05-01

    Full Text Available With the latest technological developments, the software industry is at the center of the fourth industrial revolution. In today’s complex and rapidly changing environment, where software applications must be developed quickly and easily, software must be focused on rapidly changing information technology. The basic goal of software engineering is to produce high-quality software at low cost. However, because of the complexity of software systems, software development can be time consuming and expensive. Software reliability models (SRMs are used to estimate and predict the reliability, number of remaining faults, failure intensity, total and development cost, etc., of software. Additionally, it is very important to decide when, how, and at what cost to release the software to users. In this study, we propose a new nonhomogeneous Poisson process (NHPP SRM with a fault detection rate function affected by the probability of fault removal on failure subject to operating environments and discuss the optimal release time and software reliability with the new NHPP SRM. The example results show a good fit to the proposed model, and we propose an optimal release time for a given change in the proposed model.

  19. USA Nutrient managment forecasting via the "Fertilizer Forecaster": linking surface runnof, nutrient application and ecohydrology.

    Science.gov (United States)

    Drohan, Patrick; Buda, Anthony; Kleinman, Peter; Miller, Douglas; Lin, Henry; Beegle, Douglas; Knight, Paul

    2017-04-01

    USA and state nutrient management planning offers strategic guidance that strives to educate farmers and those involved in nutrient management to make wise management decisions. A goal of such programs is to manage hotspots of water quality degradation that threaten human and ecosystem health, water and food security. The guidance provided by nutrient management plans does not provide the day-to-day support necessary to make operational decisions, particularly when and where to apply nutrients over the short term. These short-term decisions on when and where to apply nutrients often make the difference between whether the nutrients impact water quality or are efficiently utilized by crops. Infiltrating rainfall events occurring shortly after broadcast nutrient applications are beneficial, given they will wash soluble nutrients into the soil where they are used by crops. Rainfall events that generate runoff shortly after nutrients are broadcast may wash off applied nutrients, and produce substantial nutrient losses from that site. We are developing a model and data based support tool for nutrient management, the Fertilizer Forecaster, which identifies the relative probability of runoff or infiltrating events in Pennsylvania (PA) landscapes in order to improve water quality. This tool will support field specific decisions by farmers and land managers on when and where to apply fertilizers and manures over 24, 48 and 72 hour periods. Our objectives are to: (1) monitor agricultural hillslopes in watersheds representing four of the five Physiographic Provinces of the Chesapeake Bay basin; (2) validate a high resolution mapping model that identifies soils prone to runoff; (3) develop an empirically based approach to relate state-of-the-art weather forecast variables to site-specific rainfall infiltration or runoff occurrence; (4) test the empirical forecasting model against alternative approaches to forecasting runoff occurrence; and (5) recruit farmers from the four

  20. Using data-driven agent-based models for forecasting emerging infectious diseases

    Directory of Open Access Journals (Sweden)

    Srinivasan Venkatramanan

    2018-03-01

    Full Text Available Producing timely, well-informed and reliable forecasts for an ongoing epidemic of an emerging infectious disease is a huge challenge. Epidemiologists and policy makers have to deal with poor data quality, limited understanding of the disease dynamics, rapidly changing social environment and the uncertainty on effects of various interventions in place. Under this setting, detailed computational models provide a comprehensive framework for integrating diverse data sources into a well-defined model of disease dynamics and social behavior, potentially leading to better understanding and actions. In this paper, we describe one such agent-based model framework developed for forecasting the 2014–2015 Ebola epidemic in Liberia, and subsequently used during the Ebola forecasting challenge. We describe the various components of the model, the calibration process and summarize the forecast performance across scenarios of the challenge. We conclude by highlighting how such a data-driven approach can be refined and adapted for future epidemics, and share the lessons learned over the course of the challenge. Keywords: Emerging infectious diseases, Agent-based models, Simulation optimization, Bayesian calibration, Ebola

  1. Fine-Tuning Nonhomogeneous Regression for Probabilistic Precipitation Forecasts: Unanimous Predictions, Heavy Tails, and Link Functions

    DEFF Research Database (Denmark)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.

    2017-01-01

    functions for the optimization of regression coefficients for the scale parameter. These three refinements are tested for 10 stations in a small area of the European Alps for lead times from +24 to +144 h and accumulation periods of 24 and 6 h. Together, they improve probabilistic forecasts...... to obtain automatically corrected weather forecasts. This study applies the nonhomogenous regression framework as a state-of-the-art ensemble postprocessing technique to predict a full forecast distribution and improves its forecast performance with three statistical refinements. First of all, a novel split...... for precipitation amounts as well as the probability of precipitation events over the default postprocessing method. The improvements are largest for the shorter accumulation periods and shorter lead times, where the information of unanimous ensemble predictions is more important....

  2. Probabilistic eruption forecasting at short and long time scales

    Science.gov (United States)

    Marzocchi, Warner; Bebbington, Mark S.

    2012-10-01

    Any effective volcanic risk mitigation strategy requires a scientific assessment of the future evolution of a volcanic system and its eruptive behavior. Some consider the onus should be on volcanologists to provide simple but emphatic deterministic forecasts. This traditional way of thinking, however, does not deal with the implications of inherent uncertainties, both aleatoric and epistemic, that are inevitably present in observations, monitoring data, and interpretation of any natural system. In contrast to deterministic predictions, probabilistic eruption forecasting attempts to quantify these inherent uncertainties utilizing all available information to the extent that it can be relied upon and is informative. As with many other natural hazards, probabilistic eruption forecasting is becoming established as the primary scientific basis for planning rational risk mitigation actions: at short-term (hours to weeks or months), it allows decision-makers to prioritize actions in a crisis; and at long-term (years to decades), it is the basic component for land use and emergency planning. Probabilistic eruption forecasting consists of estimating the probability of an eruption event and where it sits in a complex multidimensional time-space-magnitude framework. In this review, we discuss the key developments and features of models that have been used to address the problem.

  3. Forecasting Skill

    Science.gov (United States)

    1981-01-01

    for the third and fourth day precipitation forecasts. A marked improvement was shown for the consensus 24 hour precipitation forecast, and small... Zuckerberg (1980) found a small long term skill increase in forecasts of heavy snow events for nine eastern cities. Other National Weather Service...and maximum temperature) are each awarded marks 2, 1, or 0 according to whether the forecast is correct, 8 - *- -**■*- ———"—- - -■ t0m 1 MM—IB I

  4. Recent advances in flood forecasting and flood risk assessment

    Directory of Open Access Journals (Sweden)

    G. Arduino

    2005-01-01

    Full Text Available Recent large floods in Europe have led to increased interest in research and development of flood forecasting systems. Some of these events have been provoked by some of the wettest rainfall periods on record which has led to speculation that such extremes are attributable in some measure to anthropogenic global warming and represent the beginning of a period of higher flood frequency. Whilst current trends in extreme event statistics will be difficult to discern, conclusively, there has been a substantial increase in the frequency of high floods in the 20th century for basins greater than 2x105 km2. There is also increasing that anthropogenic forcing of climate change may lead to an increased probability of extreme precipitation and, hence, of flooding. There is, therefore, major emphasis on the improvement of operational flood forecasting systems in Europe, with significant European Community spending on research and development on prototype forecasting systems and flood risk management projects. This Special Issue synthesises the most relevant scientific and technological results presented at the International Conference on Flood Forecasting in Europe held in Rotterdam from 3-5 March 2003. During that meeting 150 scientists, forecasters and stakeholders from four continents assembled to present their work and current operational best practice and to discuss future directions of scientific and technological efforts in flood prediction and prevention. The papers presented at the conference fall into seven themes, as follows.

  5. Energy operations and planning decision support for systems using weather forecast information

    International Nuclear Information System (INIS)

    Altalo, M.G.

    2004-01-01

    Hydroelectric utilities deal with uncertainties on a regular basis. These include uncertainties in weather, policy and markets. This presentation outlined regional studies to define uncertainty, sources of uncertainty and their affect on power managers, power marketers, power insurers and end users. Solutions to minimize uncertainties include better forecasting and better business processes to mobilize action. The main causes of uncertainty in energy operations and planning include uncaptured wind, precipitation and wind events. Load model errors also contribute to uncertainty in energy operations. This presentation presented the results of a 2002-2003 study conducted by the National Oceanic and Atmospheric Administration (NOAA) on the impact uncertainties in northeast energy weather forecasts. The study demonstrated the cost of seabreeze error on transmission and distribution. The impact of climate scale events were also presented along with energy demand implications. It was suggested that energy planners should incorporate climate change parameters into planning, and that models should include probability distribution forecasts and ensemble forecasting methods that incorporate microclimate estimates. It was also suggested that seabreeze, lake effect, fog, afternoon thunderstorms and frontal passage should be incorporated into forecasts. tabs., figs

  6. Forecasting oil price trends using wavelets and hidden Markov models

    International Nuclear Information System (INIS)

    Souza e Silva, Edmundo G. de; Souza e Silva, Edmundo A. de; Legey, Luiz F.L.

    2010-01-01

    The crude oil price is influenced by a great number of factors, most of which interact in very complex ways. For this reason, forecasting it through a fundamentalist approach is a difficult task. An alternative is to use time series methodologies, with which the price's past behavior is conveniently analyzed, and used to predict future movements. In this paper, we investigate the usefulness of a nonlinear time series model, known as hidden Markov model (HMM), to predict future crude oil price movements. Using an HMM, we develop a forecasting methodology that consists of, basically, three steps. First, we employ wavelet analysis to remove high frequency price movements, which can be assumed as noise. Then, the HMM is used to forecast the probability distribution of the price return accumulated over the next F days. Finally, from this distribution, we infer future price trends. Our results indicate that the proposed methodology might be a useful decision support tool for agents participating in the crude oil market. (author)

  7. Forecasting Natural Rubber Price In Malaysia Using Arima

    Science.gov (United States)

    Zahari, Fatin Z.; Khalid, Kamil; Roslan, Rozaini; Sufahani, Suliadi; Mohamad, Mahathir; Saifullah Rusiman, Mohd; Ali, Maselan

    2018-04-01

    This paper contains introduction, materials and methods, results and discussions, conclusions and references. Based on the title mentioned, high volatility of the price of natural rubber nowadays will give the significant risk to the producers, traders, consumers, and others parties involved in the production of natural rubber. To help them in making decisions, forecasting is needed to predict the price of natural rubber. The main objective of the research is to forecast the upcoming price of natural rubber by using the reliable statistical method. The data are gathered from Malaysia Rubber Board which the data are from January 2000 until December 2015. In this research, average monthly price of Standard Malaysia Rubber 20 (SMR20) will be forecast by using Box-Jenkins approach. Time series plot is used to determine the pattern of the data. The data have trend pattern which indicates the data is non-stationary data and the data need to be transformed. By using the Box-Jenkins method, the best fit model for the time series data is ARIMA (1, 1, 0) which this model satisfy all the criteria needed. Hence, ARIMA (1, 1, 0) is the best fitted model and the model will be used to forecast the average monthly price of Standard Malaysia Rubber 20 (SMR20) for twelve months ahead.

  8. Evaluating information in multiple horizon forecasts. The DOE's energy price forecasts

    International Nuclear Information System (INIS)

    Sanders, Dwight R.; Manfredo, Mark R.; Boris, Keith

    2009-01-01

    The United States Department of Energy's (DOE) quarterly price forecasts for energy commodities are examined to determine the incremental information provided at the one-through four-quarter forecast horizons. A direct test for determining information content at alternative forecast horizons, developed by Vuchelen and Gutierrez [Vuchelen, J. and Gutierrez, M.-I. 'A Direct Test of the Information Content of the OECD Growth Forecasts.' International Journal of Forecasting. 21(2005):103-117.], is used. The results suggest that the DOE's price forecasts for crude oil, gasoline, and diesel fuel do indeed provide incremental information out to three-quarters ahead, while natural gas and electricity forecasts are informative out to the four-quarter horizon. In contrast, the DOE's coal price forecasts at two-, three-, and four-quarters ahead provide no incremental information beyond that provided for the one-quarter horizon. Recommendations of how to use these results for making forecast adjustments is also provided. (author)

  9. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  10. Forecasting forest chip energy production in Finland 2008-2014

    International Nuclear Information System (INIS)

    Linden, Mikael

    2011-01-01

    Energy policy measures aim to increase energy production from forest chips in Finland to 10 TWh by year 2010. However, on the regional level production differences are large, and the regional estimates of the potential base of raw materials for the production of forest chips are heterogeneous. In order to analyse the validity of the above target, two methods are proposed to derive forecasts for region-level energy production from forest chips in Finland in the years 2008-2014. The plant-level data from 2003-2007 gives a starting point for a detailed statistical analysis of present and future region-level forest chip production. Observed 2008 regional levels are above the estimated prediction 95% confidence intervals based on aggregation of plant-level time averages. A simple time trend model with fixed-region effects provides accurate forecasts for the years 2008-2014. Forest chip production forecast confidence intervals cover almost all regions for the 2008 levels and the estimates of potential production levels for 2014. The forecast confidence intervals are also derived with re-sampling methods, i.e. with bootstrap methods, to obtain more reliable results. Results confirm that a general materials shortfall is not expected in the near future for forest chip energy production in Finland.

  11. Reliability evaluation of the ECCS of LWR No.2

    International Nuclear Information System (INIS)

    Tsujimura, Yasuhiro; Suzuki, Eiji

    1987-01-01

    In this paper, a new characteristic function of probability importance is proposed and discussed. The function represents overall characteristics of the system reliability relating to a failure probability of each system component. Further, results of evaluation brought about by the method for practical system reliability design are shown. (author)

  12. Reliability and continuous regeneration model

    Directory of Open Access Journals (Sweden)

    Anna Pavlisková

    2006-06-01

    Full Text Available The failure-free function of an object is very important for the service. This leads to the interest in the determination of the object reliability and failure intensity. The reliability of an element is defined by the theory of probability.The element durability T is a continuous random variate with the probability density f. The failure intensity (tλ is a very important reliability characteristics of the element. Often it is an increasing function, which corresponds to the element ageing. We disposed of the data about a belt conveyor failures recorded during the period of 90 months. The given ses behaves according to the normal distribution. By using a mathematical analysis and matematical statistics, we found the failure intensity function (tλ. The function (tλ increases almost linearly.

  13. The Next Level in Automated Solar Flare Forecasting: the EU FLARECAST Project

    Science.gov (United States)

    Georgoulis, M. K.; Bloomfield, D.; Piana, M.; Massone, A. M.; Gallagher, P.; Vilmer, N.; Pariat, E.; Buchlin, E.; Baudin, F.; Csillaghy, A.; Soldati, M.; Sathiapal, H.; Jackson, D.; Alingery, P.; Argoudelis, V.; Benvenuto, F.; Campi, C.; Florios, K.; Gontikakis, C.; Guennou, C.; Guerra, J. A.; Kontogiannis, I.; Latorre, V.; Murray, S.; Park, S. H.; Perasso, A.; Sciacchitano, F.; von Stachelski, S.; Torbica, A.; Vischi, D.

    2017-12-01

    We attempt an informative description of the Flare Likelihood And Region Eruption Forecasting (FLARECAST) project, European Commission's first large-scale investment to explore the limits of reliability and accuracy achieved for the forecasting of major solar flares. We outline the consortium, top-level objectives and first results of the project, highlighting the diversity and fusion of expertise needed to deliver what was promised. The project's final product, featuring an openly accessible, fully modular and free to download flare forecasting facility will be delivered in early 2018. The project's three objectives, namely, science, research-to-operations and dissemination / communication, are also discussed: in terms of science, we encapsulate our close-to-final assessment on how close (or far) are we from a practically exploitable solar flare forecasting. In terms of R2O, we briefly describe the architecture of the FLARECAST infrastructure that includes rigorous validation for each forecasting step. From the three different communication levers of the project we finally focus on lessons learned from the two-way interaction with the community of stakeholders and governmental organizations. The FLARECAST project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 640216.

  14. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    Science.gov (United States)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more

  15. Lifetime and economic analyses of lithium-ion batteries for balancing wind power forecast error

    DEFF Research Database (Denmark)

    Swierczynski, Maciej Jozef; Stroe, Daniel Ioan; Stroe, Ana-Irina

    2015-01-01

    is considered. In this paper, the economic feasibility of lithium-ion batteries for balancing the wind power forecast error is analysed. In order to perform a reliable assessment, an ageing model of lithium-ion battery was developed considering both cycling and calendar life. The economic analysis considers two......, it was found that for total elimination of the wind power forecast error, it is required to have a 25-MWh Li-ion battery energy storage system for the considered 2 MW WT....

  16. Fundamentals of reliability engineering applications in multistage interconnection networks

    CERN Document Server

    Gunawan, Indra

    2014-01-01

    This book presents fundamentals of reliability engineering with its applications in evaluating reliability of multistage interconnection networks. In the first part of the book, it introduces the concept of reliability engineering, elements of probability theory, probability distributions, availability and data analysis.  The second part of the book provides an overview of parallel/distributed computing, network design considerations, and more.  The book covers a comprehensive reliability engineering methods and its practical aspects in the interconnection network systems. Students, engineers, researchers, managers will find this book as a valuable reference source.

  17. Sharing wind power forecasts in electricity markets: A numerical analysis

    International Nuclear Information System (INIS)

    Exizidis, Lazaros; Kazempour, S. Jalal; Pinson, Pierre; Greve, Zacharie de; Vallée, François

    2016-01-01

    Highlights: • Information sharing among different agents can be beneficial for electricity markets. • System cost decreases by sharing wind power forecasts between different agents. • Market power of wind producer may increase by sharing forecasts with market operator. • Extensive out-of-sample analysis is employed to draw reliable conclusions. - Abstract: In an electricity pool with significant share of wind power, all generators including conventional and wind power units are generally scheduled in a day-ahead market based on wind power forecasts. Then, a real-time market is cleared given the updated wind power forecast and fixed day-ahead decisions to adjust power imbalances. This sequential market-clearing process may cope with serious operational challenges such as severe power shortage in real-time due to erroneous wind power forecasts in day-ahead market. To overcome such situations, several solutions can be considered such as adding flexible resources to the system. In this paper, we address another potential solution based on information sharing in which market players share their own wind power forecasts with others in day-ahead market. This solution may improve the functioning of sequential market-clearing process through making more informed day-ahead schedules, which reduces the need for balancing resources in real-time operation. This paper numerically evaluates the potential value of sharing forecasts for the whole system in terms of system cost reduction. Besides, its impact on each market player’s profit is analyzed. The framework of this study is based on a stochastic two-stage market setup and complementarity modeling, which allows us to gain further insights into information sharing impacts.

  18. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    Science.gov (United States)

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-03-01

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  19. Validation of the CME Geomagnetic forecast alerts under COMESEP alert system

    Science.gov (United States)

    Dumbovic, Mateja; Srivastava, Nandita; Khodia, Yamini; Vršnak, Bojan; Devos, Andy; Rodriguez, Luciano

    2017-04-01

    An automated space weather alert system has been developed under the EU FP7 project COMESEP (COronal Mass Ejections and Solar Energetic Particles: http://comesep.aeronomy.be) to forecast solar energetic particles (SEP) and coronal mass ejection (CME) risk levels at Earth. COMESEP alert system uses automated detection tool CACTus to detect potentially threatening CMEs, drag-based model (DBM) to predict their arrival and CME geo-effectiveness tool (CGFT) to predict their geomagnetic impact. Whenever CACTus detects a halo or partial halo CME and issues an alert, DBM calculates its arrival time at Earth and CGFT calculates its geomagnetic risk level. Geomagnetic risk level is calculated based on an estimation of the CME arrival probability and its likely geo-effectiveness, as well as an estimate of the geomagnetic-storm duration. We present the evaluation of the CME risk level forecast with COMESEP alert system based on a study of geo-effective CMEs observed during 2014. The validation of the forecast tool is done by comparing the forecasts with observations. In addition, we test the success rate of the automatic forecasts (without human intervention) against the forecasts with human intervention using advanced versions of DBM and CGFT (self standing tools available at Hvar Observatory website: http://oh.geof.unizg.hr). The results implicate that the success rate of the forecast is higher with human intervention and using more advanced tools. This work has received funding from the European Commission FP7 Project COMESEP (263252). We acknowledge the support of Croatian Science Foundation under the project 6212 „Solar and Stellar Variability".

  20. Continuous hydrological modelling in the context of real time flood forecasting in alpine Danube tributary catchments

    International Nuclear Information System (INIS)

    Stanzel, Ph; Kahl, B; Haberl, U; Herrnegger, M; Nachtnebel, H P

    2008-01-01

    A hydrological modelling framework applied within operational flood forecasting systems in three alpine Danube tributary basins, Traisen, Salzach and Enns, is presented. A continuous, semi-distributed rainfall-runoff model, accounting for the main hydrological processes of snow accumulation and melt, interception, evapotranspiration, infiltration, runoff generation and routing is set up. Spatial discretization relies on the division of watersheds into subbasins and subsequently into hydrologic response units based on spatial information on soil types, land cover and elevation bands. The hydrological models are calibrated with meteorological ground measurements and with meteorological analyses incorporating radar information. Operationally, each forecasting sequence starts with the re-calculation of the last 24 to 48 hours. Errors between simulated and observed runoff are minimized by optimizing a correction factor for the input to provide improved system states. For the hydrological forecast quantitative 48 or 72 hour forecast grids of temperature and precipitation - deterministic and probabilistic - are used as input. The forecasted hydrograph is corrected with an autoregressive model. The forecasting sequences are repeated each 15 minutes. First evaluations of resulting hydrological forecasts are presented and reliability of forecasts with different lead times is discussed.

  1. National Forecast Charts

    Science.gov (United States)

    code. Press enter or select the go button to submit request Local forecast by "City, St" or Prediction Center on Twitter NCEP Quarterly Newsletter WPC Home Analyses and Forecasts National Forecast to all federal, state, and local government web resources and services. National Forecast Charts

  2. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  3. On the reliable use of satellite-derived surface water products for global flood monitoring

    Science.gov (United States)

    Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.

    2015-12-01

    Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.

  4. Probabilistic forecasting of shallow, rainfall-triggered landslides using real-time numerical weather predictions

    Directory of Open Access Journals (Sweden)

    J. Schmidt

    2008-04-01

    Full Text Available A project established at the National Institute of Water and Atmospheric Research (NIWA in New Zealand is aimed at developing a prototype of a real-time landslide forecasting system. The objective is to predict temporal changes in landslide probability for shallow, rainfall-triggered landslides, based on quantitative weather forecasts from numerical weather prediction models. Global weather forecasts from the United Kingdom Met Office (MO Numerical Weather Prediction model (NWP are coupled with a regional data assimilating NWP model (New Zealand Limited Area Model, NZLAM to forecast atmospheric variables such as precipitation and temperature up to 48 h ahead for all of New Zealand. The weather forecasts are fed into a hydrologic model to predict development of soil moisture and groundwater levels. The forecasted catchment-scale patterns in soil moisture and soil saturation are then downscaled using topographic indices to predict soil moisture status at the local scale, and an infinite slope stability model is applied to determine the triggering soil water threshold at a local scale. The model uses uncertainty of soil parameters to produce probabilistic forecasts of spatio-temporal landslide occurrence 48~h ahead. The system was evaluated for a damaging landslide event in New Zealand. Comparison with landslide densities estimated from satellite imagery resulted in hit rates of 70–90%.

  5. Forecasting telecommunication new service demand by analogy method and combined forecast

    Directory of Open Access Journals (Sweden)

    Lin Feng-Jenq

    2005-01-01

    Full Text Available In the modeling forecast field, we are usually faced with the more difficult problems of forecasting market demand for a new service or product. A new service or product is defined as that there is absence of historical data in this new market. We hardly use models to execute the forecasting work directly. In the Taiwan telecommunication industry, after liberalization in 1996, there are many new services opened continually. For optimal investment, it is necessary that the operators, who have been granted the concessions and licenses, forecast this new service within their planning process. Though there are some methods to solve or avoid this predicament, in this paper, we will propose one forecasting procedure that integrates the concept of analogy method and the idea of combined forecast to generate new service forecast. In view of the above, the first half of this paper describes the procedure of analogy method and the approach of combined forecast, and the second half provides the case of forecasting low-tier phone demand in Taiwan to illustrate this procedure's feasibility.

  6. Probability of crack-initiation and application to NDE

    Energy Technology Data Exchange (ETDEWEB)

    Prantl, G [Nuclear Safety Inspectorate HSK, (Switzerland)

    1988-12-31

    Fracture toughness is a property with a certain variability. When a statistical distribution is assumed, the probability of crack initiation may be calculated for a given problem defined by its geometry and the applied stress. Experiments have shown, that cracks which experience a certain small amount of ductile growth can reliably be detected by acoustic emission measurements. The probability of crack detection by AE-techniques may be estimated using this experimental finding and the calculated probability of crack initiation. (author).

  7. Inflow forecasting using Artificial Neural Networks for reservoir operation

    Directory of Open Access Journals (Sweden)

    C. Chiamsathit

    2016-05-01

    Full Text Available In this study, multi-layer perceptron (MLP artificial neural networks have been applied to forecast one-month-ahead inflow for the Ubonratana reservoir, Thailand. To assess how well the forecast inflows have performed in the operation of the reservoir, simulations were carried out guided by the systems rule curves. As basis of comparison, four inflow situations were considered: (1 inflow known and assumed to be the historic (Type A; (2 inflow known and assumed to be the forecast (Type F; (3 inflow known and assumed to be the historic mean for month (Type M; and (4 inflow is unknown with release decision only conditioned on the starting reservoir storage (Type N. Reservoir performance was summarised in terms of reliability, resilience, vulnerability and sustainability. It was found that Type F inflow situation produced the best performance while Type N was the worst performing. This clearly demonstrates the importance of good inflow information for effective reservoir operation.

  8. Reliability analysis of service water system under earthquake

    International Nuclear Information System (INIS)

    Yu Yu; Qian Xiaoming; Lu Xuefeng; Wang Shengfei; Niu Fenglei

    2013-01-01

    Service water system is one of the important safety systems in nuclear power plant, whose failure probability is always gained by system reliability analysis. The probability of equipment failure under the earthquake is the function of the peak acceleration of earthquake motion, while the occurrence of earthquake is of randomicity, thus the traditional fault tree method in current probability safety assessment is not powerful enough to deal with such case of conditional probability problem. An analysis frame was put forward for system reliability evaluation in seismic condition in this paper, in which Monte Carlo simulation was used to deal with conditional probability problem. Annual failure probability of service water system was calculated, and failure probability of 1.46X10 -4 per year was obtained. The analysis result is in accordance with the data which indicate equipment seismic resistance capability, and the rationality of the model is validated. (authors)

  9. Probability of US Heat Waves Affected by a Subseasonal Planetary Wave Pattern

    Science.gov (United States)

    Teng, Haiyan; Branstator, Grant; Wang, Hailan; Meehl, Gerald A.; Washington, Warren M.

    2013-01-01

    Heat waves are thought to result from subseasonal atmospheric variability. Atmospheric phenomena driven by tropical convection, such as the Asian monsoon, have been considered potential sources of predictability on subseasonal timescales. Mid-latitude atmospheric dynamics have been considered too chaotic to allow significant prediction skill of lead times beyond the typical 10-day range of weather forecasts. Here we use a 12,000-year integration of an atmospheric general circulation model to identify a pattern of subseasonal atmospheric variability that can help improve forecast skill for heat waves in the United States. We find that heat waves tend to be preceded by 15-20 days by a pattern of anomalous atmospheric planetary waves with a wavenumber of 5. This circulation pattern can arise as a result of internal atmospheric dynamics and is not necessarily linked to tropical heating.We conclude that some mid-latitude circulation anomalies that increase the probability of heat waves are predictable beyond the typical weather forecast range.

  10. Impact bias or underestimation? Outcome specifications predict the direction of affective forecasting errors.

    Science.gov (United States)

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K

    2017-05-01

    Affective forecasts are used to anticipate the hedonic impact of future events and decide which events to pursue or avoid. We propose that because affective forecasters are more sensitive to outcome specifications of events than experiencers, the outcome specification values of an event, such as its duration, magnitude, probability, and psychological distance, can be used to predict the direction of affective forecasting errors: whether affective forecasters will overestimate or underestimate its hedonic impact. When specifications are positively correlated with the hedonic impact of an event, forecasters will overestimate the extent to which high specification values will intensify and low specification values will discount its impact. When outcome specifications are negatively correlated with its hedonic impact, forecasters will overestimate the extent to which low specification values will intensify and high specification values will discount its impact. These affective forecasting errors compound additively when multiple specifications are aligned in their impact: In Experiment 1, affective forecasters underestimated the hedonic impact of winning a smaller prize that they expected to win, and they overestimated the hedonic impact of winning a larger prize that they did not expect to win. In Experiment 2, affective forecasters underestimated the hedonic impact of a short unpleasant video about a temporally distant event, and they overestimated the hedonic impact of a long unpleasant video about a temporally near event. Experiments 3A and 3B showed that differences in the affect-richness of forecasted and experienced events underlie these differences in sensitivity to outcome specifications, therefore accounting for both the impact bias and its reversal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  12. Flood forecasting and warning systems in Pakistan

    International Nuclear Information System (INIS)

    Ali Awan, Shaukat

    2004-01-01

    Meteorologically, there are two situations which may cause three types of floods in Indus Basin in Pakistan: i) Meteorological Situation for Category-I Floods when the seasonal low is a semi permanent weather system situated over south eastern Balochistan, south western Punjab, adjoining parts of Sindh get intensified and causes the moisture from the Arabian Sea to be brought up to upper catchments of Chenab and Jhelum rivers. (ii) Meteorological Situation for Category-11 and Category-111 Floods, which is linked with monsoon low/depression. Such monsoon systems originate in Bay of Bengal region and then move across India in general west/north westerly direction arrive over Rajasthan or any of adjoining states of India. Flood management in Pakistan is multi-functional process involving a number of different organizations. The first step in the process is issuance of flood forecast/warning, which is performed by Pakistan Meteorological Department (PMD) utilizing satellite cloud pictures and quantitative precipitation measurement radar data, in addition to the conventional weather forecasting facilities. For quantitative flood forecasting, hydrological data is obtained through the Provincial Irrigation Department and WAPDA. Furthermore, improved rainfall/runoff and flood routing models have been developed to provide more reliable and explicit flood information to a flood prone population.(Author)

  13. Computational Intelligence Techniques Applied to the Day Ahead PV Output Power Forecast: PHANN, SNO and Mixed

    Directory of Open Access Journals (Sweden)

    Emanuele Ogliari

    2018-06-01

    Full Text Available An accurate forecast of the exploitable energy from Renewable Energy Sources is extremely important for the stability issues of the electric grid and the reliability of the bidding markets. This paper presents a comparison among different forecasting methods of the photovoltaic output power introducing a new method that mixes some peculiarities of the others: the Physical Hybrid Artificial Neural Network and the five parameters model estimated by the Social Network Optimization. In particular, the day-ahead forecasts evaluated against real data measured for two years in an existing photovoltaic plant located in Milan, Italy, are compared by means both new and the most common error indicators. Results reported in this work show the best forecasting capability of the new “mixed method” which scored the best forecast skill and Enveloped Mean Absolute Error on a yearly basis (47% and 24.67%, respectively.

  14. Forecasting the magnitude and onset of El Niño based on climate network

    Science.gov (United States)

    Meng, Jun; Fan, Jingfang; Ashkenazy, Yosef; Bunde, Armin; Havlin, Shlomo

    2018-04-01

    El Niño is probably the most influential climate phenomenon on inter-annual time scales. It affects the global climate system and is associated with natural disasters; it has serious consequences in many aspects of human life. However, the forecasting of the onset and in particular the magnitude of El Niño are still not accurate enough, at least more than half a year ahead. Here, we introduce a new forecasting index based on climate network links representing the similarity of low frequency temporal temperature anomaly variations between different sites in the Niño 3.4 region. We find that significant upward trends in our index forecast the onset of El Niño approximately 1 year ahead, and the highest peak since the end of last El Niño in our index forecasts the magnitude of the following event. We study the forecasting capability of the proposed index on several datasets, including, ERA-Interim, NCEP Reanalysis I, PCMDI-AMIP 1.1.3 and ERSST.v5.

  15. Short-range solar radiation forecasts over Sweden

    Directory of Open Access Journals (Sweden)

    T. Landelius

    2018-04-01

    Full Text Available In this article the performance for short-range solar radiation forecasts by the global deterministic and ensemble models from the European Centre for Medium-Range Weather Forecasts (ECMWF is compared with an ensemble of the regional mesoscale model HARMONIE-AROME used by the national meteorological services in Sweden, Norway and Finland. Note however that only the control members and the ensemble means are included in the comparison. The models resolution differs considerably with 18 km for the ECMWF ensemble, 9 km for the ECMWF deterministic model, and 2.5 km for the HARMONIE-AROME ensemble.The models share the same radiation code. It turns out that they all underestimate systematically the Direct Normal Irradiance (DNI for clear-sky conditions. Except for this shortcoming, the HARMONIE-AROME ensemble model shows the best agreement with the distribution of observed Global Horizontal Irradiance (GHI and DNI values. During mid-day the HARMONIE-AROME ensemble mean performs best. The control member of the HARMONIE-AROME ensemble also scores better than the global deterministic ECMWF model. This is an interesting result since mesoscale models have so far not shown good results when compared to the ECMWF models.Three days with clear, mixed and cloudy skies are used to illustrate the possible added value of a probabilistic forecast. It is shown that in these cases the mesoscale ensemble could provide decision support to a grid operator in terms of forecasts of both the amount of solar power and its probabilities.

  16. Day-Ahead Wind Power Forecasting Using a Two-Stage Hybrid Modeling Approach Based on SCADA and Meteorological Information, and Evaluating the Impact of Input-Data Dependency on Forecasting Accuracy

    Directory of Open Access Journals (Sweden)

    Dehua Zheng

    2017-12-01

    Full Text Available The power generated by wind generators is usually associated with uncertainties, due to the intermittency of wind speed and other weather variables. This creates a big challenge for transmission system operators (TSOs and distribution system operators (DSOs in terms of connecting, controlling and managing power networks with high-penetration wind energy. Hence, in these power networks, accurate wind power forecasts are essential for their reliable and efficient operation. They support TSOs and DSOs in enhancing the control and management of the power network. In this paper, a novel two-stage hybrid approach based on the combination of the Hilbert-Huang transform (HHT, genetic algorithm (GA and artificial neural network (ANN is proposed for day-ahead wind power forecasting. The approach is composed of two stages. The first stage utilizes numerical weather prediction (NWP meteorological information to predict wind speed at the exact site of the wind farm. The second stage maps actual wind speed vs. power characteristics recorded by SCADA. Then, the wind speed forecast in the first stage for the future day is fed to the second stage to predict the future day’s wind power. Comparative selection of input-data parameter sets for the forecasting model and impact analysis of input-data dependency on forecasting accuracy have also been studied. The proposed approach achieves significant forecasting accuracy improvement compared with three other artificial intelligence-based forecasting approaches and a benchmark model using the smart persistence method.

  17. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  18. Real-time drought forecasting system for irrigation managment

    Science.gov (United States)

    Ceppi, Alessandro; Ravazzani, Giovanni; Corbari, Chiara; Masseroni, Daniele; Meucci, Stefania; Pala, Francesca; Salerno, Raffaele; Meazza, Giuseppe; Chiesa, Marco; Mancini, Marco

    2013-04-01

    In recent years frequent periods of water scarcity have enhanced the need to use water more carefully, even in in European areas traditionally rich of water such as the Po Valley. In dry periods, the problem of water shortage can be enhanced by conflictual use of water such as irrigation, industrial and power production (hydroelectric and thermoelectric). Further, over the last decade the social perspective on this issue is increasing due to climate change and global warming scenarios which come out from the last IPCC Report. The increased frequency of dry periods has stimulated the improvement of irrigation and water management. In this study we show the development and implementation of the real-time drought forecasting system Pre.G.I., an Italian acronym that stands for "Hydro-Meteorological forecast for irrigation management". The system is based on ensemble prediction at long range (30 days) with hydrological simulation of water balance to forecast the soil water content in every parcel over the Consorzio Muzza basin. The studied area covers 74,000 ha in the middle of the Po Valley, near the city of Lodi. The hydrological ensemble forecasts are based on 20 meteorological members of the non-hydrostatic WRF model with 30 days as lead-time, provided by Epson Meteo Centre, while the hydrological model used to generate the soil moisture and water table simulations is the rainfall-runoff distributed FEST-WB model, developed at Politecnico di Milano. The hydrological model was validated against measurements of latent heat flux and soil moisture acquired by an eddy-covariance station. Reliability of the forecasting system and its benefits was assessed on some cases-study occurred in the recent years.

  19. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  20. Reliability and safety analyses under fuzziness

    International Nuclear Information System (INIS)

    Onisawa, T.; Kacprzyk, J.

    1995-01-01

    Fuzzy theory, for example possibility theory, is compatible with probability theory. What is shown so far is that probability theory needs not be replaced by fuzzy theory, but rather that the former works much better in applications if it is combined with the latter. In fact, it is said that there are two essential uncertainties in the field of reliability and safety analyses: One is a probabilistic uncertainty which is more relevant for mechanical systems and the natural environment, and the other is fuzziness (imprecision) caused by the existence of human beings in systems. The classical probability theory alone is therefore not sufficient to deal with uncertainties in humanistic system. In such a context this collection of works will put a milestone in the arguments of probability theory and fuzzy theory. This volume covers fault analysis, life time analysis, reliability, quality control, safety analysis and risk analysis. (orig./DG). 106 figs

  1. Problems involved in calculating the probability of rare occurrences

    International Nuclear Information System (INIS)

    Tittes, E.

    1986-01-01

    Also with regard to the characteristics such as occurrence probability or occurrence rate, there are limits which have to be observed, or else probability data and thus the concept of determinable risk itself will lose its practical value. The mathematical models applied for probability assessment are based on data supplied by the insurance companies, reliability experts in the automobile industry, or by planning experts in the field of traffic or information supply. (DG) [de

  2. Hailstorm forecast from stability indexes in Southwestern France

    Science.gov (United States)

    Melcón, Pablo; Merino, Andrés; Sánchez, José Luis; Dessens, Jean; Gascón, Estíbaliz; Berthet, Claude; López, Laura; García-Ortega, Eduardo

    2016-04-01

    Forecasting hailstorms is a difficult task because of their small spatial and temporal scales. Over recent decades, stability indexes have been commonly used in operational forecasting to provide a simplified representation of different thermodynamic characteristics of the atmosphere, regarding the onset of convective events. However, they are estimated from vertical profiles obtained by radiosondes, which are usually available only twice a day and have limited spatial representativeness. Numerical models predictions can be used to overcome these drawbacks, providing vertical profiles with higher spatiotemporal resolution. The main objective of this study is to create a tool for hail prediction in the southwest of France, one of the European regions where hailstorms have a higher incidence. The Association Nationale d'Etude et de Lutte contre les Fleáux Atmosphériques (ANELFA) maintains there a dense hailpad network in continuous operation, which has created an extensive database of hail events, used in this study as ground truth. The new technique is aimed to classify the spatial distribution of different stability indexes on hail days. These indexes were calculated from vertical profiles at 1200 UTC provided by WRF numerical model, validated with radiosonde data from Bordeaux. Binary logistic regression is used to select those indexes that best represent thermodynamic conditions related to occurrence of hail in the zone. Then, they are combined in a single algorithm that surpassed the predictive power they have when used independently. Regression equation results in hail days are used in cluster analysis to identify different spatial patterns given by the probability algorithm. This new tool can be used in operational forecasting, in combination with synoptic and mesoscale techniques, to properly define hail probability and distribution. Acknowledgements The authors would like to thank the CEPA González Díez Foundation and the University of Leon for its

  3. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  4. Forecasting Global Rainfall for Points Using ECMWF's Global Ensemble and Its Applications in Flood Forecasting

    Science.gov (United States)

    Pillosu, F. M.; Hewson, T.; Mazzetti, C.

    2017-12-01

    Prediction of local extreme rainfall has historically been the remit of nowcasting and high resolution limited area modelling, which represent only limited areas, may not be spatially accurate, give reasonable results only for limited lead times (based statistical post-processing software ("ecPoint-Rainfall, ecPR", operational in 2017) that uses ECMWF Ensemble (ENS) output to deliver global probabilistic rainfall forecasts for points up to day 10. Firstly, ecPR applies a new notion of "remote calibration", which 1) allows us to replicate a multi-centennial training period using only one year of data, and 2) provides forecasts for anywhere in the world. Secondly, the software applies an understanding of how different rainfall generation mechanisms lead to different degrees of sub-grid variability in rainfall totals, and of where biases in the model can be improved upon. A long-term verification has shown that the post-processed rainfall has better reliability and resolution at every lead time if compared with ENS, and for large totals, ecPR outputs have the same skill at day 5 that the raw ENS has at day 1 (ROC area metric). ecPR could be used as input for hydrological models if its probabilistic output is modified accordingly to the inputs requirements for hydrological models. Indeed, ecPR does not provide information on where the highest total is likely to occur inside the gridbox, nor on the spatial distribution of rainfall values nearby. "Scenario forecasts" could be a solution. They are derived from locating the rainfall peak in sensitive positions (e.g. urban areas), and then redistributing the remaining quantities in the gridbox modifying traditional spatial correlation characterization methodologies (e.g. variogram analysis) in order to take account, for instance, of the type of rainfall forecast (stratiform, convective). Such an approach could be a turning point in the field of medium-range global real-time riverine flood forecasts. This presentation will

  5. Performance of the ocean state forecast system at Indian National Centre for Ocean Information Services

    Digital Repository Service at National Institute of Oceanography (India)

    Nair, T.M.B.; Sirisha, P.; Sandhya, K.G.; Srinivas, K.; SanilKumar, V.; Sabique, L.; Nherakkol, A.; KrishnaPrasad, B.; RakhiKumari; Jeyakumar, C.; Kaviyazhahu, K.; RameshKumar, M.; Harikumar, R.; Shenoi, S.S.C.; Nayak, S.

    The reliability of the operational Ocean State Forecast system at the Indian National Centre for Ocean Information Services (INCOIS) during tropical cyclones that affect the coastline of India is described in this article. The performance...

  6. Factors Reducing Efficiency of the Operational Oceanographic Forecast Systems in the Arctic Basin

    Directory of Open Access Journals (Sweden)

    V.N. Belokopytov

    2017-04-01

    Full Text Available Reliability of the forecasted fields in the Arctic Basin is limited by a number of problems resulting, in the first turn, from lack of operational information. Due to the ice cover, satellite data on the sea level and the sea surface temperature is either completely not available or partially accessible in summer. The amount of CTD measuring systems functioning in the operational mode (3 – 5 probes is not sufficient. The number of the temperature-profiling buoys the probing depth of which is limited to 60 m, is not enough for the Arctic as well. Lack of spatial resolution of the available altimetry information (14 km, as compared to the Rossby radius in the Arctic Ocean (2 – 12 km, requires a thorough analysis of the forecasting system practical goals. The basic factor enhancing reliability of the oceanographic forecast consists in the fact that the key oceanographic regions, namely the eastern parts of the Norwegian and Greenland seas, the Barents Sea and the Chukchi Sea including the Bering Strait (where the Atlantic and Pacific waters flow in and transform, and the halocline structure is formed are partially or completely free of ice and significantly better provided with operational information.

  7. A dynamic system to forecast ionospheric storm disturbances based on solar wind conditions

    Directory of Open Access Journals (Sweden)

    L. R. Cander

    2005-06-01

    Full Text Available For the reliable performance of technologically advanced radio communications systems under geomagnetically disturbed conditions, the forecast and modelling of the ionospheric response during storms is a high priority. The ionospheric storm forecasting models that are currently in operation have shown a high degree of reliability during quiet conditions, but they have proved inadequate during storm events. To improve their prediction accuracy, we have to take advantage of the deeper understanding in ionospheric storm dynamics that is currently available, indicating a correlation between the Interplanetary Magnetic Field (IMF disturbances and the qualitative signature of ionospheric storm disturbances at middle latitude stations. In this paper we analyse observations of the foF2 critical frequency parameter from one mid-latitude European ionospheric station (Chilton in conjunction with observations of IMF parameters (total magnitude, Bt and Bz-IMF component from the ACE spacecraft mission for eight storm events. The determination of the time delay in the ionospheric response to the interplanetary medium disturbances leads to significant results concerning the forecast of the ionospheric storms onset and their development during the first 24 h. In this way the real-time ACE observations of the solar wind parameters may be used in the development of a real-time dynamic ionospheric storm model with adequate accuracy.

  8. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  9. Combining 2-m temperature nowcasting and short range ensemble forecasting

    Directory of Open Access Journals (Sweden)

    A. Kann

    2011-12-01

    variables. Validation results indicate that all three methods produce sharp and reliable probabilistic 2-m temperature forecasts. However, the statistical and combined dynamic-statistical methods slightly outperform the pure dynamical approach, mainly due to the under-dispersive behavior of ALADIN-LAEF outside the nowcasting range. The training length does not have a pronounced impact on forecast skill, but a spread re-scaling improves the forecast skill substantially. Refinements of the statistical methods yield a slight further improvement.

  10. Evaluation of probabilistic forecasts with the scoringRules package

    Science.gov (United States)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  11. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  12. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    International Nuclear Information System (INIS)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-01-01

    Highlights: • An ensemble model is developed to produce both deterministic and probabilistic wind forecasts. • A deep feature selection framework is developed to optimally determine the inputs to the forecasting methodology. • The developed ensemble methodology has improved the forecasting accuracy by up to 30%. - Abstract: With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  13. Aggregated wind power generation probabilistic forecasting based on particle filter

    International Nuclear Information System (INIS)

    Li, Pai; Guan, Xiaohong; Wu, Jiang

    2015-01-01

    Highlights: • A new method for probabilistic forecasting of aggregated wind power generation. • A dynamic system is established based on a numerical weather prediction model. • The new method handles the non-Gaussian and time-varying wind power uncertainties. • Particle filter is applied to forecast predictive densities of wind generation. - Abstract: Probability distribution of aggregated wind power generation in a region is one of important issues for power system daily operation. This paper presents a novel method to forecast the predictive densities of the aggregated wind power generation from several geographically distributed wind farms, considering the non-Gaussian and non-stationary characteristics in wind power uncertainties. Based on a mesoscale numerical weather prediction model, a dynamic system is established to formulate the relationship between the atmospheric and near-surface wind fields of geographically distributed wind farms. A recursively backtracking framework based on the particle filter is applied to estimate the atmospheric state with the near-surface wind power generation measurements, and to forecast the possible samples of the aggregated wind power generation. The predictive densities of the aggregated wind power generation are then estimated based on these predicted samples by a kernel density estimator. In case studies, the new method presented is tested on a 9 wind farms system in Midwestern United States. The testing results that the new method can provide competitive interval forecasts for the aggregated wind power generation with conventional statistical based models, which validates the effectiveness of the new method

  14. Advancing satellite-based solar power forecasting through integration of infrared channels for automatic detection of coastal marine inversion layer

    Energy Technology Data Exchange (ETDEWEB)

    Kostylev, Vladimir; Kostylev, Andrey; Carter, Chris; Mahoney, Chad; Pavlovski, Alexandre; Daye, Tony [Green Power Labs Inc., Dartmouth, NS (Canada); Cormier, Dallas Eugene; Fotland, Lena [San Diego Gas and Electric Co., San Diego, CA (United States)

    2012-07-01

    The marine atmospheric boundary layer is a layer or cool, moist maritime air with the thickness of a few thousand feet immediately below a temperature inversion. In coastal areas as moist air rises from the ocean surface, it becomes trapped and is often compressed into fog above which a layer of stratus clouds often forms. This phenomenon is common for satellite-based solar radiation monitoring and forecasting. Hour ahead satellite-based solar radiation forecasts are commonly using visible spectrum satellite images, from which it is difficult to automatically differentiate low stratus clouds and fog from high altitude clouds. This provides a challenge for cloud motion tyracking and cloud cover forecasting. San Diego Gas and Electric {sup registered} (SDG and E {sup registered}) Marine Layer Project was undertaken to obtain information for integration with PV forecasts, and to develop a detailed understanding of long-term benefits from forecasting Marine Layer (ML) events and their effects on PV production. In order to establish climatological ML patterns, spatial extent and distribution of marine layer, we analyzed visible and IR spectrum satellite images (GOES WEST) archive for the period of eleven years (2000 - 2010). Historical boundaries of marine layers impact were established based on the cross-classification of visible spectrum (VIS) and infrared (IR) images. This approach is successfully used by us and elsewhere for evaluating cloud albedo in common satellite-based techniques for solar radiation monitoring and forecasting. The approach allows differentiation of cloud cover and helps distinguish low laying fog which is the main consequence of marine layer formation. ML occurrence probability and maximum extent inland was established for each hour and day of the analyzed period and seasonal/patterns were described. SDG and E service area is the most affected region by ML events with highest extent and probability of ML occurrence. Influence of ML was the

  15. Tool for Forecasting Cool-Season Peak Winds Across Kennedy Space Center and Cape Canaveral Air Force Station (CCAFS)

    Science.gov (United States)

    Barrett, Joe H., III; Roeder, William P.

    2010-01-01

    Peak wind speed is important element in 24-Hour and Weekly Planning Forecasts issued by 45th Weather Squadron (45 WS). Forecasts issued for planning operations at KSC/CCAFS. 45 WS wind advisories issued for wind gusts greater than or equal to 25 kt. 35 kt and 50 kt from surface to 300 ft. AMU developed cool-season (Oct - Apr) tool to help 45 WS forecast: daily peak wind speed, 5-minute average speed at time of peak wind, and probability peak speed greater than or equal to 25 kt, 35 kt, 50 kt. AMU tool also forecasts daily average wind speed from 30 ft to 60 ft. Phase I and II tools delivered as a Microsoft Excel graphical user interface (GUI). Phase II tool also delivered as Meteorological Interactive Data Display System (MIDDS) GUI. Phase I and II forecast methods were compared to climatology, 45 WS wind advisories and North American Mesoscale model (MesoNAM) forecasts in a verification data set.

  16. Advanced mesoscale forecasts of icing events for Gaspe wind farms

    International Nuclear Information System (INIS)

    Gayraud, A.; Benoit, R.; Camion, A.

    2009-01-01

    Atmospheric icing includes every event which causes ice accumulations of various shapes on different structures. In terms of its effects on wind farms, atmospheric icing can decrease the aerodynamic performance, cause structure overloading, and add vibrations leading to failure and breaking. This presentation discussed advanced mesoscale forecasts of icing events for Gaspe wind farms. The context of the study was discussed with particular reference to atmospheric icing; effects on wind farms; and forecast objectives. The presentation also described the models and results of the study. These included MC2, a compressible community model, as well as a Milbrandt and Yau condensation scheme. It was shown that the study has provided good estimates of the duration of events as well as reliable precipitation categories. tabs., figs.

  17. Inflation Forecast Contracts

    OpenAIRE

    Gersbach, Hans; Hahn, Volker

    2012-01-01

    We introduce a new type of incentive contract for central bankers: inflation forecast contracts, which make central bankers’ remunerations contingent on the precision of their inflation forecasts. We show that such contracts enable central bankers to influence inflation expectations more effectively, thus facilitating more successful stabilization of current inflation. Inflation forecast contracts improve the accuracy of inflation forecasts, but have adverse consequences for output. On balanc...

  18. Operational flood-forecasting in the Piemonte region – development and verification of a fully distributed physically-oriented hydrological model

    Directory of Open Access Journals (Sweden)

    D. Rabuffetti

    2009-03-01

    Full Text Available A hydrological model for real time flood forecasting to Civil Protection services requires reliability and rapidity. At present, computational capabilities overcome the rapidity needs even when a fully distributed hydrological model is adopted for a large river catchment as the Upper Po river basin closed at Ponte Becca (nearly 40 000 km2. This approach allows simulating the whole domain and obtaining the responses of large as well as of medium and little sized sub-catchments. The FEST-WB hydrological model (Mancini, 1990; Montaldo et al., 2007; Rabuffetti et al., 2008 is implemented. The calibration and verification activities are based on more than 100 flood events, occurred along the main tributaries of the Po river in the period 2000–2003. More than 300 meteorological stations are used to obtain the forcing fields, 10 cross sections with continuous and reliable discharge time series are used for calibration while verification is performed on about 40 monitored cross sections. Furthermore meteorological forecasting models are used to force the hydrological model with Quantitative Precipitation Forecasts (QPFs for 36 h horizon in "operational setting" experiments. Particular care is devoted to understanding how QPF affects the accuracy of the Quantitative Discharge Forecasts (QDFs and to assessing the QDF uncertainty impact on the warning system reliability. Results are presented either in terms of QDF and of warning issues highlighting the importance of an "operational based" verification approach.

  19. System-Reliability Cumulative-Binomial Program

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, NEWTONP, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. NEWTONP, CUMBIN (NPO-17555), and CROSSER (NPO-17557), used independently of one another. Program finds probability required to yield given system reliability. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Program written in C.

  20. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  1. Plant and control system reliability and risk model

    International Nuclear Information System (INIS)

    Niemelae, I.M.

    1986-01-01

    A new reliability modelling technique for control systems and plants is demonstrated. It is based on modified boolean algebra and it has been automated into an efficient computer code called RELVEC. The code is useful for getting an overall view of the reliability parameters or for an in-depth reliability analysis, which is essential in risk analysis, where the model must be capable of answering to specific questions like: 'What is the probability of this temperature limiter to provide a false alarm', or 'what is the probability of air pressure in this subsystem to drop below lower limit'. (orig./DG)

  2. Energy Demand Forecasting: Combining Cointegration Analysis and Artificial Intelligence Algorithm

    Directory of Open Access Journals (Sweden)

    Junbing Huang

    2018-01-01

    Full Text Available Energy is vital for the sustainable development of China. Accurate forecasts of annual energy demand are essential to schedule energy supply and provide valuable suggestions for developing related industries. In the existing literature on energy use prediction, the artificial intelligence-based (AI-based model has received considerable attention. However, few econometric and statistical evidences exist that can prove the reliability of the current AI-based model, an area that still needs to be addressed. In this study, a new energy demand forecasting framework is presented at first. On the basis of historical annual data of electricity usage over the period of 1985–2015, the coefficients of linear and quadratic forms of the AI-based model are optimized by combining an adaptive genetic algorithm and a cointegration analysis shown as an example. Prediction results of the proposed model indicate that the annual growth rate of electricity demand in China will slow down. However, China will continue to demand about 13 trillion kilowatt hours in 2030 because of population growth, economic growth, and urbanization. In addition, the model has greater accuracy and reliability compared with other single optimization methods.

  3. Human factors reliability Benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-06-01

    The Joint Research Centre of the European Commission has organized a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organized around two study cases: (1) analysis of routine functional Test and Maintenance (T and M) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report contains the final summary reports produced by the participants in the exercise

  4. Medium-range reference evapotranspiration forecasts for the contiguous United States based on multi-model numerical weather predictions

    Science.gov (United States)

    Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.

    2018-07-01

    Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.

  5. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  6. Using Bayes Model Averaging for Wind Power Forecasts

    Science.gov (United States)

    Preede Revheim, Pål; Beyer, Hans Georg

    2014-05-01

    For operational purposes predictions of the forecasts of the lumped output of groups of wind farms spread over larger geographic areas will often be of interest. A naive approach is to make forecasts for each individual site and sum them up to get the group forecast. It is however well documented that a better choice is to use a model that also takes advantage of spatial smoothing effects. It might however be the case that some sites tends to more accurately reflect the total output of the region, either in general or for certain wind directions. It will then be of interest giving these a greater influence over the group forecast. Bayesian model averaging (BMA) is a statistical post-processing method for producing probabilistic forecasts from ensembles. Raftery et al. [1] show how BMA can be used for statistical post processing of forecast ensembles, producing PDFs of future weather quantities. The BMA predictive PDF of a future weather quantity is a weighted average of the ensemble members' PDFs, where the weights can be interpreted as posterior probabilities and reflect the ensemble members' contribution to overall forecasting skill over a training period. In Revheim and Beyer [2] the BMA procedure used in Sloughter, Gneiting and Raftery [3] were found to produce fairly accurate PDFs for the future mean wind speed of a group of sites from the single sites wind speeds. However, when the procedure was attempted applied to wind power it resulted in either problems with the estimation of the parameters (mainly caused by longer consecutive periods of no power production) or severe underestimation (mainly caused by problems with reflecting the power curve). In this paper the problems that arose when applying BMA to wind power forecasting is met through two strategies. First, the BMA procedure is run with a combination of single site wind speeds and single site wind power production as input. This solves the problem with longer consecutive periods where the input data

  7. Forecast of Antarctic Sea Ice and Meteorological Fields

    Science.gov (United States)

    Barreira, S.; Orquera, F.

    2017-12-01

    Since 2001, we have been forecasting the climatic fields of the Antarctic sea ice (SI) and surface air temperature, surface pressure and precipitation anomalies for the Southern Hemisphere at the Meteorological Department of the Argentine Naval Hydrographic Service with different techniques that have evolved with the years. Forecast is based on the results of Principal Components Analysis applied to SI series (S-Mode) that gives patterns of temporal series with validity areas (these series are important to determine which areas in Antarctica will have positive or negative SI anomalies based on what happen in the atmosphere) and, on the other hand, to SI fields (T-Mode) that give us the form of the SI fields anomalies based on a classification of 16 patterns. Each T-Mode pattern has unique atmospheric fields associated to them. Therefore, it is possible to forecast whichever atmosphere variable we decide for the Southern Hemisphere. When the forecast is obtained, each pattern has a probability of occurrence and sometimes it is necessary to compose more than one of them to obtain the final result. S-Mode and T-Mode are monthly updated with new data, for that reason the forecasts improved with the increase of cases since 2001. We used the Monthly Polar Gridded Sea Ice Concentrations database derived from satellite information generated by NASA Team algorithm provided monthly by the National Snow and Ice Data Center of USA that begins in November 1978. Recently, we have been experimenting with multilayer Perceptron (neuronal network) with supervised learning and a back-propagation algorithm to improve the forecast. The Perceptron is the most common Artificial Neural Network topology dedicated to image pattern recognition. It was implemented through the use of temperature and pressure anomalies field images that were associated with a the different sea ice anomaly patterns. The variables analyzed included only composites of surface air temperature and pressure anomalies

  8. Reliability analysis of RC containment structures under combined loads

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Kagami, S.

    1984-01-01

    This paper discusses a reliability analysis method and load combination design criteria for reinforced concrete containment structures under combined loads. The probability based reliability analysis method is briefly described. For load combination design criteria, derivations of the load factors for accidental pressure due to a design basis accident and safe shutdown earthquake (SSE) for three target limit state probabilities are presented

  9. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  10. The Astringency of the GP Algorithm for Forecasting Software Failure Data Series

    Directory of Open Access Journals (Sweden)

    Yong-qiang Zhang

    2007-05-01

    Full Text Available The forecasting of software failure data series by Genetic Programming (GP can be realized without any assumptions before modeling. This discovery has transformed traditional statistical modeling methods as well as improved consistency for model applicability. The individuals' different characteristics during the evolution of generations, which are randomly changeable, are treated as Markov random processes. This paper also proposes that a GP algorithm with "optimal individuals reserved strategy" is the best solution to this problem, and therefore the adaptive individuals finally will be evolved. This will allow practical applications in software reliability modeling analysis and forecasting for failure behaviors. Moreover it can verify the feasibility and availability of the GP algorithm, which is applied to software failure data series forecasting on a theoretical basis. The results show that the GP algorithm is the best solution for software failure behaviors in a variety of disciplines.

  11. A Novel Nonlinear Combined Forecasting System for Short-Term Load Forecasting

    Directory of Open Access Journals (Sweden)

    Chengshi Tian

    2018-03-01

    Full Text Available Short-term load forecasting plays an indispensable role in electric power systems, which is not only an extremely challenging task but also a concerning issue for all society due to complex nonlinearity characteristics. However, most previous combined forecasting models were based on optimizing weight coefficients to develop a linear combined forecasting model, while ignoring that the linear combined model only considers the contribution of the linear terms to improving the model’s performance, which will lead to poor forecasting results because of the significance of the neglected and potential nonlinear terms. In this paper, a novel nonlinear combined forecasting system, which consists of three modules (improved data pre-processing module, forecasting module and the evaluation module is developed for short-term load forecasting. Different from the simple data pre-processing of most previous studies, the improved data pre-processing module based on longitudinal data selection is successfully developed in this system, which further improves the effectiveness of data pre-processing and then enhances the final forecasting performance. Furthermore, the modified support vector machine is developed to integrate all the individual predictors and obtain the final prediction, which successfully overcomes the upper drawbacks of the linear combined model. Moreover, the evaluation module is incorporated to perform a scientific evaluation for the developed system. The half-hourly electrical load data from New South Wales are employed to verify the effectiveness of the developed forecasting system, and the results reveal that the developed nonlinear forecasting system can be employed in the dispatching and planning for smart grids.

  12. Probabilistic Forecasting of the Wave Energy Flux

    DEFF Research Database (Denmark)

    Pinson, Pierre; Reikard, G.; Bidlot, J.-R.

    2012-01-01

    Wave energy will certainly have a significant role to play in the deployment of renewable energy generation capacities. As with wind and solar, probabilistic forecasts of wave power over horizons of a few hours to a few days are required for power system operation as well as trading in electricit......% and 70% in terms of Continuous Rank Probability Score (CRPS), depending upon the test case and the lead time. It is finally shown that the log-Normal assumption can be seen as acceptable, even though it may be refined in the future....

  13. Reliability Analysis of Free Jet Scour Below Dams

    Directory of Open Access Journals (Sweden)

    Chuanqi Li

    2012-12-01

    Full Text Available Current formulas for calculating scour depth below of a free over fall are mostly deterministic in nature and do not adequately consider the uncertainties of various scouring parameters. A reliability-based assessment of scour, taking into account uncertainties of parameters and coefficients involved, should be performed. This paper studies the reliability of a dam foundation under the threat of scour. A model for calculating the reliability of scour and estimating the probability of failure of the dam foundation subjected to scour is presented. The Maximum Entropy Method is applied to construct the probability density function (PDF of the performance function subject to the moment constraints. Monte Carlo simulation (MCS is applied for uncertainty analysis. An example is considered, and there liability of its scour is computed, the influence of various random variables on the probability failure is analyzed.

  14. Types of Forecast and Weather-Related Information Used among Tourism Businesses in Coastal North Carolina

    Science.gov (United States)

    Ayscue, Emily P.

    This study profiles the coastal tourism sector, a large and diverse consumer of climate and weather information. It is crucial to provide reliable, accurate and relevant resources for the climate and weather-sensitive portions of this stakeholder group in order to guide them in capitalizing on current climate and weather conditions and to prepare them for potential changes. An online survey of tourism business owners, managers and support specialists was conducted within the eight North Carolina oceanfront counties asking respondents about forecasts they use and for what purposes as well as why certain forecasts are not used. Respondents were also asked about their perceived dependency of their business on climate and weather as well as how valuable different forecasts are to their decision-making. Business types represented include: Agriculture, Outdoor Recreation, Accommodations, Food Services, Parks and Heritage, and Other. Weekly forecasts were the most popular forecasts with Monthly and Seasonal being the least used. MANOVA and ANOVA analyses revealed outdoor-oriented businesses (Agriculture and Outdoor Recreation) as perceiving themselves significantly more dependent on climate and weather than indoor-oriented ones (Food Services and Accommodations). Outdoor businesses also valued short-range forecasts significantly more than indoor businesses. This suggests a positive relationship between perceived climate and weather dependency and forecast value. The low perceived dependency and value of short-range forecasts of indoor businesses presents an opportunity to create climate and weather information resources directed at how they can capitalize on positive climate and weather forecasts and how to counter negative effects with forecasted adverse conditions. The low use of long-range forecasts among all business types can be related to the low value placed on these forecasts. However, these forecasts are still important in that they are used to make more

  15. Using subseasonal-to-seasonal (S2S) extreme rainfall forecasts for extended-range flood prediction in Australia

    Science.gov (United States)

    White, C. J.; Franks, S. W.; McEvoy, D.

    2015-06-01

    Meteorological and hydrological centres around the world are looking at ways to improve their capacity to be able to produce and deliver skilful and reliable forecasts of high-impact extreme rainfall and flooding events on a range of prediction timescales (e.g. sub-daily, daily, multi-week, seasonal). Making improvements to extended-range rainfall and flood forecast models, assessing forecast skill and uncertainty, and exploring how to apply flood forecasts and communicate their benefits to decision-makers are significant challenges facing the forecasting and water resources management communities. This paper presents some of the latest science and initiatives from Australia on the development, application and communication of extreme rainfall and flood forecasts on the extended-range "subseasonal-to-seasonal" (S2S) forecasting timescale, with a focus on risk-based decision-making, increasing flood risk awareness and preparedness, capturing uncertainty, understanding human responses to flood forecasts and warnings, and the growing adoption of "climate services". The paper also demonstrates how forecasts of flood events across a range of prediction timescales could be beneficial to a range of sectors and society, most notably for disaster risk reduction (DRR) activities, emergency management and response, and strengthening community resilience. Extended-range S2S extreme flood forecasts, if presented as easily accessible, timely and relevant information are a valuable resource to help society better prepare for, and subsequently cope with, extreme flood events.

  16. Uncertainties in reservoir performance forecasts; Estimativa de incertezas na previsao de desempenho de reservatorios

    Energy Technology Data Exchange (ETDEWEB)

    Loschiavo, Roberto

    1999-07-01

    Project economic evaluation as well as facilities design for oil exploration is, in general based on production forecast. Since production forecast depends on several parameters that are not completely known, one should take a probabilistic approach for reservoir modeling and numerical flow simulation. In this work, we propose a procedure to estimate probabilistic production forecast profiles based on the decision tree technique. The most influencing parameters of a reservoir model are identified identified and combined to generate a number of realizations of the reservoirs. The combination of each branch of the decision tree defines the probability associated to each reservoir model. A computer program was developed to automatically generate the reservoir models, submit them to the numerical simulator, and process the results. Parallel computing was used to improve the performance of the procedure. (author)

  17. The european flood alert system EFAS – Part 2: Statistical skill assessment of probabilistic and deterministic operational forecasts

    Directory of Open Access Journals (Sweden)

    J. C. Bartholmes

    2009-02-01

    Full Text Available Since 2005 the European Flood Alert System (EFAS has been producing probabilistic hydrological forecasts in pre-operational mode at the Joint Research Centre (JRC of the European Commission. EFAS aims at increasing preparedness for floods in trans-national European river basins by providing medium-range deterministic and probabilistic flood forecasting information, from 3 to 10 days in advance, to national hydro-meteorological services.

    This paper is Part 2 of a study presenting the development and skill assessment of EFAS. In Part 1, the scientific approach adopted in the development of the system has been presented, as well as its basic principles and forecast products. In the present article, two years of existing operational EFAS forecasts are statistically assessed and the skill of EFAS forecasts is analysed with several skill scores. The analysis is based on the comparison of threshold exceedances between proxy-observed and forecasted discharges. Skill is assessed both with and without taking into account the persistence of the forecasted signal during consecutive forecasts.

    Skill assessment approaches are mostly adopted from meteorology and the analysis also compares probabilistic and deterministic aspects of EFAS. Furthermore, the utility of different skill scores is discussed and their strengths and shortcomings illustrated. The analysis shows the benefit of incorporating past forecasts in the probability analysis, for medium-range forecasts, which effectively increases the skill of the forecasts.

  18. Benefits of spatiotemporal modeling for short-term wind power forecasting at both individual and aggregated levels

    DEFF Research Database (Denmark)

    Lenzi, Amanda; Steinsland, Ingelin; Pinson, Pierre

    2018-01-01

    The share of wind energy in total installed power capacity has grown rapidly in recent years. Producing accurate and reliable forecasts of wind power production, together with a quantification of the uncertainty, is essential to optimally integrate wind energy into power systems. We build...... spatiotemporal models for wind power generation and obtain full probabilistic forecasts from 15 min to 5 h ahead. Detailed analyses of forecast performances on individual wind farms and aggregated wind power are provided. The predictions from our models are evaluated on a data set from wind farms in western...... Denmark using a sliding window approach, for which estimation is performed using only the last available measurements. The case study shows that it is important to have a spatiotemporal model instead of a temporal one to achieve calibrated aggregated forecasts. Furthermore, spatiotemporal models have...

  19. An integrated, probabilistic model for improved seasonal forecasting of agricultural crop yield under environmental uncertainty

    Directory of Open Access Journals (Sweden)

    Nathaniel K. Newlands

    2014-06-01

    Full Text Available We present a novel forecasting method for generating agricultural crop yield forecasts at the seasonal and regional-scale, integrating agroclimate variables and remotely-sensed indices. The method devises a multivariate statistical model to compute bias and uncertainty in forecasted yield at the Census of Agricultural Region (CAR scale across the Canadian Prairies. The method uses robust variable-selection to select the best predictors within spatial subregions. Markov-Chain Monte Carlo (MCMC simulation and random forest-tree machine learning techniques are then integrated to generate sequential forecasts through the growing season. Cross-validation of the model was performed by hindcasting/backcasting it and comparing its forecasts against available historical data (1987-2011 for spring wheat (Triticum aestivum L.. The model was also validated for the 2012 growing season by comparing its forecast skill at the CAR, provincial and Canadian Prairie region scales against available statistical survey data. Mean percent departures between wheat yield forecasted were under-estimated by 1-4 % in mid-season and over-estimated by 1 % at the end of the growing season. This integrated methodology offers a consistent, generalizable approach for sequentially forecasting crop yield at the regional-scale. It provides a statistically robust, yet flexible way to concurrently adjust to data-rich and data-sparse situations, adaptively select different predictors of yield to changing levels of environmental uncertainty, and to update forecasts sequentially so as to incorporate new data as it becomes available. This integrated method also provides additional statistical support for assessing the accuracy and reliability of model-based crop yield forecasts in time and space.

  20. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)