WorldWideScience

Sample records for global probabilistic approach

  1. A probabilistic approach to examine the impacts of mitigation policies on future global PM emissions from on-road vehicles

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2012-12-01

    There is deficiency in the determination of emission reduction potential in the future, especially with consideration of uncertainty. Mitigation measures for some economic sectors have been proposed, but few studies provide an evaluation of the amount of PM emission reduction that can be obtained in future years by different emission reduction strategies. We attribute the absence of helpful mitigation strategy analysis to limitations in the technical detail of future emission scenarios, which result in the inability to relate technological or regulatory intervention to emission changes. The purpose of this work is to provide a better understanding of the potential benefits of mitigation policies in addressing global and regional emissions. In this work, we introduce a probabilistic approach to explore the impacts of retrofit and scrappage on global PM emissions from on-road vehicles in the coming decades. This approach includes scenario analysis, sensitivity analysis and Monte Carlo simulations. A dynamic model of vehicle population linked to emission characteristics, SPEW-Trend, is used to estimate future emissions and make policy evaluations. Three basic questions will be answered in this work: (1) what contribution can these two programs make to improve global emissions in the future? (2) in which regions are such programs most and least effective in reducing emissions and what features of the vehicle fleet cause these results? (3) what is the level of confidence in the projected emission reductions, given uncertain parameters in describing the dynamic vehicle fleet?

  2. Probabilistic approach to mechanisms

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  3. Probabilistic approaches to recommendations

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  4. Probabilistic modeling and global sensitivity analysis for CO 2 storage in geological formations: a spectral approach

    Saad, Bilal Mohammed

    2017-09-18

    This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.

  5. Probabilistic modeling and global sensitivity analysis for CO 2 storage in geological formations: a spectral approach

    Saad, Bilal Mohammed; Alexanderian, Alen; Prudhomme, Serge; Knio, Omar

    2017-01-01

    This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.

  6. Global/local methods for probabilistic structural analysis

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  7. Probabilistic approach to EMP assessment

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  8. Probabilistic Approaches to Video Retrieval

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  9. A global empirical system for probabilistic seasonal climate prediction

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  10. Probabilistic approach to earthquake prediction.

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  11. Probabilistic approach to manipulator kinematics and dynamics

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  12. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  13. PROBABILISTIC APPROACH OF STABILIZED ELECTROMAGNETIC FIELD EFFECTS

    FELEA. I.

    2017-09-01

    Full Text Available The effects of the omnipresence of the electromagnetic field are certain and recognized. Assessing as accurately as possible these effects, which characterize random phenomena require the use of statistical-probabilistic calculation. This paper aims at assessing the probability of exceeding the admissible values of the characteristic sizes of the electromagnetic field - magnetic induction and electric field strength. The first part justifies the need for concern and specifies how to approach it. The mathematical model of approach and treatment is presented in the second part of the paper and the results obtained with reference to 14 power stations are synthesized in the third part. In the last part, are formulated the conclusions of the evaluations.

  14. Probabilistic interpretation of data a physicist's approach

    Miller, Guthrie

    2013-01-01

    This book is a physicists approach to interpretation of data using Markov Chain Monte Carlo (MCMC). The concepts are derived from first principles using a style of mathematics that quickly elucidates the basic ideas, sometimes with the aid of examples. Probabilistic data interpretation is a straightforward problem involving conditional probability. A prior probability distribution is essential, and examples are given. In this small book (200 pages) the reader is led from the most basic concepts of mathematical probability all the way to parallel processing algorithms for Markov Chain Monte Carlo. Fortran source code (for eigenvalue analysis of finite discrete Markov Chains, for MCMC, and for nonlinear least squares) is included with the supplementary material for this book (available online).

  15. A probabilistic approach to controlling crevice chemistry

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-01-01

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved

  16. Semantics of probabilistic processes an operational approach

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  17. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  18. A sampling-based approach to probabilistic pursuit evasion

    Mahadevan, Aditya; Amato, Nancy M.

    2012-01-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented

  19. A Markov Chain Approach to Probabilistic Swarm Guidance

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  20. Overview of the probabilistic risk assessment approach

    Reed, J.W.

    1985-01-01

    The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table

  1. A random probabilistic approach to seismic nuclear power plant analysis

    Romo, M.P.

    1985-01-01

    A probabilistic method for the seismic analysis of structures which takes into account the random nature of earthquakes and of the soil parameter uncertainties is presented in this paper. The method was developed combining elements of the theory of perturbations, the Random vibration theory and the complex response method. The probabilistic method is evaluated by comparing the responses of a single degree of freedom system computed with this approach and the Monte Carlo method. (orig.)

  2. A probabilistic approach to crack instability

    Chudnovsky, A.; Kunin, B.

    1989-01-01

    A probabilistic model of brittle fracture is examined with reference to two-dimensional problems. The model is illustrated by using experimental data obtained for 25 macroscopically identical specimens made of short-fiber-reinforced composites. It is shown that the model proposed here provides a predictive formalism for the probability distributions of critical crack depth, critical loads, and crack arrest depths. It also provides similarity criteria for small-scale testing.

  3. Future trends in flood risk in Indonesia - A probabilistic approach

    Muis, Sanne; Guneralp, Burak; Jongman, Brenden; Ward, Philip

    2014-05-01

    Indonesia is one of the 10 most populous countries in the world and is highly vulnerable to (river) flooding. Catastrophic floods occur on a regular basis; total estimated damages were US 0.8 bn in 2010 and US 3 bn in 2013. Large parts of Greater Jakarta, the capital city, are annually subject to flooding. Flood risks (i.e. the product of hazard, exposure and vulnerability) are increasing due to rapid increases in exposure, such as strong population growth and ongoing economic development. The increase in risk may also be amplified by increasing flood hazards, such as increasing flood frequency and intensity due to climate change and land subsidence. The implementation of adaptation measures, such as the construction of dykes and strategic urban planning, may counteract these increasing trends. However, despite its importance for adaptation planning, a comprehensive assessment of current and future flood risk in Indonesia is lacking. This contribution addresses this issue and aims to provide insight into how socio-economic trends and climate change projections may shape future flood risks in Indonesia. Flood risk were calculated using an adapted version of the GLOFRIS global flood risk assessment model. Using this approach, we produced probabilistic maps of flood risks (i.e. annual expected damage) at a resolution of 30"x30" (ca. 1km x 1km at the equator). To represent flood exposure, we produced probabilistic projections of urban growth in a Monte-Carlo fashion based on probability density functions of projected population and GDP values for 2030. To represent flood hazard, inundation maps were computed using the hydrological-hydraulic component of GLOFRIS. These maps show flood inundation extent and depth for several return periods and were produced for several combinations of GCMs and future socioeconomic scenarios. Finally, the implementation of different adaptation strategies was incorporated into the model to explore to what extent adaptation may be able to

  4. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  5. Deterministic and probabilistic approach to safety analysis

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  6. A global probabilistic tsunami hazard assessment from earthquake sources

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  7. Variational approach to probabilistic finite elements

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-08-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  8. Global Infrasound Association Based on Probabilistic Clutter Categorization

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  9. The probabilistic approach and the deterministic licensing procedure

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  10. Information fusion in signal and image processing major probabilistic and non-probabilistic numerical approaches

    Bloch, Isabelle

    2010-01-01

    The area of information fusion has grown considerably during the last few years, leading to a rapid and impressive evolution. In such fast-moving times, it is important to take stock of the changes that have occurred. As such, this books offers an overview of the general principles and specificities of information fusion in signal and image processing, as well as covering the main numerical methods (probabilistic approaches, fuzzy sets and possibility theory and belief functions).

  11. A sampling-based approach to probabilistic pursuit evasion

    Mahadevan, Aditya

    2012-05-01

    Probabilistic roadmaps (PRMs) are a sampling-based approach to motion-planning that encodes feasible paths through the environment using a graph created from a subset of valid positions. Prior research has shown that PRMs can be augmented with useful information to model interesting scenarios related to multi-agent interaction and coordination. © 2012 IEEE.

  12. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  13. Application of probabilistic risk based optimization approaches in environmental restoration

    Goldammer, W.

    1995-01-01

    The paper presents a general approach to site-specific risk assessments and optimization procedures. In order to account for uncertainties in the assessment of the current situation and future developments, optimization parameters are treated as probabilistic distributions. The assessments are performed within the framework of a cost-benefit analysis. Radiation hazards and conventional risks are treated within an integrated approach. Special consideration is given to consequences of low probability events such as, earthquakes or major floods. Risks and financial costs are combined to an overall figure of detriment allowing one to distinguish between benefits of available reclamation options. The probabilistic analysis uses a Monte Carlo simulation technique. The paper demonstrates the applicability of this approach in aiding the reclamation planning using an example from the German reclamation program for uranium mining and milling sites

  14. A probabilistic approach to delineating functional brain regions

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  15. Probabilistic approaches to life prediction of nuclear plant structural components

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  16. Probabilistic approaches to life prediction of nuclear plant structural components

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  17. Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment. Research project

    Martorell, S.; Serradell, V.; Munoz, A.; Sanchez, A.

    1997-01-01

    Background, objective, scope, detailed working plan and follow-up and final product of the project ''Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment'' are described

  18. Standardized approach for developing probabilistic exposure factor distributions

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  19. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  20. A probabilistic approach to Radiological Environmental Impact Assessment

    Avila, Rodolfo; Larsson, Carl-Magnus

    2001-01-01

    Since a radiological environmental impact assessment typically relies on limited data and poorly based extrapolation methods, point estimations, as implied by a deterministic approach, do not suffice. To be of practical use for risk management, it is necessary to quantify the uncertainty margins of the estimates as well. In this paper we discuss how to work out a probabilistic approach for dealing with uncertainties in assessments of the radiological risks to non-human biota of a radioactive contamination. Possible strategies for deriving the relevant probability distribution functions from available empirical data and theoretical knowledge are outlined

  1. Globalization - Different approaches

    Viorica Puscaciu

    2014-11-01

    Full Text Available In this paper we investigate the different approaches of the globalization phenomenon. Despite the geographical distancesm, the link between people are ever more strong on different ways and plans: from technology between political, economical, cultural world events, and many other aspects. So, the link between globalization and democracy, and its impact on the most important social and economic matters. We also surprise the impact of the internet revolution and its corolar e-commerce, and its consequences, sometimes unpredictible ones. Another annalysed problem is that of the governments trying, and sometimes succeeding to controll the money, products, peole and their ideas that freely move inside the national frontiers, thus going to slower or to stop the progress. Nevertheless, this global interraction between people also create phenomena of insecurity on different ways: terrorism, traffic of arms, drugs, economical aggresions causing the environment, and other inconvenient facts and situations.

  2. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  3. Probabilistic logics and probabilistic networks

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  4. A probabilistic approach for representation of interval uncertainty

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  5. Convex models and probabilistic approach of nonlinear fatigue failure

    Qiu Zhiping; Lin Qiang; Wang Xiaojun

    2008-01-01

    This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach

  6. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  7. A probabilistic approach for RIA fuel failure criteria

    Carlo Vitanza, Dr.

    2008-01-01

    Substantial experimental data have been produced in support of the definition of the RIA safety limits for water reactor fuels at high burn up. Based on these data, fuel failure enthalpy limits can be derived based on methods having a varying degree of complexity. However, regardless of sophistication, it is unlikely that any deterministic approach would result in perfect predictions of all failure and non failure data obtained in RIA tests. Accordingly, a probabilistic approach is proposed in this paper, where in addition to a best estimate evaluation of the failure enthalpy, a RIA fuel failure probability distribution is defined within an enthalpy band surrounding the best estimate failure enthalpy. The band width and the failure probability distribution within this band are determined on the basis of the whole data set, including failure and non failure data and accounting for the actual scatter of the database. The present probabilistic approach can be used in conjunction with any deterministic model or correlation. For deterministic models or correlations having good prediction capability, the probability distribution will be sharply increasing within a narrow band around the best estimate value. For deterministic predictions of lower quality, instead, the resulting probability distribution will be broad and coarser

  8. Probabilistic energy forecasting: Global Energy Forecasting Competition 2014 and beyond

    Hong, Tao; Pinson, Pierre; Fan, Shu

    2016-01-01

    The energy industry has been going through a significant modernization process over the last decade. Its infrastructure is being upgraded rapidly. The supply, demand and prices are becoming more volatile and less predictable than ever before. Even its business model is being challenged fundamenta......The energy industry has been going through a significant modernization process over the last decade. Its infrastructure is being upgraded rapidly. The supply, demand and prices are becoming more volatile and less predictable than ever before. Even its business model is being challenged...... fundamentally. In this competitive and dynamic environment, many decision-making processes rely on probabilistic forecasts to quantify the uncertain future. Although most of the papers in the energy forecasting literature focus on point or singlevalued forecasts, the research interest in probabilistic energy...

  9. Using probabilistic finite automata to simulate hourly series of global radiation

    Mora-Lopez, L. [Universidad de Malaga (Spain). Dpto. Lenguajes y Computacion; Sidrach-de-Cardona, M. [Universidad de Malaga (Spain). Dpto. Fisica Aplicada II

    2003-03-01

    A model to generate synthetic series of hourly exposure of global radiation is proposed. This model has been constructed using a machine learning approach. It is based on the use of a subclass of probabilistic finite automata which can be used for variable-order Markov processes. This model allows us to represent the different relationships and the representative information observed in the hourly series of global radiation; the variable-order Markov process can be used as a natural way to represent different types of days, and to take into account the ''variable memory'' of cloudiness. A method to generate new series of hourly global radiation, which incorporates the randomness observed in recorded series, is also proposed. As input data this method only uses the mean monthly value of the daily solar global radiation. We examine if the recorded and simulated series are similar. It can be concluded that both series have the same statistical properties. (author)

  10. A dynamic probabilistic safety margin characterization approach in support of Integrated Deterministic and Probabilistic Safety Analysis

    Di Maio, Francesco; Rai, Ajit; Zio, Enrico

    2016-01-01

    The challenge of Risk-Informed Safety Margin Characterization (RISMC) is to develop a methodology for estimating system safety margins in the presence of stochastic and epistemic uncertainties affecting the system dynamic behavior. This is useful to support decision-making for licensing purposes. In the present work, safety margin uncertainties are handled by Order Statistics (OS) (with both Bracketing and Coverage approaches) to jointly estimate percentiles of the distributions of the safety parameter and of the time required for it to reach these percentiles values during its dynamic evolution. The novelty of the proposed approach consists in the integration of dynamic aspects (i.e., timing of events) into the definition of a dynamic safety margin for a probabilistic Quantification of Margin and Uncertainties (QMU). The system here considered for demonstration purposes is the Lead–Bismuth Eutectic- eXperimental Accelerator Driven System (LBE-XADS). - Highlights: • We integrate dynamic aspects into the definition of a safety margins. • We consider stochastic and epistemic uncertainties affecting the system dynamics. • Uncertainties are handled by Order Statistics (OS). • We estimate the system grace time during accidental scenarios. • We apply the approach to an LBE-XADS accidental scenario.

  11. The probabilistic approach in the licensing process and the development of probabilistic risk assessment methodology in Japan

    Togo, Y.; Sato, K.

    1981-01-01

    The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)

  12. A Probabilistic, Facility-Centric Approach to Lightning Strike Location

    Huddleston, Lisa L.; Roeder, William p.; Merceret, Francis J.

    2012-01-01

    A new probabilistic facility-centric approach to lightning strike location has been developed. This process uses the bivariate Gaussian distribution of probability density provided by the current lightning location error ellipse for the most likely location of a lightning stroke and integrates it to determine the probability that the stroke is inside any specified radius of any location, even if that location is not centered on or even with the location error ellipse. This technique is adapted from a method of calculating the probability of debris collisionith spacecraft. Such a technique is important in spaceport processing activities because it allows engineers to quantify the risk of induced current damage to critical electronics due to nearby lightning strokes. This technique was tested extensively and is now in use by space launch organizations at Kennedy Space Center and Cape Canaveral Air Force Station. Future applications could include forensic meteorology.

  13. A probabilistic approach to the evaluation of the PTS issue

    Cheverton, R.D.; Selby, D.L.

    1991-01-01

    An integrated probabilistic approach for the evaluation of the pressurized-thermal-shock (PTS) issue was developed at the Oak Ridge National Laboratory (ORNL) at the request of the Nuclear Regulatory Commission (NRC). The purpose was to provide a method for identifying dominant plant design and operating features, evaluating possible remedial measures and the validity of the NRC PTS screening criteria, and to provide an additional tool for estimating vessel life expectancy. The approach was to be integrated in the sense that it would include the postulation of transients; estimates of their frequencies of occurrence; systems analyses to obtain the corresponding primary-system pressure, down-comer coolant temperature, and fluid-film heat-transfer coefficient adjacent to the vessel wall; and a probabilistic fracture-mechanics analysis, using the latter data as input. A summation of the products of frequency of transient and conditional probability of failure for all postulated transients provides an estimate of frequency of vessel failure. In the process of developing the integrated pressurized-thermal-shock (IPTS) methodology, three specific plant analyses were conducted. The results indicate that the NRC screening criteria may not be appropriate for all US pressurized water reactor (PWR) plants; that is, for some PWRs, the calculated mean frequency of vessel failure corresponding to the screening criteria may be greater than the maximum permissible value in Regulatory Guide 1.154. A recent view of the ORNL IPTS study, which was completed in 1985, indicates that there are a number of areas in which the methodology can and should be updated, but it is not clear whether the update will increase or decrease the calculated probabilities. 31 refs., 2 tabs

  14. Transmission capacity assessment by probabilistic planning. An approach

    Lammintausta, M.

    2002-01-01

    The Finnish electricity markets participate in the Scandinavian markets, Nord-Pool. The Finnish market is free for marketers, producers and consumers. All these participants can be seen as customers of the transmission network, which in turn can be considered to be a market place in which electricity can be sold and bought. The Finnish transmission network is owned and operated by an independent company, Fingrid that has the full responsibility of the Finnish transmission system. The available transfer capacity of a transmission route is traditionally limited by deterministic security constraints. More efficient and flexible network utilisation could be achieved with probabilistic planning methods. This report introduces a simple and practical probabilistic approach for transfer limit and risk assessment. The method is based on the economical benefit and risk predictions. It uses also the existing results of deterministic data and it could be used side by side with the deterministic method. The basic concept and necessary equations for expected risks of various market players have been derived for further developments. The outage costs and thereby the risks of the market participants depend on how the system operator reacts to the faults. In the Finnish power system consumers will usually experience no costs due to the faults because of meshed network and counter trade method preferred by the system operator. The costs to the producers and dealers are also low because of the counter trade method. The network company will lose the cost of reparation, additional losses and cost of regulation power because of counter trades. In case power flows will be rearranged drastically because of aggressive strategies used in the electricity markets, the only way to fulfil the needs of free markets is that the network operator buys regulation power for short-term problems and reinforces the network in the long-term situations. The reinforcement is done if the network can not be

  15. Probabilistic encoding of stimulus strength in astrocyte global calcium signals.

    Croft, Wayne; Reusch, Katharina; Tilunaite, Agne; Russell, Noah A; Thul, Rüdiger; Bellamy, Tomas C

    2016-04-01

    Astrocyte calcium signals can range in size from subcellular microdomains to waves that spread through the whole cell (and into connected cells). The differential roles of such local or global calcium signaling are under intense investigation, but the mechanisms by which local signals evolve into global signals in astrocytes are not well understood, nor are the computational rules by which physiological stimuli are transduced into a global signal. To investigate these questions, we transiently applied receptor agonists linked to calcium signaling to primary cultures of cerebellar astrocytes. Astrocytes repetitively tested with the same stimulus responded with global signals intermittently, indicating that each stimulus had a defined probability for triggering a response. The response probability varied between agonists, increased with agonist concentration, and could be positively and negatively modulated by crosstalk with other signaling pathways. To better understand the processes determining the evolution of a global signal, we recorded subcellular calcium "puffs" throughout the whole cell during stimulation. The key requirement for puffs to trigger a global calcium wave following receptor activation appeared to be the synchronous release of calcium from three or more sites, rather than an increasing calcium load accumulating in the cytosol due to increased puff size, amplitude, or frequency. These results suggest that the concentration of transient stimuli will be encoded into a probability of generating a global calcium response, determined by the likelihood of synchronous release from multiple subcellular sites. © 2015 Wiley Periodicals, Inc.

  16. A Probabilistic Approach for Breast Boundary Extraction in Mammograms

    Hamed Habibi Aghdam

    2013-01-01

    Full Text Available The extraction of the breast boundary is crucial to perform further analysis of mammogram. Methods to extract the breast boundary can be classified into two categories: methods based on image processing techniques and those based on models. The former use image transformation techniques such as thresholding, morphological operations, and region growing. In the second category, the boundary is extracted using more advanced techniques, such as the active contour model. The problem with thresholding methods is that it is a hard to automatically find the optimal threshold value by using histogram information. On the other hand, active contour models require defining a starting point close to the actual boundary to be able to successfully extract the boundary. In this paper, we propose a probabilistic approach to address the aforementioned problems. In our approach we use local binary patterns to describe the texture around each pixel. In addition, the smoothness of the boundary is handled by using a new probability model. Experimental results show that the proposed method reaches 38% and 50% improvement with respect to the results obtained by the active contour model and threshold-based methods respectively, and it increases the stability of the boundary extraction process up to 86%.

  17. A probabilistic approach for validating protein NMR chemical shift assignments

    Wang Bowei; Wang, Yunjun; Wishart, David S.

    2010-01-01

    It has been estimated that more than 20% of the proteins in the BMRB are improperly referenced and that about 1% of all chemical shift assignments are mis-assigned. These statistics also reflect the likelihood that any newly assigned protein will have shift assignment or shift referencing errors. The relatively high frequency of these errors continues to be a concern for the biomolecular NMR community. While several programs do exist to detect and/or correct chemical shift mis-referencing or chemical shift mis-assignments, most can only do one, or the other. The one program (SHIFTCOR) that is capable of handling both chemical shift mis-referencing and mis-assignments, requires the 3D structure coordinates of the target protein. Given that chemical shift mis-assignments and chemical shift re-referencing issues should ideally be addressed prior to 3D structure determination, there is a clear need to develop a structure-independent approach. Here, we present a new structure-independent protocol, which is based on using residue-specific and secondary structure-specific chemical shift distributions calculated over small (3-6 residue) fragments to identify mis-assigned resonances. The method is also able to identify and re-reference mis-referenced chemical shift assignments. Comparisons against existing re-referencing or mis-assignment detection programs show that the method is as good or superior to existing approaches. The protocol described here has been implemented into a freely available Java program called 'Probabilistic Approach for protein Nmr Assignment Validation (PANAV)' and as a web server (http://redpoll.pharmacy.ualberta.ca/PANAVhttp://redpoll.pharmacy.ualberta.ca/PANAV) which can be used to validate and/or correct as well as re-reference assigned protein chemical shifts.

  18. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions.

    Kaufman, Leyla V; Wright, Mark G

    2017-07-07

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA) procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  19. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    Leyla V. Kaufman

    2017-07-01

    Full Text Available The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in Hawaii to validate a probabilistic risk assessment (PRA procedure for non-target impacts. We use data on known host range and habitat use in the place of origin of the parasitoids to determine whether contemporary levels of non-target parasitism could have been predicted using PRA. Our results show that reasonable predictions of potential non-target impacts may be made if comprehensive data are available from places of origin of biological control agents, but scant data produce poor predictions. Using apparent mortality data rather than marginal attack rate estimates in PRA resulted in over-estimates of predicted non-target impact. Incorporating ecological data into PRA models improved the predictive power of the risk assessments.

  20. A probabilistic approach to emission-line galaxy classification

    de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.

    2017-12-01

    We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.

  1. Dynamics beyond uniform hyperbolicity a global geometric and probabilistic perspective

    Bonatti, Christian; Viana, Marcelo

    2005-01-01

    The notion of uniform hyperbolicity, introduced by Steve Smale in the early sixties, unified important developments and led to a remarkably successful theory for a large class of systems: uniformly hyperbolic systems often exhibit complicated evolution which, nevertheless, is now rather well understood, both geometrically and statistically.Another revolution has been taking place in the last couple of decades, as one tries to build a global theory for "most" dynamical systems, recovering as much as possible of the conclusions of the uniformly hyperbolic case, in great generality. This book aims to put such recent developments in a unified perspective, and to point out open problems and likely directions for further progress. It is aimed at researchers, both young and senior, willing to get a quick, yet broad, view of this part of dynamics. Main ideas, methods, and results are discussed, at variable degrees of depth, with references to the original works for details and complementary information.

  2. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  3. Estimated Quality of Multistage Process on the Basis of Probabilistic Approach with Continuous Response Functions

    Yuri B. Tebekin

    2011-11-01

    Full Text Available The article is devoted to the problem of the quality management for multiphase processes on the basis of the probabilistic approach. Method with continuous response functions is offered from the application of the method of Lagrange multipliers.

  4. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  5. An approximate methods approach to probabilistic structural analysis

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  6. A tiered approach for probabilistic ecological risk assessment of contaminated sites

    Zolezzi, M.; Nicolella, C.; Tarazona, J.V.

    2005-01-01

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic [it

  7. Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach

    Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis

    2003-01-01

    Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)

  8. Identification of probabilistic approaches and map-based navigation ...

    B Madhevan

    2018-02-07

    Feb 7, 2018 ... consists of three processes: map learning (ML), localization and PP [73–76]. (i) ML ...... [83] Thrun S 2001 A probabilistic online mapping algorithm for teams of .... for target tracking using fuzzy logic controller in game theoretic ...

  9. Basic Ideas to Approach Metastability in Probabilistic Cellular Automata

    Cirillo, Emilio N. M.; Nardi, Francesca R.; Spitoni, Cristian

    2016-01-01

    Cellular Automata are discrete--time dynamical systems on a spatially extended discrete space which provide paradigmatic examples of nonlinear phenomena. Their stochastic generalizations, i.e., Probabilistic Cellular Automata, are discrete time Markov chains on lattice with finite single--cell

  10. CANDU type fuel behavior evaluation - a probabilistic approach

    Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.

    1995-01-01

    In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)

  11. A probabilistic approach of sum rules for heat polynomials

    Vignat, C; Lévêque, O

    2012-01-01

    In this paper, we show that the sum rules for generalized Hermite polynomials derived by Daboul and Mizrahi (2005 J. Phys. A: Math. Gen. http://dx.doi.org/10.1088/0305-4470/38/2/010) and by Graczyk and Nowak (2004 C. R. Acad. Sci., Ser. 1 338 849) can be interpreted and easily recovered using a probabilistic moment representation of these polynomials. The covariance property of the raising operator of the harmonic oscillator, which is at the origin of the identities proved in Daboul and Mizrahi and the dimension reduction effect expressed in the main result of Graczyk and Nowak are both interpreted in terms of the rotational invariance of the Gaussian distributions. As an application of these results, we uncover a probabilistic moment interpretation of two classical integrals of the Wigner function that involve the associated Laguerre polynomials. (paper)

  12. Intermediate probabilistic safety assessment approach for safety critical digital systems

    Taeyong, Sung; Hyun Gook, Kang

    2001-01-01

    Even though the conventional probabilistic safety assessment methods are immature for applying to microprocessor-based digital systems, practical needs force to apply it. In the Korea, UCN 5 and 6 units are being constructed and Korean Next Generation Reactor is being designed using the digital instrumentation and control equipment for the safety related functions. Korean regulatory body requires probabilistic safety assessment. This paper analyzes the difficulties on the assessment of digital systems and suggests an intermediate framework for evaluating their safety using fault tree models. The framework deals with several important characteristics of digital systems including software modules and fault-tolerant features. We expect that the analysis result will provide valuable design feedback. (authors)

  13. Systems analysis approach to probabilistic modeling of fault trees

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  14. Bayesian probabilistic network approach for managing earthquake risks of cities

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...... and a fourth module on the consequences of an earthquake. Each of these modules is integrated into a BPN. Special attention is given to aggregated risk, i.e. the risk contribution from assets at multiple locations in a city subjected to the same earthquake. The application of the methodology is illustrated...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....

  15. Tools for voltage stability analysis, including a probabilistic approach

    Vieira Filho, X; Martins, N; Bianco, A; Pinto, H J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  16. Operational intervention levels in a nuclear emergency, general concepts and a probabilistic approach

    Lauritzen, B.; Baeverstam, U.; Naadland Holo, E.; Sinkko, K.

    1997-12-01

    This report deals with Operational Intervention Levels (OILs) in a nuclear or radiation emergency. OILs are defined as the values of environmental measurements, in particular dose rate measurements, above which specific protective actions should be carried out in emergency exposure situations. The derivation and the application of OILs are discussed, and an overview of the presently adopted values is provided, with emphasis on the situation in the Nordic countries. A new, probabilistic approach to derive OILs is presented and the method is illustrated by calculating dose rate OILs in a simplified setting. Contrary to the standard approach, the probabilistic approach allows for optimization of OILs. It is argued, that optimized OILs may be much larger than the presently adopted or suggested values. It is recommended, that the probabilistic approach is further developed and employed in determining site specific OILs and in optimizing environmental measuring strategies. (au)

  17. Optimization-Based Approaches to Control of Probabilistic Boolean Networks

    Koichi Kobayashi

    2017-02-01

    Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.

  18. Impact of external events on site evaluation: a probabilistic approach

    Jaccarino, E.; Giuliani, P.; Zaffiro, C.

    1975-01-01

    A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)

  19. xLPR - a probabilistic approach to piping integrity analysis

    Harrington, C.; Rudland, D.; Fyfitch, S.

    2015-01-01

    The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015

  20. All-possible-couplings approach to measuring probabilistic context.

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  1. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  2. A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.

    Tatsuoka, Kikumi K.

    A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…

  3. Unit commitment with probabilistic reserve: An IPSO approach

    Lee, Tsung-Ying; Chen, Chun-Lung

    2007-01-01

    This paper presents a new algorithm for solution of the nonlinear optimal scheduling problem. This algorithm is named the iteration particle swarm optimization (IPSO). A new index, called iteration best, is incorporated into particle swarm optimization (PSO) to improve the solution quality and computation efficiency. IPSO is applied to solve the unit commitment with probabilistic reserve problem of a power system. The outage cost as well as fuel cost of thermal units was considered in the unit commitment program to evaluate the level of spinning reserve. The optimal scheduling of on line generation units was reached while minimizing the sum of fuel cost and outage cost. A 48 unit power system was used as a numerical example to test the new algorithm. The optimal scheduling of on line generation units could be reached in the testing results while satisfying the requirement of the objective function

  4. The development of a probabilistic approach to forecast coastal change

    Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.

    2011-01-01

    This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.

  5. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor

  6. Seismic Probabilistic Risk Assessment (SPRA), approach and results

    Campbell, R.D.

    1995-01-01

    During the past 15 years there have been over 30 Seismic Probabilistic Risk Assessments (SPRAs) and Seismic Probabilistic Safety Assessments (SPSAs) conducted of Western Nuclear Power Plants, principally of US design. In this paper PRA and PSA are used interchangeably as the overall process is essentially the same. Some similar assessments have been done for reactors in Taiwan, Korea, Japan, Switzerland and Slovenia. These plants were also principally US supplied or built under US license. Since the restructuring of the governments in former Soviet Bloc countries, there has been grave concern regarding the safety of the reactors in these countries. To date there has been considerable activity in conducting partial seismic upgrades but the overall quantification of risk has not been pursued to the depth that it has in Western countries. This paper summarizes the methodology for Seismic PRA/PSA and compares results of two partially completed and two completed PRAs of soviet designed reactors to results from earlier PRAs on US Reactors. A WWER 440 and a WWER 1000 located in low seismic activity regions have completed PRAs and results show the seismic risk to be very low for both designs. For more active regions, partially completed PRAs of a WWER 440 and WWER 1000 located at the same site show the WWER 440 to have much greater seismic risk than the WWER 1000 plant. The seismic risk from the 1000 MW plant compares with the high end of seismic risk for earlier seismic PRAs in the US. Just as for most US plants, the seismic risk appears to be less than the risk from internal events if risk is measured is terms of mean core damage frequency. However, due to the lack of containment for the earlier WWER 440s, the risk to the public may be significantly greater due to the more probable scenario of an early release. The studies reported have not taken the accident sequences beyond the stage of core damage hence the public heath risk ratios are speculative. (author)

  7. Deterministic and probabilistic approach to determine seismic risk of nuclear power plants; a practical example

    Soriano Pena, A.; Lopez Arroyo, A.; Roesset, J.M.

    1976-01-01

    The probabilistic and deterministic approaches for calculating the seismic risk of nuclear power plants are both applied to a particular case in Southern Spain. The results obtained by both methods, when varying the input data, are presented and some conclusions drawn in relation to the applicability of the methods, their reliability and their sensitivity to change

  8. Determining reserve requirements in DK1 area of Nord Pool using a probabilistic approach

    Saez Gallego, Javier; Morales González, Juan Miguel; Madsen, Henrik

    2014-01-01

    a probabilistic framework where the reserve requirements are computed based on scenarios of wind power forecast error, load forecast errors and power plant outages. Our approach is first motivated by the increasing wind power penetration in power systems worldwide as well as the current market design of the DK1...... System Operator). © 2014 Elsevier Ltd. All rights reserved....

  9. An approach to handle Real Time and Probabilistic behaviors in e-commerce

    Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.

    2005-01-01

    In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...

  10. Use of adjoint methods in the probabilistic finite element approach to fracture mechanics

    Liu, Wing Kam; Besterfield, Glen; Lawrence, Mark; Belytschko, Ted

    1988-01-01

    The adjoint method approach to probabilistic finite element methods (PFEM) is presented. When the number of objective functions is small compared to the number of random variables, the adjoint method is far superior to the direct method in evaluating the objective function derivatives with respect to the random variables. The PFEM is extended to probabilistic fracture mechanics (PFM) using an element which has the near crack-tip singular strain field embedded. Since only two objective functions (i.e., mode I and II stress intensity factors) are needed for PFM, the adjoint method is well suited.

  11. Improving PERSIANN-CCS rain estimation using probabilistic approach and multi-sensors information

    Karbalaee, N.; Hsu, K. L.; Sorooshian, S.; Kirstetter, P.; Hong, Y.

    2016-12-01

    This presentation discusses the recent implemented approaches to improve the rainfall estimation from Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network-Cloud Classification System (PERSIANN-CCS). PERSIANN-CCS is an infrared (IR) based algorithm being integrated in the IMERG (Integrated Multi-Satellite Retrievals for the Global Precipitation Mission GPM) to create a precipitation product in 0.1x0.1degree resolution over the chosen domain 50N to 50S every 30 minutes. Although PERSIANN-CCS has a high spatial and temporal resolution, it overestimates or underestimates due to some limitations.PERSIANN-CCS can estimate rainfall based on the extracted information from IR channels at three different temperature threshold levels (220, 235, and 253k). This algorithm relies only on infrared data to estimate rainfall indirectly from this channel which cause missing the rainfall from warm clouds and false estimation for no precipitating cold clouds. In this research the effectiveness of using other channels of GOES satellites such as visible and water vapors has been investigated. By using multi-sensors the precipitation can be estimated based on the extracted information from multiple channels. Also, instead of using the exponential function for estimating rainfall from cloud top temperature, the probabilistic method has been used. Using probability distributions of precipitation rates instead of deterministic values has improved the rainfall estimation for different type of clouds.

  12. Probabilistic Approach to Fatigue Assessment for Stay Cables

    Baussaron, Julien; Sørensen, John Dalsgaard; Toft, Henrik Stensgaard

    2013-01-01

    Many parameters used for predicting times to failure of structure due to fatigue are uncertain and their variations have a big influence on the real lifetime. This paper focus on a global methodology to take main sources of variability in fatigue prediction for stay cables into account. The first...

  13. Application of probabilistic approach to UP3-A reprocessing plant

    Mercier, J P; Bonneval, F; Weber, M [Institut de Protection et de Surete Nucleaire, Commissariat a l' Energie Atomique, Centre d' Etudes Nucleaires, Fontenay-aux-Roses (France)

    1992-02-01

    In the UP3-A design studies, three complementary approaches were used: - observance of regulations, and good practice; - review of experience feedback; - the correlation of probabilities and consequences making use of an acceptability graph. The latter approach was considered by the safety authorities to be an acceptable practice where the probability calculations were sufficiently accurate. Examples of its application are presented. (author)

  14. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  15. A Probabilistic Approach to Tropical Cyclone Conditions of Readiness (TCCOR)

    Wallace, Kenneth A

    2008-01-01

    Tropical Cyclone Conditions of Readiness (TCCOR) are set at DoD installations in the Western Pacific to convey the risk associated with the onset of destructive winds from approaching tropical cyclones...

  16. A Probabilistic Approach to Baffle Bolt IASCC Predictions

    Griesbach, Timothy J.; Licina, George J.; Riccardella, Peter C.; Rashid, Joe R.; Nickell, Robert E.

    2012-01-01

    test results reported in the literature, and plotted as percent of irradiated yield strength versus irradiation dose. This method was used to determine the probability of IASCC occurring at various stress levels, using a Weibull fit of the cumulative failure probability vs. IASCC ratio. To benchmark the model, bolt-by-bolt stresses in a typical PWR were estimated and the accumulated fluence or dose level were used with the model to predict the probabilities (or numbers) of baffle-former bolt failures due to IASCC over time (i.e., at various Effective Full Power Years). The resulting predictions were compared to actual field experience with bolt cracking in several operating PWRs. The model provides a probabilistic estimate of the number of cracked bolts that might be expected to be found during any future refueling outage with inspections of the baffle-former bolts. Such a priori knowledge is important because the plan for inspection of the baffle-former bolts may require additional contingencies depending on the likely outcome. (author)

  17. Probabilistic models for neural populations that naturally capture global coupling and criticality.

    Humplik, Jan; Tkačik, Gašper

    2017-09-01

    Advances in multi-unit recordings pave the way for statistical modeling of activity patterns in large neural populations. Recent studies have shown that the summed activity of all neurons strongly shapes the population response. A separate recent finding has been that neural populations also exhibit criticality, an anomalously large dynamic range for the probabilities of different population activity patterns. Motivated by these two observations, we introduce a class of probabilistic models which takes into account the prior knowledge that the neural population could be globally coupled and close to critical. These models consist of an energy function which parametrizes interactions between small groups of neurons, and an arbitrary positive, strictly increasing, and twice differentiable function which maps the energy of a population pattern to its probability. We show that: 1) augmenting a pairwise Ising model with a nonlinearity yields an accurate description of the activity of retinal ganglion cells which outperforms previous models based on the summed activity of neurons; 2) prior knowledge that the population is critical translates to prior expectations about the shape of the nonlinearity; 3) the nonlinearity admits an interpretation in terms of a continuous latent variable globally coupling the system whose distribution we can infer from data. Our method is independent of the underlying system's state space; hence, it can be applied to other systems such as natural scenes or amino acid sequences of proteins which are also known to exhibit criticality.

  18. Registration of indoor TLS data: in favor of a probabilistic approach initialized by geo-location

    Hullo, Jean-Francois

    2013-01-01

    Many pre-maintenance operations of industrial facilities currently resort on to three dimensional CAD models. The acquisition of these models is performed from point clouds measured by Terrestrial Laser Scanning (TLS). When the scenes are complex, several view points for scanning, also known as stations, are necessary to ensure the completeness and the density of the survey data. The generation of a global point cloud, i.e. the expression of all the acquired data in a common reference frame, is a crucial step called registration. During this process, the pose parameters are estimated. If the GNSS Systems are now a solution for many outdoor scenes, the registration of indoor TLS data still remains a challenge. The goal of this thesis is to improve the acquisition process of TLS data in industrial environments. The aim is to guarantee the precision and accuracy of acquired data, while optimizing on-site acquisition time and protocols by, as often as possible, freeing the operator from the constraints inherent to conventional topography surveys. In a first part, we consider the state of the art of the means and methods used during the acquisition of dense point clouds of complex interior scenes (Part I). In a second part, we study and evaluate the data available for the registration: terrestrial laser data, primitive reconstruction algorithms in point clouds and indoor geo-location Systems (Part II). In the third part, we then formalize and experiment a registration algorithm based on the use of matched primitives, reconstructed from per station point clouds ( Part III). We finally propose a probabilistic approach for matching primitives, allowing the integration of a priori information and uncertainty in the constraints System used for calculating poses (Part IV). The contributions of our work are as follows: - to take a critical look at current methods of TLS data acquisition in industrial environments, - to evaluate, through experimentations, the information

  19. A probabilistic approach to quantum mechanics based on 'tomograms'

    Caponigro, M.; Mancini, S.; Man'ko, V.I.

    2006-01-01

    It is usually believed that a picture of Quantum Mechanics in terms of true probabilities cannot be given due to the uncertainty relations. Here we discuss a tomographic approach to quantum states that leads to a probability representation of quantum states. This can be regarded as a classical-like formulation of quantum mechanics which avoids the counterintuitive concepts of wave function and density operator. The relevant concepts of quantum mechanics are then reconsidered and the epistemological implications of such approach discussed. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  20. Probabilistic Damage Characterization Using the Computationally-Efficient Bayesian Approach

    Warner, James E.; Hochhalter, Jacob D.

    2016-01-01

    This work presents a computationally-ecient approach for damage determination that quanti es uncertainty in the provided diagnosis. Given strain sensor data that are polluted with measurement errors, Bayesian inference is used to estimate the location, size, and orientation of damage. This approach uses Bayes' Theorem to combine any prior knowledge an analyst may have about the nature of the damage with information provided implicitly by the strain sensor data to form a posterior probability distribution over possible damage states. The unknown damage parameters are then estimated based on samples drawn numerically from this distribution using a Markov Chain Monte Carlo (MCMC) sampling algorithm. Several modi cations are made to the traditional Bayesian inference approach to provide signi cant computational speedup. First, an ecient surrogate model is constructed using sparse grid interpolation to replace a costly nite element model that must otherwise be evaluated for each sample drawn with MCMC. Next, the standard Bayesian posterior distribution is modi ed using a weighted likelihood formulation, which is shown to improve the convergence of the sampling process. Finally, a robust MCMC algorithm, Delayed Rejection Adaptive Metropolis (DRAM), is adopted to sample the probability distribution more eciently. Numerical examples demonstrate that the proposed framework e ectively provides damage estimates with uncertainty quanti cation and can yield orders of magnitude speedup over standard Bayesian approaches.

  1. A probabilistic approach to the drag-based model

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  2. Assessing Probabilistic Risk Assessment Approaches for Insect Biological Control Introductions

    Kaufman, Leyla V.; Wright, Mark G.

    2017-01-01

    The introduction of biological control agents to new environments requires host specificity tests to estimate potential non-target impacts of a prospective agent. Currently, the approach is conservative, and is based on physiological host ranges determined under captive rearing conditions, without consideration for ecological factors that may influence realized host range. We use historical data and current field data from introduced parasitoids that attack an endemic Lepidoptera species in H...

  3. A probabilistic approach to assessing radioactive waste container lifetimes

    Porter, F.M.; Naish, C.C.; Sharland, S.M.

    1994-01-01

    A general methodology has been developed to make assessments of the lifetime of specific radioactive waste container designs in a repository environment. The methodology employs a statistical approach, which aims to reflect uncertainty in the corrosion rates, and the evolution of the environmental conditions. In this paper, the methodology is demonstrated for an intermediate-level waste (ILW) container in the anticipated UK repository situation

  4. Combination of Evidence with Different Weighting Factors: A Novel Probabilistic-Based Dissimilarity Measure Approach

    Mengmeng Ma

    2015-01-01

    Full Text Available To solve the invalidation problem of Dempster-Shafer theory of evidence (DS with high conflict in multisensor data fusion, this paper presents a novel combination approach of conflict evidence with different weighting factors using a new probabilistic dissimilarity measure. Firstly, an improved probabilistic transformation function is proposed to map basic belief assignments (BBAs to probabilities. Then, a new dissimilarity measure integrating fuzzy nearness and introduced correlation coefficient is proposed to characterize not only the difference between basic belief functions (BBAs but also the divergence degree of the hypothesis that two BBAs support. Finally, the weighting factors used to reassign conflicts on BBAs are developed and Dempster’s rule is chosen to combine the discounted sources. Simple numerical examples are employed to demonstrate the merit of the proposed method. Through analysis and comparison of the results, the new combination approach can effectively solve the problem of conflict management with better convergence performance and robustness.

  5. Development of a Quantitative Framework for Regulatory Risk Assessments: Probabilistic Approaches

    Wilmot, R.D.

    2003-11-01

    The Swedish regulators have been active in the field of performance assessment for many years and have developed sophisticated approaches to the development of scenarios and other aspects of assessments. These assessments have generally used dose as the assessment end-point and have been based on deterministic calculations. Recently introduced Swedish regulations have introduced a risk criterion for radioactive waste disposal: the annual risk of harmful effects after closure of a disposal facility should not exceed 10 -6 for a representative individual in the group exposed to the greatest risk. A recent review of the overall structure of risk assessments in safety cases concluded that there are a number of decisions and assumptions in the development of a risk assessment methodology that could potentially affect the calculated results. Regulatory understanding of these issues, potentially supported by independent calculations, is important in preparing for review of a proponent's risk assessment. One approach to evaluating risk in performance assessments is to use the concept of probability to express uncertainties, and to propagate these probabilities through the analysis. This report describes the various approaches available for undertaking such probabilistic analyses, both as a means of accounting for uncertainty in the determination of risk and more generally as a means of sensitivity and uncertainty analysis. The report discusses the overall nature of probabilistic analyses and how they are applied to both the calculation of risk and sensitivity analyses. Several approaches are available, including differential analysis, response surface methods and simulation. Simulation is the approach most commonly used, both in assessments for radioactive waste disposal and in other subject areas, and the report describes the key stages of this approach in detail. Decisions relating to the development of input PDFs, sampling methods (including approaches to the treatment

  6. Approach to modeling of human performance for purposes of probabilistic risk assessment

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  7. A probabilistic approach for the estimation of earthquake source parameters from spectral inversion

    Supino, M.; Festa, G.; Zollo, A.

    2017-12-01

    The amplitude spectrum of a seismic signal related to an earthquake source carries information about the size of the rupture, moment, stress and energy release. Furthermore, it can be used to characterize the Green's function of the medium crossed by the seismic waves. We describe the earthquake amplitude spectrum assuming a generalized Brune's (1970) source model, and direct P- and S-waves propagating in a layered velocity model, characterized by a frequency-independent Q attenuation factor. The observed displacement spectrum depends indeed on three source parameters, the seismic moment (through the low-frequency spectral level), the corner frequency (that is a proxy of the fault length) and the high-frequency decay parameter. These parameters are strongly correlated each other and with the quality factor Q; a rigorous estimation of the associated uncertainties and parameter resolution is thus needed to obtain reliable estimations.In this work, the uncertainties are characterized adopting a probabilistic approach for the parameter estimation. Assuming an L2-norm based misfit function, we perform a global exploration of the parameter space to find the absolute minimum of the cost function and then we explore the cost-function associated joint a-posteriori probability density function around such a minimum, to extract the correlation matrix of the parameters. The global exploration relies on building a Markov chain in the parameter space and on combining a deterministic minimization with a random exploration of the space (basin-hopping technique). The joint pdf is built from the misfit function using the maximum likelihood principle and assuming a Gaussian-like distribution of the parameters. It is then computed on a grid centered at the global minimum of the cost-function. The numerical integration of the pdf finally provides mean, variance and correlation matrix associated with the set of best-fit parameters describing the model. Synthetic tests are performed to

  8. Simulation approaches to probabilistic structural design at the component level

    Stancampiano, P.A.

    1978-01-01

    In this paper, structural failure of large nuclear components is viewed as a random process with a low probability of occurrence. Therefore, a statistical interpretation of probability does not apply and statistical inferences cannot be made due to the sparcity of actual structural failure data. In such cases, analytical estimates of the failure probabilities may be obtained from stress-strength interference theory. Since the majority of real design applications are complex, numerical methods are required to obtain solutions. Monte Carlo simulation appears to be the best general numerical approach. However, meaningful applications of simulation methods suggest research activities in three categories: methods development, failure mode models development, and statistical data models development. (Auth.)

  9. Analytic and probabilistic approaches to dynamics in negative curvature

    Peigné, Marc; Sambusetti, Andrea

    2014-01-01

    The work of E. Hopf and G.A. Hedlund, in the 1930s, on transitivity and ergodicity of the geodesic flow for hyperbolic surfaces, marked the beginning of the investigation of the statistical properties and stochastic behavior of the flow. The first central limit theorem for the geodesic flow was proved in the 1960s by Y. Sinai for compact hyperbolic manifolds. Since then, strong relationships have been found between the fields of ergodic theory, analysis, and geometry. Different approaches and new tools have been developed to study the geodesic flow, including measure theory, thermodynamic formalism, transfer operators, Laplace operators, and Brownian motion. All these different points of view have led to a deep understanding of more general dynamical systems, in particular the so-called Anosov systems, with applications to geometric problems such as counting, equirepartition, mixing, and recurrence properties of the orbits. This book comprises two independent texts that provide a self-contained introduction t...

  10. Approaches to probabilistic model learning for mobile manipulation robots

    Sturm, Jürgen

    2013-01-01

    Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. Examples include domestic service robots that implement large parts of the housework, and versatile industrial assistants that provide automation, transportation, inspection, and monitoring services. The challenge in these applications is that the robots have to function under changing, real-world conditions, be able to deal with considerable amounts of noise and uncertainty, and operate without the supervision of an expert. This book presents novel learning techniques that enable mobile manipulation robots, i.e., mobile platforms with one or more robotic manipulators, to autonomously adapt to new or changing situations. The approaches presented in this book cover the following topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating,...

  11. A new probabilistic approach to the microdosimetry of BNCT

    Santa Cruz, G.A.; Palmer, M.R.; Kiger, W.S. III; Zamenhof, R.G.; Matatagui, E.

    2000-01-01

    Using H and E micrographs of glioma, melanoma, and normal brain cells and applying stereological reconstruction, we computed the chord length distributions of the nuclei in these tissues that are the basis of a new approach to microdosimetry. This new formalism, derived from the field of geometric probability, allows the calculation of event statistics and mean specific energy (z F and z D ) values for different kinds of 10 B distributions, including the case of cell surface bound compounds. The results suggest a new method for predicting boron compound efficacy in terms of cell geometry and microscopic boron distribution. The new formalism can be applied to any high-LET particle type such as protons or heavy recoil particles. Illustrative results are presented. (author)

  12. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    Caracciolo, Paola

    2014-01-01

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic

  13. Multilevel probabilistic approach to evaluate manufacturing defect in composite aircraft structures

    Caracciolo, Paola, E-mail: paola.caracciolo@airbus.com [AIRBUS INDUSTRIES Germany, Department of Airframe Architecture and Integration-Research and Technology-Kreetslag, 10, D-21129, Hamburg (Germany)

    2014-05-15

    In this work it is developed a reliable approach and its feasibility to the design and analysis of a composite structures. The metric is compared the robustness and reliability designs versus the traditional design, to demonstrate the gain that can be achieved with a probabilistic approach. The use of the stochastic approach of the uncertain parameteters in combination with the multi-scale levels analysis is the main objective of this paper. The work is dedicated to analyze the uncertainties in the design, tests, manufacturing process, and key gates such as materials characteristic.

  14. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-01-01

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm’s accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10 −4 ), 283 for the intensity approach (p = 2  ×  10 −6 ) and 282

  15. CLASSIFYING X-RAY BINARIES: A PROBABILISTIC APPROACH

    Gopalan, Giri; Bornn, Luke; Vrtilek, Saeqa Dil

    2015-01-01

    In X-ray binary star systems consisting of a compact object that accretes material from an orbiting secondary star, there is no straightforward means to decide whether the compact object is a black hole or a neutron star. To assist in this process, we develop a Bayesian statistical model that makes use of the fact that X-ray binary systems appear to cluster based on their compact object type when viewed from a three-dimensional coordinate system derived from X-ray spectral data where the first coordinate is the ratio of counts in the mid- to low-energy band (color 1), the second coordinate is the ratio of counts in the high- to low-energy band (color 2), and the third coordinate is the sum of counts in all three bands. We use this model to estimate the probabilities of an X-ray binary system containing a black hole, non-pulsing neutron star, or pulsing neutron star. In particular, we utilize a latent variable model in which the latent variables follow a Gaussian process prior distribution, and hence we are able to induce the spatial correlation which we believe exists between systems of the same type. The utility of this approach is demonstrated by the accurate prediction of system types using Rossi X-ray Timing Explorer All Sky Monitor data, but it is not flawless. In particular, non-pulsing neutron systems containing “bursters” that are close to the boundary demarcating systems containing black holes tend to be classified as black hole systems. As a byproduct of our analyses, we provide the astronomer with the public R code which can be used to predict the compact object type of XRBs given training data

  16. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  17. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  18. A Probabilistic Approach to Predict Thermal Fatigue Life for Ball Grid Array Solder Joints

    Wei, Helin; Wang, Kuisheng

    2011-11-01

    Numerous studies of the reliability of solder joints have been performed. Most life prediction models are limited to a deterministic approach. However, manufacturing induces uncertainty in the geometry parameters of solder joints, and the environmental temperature varies widely due to end-user diversity, creating uncertainties in the reliability of solder joints. In this study, a methodology for accounting for variation in the lifetime prediction for lead-free solder joints of ball grid array packages (PBGA) is demonstrated. The key aspects of the solder joint parameters and the cyclic temperature range related to reliability are involved. Probabilistic solutions of the inelastic strain range and thermal fatigue life based on the Engelmaier model are developed to determine the probability of solder joint failure. The results indicate that the standard deviation increases significantly when more random variations are involved. Using the probabilistic method, the influence of each variable on the thermal fatigue life is quantified. This information can be used to optimize product design and process validation acceptance criteria. The probabilistic approach creates the opportunity to identify the root causes of failed samples from product fatigue tests and field returns. The method can be applied to better understand how variation affects parameters of interest in an electronic package design with area array interconnections.

  19. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  20. A probabilistic approach for debris impact risk with numerical simulations of debris behaviors

    Kihara, Naoto; Matsuyama, Masafumi; Fujii, Naoki

    2013-01-01

    We propose a probabilistic approach for evaluating the impact risk of tsunami debris through Monte Carlo simulations with a combined system comprising a depth-averaged two-dimensional shallow water model and a discrete element model customized to simulate the motions of floating objects such as vessels. In the proposed method, first, probabilistic tsunami hazard analysis is carried out, and the exceedance probability of tsunami height and numerous tsunami time series for various hazard levels on the offshore side of a target site are estimated. Second, a characteristic tsunami time series for each hazard level is created by cluster analysis. Third, using the Monte Carlo simulation model the debris impact probability with the buildings of interest and the exceedance probability of debris impact speed are evaluated. (author)

  1. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  2. Dietary Exposure Assessment of Danish Consumers to Dithiocarbamate Residues in Food: a Comparison of the Deterministic and Probabilistic Approach

    Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette

    2008-01-01

    Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...... the probabilistic approach, the daily acute intakes at the 99.9% percentile for adults and children were 11.2 and 28.2 mu g kg(-1) body weight day(-1), representing 5.6% and 14.1% of the ARfD for maneb, respectively. When comparing the point estimate approach with the probabilistic approach, the outcome...

  3. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....

  4. Universal Generating Function Based Probabilistic Production Simulation Approach Considering Wind Speed Correlation

    Yan Li

    2017-11-01

    Full Text Available Due to the volatile and correlated nature of wind speed, a high share of wind power penetration poses challenges to power system production simulation. Existing power system probabilistic production simulation approaches are in short of considering the time-varying characteristics of wind power and load, as well as the correlation between wind speeds at the same time, which brings about some problems in planning and analysis for the power system with high wind power penetration. Based on universal generating function (UGF, this paper proposes a novel probabilistic production simulation approach considering wind speed correlation. UGF is utilized to develop the chronological models of wind power that characterizes wind speed correlation simultaneously, as well as the chronological models of conventional generation sources and load. The supply and demand are matched chronologically to not only obtain generation schedules, but also reliability indices both at each simulation interval and the whole period. The proposed approach has been tested on the improved IEEE-RTS 79 test system and is compared with the Monte Carlo approach and the sequence operation theory approach. The results verified the proposed approach with the merits of computation simplicity and accuracy.

  5. Caffeine and paraxanthine in aquatic systems: Global exposure distributions and probabilistic risk assessment.

    Rodríguez-Gil, J L; Cáceres, N; Dafouz, R; Valcárcel, Y

    2018-01-15

    This study presents one of the most complete applications of probabilistic methodologies to the risk assessment of emerging contaminants. Perhaps the most data-rich of these compounds, caffeine, as well as its main metabolite (paraxanthine), were selected for this study. Information for a total of 29,132 individual caffeine and 7442 paraxanthine samples was compiled, including samples where the compounds were not detected. The inclusion of non-detect samples (as censored data) in the estimation of environmental exposure distributions (EEDs) allowed for a realistic characterization of the global presence of these compounds in aquatic systems. EEDs were compared to species sensitivity distributions (SSDs), when possible, in order to calculate joint probability curves (JPCs) to describe the risk to aquatic organisms. This way, it was determined that unacceptable environmental risk (defined as 5% of the species being potentially exposed to concentrations able to cause effects in>5% of the cases) could be expected from chronic exposure to caffeine from effluent (28.4% of the cases), surface water (6.7% of the cases) and estuary water (5.4% of the cases). Probability of exceedance of acute predicted no-effect concentrations (PNECs) for paraxanthine were higher than 5% for all assessed matrices except for drinking water and ground water, however no experimental effects data was available for paraxanthine, resulting in a precautionary deterministic hazard assessment for this compound. Given the chemical similarities between both compounds, real effect thresholds, and thus risk, for paraxanthine, would be expected to be close to those observed for caffeine. Negligible Human health risk from exposure to caffeine via drinking or groundwater is expected from the compiled data. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Identification of failure type in corroded pipelines: a bayesian probabilistic approach.

    Breton, T; Sanchez-Gheno, J C; Alamilla, J L; Alvarez-Ramirez, J

    2010-07-15

    Spillover of hazardous materials from transport pipelines can lead to catastrophic events with serious and dangerous environmental impact, potential fire events and human fatalities. The problem is more serious for large pipelines when the construction material is under environmental corrosion conditions, as in the petroleum and gas industries. In this way, predictive models can provide a suitable framework for risk evaluation, maintenance policies and substitution procedure design that should be oriented to reduce increased hazards. This work proposes a bayesian probabilistic approach to identify and predict the type of failure (leakage or rupture) for steel pipelines under realistic corroding conditions. In the first step of the modeling process, the mechanical performance of the pipe is considered for establishing conditions under which either leakage or rupture failure can occur. In the second step, experimental burst tests are used to introduce a mean probabilistic boundary defining a region where the type of failure is uncertain. In the boundary vicinity, the failure discrimination is carried out with a probabilistic model where the events are considered as random variables. In turn, the model parameters are estimated with available experimental data and contrasted with a real catastrophic event, showing good discrimination capacity. The results are discussed in terms of policies oriented to inspection and maintenance of large-size pipelines in the oil and gas industry. 2010 Elsevier B.V. All rights reserved.

  7. Duplicate Detection in Probabilistic Data

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  8. Specification of test criteria and probabilistic approach: the case of plutonium air transport

    Hubert, P.; Pages, P.; Ringot, C.; Tomachewsky, E.

    1989-03-01

    The safety of international transportation relies on compliance with IAEA regulations which specify a serie of test which the package is supposed to withstand. For Plutonium air transport some national regulations are more stringent than the IAEA one, namely the US one. For example the drop test is to be performed at 129 m.s -1 instead of 13.4 m.s -1 . The development of international Plutonium exchanges has raised the question of the adequacy of both those standards. The purpose of this paper is to show how a probabilistic approach helps in assessing the efficiency of a move towards more stringent tests

  9. A Probabilistic Approach to Fitting Period–luminosity Relations and Validating Gaia Parallaxes

    Sesar, Branimir; Fouesneau, Morgan; Bailer-Jones, Coryn A. L.; Gould, Andy; Rix, Hans-Walter [Max Planck Institute for Astronomy, Königstuhl 17, D-69117 Heidelberg (Germany); Price-Whelan, Adrian M., E-mail: bsesar@mpia.de [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2017-04-01

    Pulsating stars, such as Cepheids, Miras, and RR Lyrae stars, are important distance indicators and calibrators of the “cosmic distance ladder,” and yet their period–luminosity–metallicity (PLZ) relations are still constrained using simple statistical methods that cannot take full advantage of available data. To enable optimal usage of data provided by the Gaia mission, we present a probabilistic approach that simultaneously constrains parameters of PLZ relations and uncertainties in Gaia parallax measurements. We demonstrate this approach by constraining PLZ relations of type ab RR Lyrae stars in near-infrared W 1 and W 2 bands, using Tycho- Gaia Astrometric Solution (TGAS) parallax measurements for a sample of ≈100 type ab RR Lyrae stars located within 2.5 kpc of the Sun. The fitted PLZ relations are consistent with previous studies, and in combination with other data, deliver distances precise to 6% (once various sources of uncertainty are taken into account). To a precision of 0.05 mas (1 σ ), we do not find a statistically significant offset in TGAS parallaxes for this sample of distant RR Lyrae stars (median parallax of 0.8 mas and distance of 1.4 kpc). With only minor modifications, our probabilistic approach can be used to constrain PLZ relations of other pulsating stars, and we intend to apply it to Cepheid and Mira stars in the near future.

  10. Simplified probabilistic approach to determine safety factors in deterministic flaw acceptance criteria

    Barthelet, B.; Ardillon, E.

    1997-01-01

    The flaw acceptance rules in nuclear components rely on deterministic criteria supposed to ensure the safe operating of plants. The interest of having a reliable method of evaluating the safety margins and the integrity of components led Electricite de France to launch a study to link safety factors with requested reliability. A simplified analytical probabilistic approach is developed to analyse the failure risk in Fracture Mechanics. Assuming lognormal distributions of the main random variables, it is possible considering a simple Linear Elastic Fracture Mechanics model, to determine the failure probability as a function of mean values and logarithmic standard deviations. The 'design' failure point can be analytically calculated. Partial safety factors on the main variables (stress, crack size, material toughness) are obtained in relation with reliability target values. The approach is generalized to elastic plastic Fracture Mechanics (piping) by fitting J as a power law function of stress, crack size and yield strength. The simplified approach is validated by detailed probabilistic computations with PROBAN computer program. Assuming reasonable coefficients of variations (logarithmic standard deviations), the method helps to calibrate safety factors for different components taking into account reliability target values in normal, emergency and faulted conditions. Statistical data for the mechanical properties of the main basic materials complement the study. The work involves laboratory results and manufacture data. The results of this study are discussed within a working group of the French in service inspection code RSE-M. (authors)

  11. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    Pecha, P.; Hofman, R.; Kuca, P.

    2008-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  12. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  13. Illustration of probabilistic approach in consequence assessment of accidental radioactive releases

    Pecha, P.; Hofman, R.; Kuca, P.

    2009-01-01

    We are describing a certain application of uncertainty analysis of environmental model HARP applied on atmospheric and deposition sub-model. Simulation of uncertainties propagation through the model is basic inevitable task bringing data for advanced techniques of probabilistic consequence assessment and further improvement of reliability of model predictions based on statistical procedures of assimilation with measured data. The activities are investigated in the institute IITA AV CR within the grant project supported by GACR (2007-2009). The problem is solved in close cooperation with section of information systems in institute NRPI. The subject of investigation concerns evaluation of consequences of radioactivity propagation after an accidental radioactivity release from nuclear facility.Transport of activity is studied from initial atmospheric propagation, deposition of radionuclides on terrain and spreading through food chains towards human body .Subsequent deposition processes of admixtures and food chain activity transport are modeled. In the final step a hazard estimation based on doses on population is integrated into the software system HARP. Extension to probabilistic approach has increased the complexity substantially, but offers much more informative background for modem methods of estimation accounting for inherent stochastic nature of the problem. Example of probabilistic assessment illustrated here is based on uncertainty analysis of input parameters of SGPM model. Predicted background field of Cs-137 deposition are labelled with index p. as P X SGPM . Final goal is estimation of a certain unknown true background vector χ true , which accounts also for deficiencies of the SGPM formulation in itself insisting in insufficient description of reality. We must have on mind, that even if we know true values of all input parameters θ m true (m= 1 ,..., M) of SGPM model, the χ true still remain uncertain. One possibility how to approach reality insists

  14. Various Approaches to Globalization of Women's Rights

    محمد تقی رفیعی

    2017-03-01

    Full Text Available Globalization is an undeniable fact; however regarding to its complex dimensions assessing the concept is greatly difficult. Having a boundless world has always been discussed in globalization, thus, any discussion on globalizing women's rights, which is associated with the culture, tradition and moral values of a society, is controversial and momentous. First, regarding the terminology of globalization, the concept of globalization will be clarified. Then, three approaches of traditional, reformational and religional modernists, which approach the globalization of women's rights differently, will be defined and examined. These approaches recognize the globalization of women's rights in a way to achieve the same rules, which are enshrined in the Convention on the Elimination of All Forms of Discrimination against Women (CEDAW. Of course, all these approaches have religious attitudes toward the discussion, and atheist opinions are not subject to this study. Finally, it seems that, in conformity with the viewpoint derived from the new religinal modernists’ perspective and the great concern of the reformative approach in respect of protecting religious values, new mechanisms can be designated, which are not harmful to religious foundations on one hand, and pave the way for globalizing women's rights on the other.

  15. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  16. A novel approach for voltage secure operation using Probabilistic Neural Network in transmission network

    Santi Behera

    2016-05-01

    Full Text Available This work proposes a unique approach for improving voltage stability limit using a Probabilistic Neural Network (PNN classifier that gives corrective controls available in the system in the scenario of contingencies. The sensitivity of system is analyzed to identify weak buses with ENVCI evaluation approaching zero. The input to the classifier, termed as voltage stability enhancing neural network (VSENN classifier, for training are line flows and bus voltages near the notch point of the P–V curve and the output of the VSENN is a control variable. For various contingencies the control action that improves the voltage profile as well as stability index is identified and trained accordingly. The trained VSENN is finally tested for its robustness to improve load margin and ENVCI as well, apart from trained set of operating condition of the system along with contingencies. The proposed approach is verified in IEEE 39-bus test system.

  17. Survey on application of probabilistic fracture mechanics approach to nuclear piping

    Kashima, Koichi

    1987-01-01

    Probabilistic fracture mechanics (PFM) approach is newly developed as one of the tools to evaluate the structural integrity of nuclear components. This report describes the current status of PFM studies for pressure vessel and piping system in light water reactors and focuses on the investigations of the piping failure probability which have been undertaken by USNRC. USNRC reevaluates the double-ended guillotine break (DEGB) of rector coolant piping as a design basis event for nuclear power plant by using the PFM approach. For PWR piping systems designed by Westinghouse, two causes of pipe break are considered: pipe failure due to the crack growth and pipe failure indirectly caused by failure of component supports due to an earthquake. PFM approach shows that the probability of DEGB from either cause is very low and that the effect of earthquake on pipe failure can be neglected. (author)

  18. Probabilistic approach in treatment of deterministic analyses results of severe accidents

    Krajnc, B.; Mavko, B.

    1996-01-01

    Severe accidents sequences resulting in loss of the core geometric integrity have been found to have small probability of the occurrence. Because of their potential consequences to public health and safety, an evaluation of the core degradation progression and the resulting effects on the containment is necessary to determine the probability of a significant release of radioactive materials. This requires assessment of many interrelated phenomena including: steel and zircaloy oxidation, steam spikes, in-vessel debris cooling, potential vessel failure mechanisms, release of core material to the containment, containment pressurization from steam generation, or generation of non-condensable gases or hydrogen burn, and ultimately coolability of degraded core material. To asses the answer from the containment event trees in the sense of weather certain phenomenological event would happen or not the plant specific deterministic analyses should be performed. Due to the fact that there is a large uncertainty in the prediction of severe accidents phenomena in Level 2 analyses (containment event trees) the combination of probabilistic and deterministic approach should be used. In fact the result of the deterministic analyses of severe accidents are treated in probabilistic manner due to large uncertainty of results as a consequence of a lack of detailed knowledge. This paper discusses approach used in many IPEs, and which assures that the assigned probability for certain question in the event tree represent the probability that the event will or will not happen and that this probability also includes its uncertainty, which is mainly result of lack of knowledge. (author)

  19. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  20. Assessing dynamic postural control during exergaming in older adults: A probabilistic approach.

    Soancatl Aguilar, V; Lamoth, C J C; Maurits, N M; Roerdink, J B T M

    2018-02-01

    Digital games controlled by body movements (exergames) have been proposed as a way to improve postural control among older adults. Exergames are meant to be played at home in an unsupervised way. However, only few studies have investigated the effect of unsupervised home-exergaming on postural control. Moreover, suitable methods to dynamically assess postural control during exergaming are still scarce. Dynamic postural control (DPC) assessment could be used to provide both meaningful feedback and automatic adjustment of exergame difficulty. These features could potentially foster unsupervised exergaming at home and improve the effectiveness of exergames as tools to improve balance control. The main aim of this study is to investigate the effect of six weeks of unsupervised home-exergaming on DPC as assessed by a recently developed probabilistic model. High probability values suggest 'deteriorated' postural control, whereas low probability values suggest 'good' postural control. In a pilot study, ten healthy older adults (average 77.9, SD 7.2 years) played an ice-skating exergame at home half an hour per day, three times a week during six weeks. The intervention effect on DPC was assessed using exergaming trials recorded by Kinect at baseline and every other week. Visualization of the results suggests that the probabilistic model is suitable for real-time DPC assessment. Moreover, linear mixed model analysis and parametric bootstrapping suggest a significant intervention effect on DPC. In conclusion, these results suggest that unsupervised exergaming for improving DPC among older adults is indeed feasible and that probabilistic models could be a new approach to assess DPC. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A probabilistic approach to the computation of the levelized cost of electricity

    Geissmann, Thomas

    2017-01-01

    This paper sets forth a novel approach to calculate the levelized cost of electricity (LCOE) using a probabilistic model that accounts for endogenous input parameters. The approach is applied to the example of a nuclear and gas power project. Monte Carlo simulation results show that a correlation between input parameters has a significant effect on the model outcome. By controlling for endogeneity, a statistically significant difference in the mean LCOE estimate and a change in the order of input leverages is observed. Moreover, the paper discusses the role of discounting options and external costs in detail. In contrast to the gas power project, the economic viability of the nuclear project is considerably weaker. - Highlights: • First model of levelized cost of electricity accounting for uncertainty and endogeneities in input parameters. • Allowance for endogeneities significantly affects results. • Role of discounting options and external costs is discussed and modelled.

  2. A probabilistic approach to rock mechanical property characterization for nuclear waste repository design

    Kim, Kunsoo; Gao, Hang

    1996-01-01

    A probabilistic approach is proposed for the characterization of host rock mechanical properties at the Yucca Mountain site. This approach helps define the probability distribution of rock properties by utilizing extreme value statistics and Monte Carlo simulation. We analyze mechanical property data of tuff obtained by the NNWSI Project to assess the utility of the methodology. The analysis indicates that laboratory measured strength and deformation data of Calico Hills and Bullfrog tuffs follow an extremal. probability distribution (the third type asymptotic distribution of the smallest values). Monte Carlo simulation is carried out to estimate rock mass deformation moduli using a one-dimensional tuff model proposed by Zimmermann and Finley. We suggest that the results of these analyses be incorporated into the repository design

  3. Development of Nuclear Safety Culture evaluation method for an operation team based on the probabilistic approach

    Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun

    2018-01-01

    Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of

  4. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Retention and Curve Number Variability in a Small Agricultural Catchment: The Probabilistic Approach

    Kazimierz Banasik

    2014-04-01

    Full Text Available The variability of the curve number (CN and the retention parameter (S of the Soil Conservation Service (SCS-CN method in a small agricultural, lowland watershed (23.4 km2 to the gauging station in central Poland has been assessed using the probabilistic approach: distribution fitting and confidence intervals (CIs. Empirical CNs and Ss were computed directly from recorded rainfall depths and direct runoff volumes. Two measures of the goodness of fit were used as selection criteria in the identification of the parent distribution function. The measures specified the generalized extreme value (GEV, normal and general logistic (GLO distributions for 100-CN and GLO, lognormal and GEV distributions for S. The characteristics estimated from theoretical distribution (median, quantiles were compared to the tabulated CN and to the antecedent runoff conditions of Hawkins and Hjelmfelt. The distribution fitting for the whole sample revealed a good agreement between the tabulated CN and the median and between the antecedent runoff conditions (ARCs of Hawkins and Hjelmfelt, which certified a good calibration of the model. However, the division of the CN sample due to heavy and moderate rainfall depths revealed a serious inconsistency between the parameters mentioned. This analysis proves that the application of the SCS-CN method should rely on deep insight into the probabilistic properties of CN and S.

  6. Simple probabilistic approach to evaluate radioiodine behavior at severe accidents: application to Phebus test FPT1

    Rydl, A.

    2007-01-01

    The contribution of radioiodine to risk from a severe accident is recognized to be one of the highest among all the fission products. In a long term (e.g. several days), volatile species of iodine are the most important forms of iodine from the safety point of view. These volatile forms ('volatile iodine') are mainly molecular iodine, I 2 , and various types of organic iodides, RI. A certain controversy exist today among the international research community about the relative importance of the processes leading to volatile iodine formation in containment under severe accident conditions. The amount of knowledge, coming from experiments, of the phenomenology of iodine behavior is enormous and it is embedded in specialized mechanistic or empirical codes. An exhaustive description of the processes governing the iodine behavior in containment is given in reference 1. Yet, all this knowledge is still not enough to resolve some important questions. Moreover, the results of different codes -when applied to relatively simple experiments, such as RTF or CAIMAN - vary widely. Thus, as a complement (or maybe even as an alternative in some instances) to deterministic analyses of iodine behavior, simple probabilistic approach is proposed in this work which could help to see the whole problem in a different perspective. The final goal of using this approach should be the characterization of uncertainties of the description of various processes in question. This would allow for identification of the processes which contribute most significantly to the overall uncertainty of the predictions of iodine volatility in containment. In this work we made a dedicated, small event tree to describe iodine behavior at an accident and we used that tree for a simple sensitivity study. For the evaluation of the tree, the US NRC code EVNTRE was used. To test the proposed probabilistic approach we analyzed results of the integral PHEBUS FPT1 experiment which comprises most of the important

  7. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  8. Probabilistic approach to requalification of existing NPPs under aircraft crash loading

    Birbraer, A.N.; Roleder, A.J.; Shulman, G.S.

    1993-01-01

    A probabilistic approach to the analysis of NPP safety under aircraft impact is discussed. It may be used both for requalification of existing NPPs and in the process of NPP design. NPP is considered as a system of components: structures, pipes, different kinds of equipment, soil, foundation. The exceeding of the limit probability of the radioactive products release out of containment (i.e. of the NPP safety requirements non-fulfilment) is taken as a system failure criterion. An example of an event tree representing the consequence of events causing the failure is given. Described are the methods of estimate of elementary events probabilities through which a composite probability of the failure is evaluated. (author)

  9. A Heuristic Probabilistic Approach to Estimating Size-Dependent Mobility of Nonuniform Sediment

    Woldegiorgis, B. T.; Wu, F. C.; van Griensven, A.; Bauwens, W.

    2017-12-01

    Simulating the mechanism of bed sediment mobility is essential for modelling sediment dynamics. Despite the fact that many studies are carried out on this subject, they use complex mathematical formulations that are computationally expensive, and are often not easy for implementation. In order to present a simple and computationally efficient complement to detailed sediment mobility models, we developed a heuristic probabilistic approach to estimating the size-dependent mobilities of nonuniform sediment based on the pre- and post-entrainment particle size distributions (PSDs), assuming that the PSDs are lognormally distributed. The approach fits a lognormal probability density function (PDF) to the pre-entrainment PSD of bed sediment and uses the threshold particle size of incipient motion and the concept of sediment mixture to estimate the PSDs of the entrained sediment and post-entrainment bed sediment. The new approach is simple in physical sense and significantly reduces the complexity and computation time and resource required by detailed sediment mobility models. It is calibrated and validated with laboratory and field data by comparing to the size-dependent mobilities predicted with the existing empirical lognormal cumulative distribution function (CDF) approach. The novel features of the current approach are: (1) separating the entrained and non-entrained sediments by a threshold particle size, which is a modified critical particle size of incipient motion by accounting for the mixed-size effects, and (2) using the mixture-based pre- and post-entrainment PSDs to provide a continuous estimate of the size-dependent sediment mobility.

  10. Numerical probabilistic analysis for slope stability in fractured rock masses using DFN-DEM approach

    Alireza Baghbanan

    2017-06-01

    Full Text Available Due to existence of uncertainties in input geometrical properties of fractures, there is not any unique solution for assessing the stability of slopes in jointed rock masses. Therefore, the necessity of applying probabilistic analysis in these cases is inevitable. In this study a probabilistic analysis procedure together with relevant algorithms are developed using Discrete Fracture Network-Distinct Element Method (DFN-DEM approach. In the right abutment of Karun 4 dam and downstream of the dam body, five joint sets and one major joint have been identified. According to the geometrical properties of fractures in Karun river valley, instability situations are probable in this abutment. In order to evaluate the stability of the rock slope, different combinations of joint set geometrical parameters are selected, and a series of numerical DEM simulations are performed on generated and validated DFN models in DFN-DEM approach to measure minimum required support patterns in dry and saturated conditions. Results indicate that the distribution of required bolt length is well fitted with a lognormal distribution in both circumstances. In dry conditions, the calculated mean value is 1125.3 m, and more than 80 percent of models need only 1614.99 m of bolts which is a bolt pattern with 2 m spacing and 12 m length. However, as for the slopes with saturated condition, the calculated mean value is 1821.8 m, and more than 80 percent of models need only 2653.49 m of bolts which is equivalent to a bolt pattern with 15 m length and 1.5 m spacing. Comparison between obtained results with numerical and empirical method show that investigation of a slope stability with different DFN realizations which conducted in different block patterns is more efficient than the empirical methods.

  11. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach.

    Goker Erdogan

    2015-11-01

    Full Text Available People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models-that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model's percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects' ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception.

  12. Probabilistic Logic and Probabilistic Networks

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  13. 77 FR 29391 - An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific...

    2012-05-17

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0110] An Approach for Probabilistic Risk Assessment in Risk-Informed Decisions on Plant-Specific Changes to the Licensing Basis AGENCY: Nuclear Regulatory Commission. ACTION: Draft regulatory guide; request for comment. SUMMARY: The U.S. Nuclear Regulatory...

  14. Global energy modeling - A biophysical approach

    Dale, Michael

    2010-09-15

    This paper contrasts the standard economic approach to energy modelling with energy models using a biophysical approach. Neither of these approaches includes changing energy-returns-on-investment (EROI) due to declining resource quality or the capital intensive nature of renewable energy sources. Both of these factors will become increasingly important in the future. An extension to the biophysical approach is outlined which encompasses a dynamic EROI function that explicitly incorporates technological learning. The model is used to explore several scenarios of long-term future energy supply especially concerning the global transition to renewable energy sources in the quest for a sustainable energy system.

  15. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  16. Probabilistic approach to the prediction of radioactive contamination of agricultural production

    Fesenko, S.F.; Chernyaeva, L.G.; Sanzharova, N.I.; Aleksakhin, R.M.

    1993-01-01

    The organization of agricultural production on the territory contaminated as a result of the Chernobyl reactor disaster involves prediction of the content of radionuclides in agro-industrial products. Traditional methods of prediting the contamination in the products does not give sufficient agreement with actual data and as a result it is difficult to make the necessary decisions about eliminating the consequences of the disaster in the agro-industrial complex. In many ways this is because the available methods are based on data on the radionuclide content in soils, plants, and plant and animal products. The parameters of the models used in the prediction are also evaluated on the basis of these results. Even if obtained from a single field or herd of livestock, however, such indicators have substantial variation coefficients due to various factors such as the spatial structure of the fallouts, the variability of the soil properties, the sampling error, the errors of processing and measuring the samples, and well as the data-averaging error. Consequently the parameters of the radionuclide transfer along the agricultural chains are very variable, thus considerably reducing the reliability of predicted values. The reliability of the prediction of radioactive contamination of agricultural products can be increased substantially by taking a probabilistic approach involving information about the random laws of contamination of farming land and the statistical features of the parameters of radionuclie migration along food chains. Considering the above, comparative analysis is made of the results obtained on the basis of the traditional treatment (deterministic in the simplest form) and its probabilistic analog

  17. A probabilistic approach for optimal sensor allocation in structural health monitoring

    Azarbayejani, M; Reda Taha, M M; El-Osery, A I; Choi, K K

    2008-01-01

    Recent advances in sensor technology promote using large sensor networks to efficiently and economically monitor, identify and quantify damage in structures. In structural health monitoring (SHM) systems, the effectiveness and reliability of the sensor network are crucial to determine the optimal number and locations of sensors in SHM systems. Here, we suggest a probabilistic approach for identifying the optimal number and locations of sensors for SHM. We demonstrate a methodology to establish the probability distribution function that identifies the optimal sensor locations such that damage detection is enhanced. The approach is based on using the weights of a neural network trained from simulations using a priori knowledge about damage locations and damage severities to generate a normalized probability distribution function for optimal sensor allocation. We also demonstrate that the optimal sensor network can be related to the highest probability of detection (POD). The redundancy of the proposed sensor network is examined using a 'leave one sensor out' analysis. A prestressed concrete bridge is selected as a case study to demonstrate the effectiveness of the proposed method. The results show that the proposed approach can provide a robust design for sensor networks that are more efficient than a uniform distribution of sensors on a structure

  18. Probabilistic and Fuzzy Arithmetic Approaches for the Treatment of Uncertainties in the Installation of Torpedo Piles

    Denise Margareth Kazue Nishimura Kunitaki

    2008-01-01

    Full Text Available The “torpedo” pile is a foundation system that has been recently considered to anchor mooring lines and risers of floating production systems for offshore oil exploitation. The pile is installed in a free fall operation from a vessel. However, the soil parameters involved in the penetration model of the torpedo pile contain uncertainties that can affect the precision of analysis methods to evaluate its final penetration depth. Therefore, this paper deals with methodologies for the assessment of the sensitivity of the response to the variation of the uncertain parameters and mainly to incorporate into the analysis method techniques for the formal treatment of the uncertainties. Probabilistic and “possibilistic” approaches are considered, involving, respectively, the Monte Carlo method (MC and concepts of fuzzy arithmetic (FA. The results and performance of both approaches are compared, stressing the ability of the latter approach to efficiently deal with the uncertainties of the model, with outstanding computational efficiency, and therefore, to comprise an effective design tool.

  19. A probabilistic approach to the management of multi-stage multicriteria process

    Yu. V. Bugaev

    2017-01-01

    Full Text Available Currently, any production process is viewed as the primary means of profit and competitiveness, in other words, becomes the dominant process approach. In this approach, the final product production appears network of interconnected processing steps during which the conversion of inputs into outputs, with a stable, accurate executable, high process most efficiently and cost-effectively provides a planned quality. An example is the organization of bread production. For the modern period is characterized by the classical recovery technology that allows to improve the palatability of bread, enhance its flavor, longer-lasting freshness. Baking is a process to be controlled in order to obtain the required quality parameters of the final product. One of the new and promising methods of quality management processes is a probabilistic method to determine the increase in the probability of release of quality products within the resources allocated for measures to improve the quality level. The paper was applied a quality management concept is based on a probabilistic approach for the multi-step process, which consists in the fact that as one of the main criteria adopted by the probability of release of high-quality products. However, it is obvious that the implementation of certain measures for its improvement requires the connection of certain resources that, first of all, is inevitably associated with certain cash costs. Thus, we arrive at an optimal control problem, which has at least two criteria - probability qualitative completion of the process, which should be maximized and the total costs of the corrective measures that need to be minimized. The authors of the idealized model of optimal control has been developed for the case when a single event affects only a single step. a special case of vector Uorshall-Floyd algorithm was used to optimize the structure of a multi-step process. The use of vector optimization on graphs allowed the authors to

  20. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  1. A hybrid deterministic-probabilistic approach to model the mechanical response of helically arranged hierarchical strands

    Fraldi, M.; Perrella, G.; Ciervo, M.; Bosia, F.; Pugno, N. M.

    2017-09-01

    Very recently, a Weibull-based probabilistic strategy has been successfully applied to bundles of wires to determine their overall stress-strain behaviour, also capturing previously unpredicted nonlinear and post-elastic features of hierarchical strands. This approach is based on the so-called "Equal Load Sharing (ELS)" hypothesis by virtue of which, when a wire breaks, the load acting on the strand is homogeneously redistributed among the surviving wires. Despite the overall effectiveness of the method, some discrepancies between theoretical predictions and in silico Finite Element-based simulations or experimental findings might arise when more complex structures are analysed, e.g. helically arranged bundles. To overcome these limitations, an enhanced hybrid approach is proposed in which the probability of rupture is combined with a deterministic mechanical model of a strand constituted by helically-arranged and hierarchically-organized wires. The analytical model is validated comparing its predictions with both Finite Element simulations and experimental tests. The results show that generalized stress-strain responses - incorporating tension/torsion coupling - are naturally found and, once one or more elements break, the competition between geometry and mechanics of the strand microstructure, i.e. the different cross sections and helical angles of the wires in the different hierarchical levels of the strand, determines the no longer homogeneous stress redistribution among the surviving wires whose fate is hence governed by a "Hierarchical Load Sharing" criterion.

  2. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  3. Application of a time probabilistic approach to seismic landslide hazard estimates in Iran

    Rajabi, A. M.; Del Gaudio, V.; Capolongo, D.; Khamehchiyan, M.; Mahdavifar, M. R.

    2009-04-01

    Iran is a country located in a tectonic active belt and is prone to earthquake and related phenomena. In the recent years, several earthquakes caused many fatalities and damages to facilities, e.g. the Manjil (1990), Avaj (2002), Bam (2003) and Firuzabad-e-Kojur (2004) earthquakes. These earthquakes generated many landslides. For instance, catastrophic landslides triggered by the Manjil Earthquake (Ms = 7.7) in 1990 buried the village of Fatalak, killed more than 130 peoples and cut many important road and other lifelines, resulting in major economic disruption. In general, earthquakes in Iran have been concentrated in two major zones with different seismicity characteristics: one is the region of Alborz and Central Iran and the other is the Zagros Orogenic Belt. Understanding where seismically induced landslides are most likely to occur is crucial in reducing property damage and loss of life in future earthquakes. For this purpose a time probabilistic approach for earthquake-induced landslide hazard at regional scale, proposed by Del Gaudio et al. (2003), has been applied to the whole Iranian territory to provide the basis of hazard estimates. This method consists in evaluating the recurrence of seismically induced slope failure conditions inferred from the Newmark's model. First, by adopting Arias Intensity to quantify seismic shaking and using different Arias attenuation relations for Alborz - Central Iran and Zagros regions, well-established methods of seismic hazard assessment, based on the Cornell (1968) method, were employed to obtain the occurrence probabilities for different levels of seismic shaking in a time interval of interest (50 year). Then, following Jibson (1998), empirical formulae specifically developed for Alborz - Central Iran and Zagros, were used to represent, according to the Newmark's model, the relation linking Newmark's displacement Dn to Arias intensity Ia and to slope critical acceleration ac. These formulae were employed to evaluate

  4. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    S. Raia

    2014-03-01

    Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs

  5. A probabilistic approach for the interpretation of RNA profiles as cell type evidence.

    de Zoete, Jacob; Curran, James; Sjerps, Marjan

    2016-01-01

    DNA profiles can be used as evidence to distinguish between possible donors of a crime stain. In some cases, both the prosecution and the defence claim that the cell material was left by the suspect but they dispute which cell type was left behind. For example, in sexual offense cases the prosecution could claim that the sample contains semen cells where the defence argues that the sample contains skin cells. In these cases, traditional methods (e.g. a phosphatase test) can be used to examine the cell type contained in the sample. However, there are some drawbacks when using these methods. For instance, many of these techniques need to be carried out separately for each cell type and each of them requires part of the available sample, which reduces the amount that can be used for DNA analysis. Another option is messenger RNA (mRNA) evidence. mRNA expression levels vary among cell types and can be used to make (probability) statements about the cell type(s) present in a sample. Existing methods for the interpretation of RNA profiles as evidence for the presence of certain cell types aim at making categorical statements. Such statements limit the possibility to report the associated uncertainty. Some of these existing methods will be discussed. Most notably, a method based on a 'n/2' scoring rule (Lindenbergh et al.) and a method using marker values and cell type scoring thresholds (Roeder et al.). From a statistical point of view, a probabilistic approach is the most obvious choice. Two approaches (multinomial logistic regression and naïve Bayes') are suggested. All methods are compared, using two different datasets and several criteria regarding their ability to assess the evidential value of RNA profiles. We conclude that both the naïve Bayes' method and a method based on multinomial logistic regression, that produces a probabilistic statement as measure of the evidential value, are an important improvement of the existing methods. Besides a better performance

  6. Hybrid biasing approaches for global variance reduction

    Wu, Zeyun; Abdel-Khalik, Hany S.

    2013-01-01

    A new variant of Monte Carlo—deterministic (DT) hybrid variance reduction approach based on Gaussian process theory is presented for accelerating convergence of Monte Carlo simulation and compared with Forward-Weighted Consistent Adjoint Driven Importance Sampling (FW-CADIS) approach implemented in the SCALE package from Oak Ridge National Laboratory. The new approach, denoted the Gaussian process approach, treats the responses of interest as normally distributed random processes. The Gaussian process approach improves the selection of the weight windows of simulated particles by identifying a subspace that captures the dominant sources of statistical response variations. Like the FW-CADIS approach, the Gaussian process approach utilizes particle importance maps obtained from deterministic adjoint models to derive weight window biasing. In contrast to the FW-CADIS approach, the Gaussian process approach identifies the response correlations (via a covariance matrix) and employs them to reduce the computational overhead required for global variance reduction (GVR) purpose. The effective rank of the covariance matrix identifies the minimum number of uncorrelated pseudo responses, which are employed to bias simulated particles. Numerical experiments, serving as a proof of principle, are presented to compare the Gaussian process and FW-CADIS approaches in terms of the global reduction in standard deviation of the estimated responses. - Highlights: ► Hybrid Monte Carlo Deterministic Method based on Gaussian Process Model is introduced. ► Method employs deterministic model to calculate responses correlations. ► Method employs correlations to bias Monte Carlo transport. ► Method compared to FW-CADIS methodology in SCALE code. ► An order of magnitude speed up is achieved for a PWR core model.

  7. Probabilistic approaches to accounting for data variability in the practical application of bioavailability in predicting aquatic risks from metals.

    Ciffroy, Philippe; Charlatchka, Rayna; Ferreira, Daniel; Marang, Laura

    2013-07-01

    The biotic ligand model (BLM) theoretically enables the derivation of environmental quality standards that are based on true bioavailable fractions of metals. Several physicochemical variables (especially pH, major cations, dissolved organic carbon, and dissolved metal concentrations) must, however, be assigned to run the BLM, but they are highly variable in time and space in natural systems. This article describes probabilistic approaches for integrating such variability during the derivation of risk indexes. To describe each variable using a probability density function (PDF), several methods were combined to 1) treat censored data (i.e., data below the limit of detection), 2) incorporate the uncertainty of the solid-to-liquid partitioning of metals, and 3) detect outliers. From a probabilistic perspective, 2 alternative approaches that are based on log-normal and Γ distributions were tested to estimate the probability of the predicted environmental concentration (PEC) exceeding the predicted non-effect concentration (PNEC), i.e., p(PEC/PNEC>1). The probabilistic approach was tested on 4 real-case studies based on Cu-related data collected from stations on the Loire and Moselle rivers. The approach described in this article is based on BLM tools that are freely available for end-users (i.e., the Bio-Met software) and on accessible statistical data treatments. This approach could be used by stakeholders who are involved in risk assessments of metals for improving site-specific studies. Copyright © 2013 SETAC.

  8. A Probabilistic Alternative Approach to Optimal Project Profitability Based on the Value-at-Risk

    Yonggu Kim

    2018-03-01

    Full Text Available This paper focuses on an investment decision-making process for sustainable development based on the profitability impact factors for overseas projects. Investors prefer to use the discounted cash-flow method. Although this method is simple and straightforward, its critical weakness is its inability to reflect the factor volatility associated with the project evaluation. To overcome this weakness, the Value-at-Risk method is used to apply the volatility of the profitability impact factors, thereby reflecting the risks and establishing decision-making criteria for risk-averse investors. Risk-averse investors can lose relatively acceptable investment opportunities to risk-neutral or risk-amenable investors due to strict investment decision-making criteria. To overcome this problem, critical factors are selected through a Monte Carlo simulation and a sensitivity analysis, and solutions to the critical-factor problems are then found by using the Theory of Inventive Problem Solving and a business version of the Project Definition Rating Index. This study examines the process of recovering investment opportunities with projects that are investment feasible and that have been rejected when applying the criterion of the Value-at-Risk method. To do this, a probabilistic alternative approach is taken. To validate this methodology, the proposed framework for an improved decision-making process is demonstrated using two actual overseas projects of a Korean steel-making company.

  9. Environmental risk assessment of white phosphorus from the use of munitions - a probabilistic approach.

    Voie, Øyvind Albert; Johnsen, Arnt; Strømseng, Arnljot; Longva, Kjetil Sager

    2010-03-15

    White phosphorus (P(4)) is a highly toxic compound used in various pyrotechnic products. Ammunitions containing P(4) are widely used in military training areas where the unburned products of P(4) contaminate soil and local ponds. Traditional risk assessment methods presuppose a homogeneous spatial distribution of pollutants. The distribution of P(4) in military training areas is heterogeneous, which reduces the probability of potential receptors being exposed to the P(4) by ingestion, for example. The current approach to assess the environmental risk from the use of P(4) suggests a Bayesian network (Bn) as a risk assessment tool. The probabilistic reasoning supported by a Bn allows us to take into account the heterogeneous distribution of P(4). Furthermore, one can combine empirical data and expert knowledge, which allows the inclusion of all kinds of data that are relevant to the problem. The current work includes an example of the use of the Bn as a risk assessment tool where the risk for P(4) poisoning in humans and grazing animals at a military shooting range in Northern Norway was calculated. P(4) was detected in several craters on the range at concentrations up to 5.7g/kg. The risk to human health was considered acceptable under the current land use. The risk for grazing animals such as sheep, however, was higher, suggesting that precautionary measures may be advisable.

  10. A robust probabilistic approach for variational inversion in shallow water acoustic tomography

    Berrada, M; Badran, F; Crépon, M; Thiria, S; Hermand, J-P

    2009-01-01

    This paper presents a variational methodology for inverting shallow water acoustic tomography (SWAT) measurements. The aim is to determine the vertical profile of the speed of sound c(z), knowing the acoustic pressures generated by a frequency source and collected by a sparse vertical hydrophone array (VRA). A variational approach that minimizes a cost function measuring the distance between observations and their modeled equivalents is used. A regularization term in the form of a quadratic restoring term to a background is also added. To avoid inverting the variance–covariance matrix associated with the above-weighted quadratic background, this work proposes to model the sound speed vector using probabilistic principal component analysis (PPCA). The PPCA introduces an optimum reduced number of non-correlated latent variables η, which determine a new control vector and a new regularization term, expressed as η T η. The PPCA represents a rigorous formalism for the use of a priori information and allows an efficient implementation of the variational inverse method

  11. Probabilistic approaches applied to damage and embrittlement of structural materials in nuclear power plants

    Vincent, L.

    2012-01-01

    The present study deals with the long-term mechanical behaviour and damage of structural materials in nuclear power plants. An experimental way is first followed to study the thermal fatigue of austenitic stainless steels with a focus on the effects of mean stress and bi-axiality. Furthermore, the measurement of displacement fields by Digital Image Correlation techniques has been successfully used to detect early crack initiation during high cycle fatigue tests. A probabilistic model based on the shielding zones surrounding existing cracks is proposed to describe the development of crack networks. A more numeric way is then followed to study the embrittlement consequences of the irradiation hardening of the bainitic steel constitutive of nuclear pressure vessels. A crystalline plasticity law, developed in agreement with lower scale results (Dislocation Dynamics), is introduced in a Finite Element code in order to run simulations on aggregates and obtain the distributions of the maximum principal stress inside a Representative Volume Element. These distributions are then used to improve the classical Local Approach to Fracture which estimates the probability for a microstructural defect to be loaded up to a critical level. (author) [fr

  12. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  13. Complexity characterization in a probabilistic approach to dynamical systems through information geometry and inductive inference

    Ali, S A; Kim, D-H; Cafaro, C; Giffin, A

    2012-01-01

    Information geometric techniques and inductive inference methods hold great promise for solving computational problems of interest in classical and quantum physics, especially with regard to complexity characterization of dynamical systems in terms of their probabilistic description on curved statistical manifolds. In this paper, we investigate the possibility of describing the macroscopic behavior of complex systems in terms of the underlying statistical structure of their microscopic degrees of freedom by the use of statistical inductive inference and information geometry. We review the maximum relative entropy formalism and the theoretical structure of the information geometrodynamical approach to chaos on statistical manifolds M S . Special focus is devoted to a description of the roles played by the sectional curvature K M S , the Jacobi field intensity J M S and the information geometrodynamical entropy S M S . These quantities serve as powerful information-geometric complexity measures of information-constrained dynamics associated with arbitrary chaotic and regular systems defined on M S . Finally, the application of such information-geometric techniques to several theoretical models is presented.

  14. Spatial probabilistic approach on landslide susceptibility assessment from high resolution sensors derived parameters

    Aman, S N A; Latif, Z Abd; Pradhan, B

    2014-01-01

    Landslide occurrence depends on various interrelating factors which consequently initiate to massive mass of soil and rock debris that move downhill due to the gravity action. LiDAR has come with a progressive approach in mitigating landslide by permitting the formation of more accurate DEM compared to other active space borne and airborne remote sensing techniques. The objective of this research is to assess the susceptibility of landslide in Ulu Klang area by investigating the correlation between past landslide events with geo environmental factors. A high resolution LiDAR DEM was constructed to produce topographic attributes such as slope, curvature and aspect. These data were utilized to derive second deliverables of landslide parameters such as topographic wetness index (TWI), surface area ratio (SAR) and stream power index (SPI) as well as NDVI generated from IKONOS imagery. Subsequently, a probabilistic based frequency ratio model was applied to establish the spatial relationship between the landslide locations and each landslide related factor. Factor ratings were summed up to obtain Landslide Susceptibility Index (LSI) to construct the landslide susceptibility map

  15. A probabilistic scenario approach for developing improved Reduced Emissions from Deforestation and Degradation (REDD+ baselines

    Malika Virah-Sawmy

    2015-07-01

    By generating robust probabilistic baseline scenarios, exponential smoothing models can facilitate the effectiveness of REDD+ payments, support a more efficient allocation of scarce conservation resources, and improve our understanding of effective forest conservation investments, also beyond REDD+.

  16. Diagnostic efficacy of optimised evaluation of planar MIBI myocardium perfusion scintigraphy: a probabilistic approach

    Kusmierek, J.; Plachcinska, A.

    1999-01-01

    Background: The Bayesian (probabilistic) approach to the results of a diagnostic test appears to be more informative than an interpretation of results in binary terms (having disease or not). The aim of our study was the analysis of the effect of an optimised evaluation of myocardium perfusion scintigrams on the probability of CAD in individual patients. Methods: 197 patients (132 males and 65 females) suspected of CAD, with no history of myocardial infarction were examined. Scintigraphic images were evaluated applying two methods of analysis: visual (semiquantitative) and quantitative, and the combination of both. The sensitivity and specificity of both methods (and their combination) in the detection of CAD were determined and optimal methods of scintigram evaluation, separately for males and females, were selected. All patients were subjected to coronary angiography. The pre-test probability of CAD was assessed according to Diamond (1) and the post-test probability was evaluated in accordance with Bayes's theorem. Patients were divided, according to a pre-test probability of CAD, into 3 groups: with low, medium and high probability of the disease. The same subdivision was made in relation to post-test probability of CAD. The numbers of patients in respective subgroups, before and after the test, were compared. Moreover, in order to test the reliability of post-test probability, its values were compared with real percentages of CAD occurrence among the patients under study, as demonstrated by the angiography. Results: The combination of visual and quantitative methods was accepted as the optimal method of male scintigram evaluation (with sensitivity and specificity equalling 95% and 82%, respectively) and a sole quantitative analysis as the optimal method of female scintigram evaluation (sensitivity and specificity amounted to 81% and 84%, respectively). In the subgroup of males the percentage of individuals with medium pre-test CAD probability equalled 52 and

  17. Global floor planning approach for VLSI design

    LaPotin, D.P.

    1986-01-01

    Within a hierarchical design environment, initial decisions regarding the partitioning and choice of module attributes greatly impact the quality of the resulting IC in terms of area and electrical performance. This dissertation presents a global floor-planning approach which allows designers to quickly explore layout issues during the initial stages of the IC design process. In contrast to previous efforts, which address the floor-planning problem from a strict module placement point of view, this approach considers floor-planning from an area planning point of view. The approach is based upon a combined min-cut and slicing paradigm, which ensures routability. To provide flexibility, modules may be specified as having a number of possible dimensions and orientations, and I/O pads as well as layout constraints are considered. A slicing-tree representation is employed, upon which a sequence of traversal operations are applied in order to obtain an area efficient layout. An in-place partitioning technique, which provides an improvement over previous min-cut and slicing-based efforts, is discussed. Global routing and module I/O pin assignment are provided for floor-plan evaluation purposes. A computer program, called Mason, has been developed which efficiently implements the approach and provides an interactive environment for designers to perform floor-planning. Performance of this program is illustrated via several industrial examples

  18. Need to use probabilistic risk approach in performance assessment of waste disposal facilities

    Bonano, E.J.; Gallegos, D.P.

    1991-01-01

    Regulations governing the disposal of radioactive, hazardous, and/or mixed wastes will likely require, either directly or indirectly, that the performance of disposal facilities be assessed quantitatively. Such analyses, commonly called ''performance assessments,'' rely on the use of predictive models to arrive at a quantitative estimate of the potential impact of disposal on the environment and the safety and health of the public. It has been recognized that a suite of uncertainties affect the results of a performance assessment. These uncertainties are conventionally categorized as (1) uncertainty in the future state of the disposal system (facility and surrounding medium), (2) uncertainty in models (including conceptual models, mathematical models, and computer codes), and (3) uncertainty in data and parameters. Decisions regarding the suitability of a waste disposal facility must be made in light of these uncertainties. Hence, an approach is needed that would allow the explicit consideration of these uncertainties so that their impact on the estimated consequences of disposal can be evaluated. While most regulations for waste disposal do not prescribe the consideration of uncertainties, it is proposed that, even in such cases, a meaningful decision regarding the suitability of a waste disposal facility cannot be made without considering the impact of the attendant uncertainties. A probabilistic risk assessment (PRA) approach provides the formalism for considering the uncertainties and the technical basis that the decision makers can use in discharging their duties. A PRA methodology developed and demonstrated for the disposal of high-level radioactive waste provides a general framework for assessing the disposal of all types of wastes (radioactive, hazardous, and mixed). 15 refs., 1 fig., 1 tab

  19. A probabilistic multidimensional approach to quantify large wood recruitment from hillslopes in mountainous-forested catchments

    Cislaghi, Alessio; Rigon, Emanuel; Lenzi, Mario Aristide; Bischetti, Gian Battista

    2018-04-01

    Large wood (LW) plays a key role in physical, chemical, environmental, and biological processes in most natural and seminatural streams. However, it is also a source of hydraulic hazard in anthropised territories. Recruitment from fluvial processes has been the subject of many studies, whereas less attention has been given to hillslope recruitment, which is linked to episodic and spatially distributed events and requires a reliable and accurate slope stability model and a hillslope-channel transfer model. The purpose of this study is to develop an innovative LW hillslope-recruitment estimation approach that combines forest stand characteristics in a spatially distributed form, a probabilistic multidimensional slope stability model able to include the reinforcement exerted by roots, and a hillslope-channel transfer procedure. The approach was tested on a small mountain headwater catchment in the eastern Italian Alps that is prone to shallow landslide and debris flow phenomena. The slope stability model (that had not been calibrated) provided accurate performances, in terms of unstable areas identification according to the landslide inventory (AUC = 0.832) and of LW volume estimation in comparison with LW volume produced by inventoried landslides (7702 m3 corresponding to a recurrence time of about 30 years in the susceptibility curve). The results showed that most LW potentially mobilised by landslides does not reach the channel network (only about 16%), in agreement with the few data reported by other studies, as well as the data normalized for unit length of channel and unit length of channel per year (0-116 m3/km and 0-4 m3/km y-1). This study represents an important contribution to LW research. A rigorous and site-specific estimation of LW hillslope recruitment should, in fact, be an integral part of more general studies on LW dynamics, for forest planning and management, and positioning in-channel wood retention structures.

  20. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modeling heteroscedastic residual errors

    McInerney, David; Thyer, Mark; Kavetski, Dmitri; Lerat, Julien; Kuczera, George

    2017-03-01

    Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. This study focuses on approaches for representing error heteroscedasticity with respect to simulated streamflow, i.e., the pattern of larger errors in higher streamflow predictions. We evaluate eight common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter λ) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and the United States, and two lumped hydrological models. Performance is quantified using predictive reliability, precision, and volumetric bias metrics. We find the choice of heteroscedastic error modeling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with λ of 0.2 and 0.5, and the log scheme (λ = 0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Paradoxically, calibration of λ is often counterproductive: in perennial catchments, it tends to overfit low flows at the expense of abysmal precision in high flows. The log-sinh transformation is dominated by the simpler Pareto optimal schemes listed above. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  1. A probabilistic approach to quantifying spatial patterns of flow regimes and network-scale connectivity

    Garbin, Silvia; Alessi Celegon, Elisa; Fanton, Pietro; Botter, Gianluca

    2017-04-01

    The temporal variability of river flow regime is a key feature structuring and controlling fluvial ecological communities and ecosystem processes. In particular, streamflow variability induced by climate/landscape heterogeneities or other anthropogenic factors significantly affects the connectivity between streams with notable implication for river fragmentation. Hydrologic connectivity is a fundamental property that guarantees species persistence and ecosystem integrity in riverine systems. In riverine landscapes, most ecological transitions are flow-dependent and the structure of flow regimes may affect ecological functions of endemic biota (i.e., fish spawning or grazing of invertebrate species). Therefore, minimum flow thresholds must be guaranteed to support specific ecosystem services, like fish migration, aquatic biodiversity and habitat suitability. In this contribution, we present a probabilistic approach aiming at a spatially-explicit, quantitative assessment of hydrologic connectivity at the network-scale as derived from river flow variability. Dynamics of daily streamflows are estimated based on catchment-scale climatic and morphological features, integrating a stochastic, physically based approach that accounts for the stochasticity of rainfall with a water balance model and a geomorphic recession flow model. The non-exceedance probability of ecologically meaningful flow thresholds is used to evaluate the fragmentation of individual stream reaches, and the ensuing network-scale connectivity metrics. A multi-dimensional Poisson Process for the stochastic generation of rainfall is used to evaluate the impact of climate signature on reach-scale and catchment-scale connectivity. The analysis shows that streamflow patterns and network-scale connectivity are influenced by the topology of the river network and the spatial variability of climatic properties (rainfall, evapotranspiration). The framework offers a robust basis for the prediction of the impact of

  2. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India); Gupta, Shikha; Rai, Premanjali [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India)

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  3. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    Dillstroem, Peter; Weilin Zang (Inspecta Technology AB, Stockholm (SE))

    2007-11-15

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  4. ProLBB - A Probabilistic Approach to Leak Before Break Demonstration

    Dillstroem, Peter; Weilin Zang

    2007-11-01

    Recently, the Swedish Nuclear Power Inspectorate has developed guidelines on how to demonstrate the existence of Leak Before Break (LBB). The guidelines, mainly based on NUREG/CR-6765, define the steps that must be fulfilled to get a conservative assessment of LBB acceptability. In this report, a probabilistic LBB approach is defined and implemented into the software ProLBB. The main conclusions, from the study presented in this report, are summarized below. - The probabilistic approach developed in this study was applied to different piping systems in both Boiler Water Reactors (BWR) and Pressurised Water Reactors (PWR). Pipe sizes were selected so that small, medium and large pipes were included in the analysis. The present study shows that the conditional probability of fracture is in general small for the larger diameter pipes when evaluated as function of leak flow rate. However, when evaluated as function of fraction of crack length around the circumference, then the larger diameter pipes will belong to the ones with the highest conditional fracture probabilities. - The total failure probability, corresponding to the product between the leak probability and the conditional fracture probability, will be very small for all pipe geometries when evaluated as function of fraction of crack length around the circumference. This is mainly due to a small leak probability which is consistent with expectations since no active damage mechanism has been assumed. - One of the objectives of the approach was to be able to check the influence of off-centre cracks (i.e. the possibility that cracks occur randomly around the pipe circumference). To satisfy this objective, new stress intensity factor solutions for off-centre cracks were developed. Also to check how off-centre cracks influence crack opening areas, new form factors solutions for COA were developed taking plastic deformation into account. - The influence from an off-center crack position on the conditional

  5. A hybrid approach for global sensitivity analysis

    Chakraborty, Souvik; Chowdhury, Rajib

    2017-01-01

    Distribution based sensitivity analysis (DSA) computes sensitivity of the input random variables with respect to the change in distribution of output response. Although DSA is widely appreciated as the best tool for sensitivity analysis, the computational issue associated with this method prohibits its use for complex structures involving costly finite element analysis. For addressing this issue, this paper presents a method that couples polynomial correlated function expansion (PCFE) with DSA. PCFE is a fully equivalent operational model which integrates the concepts of analysis of variance decomposition, extended bases and homotopy algorithm. By integrating PCFE into DSA, it is possible to considerably alleviate the computational burden. Three examples are presented to demonstrate the performance of the proposed approach for sensitivity analysis. For all the problems, proposed approach yields excellent results with significantly reduced computational effort. The results obtained, to some extent, indicate that proposed approach can be utilized for sensitivity analysis of large scale structures. - Highlights: • A hybrid approach for global sensitivity analysis is proposed. • Proposed approach integrates PCFE within distribution based sensitivity analysis. • Proposed approach is highly efficient.

  6. Xplicit, a novel approach in probabilistic spatiotemporally explicit exposure and risk assessment for plant protection products.

    Schad, Thorsten; Schulz, Ralf

    2011-10-01

    The quantification of risk (the likelihood and extent of adverse effects) is a prerequisite in regulatory decision making for plant protection products and is the goal of the Xplicit project. In its present development stage, realism is increased in the exposure assessment (EA), first by using real-world data on, e.g., landscape factors affecting exposure, and second, by taking the variability of key factors into account. Spatial and temporal variability is explicitly addressed. Scale dependencies are taken into account, which allows for risk quantification at different scales, for example, at landscape scale, an overall picture of the potential exposure of nontarget organisms can be derived (e.g., for all off-crop habitats in a given landscape); at local scale, exposure might be relevant to assess recovery and recolonization potential; intermediate scales might best refer to population level and hence might be relevant for risk management decisions (e.g., individual off-crop habitats). The Xplicit approach is designed to comply with a central paradigm of probabilistic approaches, namely, that each individual case that is derived from the variability functions employed should represent a potential real-world case. This is mainly achieved by operating in a spatiotemporally explicit fashion. Landscape factors affecting the local exposure of habitats of nontarget species (i.e., receptors) are derived from geodatabases. Variability in time is resolved by operating at discrete time steps, with the probability of events (e.g., application) or conditions (e.g., wind conditions) defined in probability density functions (PDFs). The propagation of variability of parameters into variability of exposure and risk is done using a Monte Carlo approach. Among the outcomes are expectancy values on the realistic worst-case exposure (predicted environmental concentration [PEC]), the probability p that the PEC exceeds the ecologically acceptable concentration (EAC) for a given

  7. SEX-DETector: A Probabilistic Approach to Study Sex Chromosomes in Non-Model Organisms

    Muyle, Aline; Käfer, Jos; Zemp, Niklaus; Mousset, Sylvain; Picard, Franck; Marais, Gabriel AB

    2016-01-01

    We propose a probabilistic framework to infer autosomal and sex-linked genes from RNA-seq data of a cross for any sex chromosome type (XY, ZW, and UV). Sex chromosomes (especially the non-recombining and repeat-dense Y, W, U, and V) are notoriously difficult to sequence. Strategies have been developed to obtain partially assembled sex chromosome sequences. Most of them remain difficult to apply to numerous non-model organisms, either because they require a reference genome, or because they are designed for evolutionarily old systems. Sequencing a cross (parents and progeny) by RNA-seq to study the segregation of alleles and infer sex-linked genes is a cost-efficient strategy, which also provides expression level estimates. However, the lack of a proper statistical framework has limited a broader application of this approach. Tests on empirical Silene data show that our method identifies 20–35% more sex-linked genes than existing pipelines, while making reliable inferences for downstream analyses. Approximately 12 individuals are needed for optimal results based on simulations. For species with an unknown sex-determination system, the method can assess the presence and type (XY vs. ZW) of sex chromosomes through a model comparison strategy. The method is particularly well optimized for sex chromosomes of young or intermediate age, which are expected in thousands of yet unstudied lineages. Any organisms, including non-model ones for which nothing is known a priori, that can be bred in the lab, are suitable for our method. SEX-DETector and its implementation in a Galaxy workflow are made freely available. PMID:27492231

  8. Validation of the probabilistic approach for the analysis of PWR transients

    Amesz, J.; Francocci, G.F.; Clarotti, C.

    1978-01-01

    This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed

  9. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  10. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard

  11. Probabilistic Approaches for Multi-Hazard Risk Assessment of Structures and Systems

    Kwag, Shinyoung

    Performance assessment of structures, systems, and components for multi-hazard scenarios has received significant attention in recent years. However, the concept of multi-hazard analysis is quite broad in nature and the focus of existing literature varies across a wide range of problems. In some cases, such studies focus on hazards that either occur simultaneously or are closely correlated with each other. For example, seismically induced flooding or seismically induced fires. In other cases, multi-hazard studies relate to hazards that are not dependent or correlated but have strong likelihood of occurrence at different times during the lifetime of a structure. The current approaches for risk assessment need enhancement to account for multi-hazard risks. It must be able to account for uncertainty propagation in a systems-level analysis, consider correlation among events or failure modes, and allow integration of newly available information from continually evolving simulation models, experimental observations, and field measurements. This dissertation presents a detailed study that proposes enhancements by incorporating Bayesian networks and Bayesian updating within a performance-based probabilistic framework. The performance-based framework allows propagation of risk as well as uncertainties in the risk estimates within a systems analysis. Unlike conventional risk assessment techniques such as a fault-tree analysis, a Bayesian network can account for statistical dependencies and correlations among events/hazards. The proposed approach is extended to develop a risk-informed framework for quantitative validation and verification of high fidelity system-level simulation tools. Validation of such simulations can be quite formidable within the context of a multi-hazard risk assessment in nuclear power plants. The efficiency of this approach lies in identification of critical events, components, and systems that contribute to the overall risk. Validation of any event or

  12. A perturbed martingale approach to global optimization

    Sarkar, Saikat [Computational Mechanics Lab, Department of Civil Engineering, Indian Institute of Science, Bangalore 560012 (India); Roy, Debasish, E-mail: royd@civil.iisc.ernet.in [Computational Mechanics Lab, Department of Civil Engineering, Indian Institute of Science, Bangalore 560012 (India); Vasu, Ram Mohan [Department of Instrumentation and Applied Physics, Indian Institute of Science, Bangalore 560012 (India)

    2014-08-01

    A new global stochastic search, guided mainly through derivative-free directional information computable from the sample statistical moments of the design variables within a Monte Carlo setup, is proposed. The search is aided by imparting to the directional update term additional layers of random perturbations referred to as ‘coalescence’ and ‘scrambling’. A selection step, constituting yet another avenue for random perturbation, completes the global search. The direction-driven nature of the search is manifest in the local extremization and coalescence components, which are posed as martingale problems that yield gain-like update terms upon discretization. As anticipated and numerically demonstrated, to a limited extent, against the problem of parameter recovery given the chaotic response histories of a couple of nonlinear oscillators, the proposed method appears to offer a more rational, more accurate and faster alternative to most available evolutionary schemes, prominently the particle swarm optimization. - Highlights: • Evolutionary global optimization is posed as a perturbed martingale problem. • Resulting search via additive updates is a generalization over Gateaux derivatives. • Additional layers of random perturbation help avoid trapping at local extrema. • The approach ensures efficient design space exploration and high accuracy. • The method is numerically assessed via parameter recovery of chaotic oscillators.

  13. On the evaluation of the efficacy of a smart damper: a new equivalent energy-based probabilistic approach

    Aly, A M; Christenson, R E

    2008-01-01

    Smart damping technology has been proposed to protect civil structures from dynamic loads. Each application of smart damping control provides varying levels of performance relative to active and passive control strategies. Currently, researchers compare the relative efficacy of smart damping control to active and passive strategies by running numerous simulations. These simulations can require significant computation time and resources. Because of this, it is desirable to develop an approach to assess the applicability of smart damping technology which requires less computation time. This paper discusses and verifies a probabilistic approach to determine the efficacy of smart damping technology based on clipped optimal state feedback control theory

  14. Learning Probabilistic Logic Models from Probabilistic Examples.

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  15. An integrated approach to the probabilistic assessments of aircraft strikes and structural mode of damages to nuclear power plants

    Godbout, P.; Brais, A.

    1975-01-01

    The possibilities of an aircraft striking a Canadian nuclear power plant in the vicinity of an airport and of inducing structural failure modes have been evaluated. This evaluation, together with other studies, may enhance decisions in the development of general criteria for the siting of reactors near airports. The study made use, for assessment, of the probabilistic approach and made judicious applications of the finite Canadian, French, German, American and English resources that were available. The tools, techniques and methods used for achieving the above, form what may be called an integrated approach. This method of approach requires that the study be made in six consecutive steps as follows: the qualitative evaluation of having an aircraft strike on a site situated near an airport with the use of the logic model technique; the statistical data gathering on aircraft movements and accidents; evaluating the probability distribution and calculating the basic event probabilities; evaluating the probability of an aircraft strike and the application of the sensitivity approach; generating the probability density distribution versus strike impact energy, that is, the evaluation of the energy envelope; and the probabilistic evaluation of structural failure mode inducements

  16. Global approaches to regulating electronic cigarettes

    Kennedy, Ryan David; Awopegba, Ayodeji; De León, Elaine; Cohen, Joanna E

    2017-01-01

    Objectives Classify and describe the policy approaches used by countries to regulate e-cigarettes. Methods National policies regulating e-cigarettes were identified by (1) conducting web searches on Ministry of Health websites, and (2) broad web searches. The mechanisms used to regulate e-cigarettes were classified as new/amended laws, or existing laws. The policy domains identified include restrictions or prohibitions on product: sale, manufacturing, importation, distribution, use, product design including e-liquid ingredients, advertising/promotion/sponsorship, trademarks, and regulation requiring: taxation, health warning labels and child-safety standards. The classification of the policy was reviewed by a country expert. Results The search identified 68 countries that regulate e-cigarettes: 22 countries regulate e-cigarettes using existing regulations; 25 countries enacted new policies to regulate e-cigarettes; 7 countries made amendments to existing legislation; 14 countries use a combination of new/amended and existing regulation. Common policies include a minimum-age-of-purchase, indoor-use (vape-free public places) bans and marketing restrictions. Few countries are applying a tax to e-cigarettes. Conclusions A range of regulatory approaches are being applied to e-cigarettes globally; many countries regulate e-cigarettes using legislation not written for e-cigarettes. PMID:27903958

  17. Global approach of mean service satisfaction assessment

    Ahmed Dooguy Kora

    2014-01-01

    Full Text Available A theoretical expression for mobile service satisfaction assessment has been proposed. Mobile networks users’ satisfaction is a major concern for the operators and regulators. Therefore a certain level of network qualification is required to be offered to consumers by operators thanks to the decisions initiated by the regulation authority. For the assessment of the level of satisfaction, several methodologies and tools (measuring and monitoring have emerged. Ranking in two broad categories, namely the objective and subjective methods, both have advantages as well as disadvantages. This Letter has proposed a unified approach to evaluate more objectively users’ level of satisfaction of a service based on the most common network key performance indicators (KPIs rate following the different methods. This approach's main advantage is that it has taken advantages of the different positive aspects of the existing methods and outperformed their limitations thanks to the introduced concept of global KPI. In addition, the size of samples according to each method has been considered. A mean service satisfaction theoretical expression has been proposed to regulation authority, consumers association and operators as common base of service satisfaction assessment.

  18. Probabilistic vs linear blending approaches to shared control for wheelchair driving.

    Ezeh, Chinemelu; Trautman, Pete; Devigne, Louise; Bureau, Valentin; Babel, Marie; Carlson, Tom

    2017-07-01

    Some people with severe mobility impairments are unable to operate powered wheelchairs reliably and effectively, using commercially available interfaces. This has sparked a body of research into "smart wheelchairs", which assist users to drive safely and create opportunities for them to use alternative interfaces. Various "shared control" techniques have been proposed to provide an appropriate level of assistance that is satisfactory and acceptable to the user. Most shared control techniques employ a traditional strategy called linear blending (LB), where the user's commands and wheelchair's autonomous commands are combined in some proportion. In this paper, however, we implement a more generalised form of shared control called probabilistic shared control (PSC). This probabilistic formulation improves the accuracy of modelling the interaction between the user and the wheelchair by taking into account uncertainty in the interaction. In this paper, we demonstrate the practical success of PSC over LB in terms of safety, particularly for novice users.

  19. Prediction Uncertainty and Groundwater Management: Approaches to get the Most out of Probabilistic Outputs

    Peeters, L. J.; Mallants, D.; Turnadge, C.

    2017-12-01

    Groundwater impact assessments are increasingly being undertaken in a probabilistic framework whereby various sources of uncertainty (model parameters, model structure, boundary conditions, and calibration data) are taken into account. This has resulted in groundwater impact metrics being presented as probability density functions and/or cumulative distribution functions, spatial maps displaying isolines of percentile values for specific metrics, etc. Groundwater management on the other hand typically uses single values (i.e., in a deterministic framework) to evaluate what decisions are required to protect groundwater resources. For instance, in New South Wales, Australia, a nominal drawdown value of two metres is specified by the NSW Aquifer Interference Policy as trigger-level threshold. In many cases, when drawdowns induced by groundwater extraction exceed two metres, "make-good" provisions are enacted (such as the surrendering of extraction licenses). The information obtained from a quantitative uncertainty analysis can be used to guide decision making in several ways. Two examples are discussed here: the first of which would not require modification of existing "deterministic" trigger or guideline values, whereas the second example assumes that the regulatory criteria are also expressed in probabilistic terms. The first example is a straightforward interpretation of calculated percentile values for specific impact metrics. The second examples goes a step further, as the previous deterministic thresholds do not currently allow for a probabilistic interpretation; e.g., there is no statement that "the probability of exceeding the threshold shall not be larger than 50%". It would indeed be sensible to have a set of thresholds with an associated acceptable probability of exceedance (or probability of not exceeding a threshold) that decreases as the impact increases. We here illustrate how both the prediction uncertainty and management rules can be expressed in a

  20. Probabilistic approach to the analysis of reactor pressure vessel integrity during a pressurized thermal shock

    Adamec, P.

    2000-12-01

    Following a general summary of the issue, an overview of international experience (USA; Belgium, France, Germany, Russia, Spain, Sweden, The Netherlands, and the UK; and probabilistic PTS assessment for the reactor pressure vessel at Loviisa-1, Finland) is presented, and the applicable computer codes (VISA-II, OCA-P, FAVOR, ZERBERUS) are highlighted and their applicability to VVER type reactor pressure vessels is outlined. (P.A.)

  1. Combination of the deterministic and probabilistic approaches for risk-informed decision-making in US NRC regulatory guides

    Patrik, M.; Babic, P.

    2001-06-01

    The report responds to the trend where probabilistic safety analyses are attached, on a voluntary basis (as yet), to the mandatory deterministic assessment of modifications of NPP systems or operating procedures, resulting in risk-informed type documents. It contains a nearly complete Czech translation of US NRC Regulatory Guide 1.177 and presents some suggestions for improving a) PSA study applications; b) the development of NPP documents for the regulatory body; and c) the interconnection between PSA and traditional deterministic analyses as contained in the risk-informed approach. (P.A.)

  2. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  3. A note on probabilistic models over strings: the linear algebra approach.

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  4. Global Environmental Change: An integrated modelling approach

    Den Elzen, M.

    1993-01-01

    Two major global environmental problems are dealt with: climate change and stratospheric ozone depletion (and their mutual interactions), briefly surveyed in part 1. In Part 2 a brief description of the integrated modelling framework IMAGE 1.6 is given. Some specific parts of the model are described in more detail in other Chapters, e.g. the carbon cycle model, the atmospheric chemistry model, the halocarbon model, and the UV-B impact model. In Part 3 an uncertainty analysis of climate change and stratospheric ozone depletion is presented (Chapter 4). Chapter 5 briefly reviews the social and economic uncertainties implied by future greenhouse gas emissions. Chapters 6 and 7 describe a model and sensitivity analysis pertaining to the scientific uncertainties and/or lacunae in the sources and sinks of methane and carbon dioxide, and their biogeochemical feedback processes. Chapter 8 presents an uncertainty and sensitivity analysis of the carbon cycle model, the halocarbon model, and the IMAGE model 1.6 as a whole. Part 4 presents the risk assessment methodology as applied to the problems of climate change and stratospheric ozone depletion more specifically. In Chapter 10, this methodology is used as a means with which to asses current ozone policy and a wide range of halocarbon policies. Chapter 11 presents and evaluates the simulated globally-averaged temperature and sea level rise (indicators) for the IPCC-1990 and 1992 scenarios, concluding with a Low Risk scenario, which would meet the climate targets. Chapter 12 discusses the impact of sea level rise on the frequency of the Dutch coastal defence system (indicator) for the IPCC-1990 scenarios. Chapter 13 presents projections of mortality rates due to stratospheric ozone depletion based on model simulations employing the UV-B chain model for a number of halocarbon policies. Chapter 14 presents an approach for allocating future emissions of CO 2 among regions. (Abstract Truncated)

  5. Global Software Engineering: A Software Process Approach

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  6. An integral approach to the use of probabilistic risk assessment methods

    Schwarzblat, M.; Arellano, J.

    1987-01-01

    In this chapter some of the work developed at the Instituto de Investigaciones Electricas in the area of probabilistic risk analysis are presented. In this area, work has been basically focused in the following directions: development and implementation of methods, and applications to real systems. The first part of this paper describes the area of methods development and implementation, presenting an integrated package of computer programs for fault tree analysis. In the second part some of the most important applications developed for real systems are presented. (author)

  7. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Tvrdá Katarína

    2017-01-01

    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  8. The Global Approach to Quantum Field Theory

    Folacci, Antoine; Jensen, Bruce

    2003-01-01

    Thanks to its impressive success in the second half of the 20th century, both in high-energy physics and in critical phenomena, quantum field theory has enjoyed an abundant literature. We therefore greet yet another book on this subject with caution: what can a monograph on quantum field theory bring now that is new, either conceptually or pedagogically? But when it is written by a physicist such as Bryce DeWitt, who has made his own contribution to the collection of field theory books with The Global Approach to Quantum Field Theory, all suspicion is naturally abandoned. DeWitt has made a formidable contribution to various areas of physics: general relativity, the interpretation of quantum mechanics, and most of all the quantization of non-Abelian gauge theories and quantum gravity. In addition, his pedagogical publications, especially the Les Houches schools of 1963 and 1983, have had a great impact on quantum field theory. We must begin by alerting the potential readers of this book that it cannot be compared to any other book in the field. This uniqueness applies to both the scientific content and the way the ideas are presented. For DeWitt, a central concept of field theory is that of 'space of histories'. For a field varphi i defined on a given spacetime M, the set of all varphi i (x) for all x in all charts of M defines its history. It is the space Phi of all possible histories (dynamically allowed or not) of the fields defined on M which is called the 'pace of histories' by DeWitt. If only bosonic fields are considered, the space of histories is an infinite-dimensional manifold and if fermionic fields are also present, it must be viewed as an infinite-dimensional supermanifold. The fields can then be regarded as coordinates on these structures, and the geometrical notions of differentiation, metric, connections, measure, as well as the geodesics which can be defined on it, are of fundamental importance in the development of the formalism of quantum field

  9. The Global Approach to Quantum Field Theory

    Folacci, Antoine; Jensen, Bruce [Faculte des Sciences, Universite de Corse (France); Department of Mathematics, University of Southampton (United Kingdom)

    2003-12-12

    Thanks to its impressive success in the second half of the 20th century, both in high-energy physics and in critical phenomena, quantum field theory has enjoyed an abundant literature. We therefore greet yet another book on this subject with caution: what can a monograph on quantum field theory bring now that is new, either conceptually or pedagogically? But when it is written by a physicist such as Bryce DeWitt, who has made his own contribution to the collection of field theory books with The Global Approach to Quantum Field Theory, all suspicion is naturally abandoned. DeWitt has made a formidable contribution to various areas of physics: general relativity, the interpretation of quantum mechanics, and most of all the quantization of non-Abelian gauge theories and quantum gravity. In addition, his pedagogical publications, especially the Les Houches schools of 1963 and 1983, have had a great impact on quantum field theory. We must begin by alerting the potential readers of this book that it cannot be compared to any other book in the field. This uniqueness applies to both the scientific content and the way the ideas are presented. For DeWitt, a central concept of field theory is that of 'space of histories'. For a field varphi{sup i} defined on a given spacetime M, the set of all varphi{sup i}(x) for all x in all charts of M defines its history. It is the space Phi of all possible histories (dynamically allowed or not) of the fields defined on M which is called the 'pace of histories' by DeWitt. If only bosonic fields are considered, the space of histories is an infinite-dimensional manifold and if fermionic fields are also present, it must be viewed as an infinite-dimensional supermanifold. The fields can then be regarded as coordinates on these structures, and the geometrical notions of differentiation, metric, connections, measure, as well as the geodesics which can be defined on it, are of fundamental importance in the development of the

  10. A probabilistic approach to cost and duration uncertainties in environmental decisions

    Boak, D.M.; Painton, L.

    1996-01-01

    Sandia National Laboratories has developed a method for analyzing life-cycle costs using probabilistic cost forecasting and utility theory to determine the most cost-effective alternatives for safe interim storage of radioactive materials. The method explicitly incorporates uncertainties in cost and storage duration by (1) treating uncertain component costs as random variables represented by probability distributions, (2) treating uncertain durations as chance nodes in a decision tree, and (3) using stochastic simulation tools to generate life-cycle cost forecasts for each storage alternative. The method applies utility functions to the forecasted costs to incorporate the decision maker's risk preferences, making it possible to compare alternatives on the basis of both cost and cost utility. Finally, the method is used to help identify key contributors to the uncertainty in forecasted costs to focus efforts aimed at reducing cost uncertainties. Where significant cost and duration uncertainties exist, and where programmatic decisions must be made despite these uncertainties, probabilistic forecasting techniques can yield important insights into decision alternatives, especially when used as part of a larger decision analysis framework and when properly balanced with deterministic analyses. Although the method is built around an interim storage example, it is potentially applicable to many other environmental decision problems

  11. A Global Approach to Foreign Language Education.

    Conner, Maurice W., Ed.

    The papers collected here are largely devoted to foreign language education as a means of increasing international and cross-cultural understanding. Titles include: (1) "Language Is the Medium, Culture Is the Message: Globalizing Foreign Languages" (Lorraine A. Strasheim); (2) "Cultural Understanding for Global Citizenship: An Inservice Model"…

  12. South Africa's transformational approach to global governance ...

    One goal was to transform structures and institutions of global governance while another aim was to place developmental goals on the global agenda. As South Africa targeted UN agencies, notably the Security Council, the IMF, World Bank, WTO and more recently the G20, the curious question begs: will South Africa ...

  13. Probabilistic and possibilistic approach for assessment of radiological risk due to organically bound and tissue free water tritium

    Dahiya, Sudhir; Hegde, A.G.; Joshi, M.L.; Verma, P.C.; Kushwaha, H.S.

    2006-01-01

    This study illustrates use of two approaches namely probabilistic using Monte Carlo simulation (MCS) and possibilistic using fuzzy α-cut (FAC) to estimate the radiological cancer risk to the population from ingestion of organically bound tritium (OBT) and tissue free water tritium (TFWT) from fish consumption from the Rana Pratap Sagar Lake (RPSL), Kota. Using FAC technique, radiological cancer risk rate (year -1 ) at A αl.0 level were 1.15E-08 and 1.50E-09 for OBT and TFWT respectively from fish ingestion pathway. The radiological cancer risk rate (year -1 ) using MCS approach at 50th percentile (median) level is 1.14E-08 and 1.49E-09 for OBT and HTO respectively from ingestion of fresh water fish. (author)

  14. Some probabilistic aspects of fracture

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  15. [Academic review of global health approaches: an analytical framework].

    Franco-Giraldo, Alvaro

    2015-09-01

    In order to identify perspectives on global health, this essay analyzes different trends from academia that have enriched global health and international health. A database was constructed with information from the world's leading global health centers. The search covered authors on global diplomacy and global health and was performed in PubMed, LILACS, and Google Scholar with the key words "global health" and "international health". Research and training centers in different countries have taken various academic approaches to global health; various interests and ideological orientations have emerged in relation to the global health concept. Based on the mosaic of global health centers and their positions, the review concludes that the new concept reflects the construction of a paradigm of renewal in international health and global health, the pre-paradigmatic stage of which has still not reached a final version.

  16. Global Mindset: An Entrepreneur's Perspective on the Born-Global Approach

    Robert Poole

    2012-10-01

    Full Text Available The born-global approach calls for a startup to address the needs of a global market from inception. This approach provides an attractive alternative to the conventional staged approach to internationalization whereby a startup first operates in its home market and then enters one or more foreign markets sequentially. This article highlights the mindset change that an entrepreneur must make to move from the conventional staged approach to the born-global approach. The author of this article is an experienced entrepreneur and the article describes his own mindset change that occurred when enacting the born-global approach. The author uses his own experience and company as a case study to develop recommendations for other entrepreneurs who are evaluating the born-global approach to launch and grow a technology company.

  17. Approaching Pediatric Cancers through a Global Lens

    Since the incidence of pediatric cancer is relatively constant worldwide, strengthening population-based registries to collect data on the extent of disease at diagnosis would be helpful in determining if late diagnosis may explain difference in outcome globally.

  18. Substation design improvement with a probabilistic reliability approach using the TOPASE program

    Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)

    1997-12-31

    TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.

  19. A Probabilistic Approach to Control of Complex Systems and Its Application to Real-Time Pricing

    Koichi Kobayashi

    2014-01-01

    Full Text Available Control of complex systems is one of the fundamental problems in control theory. In this paper, a control method for complex systems modeled by a probabilistic Boolean network (PBN is studied. A PBN is widely used as a model of complex systems such as gene regulatory networks. For a PBN, the structural control problem is newly formulated. In this problem, a discrete probability distribution appeared in a PBN is controlled by the continuous-valued input. For this problem, an approximate solution method using a matrix-based representation for a PBN is proposed. Then, the problem is approximated by a linear programming problem. Furthermore, the proposed method is applied to design of real-time pricing systems of electricity. Electricity conservation is achieved by appropriately determining the electricity price over time. The effectiveness of the proposed method is presented by a numerical example on real-time pricing systems.

  20. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  1. Probabilistic insurance

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  2. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  3. Tsunamigenic scenarios for southern Peru and northern Chile seismic gap: Deterministic and probabilistic hybrid approach for hazard assessment

    González-Carrasco, J. F.; Gonzalez, G.; Aránguiz, R.; Yanez, G. A.; Melgar, D.; Salazar, P.; Shrivastava, M. N.; Das, R.; Catalan, P. A.; Cienfuegos, R.

    2017-12-01

    Plausible worst-case tsunamigenic scenarios definition plays a relevant role in tsunami hazard assessment focused in emergency preparedness and evacuation planning for coastal communities. During the last decade, the occurrence of major and moderate tsunamigenic earthquakes along worldwide subduction zones has given clues about critical parameters involved in near-field tsunami inundation processes, i.e. slip spatial distribution, shelf resonance of edge waves and local geomorphology effects. To analyze the effects of these seismic and hydrodynamic variables over the epistemic uncertainty of coastal inundation, we implement a combined methodology using deterministic and probabilistic approaches to construct 420 tsunamigenic scenarios in a mature seismic gap of southern Peru and northern Chile, extended from 17ºS to 24ºS. The deterministic scenarios are calculated using a regional distribution of trench-parallel gravity anomaly (TPGA) and trench-parallel topography anomaly (TPTA), three-dimensional Slab 1.0 worldwide subduction zones geometry model and published interseismic coupling (ISC) distributions. As result, we find four higher slip deficit zones interpreted as major seismic asperities of the gap, used in a hierarchical tree scheme to generate ten tsunamigenic scenarios with seismic magnitudes fluctuates between Mw 8.4 to Mw 8.9. Additionally, we construct ten homogeneous slip scenarios as inundation baseline. For the probabilistic approach, we implement a Karhunen - Loève expansion to generate 400 stochastic tsunamigenic scenarios over the maximum extension of the gap, with the same magnitude range of the deterministic sources. All the scenarios are simulated through a non-hydrostatic tsunami model Neowave 2D, using a classical nesting scheme, for five coastal major cities in northern Chile (Arica, Iquique, Tocopilla, Mejillones and Antofagasta) obtaining high resolution data of inundation depth, runup, coastal currents and sea level elevation. The

  4. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  5. The Global Approach to Quantum Field Theory

    Fulling, S A [Texas A and M University (United States)

    2006-05-21

    Parts I and II develop the basic classical and quantum kinematics of fields and other dynamical systems. The presentation is conducted in the utmost generality, allowing for dynamical quantities that may be anticommuting (supernumbers) and theories subject to the most general possible gauge symmetry. The basic ingredients are action functionals and the Peierls bracket, a manifestly covariant replacement for the Poisson bracket and equal-time commutation relations. For DeWitt the logical progression is Peierls bracket {yields} Schwinger action principle {yields} Feynman functional integral although he points out that the historical development was in the opposite order. It must be pointed out that the Peierls-Schwinger-DeWitt approach, despite some advantages over initial-value formulations, has some troubles of its own. In particular, it has never completely escaped from the arena of scattering theory, the paradigm of conventional particle physics. One is naturally led to study matrix elements between an 'in-vacuum' and an 'out-vacuum' though such concepts are murky in situations, such as big bangs and black holes, where the ambient geometry is not asymptotically static in the far past and future. The newest material in the treatise appears in two chapters in part II devoted to the interpretation of quantum theory, incorporating some unpublished work of David Deutsch on the meaning of probability in physics. Parts III through V apply the formalism in depth to successively more difficult classes of systems: quantum mechanics, linear (free) fields, and interacting fields. DeWitt's characteristic tools of effective actions, heat kernels, and ghost fields are developed. Chapters 26 and 31 outline new approaches developed in collaboration with DeWitt's recent students C Molina-Paris and C Y Wang, respectively. The most of parts VI and VII consist of special topics, such as anomalies, particle creation by external fields, Unruh acceleration

  6. The Global Approach to Quantum Field Theory

    Fulling, S A

    2006-01-01

    Parts I and II develop the basic classical and quantum kinematics of fields and other dynamical systems. The presentation is conducted in the utmost generality, allowing for dynamical quantities that may be anticommuting (supernumbers) and theories subject to the most general possible gauge symmetry. The basic ingredients are action functionals and the Peierls bracket, a manifestly covariant replacement for the Poisson bracket and equal-time commutation relations. For DeWitt the logical progression is Peierls bracket → Schwinger action principle → Feynman functional integral although he points out that the historical development was in the opposite order. It must be pointed out that the Peierls-Schwinger-DeWitt approach, despite some advantages over initial-value formulations, has some troubles of its own. In particular, it has never completely escaped from the arena of scattering theory, the paradigm of conventional particle physics. One is naturally led to study matrix elements between an 'in-vacuum' and an 'out-vacuum' though such concepts are murky in situations, such as big bangs and black holes, where the ambient geometry is not asymptotically static in the far past and future. The newest material in the treatise appears in two chapters in part II devoted to the interpretation of quantum theory, incorporating some unpublished work of David Deutsch on the meaning of probability in physics. Parts III through V apply the formalism in depth to successively more difficult classes of systems: quantum mechanics, linear (free) fields, and interacting fields. DeWitt's characteristic tools of effective actions, heat kernels, and ghost fields are developed. Chapters 26 and 31 outline new approaches developed in collaboration with DeWitt's recent students C Molina-Paris and C Y Wang, respectively. The most of parts VI and VII consist of special topics, such as anomalies, particle creation by external fields, Unruh acceleration temperature, black holes, and

  7. Reliability calculation of cracked components using probabilistic fracture mechanics and a Markovian approach

    Schmidt, T.

    1988-01-01

    The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de

  8. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  9. Stable oscillations of a predator-prey probabilistic cellular automaton: a mean-field approach

    Tome, Tania; Carvalho, Kelly C de

    2007-01-01

    We analyze a probabilistic cellular automaton describing the dynamics of coexistence of a predator-prey system. The individuals of each species are localized over the sites of a lattice and the local stochastic updating rules are inspired by the processes of the Lotka-Volterra model. Two levels of mean-field approximations are set up. The simple approximation is equivalent to an extended patch model, a simple metapopulation model with patches colonized by prey, patches colonized by predators and empty patches. This approximation is capable of describing the limited available space for species occupancy. The pair approximation is moreover able to describe two types of coexistence of prey and predators: one where population densities are constant in time and another displaying self-sustained time oscillations of the population densities. The oscillations are associated with limit cycles and arise through a Hopf bifurcation. They are stable against changes in the initial conditions and, in this sense, they differ from the Lotka-Volterra cycles which depend on initial conditions. In this respect, the present model is biologically more realistic than the Lotka-Volterra model

  10. A probabilistic approach to safety/reliability of space nuclear power systems

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  11. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  12. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Lee, Han Sul; Heo, Gyun Young; Kim, Tae Wan

    2017-01-01

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective

  13. Probabilistic Approach to Provide Scenarios of Earthquake-Induced Slope Failures (PARSIFAL Applied to the Alcoy Basin (South Spain

    Salvatore Martino

    2018-02-01

    Full Text Available The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake-Induced slope FAiLures approach was applied in the basin of Alcoy (Alicante, South Spain, to provide a comprehensive scenario of earthquake-induced landslides. The basin of Alcoy is well known for several historical landslides, mainly represented by earth-slides, that involve urban settlement as well as infrastructures (i.e., roads, bridges. The PARSIFAL overcomes several limits existing in other approaches, allowing the concomitant analyses of: (i first-time landslides (due to both rock-slope failures and shallow earth-slides and reactivations of existing landslides; (ii slope stability analyses of different failure mechanisms; (iii comprehensive mapping of earthquake-induced landslide scenarios in terms of exceedance probability of critical threshold values of co-seismic displacements. Geotechnical data were used to constrain the slope stability analysis, while specific field surveys were carried out to measure jointing and strength conditions of rock masses and to inventory already existing landslides. GIS-based susceptibility analyses were performed to assess the proneness to shallow earth-slides as well as to verify kinematic compatibility to planar or wedge rock-slides and to topples. The experienced application of PARSIFAL to the Alcoy basin: (i confirms the suitability of the approach at a municipality scale, (ii outputs the main role of saturation in conditioning slope instabilities in this case study, (iii demonstrates the reliability of the obtained results respect to the historical data.

  14. Probabilistic Networks

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  15. Probabilistic Insurance

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  16. Probabilistic Insurance

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  17. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  18. A Decision Support System Coupling Fuzzy Logic and Probabilistic Graphical Approaches for the Agri-Food Industry: Prediction of Grape Berry Maturity.

    Perrot, Nathalie; Baudrit, Cédric; Brousset, Jean Marie; Abbal, Philippe; Guillemin, Hervé; Perret, Bruno; Goulet, Etienne; Guerin, Laurence; Barbeau, Gérard; Picque, Daniel

    2015-01-01

    Agri-food is one of the most important sectors of the industry and a major contributor to the global warming potential in Europe. Sustainability issues pose a huge challenge for this sector. In this context, a big issue is to be able to predict the multiscale dynamics of those systems using computing science. A robust predictive mathematical tool is implemented for this sector and applied to the wine industry being easily able to be generalized to other applications. Grape berry maturation relies on complex and coupled physicochemical and biochemical reactions which are climate dependent. Moreover one experiment represents one year and the climate variability could not be covered exclusively by the experiments. Consequently, harvest mostly relies on expert predictions. A big challenge for the wine industry is nevertheless to be able to anticipate the reactions for sustainability purposes. We propose to implement a decision support system so called FGRAPEDBN able to (1) capitalize the heterogeneous fragmented knowledge available including data and expertise and (2) predict the sugar (resp. the acidity) concentrations with a relevant RMSE of 7 g/l (resp. 0.44 g/l and 0.11 g/kg). FGRAPEDBN is based on a coupling between a probabilistic graphical approach and a fuzzy expert system.

  19. Game meat consumption by hunters and their relatives: A probabilistic approach.

    Sevillano Morales, Jesus; Moreno-Ortega, Alicia; Amaro Lopez, Manual Angel; Arenas Casas, Antonio; Cámara-Martos, Fernando; Moreno-Rojas, Rafael

    2018-06-18

    This study aimed to estimate the consumption of meat and products derived from hunting by the consumer population and, specifically, by hunters and their relatives. For this purpose, a survey was conducted on the frequency of consuming meat from the four most representative game species in Spain, two of big game, wild boar (Sus scrofa) and red deer (Cervus elaphus) and two of small game, rabbit (Oryctolagus cunulucus) and red partridge (Alectoris rufa), as well as of processed meat products (salami-type sausage) made from those big game species. The survey was carried out on 337 habitual consumers of these types of products (hunters and their relatives). The total mean game meat consumption, per capita in this population group, is 6.87 kg/person/year of meat and 8.57 kg/person/year if the processed meat products are also considered. Consumption of rabbit, red partridge, red deer and wild boar, individually, was 1.85, 0.82, 2.28 and 1.92 kg/person/year, respectively. It was observed that hunters generally registered a larger intake of game meat, this being statistically significant in the case of rabbit meat consumption. Using probabilistic methods, the meat consumption frequency distributions for each hunting species studied were estimated, as well as the products made from big game species and the total consumption both of meat by itself and that including the products made from it. The consumption frequency distributions were adjusted to exponential ones, verified by the test suitable for it according to Akaike Information Criterion, Bayesian Information Criterion, the Chi-Squared and Kolmogorov-Smirnov statistics. In addition, the consumption percentiles of the different distributions were obtained. The latter could be a good tool when making nutrition or contaminant studies since they permit the assessment of exposure to the compound in question.

  20. Global evolution: New approach to understanding

    Damiani, V.

    1993-01-01

    Current threats to the health of the environment - urban air pollution, deforestation, water pollution, etc. are taking on an ever increasing global dimension and it is becoming clear that it will be impossible to develop real solutions to these problems without effectively and contemporaneously resolving the deep social and political problems which are affecting just about every part of the globe. The cause of past failures in environmental protection policy implementation can be ascribed to one of the main defects of our society - that of not having cultivated intuitive knowledge through direct as opposed to intellectual experience; and this defect was probably the result of man having separated biological and cultural aspects from human nature during the course of civilization. To accomplish the formidable task of global environmental restoration, mankind must re-program his ways of living and reasoning which are not in harmony with nature. Conventional rational methods of thinking, which are highly linear, must give way to an intuitive process of comprehension to allow man to successfully deal with the maintenance of the earth's dynamic and non-linear ecosystems and socio-economic frameworks

  1. Unified approach for estimating the probabilistic design S-N curves of three commonly used fatigue stress-life models

    Zhao Yongxiang; Wang Jinnuo; Gao Qing

    2001-01-01

    A unified approach, referred to as general maximum likelihood method, is presented for estimating probabilistic design S-N curves and their confidence bounds of the three commonly used fatigue stress-life models, namely three parameter, Langer and Basquin. The curves are described by a general form of mean and standard deviation S-N curves of the logarithm of fatigue life. Different from existent methods, i.e., the conventional method and the classical maximum likelihood method,present approach considers the statistical characteristics of whole test data. The parameters of the mean curve is firstly estimated by least square method and then, the parameters of the standard deviation curve is evaluated by mathematical programming method to be agreement with the maximum likelihood principle. Fit effects of the curves are assessed by fitted relation coefficient, total fitted standard error and the confidence bounds. Application to the virtual stress amplitude-crack initiation life data of a nuclear engineering material, Chinese 1Cr18Ni9Ti stainless steel pipe-weld metal, has indicated the validity of the approach to the S-N data where both S and N show the character of random variable. Practices to the two states of S-N data of Chinese 45 carbon steel notched specimens (k t = 2.0) have indicated the validity of present approach to the test results obtained respectively from group fatigue test and from maximum likelihood fatigue test. At the practices, it was revealed that in general the fit is best for the three-parameter model,slightly inferior for the Langer relation and poor for the Basquin equation. Relative to the existent methods, present approach has better fit. In addition, the possible non-conservative predictions of the existent methods, which are resulted from the influence of local statistical characteristics of the data, are also overcome by present approach

  2. Principles or imagination? Two approaches to global justice.

    Coeckelbergh, Mark

    2007-01-01

    What does it mean to introduce the notion of imagination in the discussion about global justice? What is gained by studying the role of imagination in thinking about global justice? Does a focus on imagination imply that we must replace existing influential principle-centred approaches such as that

  3. Children as Global Citizens: A Socratic Approach to Teaching Character

    Helterbran, Valeri R.; Strahler, Brianna R.

    2013-01-01

    Educators around the world are being challenged to promote positive global citizenship skills in the face of daily news concerning widespread discord, dissonance, injustice, and corruption. This article describes a Socratic approach to developing global citizenship. Recognizing the central role of teachers in educating future generations of a…

  4. A probabilistic approach for estimating the spatial extent of pesticide agricultural use sites and potential co-occurrence with listed species for use in ecological risk assessments.

    Budreski, Katherine; Winchell, Michael; Padilla, Lauren; Bang, JiSu; Brain, Richard A

    2016-04-01

    A crop footprint refers to the estimated spatial extent of growing areas for a specific crop, and is commonly used to represent the potential "use site" footprint for a pesticide labeled for use on that crop. A methodology for developing probabilistic crop footprints to estimate the likelihood of pesticide use and the potential co-occurrence of pesticide use and listed species locations was tested at the national scale and compared to alternative methods. The probabilistic aspect of the approach accounts for annual crop rotations and the uncertainty in remotely sensed crop and land cover data sets. The crop footprints used historically are derived exclusively from the National Land Cover Database (NLCD) Cultivated Crops and/or Pasture/Hay classes. This approach broadly aggregates agriculture into 2 classes, which grossly overestimates the spatial extent of individual crops that are labeled for pesticide use. The approach also does not use all the available crop data, represents a single point in time, and does not account for the uncertainty in land cover data set classifications. The probabilistic crop footprint approach described herein incorporates best available information at the time of analysis from the National Agricultural Statistics Service (NASS) Cropland Data Layer (CDL) for 5 y (2008-2012 at the time of analysis), the 2006 NLCD, the 2007 NASS Census of Agriculture, and 5 y of NASS Quick Stats (2008-2012). The approach accounts for misclassification of crop classes in the CDL by incorporating accuracy assessment information by state, year, and crop. The NLCD provides additional information to improve the CDL crop probability through an adjustment based on the NLCD accuracy assessment data using the principles of Bayes' Theorem. Finally, crop probabilities are scaled at the state level by comparing against NASS surveys (Census of Agriculture and Quick Stats) of reported planted acres by crop. In an example application of the new method, the probabilistic

  5. Several comparison result of two types of equilibrium (Pareto Schemes and Stackelberg Scheme) of game theory approach in probabilistic vendor – buyer supply chain system with imperfect quality

    Setiawan, R.

    2018-05-01

    In this paper, Economic Order Quantity (EOQ) of the vendor-buyer supply-chain model under a probabilistic condition with imperfect quality items has been analysed. The analysis is delivered using two concepts in game theory approach, which is Stackelberg equilibrium and Pareto Optimal, under non-cooperative and cooperative games, respectively. Another result is getting acomparison of theoptimal result between integrated scheme and game theory approach based on analytical and numerical result using appropriate simulation data.

  6. Global approaches to regulating electronic cigarettes

    Kennedy, Ryan David; Awopegba, Ayodeji; De Le?n, Elaine; Cohen, Joanna E

    2016-01-01

    Objectives Classify and describe the policy approaches used by countries to regulate e-cigarettes. Methods National policies regulating e-cigarettes were identified by (1) conducting web searches on Ministry of Health websites, and (2) broad web searches. The mechanisms used to regulate e-cigarettes were classified as new/amended laws, or existing laws. The policy domains identified include restrictions or prohibitions on product: sale, manufacturing, importation, distribution, use, product d...

  7. A policy synthesis approach for slowing global warming

    Timilsina, G.R.

    1996-01-01

    Global warming is a burning environmental issue today but confronting with subjective as well as policy conflicts. The findings of various studies indicate that developed countries that are capable of affording effective measures towards the global warming mitigation have fewer incentives for doing so because they will have a minimal damage from global warming. The developing countries, although they will have greater damage, are unlikely to divert their development budget for taking preventive actions towards global warming. The only solution in this situation is to design a policy that encourages all the nation in the world to participate in the programs for slowing global warming. Without active participation of all nations, it seems unlikely to reduce the global warming problem in an effective way. This study presents a qualitative policy recommendation extracted from a comprehensive analysis of the findings of several studies conducted so far in this field. This study has categorized the policy approaches for mitigating the global warming in three groups: Engineering approach, forestry approach and economic approach

  8. Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds

    When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...

  9. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Holmberg, J.

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant

  10. Probabilistic safety assessment and optimal control of hazardous technological systems. A marked point process approach

    Holmberg, J [VTT Automation, Espoo (Finland)

    1997-04-01

    The thesis models risk management as an optimal control problem for a stochastic process. The approach classes the decisions made by management into three categories according to the control methods of a point process: (1) planned process lifetime, (2) modification of the design, and (3) operational decisions. The approach is used for optimization of plant shutdown criteria and surveillance test strategies of a hypothetical nuclear power plant. 62 refs. The thesis includes also five previous publications by author.

  11. Probabilistic global maps of the CO2 column at daily and monthly scales from sparse satellite measurements

    Chevallier, Frédéric; Broquet, Grégoire; Pierangelo, Clémence; Crisp, David

    2017-07-01

    The column-average dry air-mole fraction of carbon dioxide in the atmosphere (XCO2) is measured by scattered satellite measurements like those from the Orbiting Carbon Observatory (OCO-2). We show that global continuous maps of XCO2 (corresponding to level 3 of the satellite data) at daily or coarser temporal resolution can be inferred from these data with a Kalman filter built on a model of persistence. Our application of this approach on 2 years of OCO-2 retrievals indicates that the filter provides better information than a climatology of XCO2 at both daily and monthly scales. Provided that the assigned observation uncertainty statistics are tuned in each grid cell of the XCO2 maps from an objective method (based on consistency diagnostics), the errors predicted by the filter at daily and monthly scales represent the true error statistics reasonably well, except for a bias in the high latitudes of the winter hemisphere and a lack of resolution (i.e., a too small discrimination skill) of the predicted error standard deviations. Due to the sparse satellite sampling, the broad-scale patterns of XCO2 described by the filter seem to lag behind the real signals by a few weeks. Finally, the filter offers interesting insights into the quality of the retrievals, both in terms of random and systematic errors.

  12. Study of Power Fluctuation from Dispersed Generations and Loads and its Impact on a Distribution Network through a Probabilistic Approach

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2007-01-01

    In order to assess the performance of distribution system under normal operating conditions with large integration of renewable energy based dispersed generation (DG) units, probabilistic modeling of the distribution system is necessary in order to take into consideration the stochastic behavior...... of load demands and DG units such as wind generation and combined heat and power plant generation. This paper classifies probabilistic models of load demands and DG units into summer and winter period, weekday and weekend as well as in 24 hours a day. The voltage results from the probabilistic load flow...

  13. Principles or Imagination? Two Approaches to Global Justice

    Coeckelbergh, Mark

    2006-01-01

    In this paper I distinguish and discuss two approaches to global justice. One approach is Rawlsian and Kantian in inspiration. Discussions within this tradition typically focus on the question whether Rawls’s theory of justice (1971), designed for the national level, can or should be applied to the

  14. A probabilistic multi objective CLSC model with Genetic algorithm-ε_Constraint approach

    Alireza TaheriMoghadam

    2014-05-01

    Full Text Available In this paper an uncertain multi objective closed-loop supply chain is developed. The first objective function is maximizing the total profit. The second objective function is minimizing the use of row materials. In the other word, the second objective function is maximizing the amount of remanufacturing and recycling. Genetic algorithm is used for optimization and for finding the pareto optimal line, Epsilon-constraint method is used. Finally a numerical example is solved with proposed approach and performance of the model is evaluated in different sizes. The results show that this approach is effective and useful for managerial decisions.

  15. A Global Public Goods Approach to the Health of Migrants.

    Widdows, Heather; Marway, Herjeet

    2015-07-01

    This paper explores a global public goods approach to the health of migrants. It suggests that this approach establishes that there are a number of health goods which must be provided to migrants not because these are theirs by right (although this may independently be the case), but because these goods are primary goods which fit the threefold criteria of global public goods. There are two key advantages to this approach: first, it is non-confrontational and non-oppositional, and second, it provides self-interested arguments to provide at least some health goods to migrants and thus appeals to those little moved by rights-based arguments.

  16. Global approach of emergency response, reflection analysis

    Velasco Garcia, E.; Garcia Ahumada, F.; Albaladejo Vidal, S.

    1998-01-01

    The emergency response management approach must be dealt with adequately within company strategy, since a badly managed emergency situation can adversely affect a company, not only in terms of asset, but also in terms of the negative impact on its credibility, profitability and image. Thereby, it can be said that there are three main supports to manage the response in an emergency situation. a) Diagnosis b) Prognosis. c) Communications. To reach these capabilities it is necessary a co-ordination of different actions at the following levels. i. Facility Operation implies Local level. ii. Facility Property implies National level iii. Local Authority implies Local level iv. National Authority implies National level Taking into account all the last, these following functions must be covered: a) Management: incorporating communication, diagnosis and prognosis areas. b) Decision: incorporating communication and information means. c) Services: in order to facilitate the decision, as well as the execution of this decision. d) Analysis: in order to facilitate the situations that make easier to decide. e) Documentation: to seek the information for the analysts and decision makers. (Author)

  17. A probabilistic approach to identify putative drug targets in biochemical networks.

    Murabito, E.; Smalbone, K.; Swinton, J.; Westerhoff, H.V.; Steuer, R.

    2011-01-01

    Network-based drug design holds great promise in clinical research as a way to overcome the limitations of traditional approaches in the development of drugs with high efficacy and low toxicity. This novel strategy aims to study how a biochemical network as a whole, rather than its individual

  18. Potential changes in the extreme climate conditions at the regional scale: from observed data to modelling approaches and towards probabilistic climate change information

    Gachon, P.; Radojevic, M.; Harding, A.; Saad, C.; Nguyen, V.T.V.

    2008-01-01

    The changes in the characteristics of extreme climate conditions are one of the most critical challenges for all ecosystems, human being and infrastructure, in the context of the on-going global climate change. However, extremes information needed for impacts studies cannot be obtained directly from coarse scale global climate models (GCMs), due mainly to their difficulties to incorporate regional scale feedbacks and processes responsible in part for the occurrence, intensity and duration of extreme events. Downscaling approaches, namely statistical and dynamical downscaling techniques (i.e. SD and RCM), have emerged as useful tools to develop high resolution climate change information, in particular for extremes, as those are theoretically more capable to take into account regional/local forcings and their feedbacks from large scale influences as they are driven with GCM synoptic variables. Nevertheless, in spite of the potential added values from downscaling methods (statistical and dynamical), a rigorous assessment of these methods are needed as inherent difficulties to simulate extremes are still present. In this paper, different series of RCM and SD simulations using three different GCMs are presented and evaluated with respect to observed values over the current period and over a river basin in southern Quebec, with future ensemble runs, i.e. centered over 2050s (i.e. 2041-2070 period using the SRES A2 emission scenario). Results suggest that the downscaling performance over the baseline period significantly varies between the two downscaling techniques and over various seasons with more regular reliable simulated values with SD technique for temperature than for RCM runs, while both approaches produced quite similar temperature changes in the future from median values with more divergence for extremes. For precipitation, less accurate information is obtained compared to observed data, and with more differences among models with higher uncertainties in the

  19. Development of Probabilistic and Possebilistic Approaches to Approximate Reasoning and Its Applications

    1989-10-31

    fo tmaa OmfuogeM ara Mmi. fal in fM?05V~ ~ ~ ~ ~ ~ A D A 2 4 0409"~ n ugt Psoo,@’ oducbof Proton (07044 136M. WagaWapN. DC 20141 T1 3. REPORT TYPE...Al (circumscription, non- monotonic reasoning, and default reasoning), our approach is based on fuzzy logic and, more specifically, on the theory of

  20. Microscopic and probabilistic approach to thermal steady state based on a dice and coin toy model

    Onorato, Pasquale; Moggio, Lorenzo; Oss, Stefano; Malgieri, Massimiliano

    2017-01-01

    In this article we present an educational approach to thermal equilibrium which was tested on a group of 13 undergraduate students at the University of Trento. The approach is based on a stochastic toy model, in which bodies in thermal contact are represented by rows of squares on a cardboard table, which exchange coins placed on the squares based on the roll of two dice. The discussion of several physical principles, such as the exponential approach to equilibrium, the determination of the equilibrium temperature, and the interpretation of the equilibrium state as the most probable macrostate, proceeds through a continual comparison between the outcomes obtained with the toy model and the results of a real experiment on the thermal contact of two masses of water at different temperatures. At the end of the sequence, a re-analysis of the experimental results in view of both the Boltzmann and Clausius definitions of entropy reveals some limits of the toy model, but also allows for a critical discussion of the concepts of temperature and entropy. In order to provide the reader with a feeling of how the sequence was received by students, and how it helped them understand the topics introduced, we discuss some excerpts from their answers to a conceptual item given at the end of the sequence. (paper)

  1. ECONOMIC SECURITY – NEW APPROACHES IN THE CONTEXT OF GLOBALIZATION

    Gabriel ANDRUSEAC

    2015-08-01

    Full Text Available Nowadays, more than ever, economic relations between states are the ones that define the general character of the relations between them and establish economic security as a concept which cannot be neglected anymore. Globalization, the process that shapes the international environment, undermines the old definition of economic security and forces its redefinition. The article aims to identify and analyse the effects of globalization on economic security and the new approaches it takes in this context.

  2. Station blackout: Deterministic and probabilistic approach in the field of electrical supply losses by EDF

    Meslin, T.; Carnino, A.

    1986-01-01

    This example shows the thoroughness of EDF's approach in processing the difficult problems of the loss of electrical power supplies. Efforts are continuing in several directions: continued revision and improvement of operating procedures in the event of loss of electrical power supplies, PWR plant operator training courses devoted to the problems of power supply losses, and continued testing on simulators, and particularly testing under real conditions, including tests lasting several hours made possible by the performance of the new EDF simulators (two-phase code and taking all power losses into account)

  3. Alternative legal and institutional approaches to global change

    Thacher, P.S.

    1991-01-01

    The processes of global change currently under way cannot be dealt with in isolation. Factors linked to environmental quality such as demographic growth, economic interdependence and indebtedness, sociopolitical changes, and others must be managed collectively. In looking at the problems of global change, a central question before us is: How comprehensive should a legal regime be in a world of considerable uncertainty in which everything is interrelated with everything else, and what we do may, or may not be, have irreversible consequences for future generations. This article focuses on the problem of global warming to provide a model approach to the larger issues of global change. This reduces the scope of global change to a manageable but representative class of the problems at issue. The author suggests an approach to stabilize global climate by the end of the next century. However, even within this relatively narrow context of stabilizing the climate, a comprehensive approach is needed to address all heat-trapping gases - not just CO 2 - to ensure that all human activities generating these gases are managed properly, without causing other problems

  4. Probabilistic, meso-scale flood loss modelling

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  5. Estimation of macroscopic elastic characteristics for hierarchical anisotropic solids based on probabilistic approach

    Smolina, Irina Yu.

    2015-10-01

    Mechanical properties of a cable are of great importance in design and strength calculation of flexible cables. The problem of determination of elastic properties and rigidity characteristics of a cable modeled by anisotropic helical elastic rod is considered. These characteristics are calculated indirectly by means of the parameters received from statistical processing of experimental data. These parameters are considered as random quantities. With taking into account probable nature of these parameters the formulas for estimation of the macroscopic elastic moduli of a cable are obtained. The calculating expressions for macroscopic flexural rigidity, shear rigidity and torsion rigidity using the macroscopic elastic characteristics obtained before are presented. Statistical estimations of the rigidity characteristics of some cable grades are adduced. A comparison with those characteristics received on the basis of deterministic approach is given.

  6. A probabilistic approach using deformable organ models for automatic definition of normal anatomical structures for 3D treatment planning

    Fritsch, Daniel; Yu Liyun; Johnson, Valen; McAuliffe, Matthew; Pizer, Stephen; Chaney, Edward

    1996-01-01

    Purpose/Objective : Current clinical methods for defining normal anatomical structures on tomographic images are time consuming and subject to intra- and inter-user variability. With the widespread implementation of 3D RTP, conformal radiotherapy, and dose escalation the implications of imprecise object definition have assumed a much higher level of importance. Object definition and volume-weighted metrics for normal anatomy, such as DVHs and NTCPs, play critical roles in aiming, shaping, and weighting beams. Improvements in object definition, including computer automation, are essential to yield reliable volume-weighted metrics and gains in human efficiency. The purpose of this study was to investigate a probabilistic approach using deformable models to automatically recognize and extract normal anatomy in tomographic images. Materials and Methods: Object models were created from normal organs that were segmented by an interactive method which involved placing a cursor near the center of the object on a slice and clicking a mouse button to initiate computation of structures called cores. Cores describe the skeletal and boundary shape of image objects in a manner that, in 2D, associates a location on the skeleton with the width of the object at that location. A significant advantage of cores is stability against image disturbances such as noise and blur. The model was composed of a relatively small set of extracted points on the skeleton and boundary. The points were carefully chosen to summarize the shape information captured by the cores. Neighborhood relationships between points were represented mathematically by energy functions that penalize, due to warping of the model, the ''goodness'' of match between the model and the image data at any stage during the segmentation process. The model was matched against the image data using a probabilistic approach based on Bayes theorem, which provides a means for computing a posteriori (posterior) probability from 1) a

  7. Probabilistic conditional independence structures

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  8. Mixing Carrots and Sticks to Conserve Forests in the Brazilian Amazon: A Spatial Probabilistic Modeling Approach

    Börner, Jan; Marinho, Eduardo; Wunder, Sven

    2015-01-01

    Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix. PMID:25650966

  9. Mixing carrots and sticks to conserve forests in the Brazilian Amazon: a spatial probabilistic modeling approach.

    Börner, Jan; Marinho, Eduardo; Wunder, Sven

    2015-01-01

    Annual forest loss in the Brazilian Amazon had in 2012 declined to less than 5,000 sqkm, from over 27,000 in 2004. Mounting empirical evidence suggests that changes in Brazilian law enforcement strategy and the related governance system may account for a large share of the overall success in curbing deforestation rates. At the same time, Brazil is experimenting with alternative approaches to compensate farmers for conservation actions through economic incentives, such as payments for environmental services, at various administrative levels. We develop a spatially explicit simulation model for deforestation decisions in response to policy incentives and disincentives. The model builds on elements of optimal enforcement theory and introduces the notion of imperfect payment contract enforcement in the context of avoided deforestation. We implement the simulations using official deforestation statistics and data collected from field-based forest law enforcement operations in the Amazon region. We show that a large-scale integration of payments with the existing regulatory enforcement strategy involves a tradeoff between the cost-effectiveness of forest conservation and landholder incomes. Introducing payments as a complementary policy measure increases policy implementation cost, reduces income losses for those hit hardest by law enforcement, and can provide additional income to some land users. The magnitude of the tradeoff varies in space, depending on deforestation patterns, conservation opportunity and enforcement costs. Enforcement effectiveness becomes a key determinant of efficiency in the overall policy mix.

  10. ChromaSig: a probabilistic approach to finding common chromatin signatures in the human genome.

    Gary Hon

    2008-10-01

    Full Text Available Computational methods to identify functional genomic elements using genetic information have been very successful in determining gene structure and in identifying a handful of cis-regulatory elements. But the vast majority of regulatory elements have yet to be discovered, and it has become increasingly apparent that their discovery will not come from using genetic information alone. Recently, high-throughput technologies have enabled the creation of information-rich epigenetic maps, most notably for histone modifications. However, tools that search for functional elements using this epigenetic information have been lacking. Here, we describe an unsupervised learning method called ChromaSig to find, in an unbiased fashion, commonly occurring chromatin signatures in both tiling microarray and sequencing data. Applying this algorithm to nine chromatin marks across a 1% sampling of the human genome in HeLa cells, we recover eight clusters of distinct chromatin signatures, five of which correspond to known patterns associated with transcriptional promoters and enhancers. Interestingly, we observe that the distinct chromatin signatures found at enhancers mark distinct functional classes of enhancers in terms of transcription factor and coactivator binding. In addition, we identify three clusters of novel chromatin signatures that contain evolutionarily conserved sequences and potential cis-regulatory elements. Applying ChromaSig to a panel of 21 chromatin marks mapped genomewide by ChIP-Seq reveals 16 classes of genomic elements marked by distinct chromatin signatures. Interestingly, four classes containing enrichment for repressive histone modifications appear to be locally heterochromatic sites and are enriched in quickly evolving regions of the genome. The utility of this approach in uncovering novel, functionally significant genomic elements will aid future efforts of genome annotation via chromatin modifications.

  11. Probabilistic approach to rock fall hazard assessment: potential of historical data analysis

    C. Dussauge-Peisser

    2002-01-01

    Full Text Available We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912–1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954–1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can

  12. Probabilistic approach to rock fall hazard assessment: potential of historical data analysis

    Dussauge-Peisser, C.; Helmstetter, A.; Grasso, J.-R.; Hantz, D.; Desvarreux, P.; Jeannin, M.; Giraud, A.

    We study the rock fall volume distribution for three rock fall inventories and we fit the observed data by a power-law distribution, which has recently been proposed to describe landslide and rock fall volume distributions, and is also observed for many other natural phenomena, such as volcanic eruptions or earthquakes. We use these statistical distributions of past events to estimate rock fall occurrence rates on the studied areas. It is an alternative to deterministic approaches, which have not proved successful in predicting individual rock falls. The first one concerns calcareous cliffs around Grenoble, French Alps, from 1935 to 1995. The second data set is gathered during the 1912-1992 time window in Yosemite Valley, USA, in granite cliffs. The third one covers the 1954-1976 period in the Arly gorges, French Alps, with metamorphic and sedimentary rocks. For the three data sets, we find a good agreement between the observed volume distributions and a fit by a power-law distribution for volumes larger than 50 m3 , or 20 m3 for the Arly gorges. We obtain similar values of the b exponent close to 0.45 for the 3 data sets. In agreement with previous studies, this suggests, that the b value is not dependant on the geological settings. Regarding the rate of rock fall activity, determined as the number of rock fall events with volume larger than 1 m3 per year, we find a large variability from one site to the other. The rock fall activity, as part of a local erosion rate, is thus spatially dependent. We discuss the implications of these observations for the rock fall hazard evaluation. First, assuming that the volume distributions are temporally stable, a complete rock fall inventory allows for the prediction of recurrence rates for future events of a given volume in the range of the observed historical data. Second, assuming that the observed volume distribution follows a power-law distribution without cutoff at small or large scales, we can extrapolate these

  13. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  14. Naples between two fires: eruptive scenarios for the next eruptions by an integrated volcanological-probabilistic approach.

    Mastrolorenzo, G.; Pappalardo, L.; de Natale, G.; Troise, C.; Rossano, S.; Panizza, A.

    2009-04-01

    Probabilistic approaches based on available volcanological data from real eruptions of Campi Flegrei and Somma-Vesuvius, are assembled in a comprehensive assessment of volcanic hazards at the Neapolitan area. This allows to compare the volcanic hazards related to the different types of events, which can be used for evaluating the conditional probability of flows and falls hazard in case of a volcanic crisis. Hazard maps are presented, based on a rather complete set of numerical simulations, produced using field and laboratory data as input parameters relative to a large range (VEI 1 to 5) of fallout and pyroclastic-flow events and their relative occurrence. The results allow us to quantitatively evaluate and compare the hazard related to pyroclastic fallout and density currents (PDCs) at the Neapolitan volcanoes and their surroundings, including the city of Naples. Due to its position between the two volcanic areas, the city of Naples is particularly exposed to volcanic risk from VEI>2 eruptions, as recorded in the local volcanic succession. Because dominant wind directions, the area of Naples is particularly prone to fallout hazard from Campi Flegrei caldera eruptions in the VEI range 2-5. The hazard from PDCs decreases roughly radially with distance from the eruptive vents and is strongly controlled by the topographic heights. Campi Flegrei eruptions are particularly hazardous for Naples, although the Camaldoli and Posillipo hills produce an effective barrier to propagation to the very central part of Naples. PDCs from Vesuvius eruptions with VEI>4 can cover the city of Naples, whereas even VEI>3 eruptions have a moderate fallout hazard there.

  15. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  16. Probabilistic linguistics

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  17. Probabilistic Design

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  18. Solving Unconstrained Global Optimization Problems via Hybrid Swarm Intelligence Approaches

    Jui-Yu Wu

    2013-01-01

    Full Text Available Stochastic global optimization (SGO algorithms such as the particle swarm optimization (PSO approach have become popular for solving unconstrained global optimization (UGO problems. The PSO approach, which belongs to the swarm intelligence domain, does not require gradient information, enabling it to overcome this limitation of traditional nonlinear programming methods. Unfortunately, PSO algorithm implementation and performance depend on several parameters, such as cognitive parameter, social parameter, and constriction coefficient. These parameters are tuned by using trial and error. To reduce the parametrization of a PSO method, this work presents two efficient hybrid SGO approaches, namely, a real-coded genetic algorithm-based PSO (RGA-PSO method and an artificial immune algorithm-based PSO (AIA-PSO method. The specific parameters of the internal PSO algorithm are optimized using the external RGA and AIA approaches, and then the internal PSO algorithm is applied to solve UGO problems. The performances of the proposed RGA-PSO and AIA-PSO algorithms are then evaluated using a set of benchmark UGO problems. Numerical results indicate that, besides their ability to converge to a global minimum for each test UGO problem, the proposed RGA-PSO and AIA-PSO algorithms outperform many hybrid SGO algorithms. Thus, the RGA-PSO and AIA-PSO approaches can be considered alternative SGO approaches for solving standard-dimensional UGO problems.

  19. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data

    Muis, S.; Güneralp, B.; Jongman, B.; Aerts, J.C.J.H.; Ward, P.J.

    2015-01-01

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale

  20. Probabilistic Structural Analysis Theory Development

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  1. An integrated artificial neural networks approach for predicting global radiation

    Azadeh, A.; Maghsoudi, A.; Sohrabkhani, S.

    2009-01-01

    This article presents an integrated artificial neural network (ANN) approach for predicting solar global radiation by climatological variables. The integrated ANN trains and tests data with multi layer perceptron (MLP) approach which has the lowest mean absolute percentage error (MAPE). The proposed approach is particularly useful for locations where no available measurement equipment. Also, it considers all related climatological and meteorological parameters as input variables. To show the applicability and superiority of the integrated ANN approach, monthly data were collected for 6 years (1995-2000) in six nominal cities in Iran. Separate model for each city is considered and the quantity of solar global radiation in each city is calculated. Furthermore an integrated ANN model has been introduced for prediction of solar global radiation. The acquired results of the integrated model have shown high accuracy of about 94%. The results of the integrated model have been compared with traditional angstrom's model to show its considerable accuracy. Therefore, the proposed approach can be used as an efficient tool for prediction of solar radiation in the remote and rural locations with no direct measurement equipment.

  2. Economic order quantity (EOQ) by game theory approach in probabilistic supply chain system under service level constraint for items with imperfect quality

    Setiawan, R.

    2018-03-01

    In this paper, Economic Order Quantity (EOQ) of probabilistic two-level supply – chain system for items with imperfect quality has been analyzed under service level constraint. A firm applies an active service level constraint to avoid unpredictable shortage terms in the objective function. Mathematical analysis of optimal result is delivered using two equilibrium scheme concept in game theory approach. Stackelberg’s equilibrium for cooperative strategy and Stackelberg’s Equilibrium for noncooperative strategy. This is a new approach to game theory result in inventory system whether service level constraint is applied by a firm in his moves.

  3. Examining the global health arena: strengths and weaknesses of a convention approach to global health challenges.

    Haffeld, Just Balstad; Siem, Harald; Røttingen, John-Arne

    2010-01-01

    The article comprises a conceptual framework to analyze the strengths and weaknesses of a global health convention. The analyses are inspired by Lawrence Gostin's suggested Framework Convention on Global Health. The analytical model takes a starting-point in events tentatively following a logic sequence: Input (global health funding), Processes (coordination, cooperation, accountability, allocation of aid), Output (definition of basic survival needs), Outcome (access to health services), and Impact (health for all). It then examines to what degree binding international regulations can create order in such a sequence of events. We conclude that a global health convention could be an appropriate instrument to deal with some of the problems of global health. We also show that some of the tasks preceding a convention approach might be to muster international support for supra-national health regulations, negotiate compromises between existing stakeholders in the global health arena, and to utilize WHO as a platform for further discussions on a global health convention. © 2010 American Society of Law, Medicine & Ethics, Inc.

  4. Global Sectoral Industry Approaches to Climate Change. The Way Forward

    Stigson, B.; Egenhofer, C.; Fujiwara, N.

    2008-01-01

    The structure of some industrial sectors is so highly concentrated that just a handful of companies are responsible for producing a significant share of that sector's total greenhouse gases emissions worldwide. These sectors are thus a 'natural' focus of policy-makers concerned with climate change and have attracted keen interest from the EU. So-called 'sectoral approaches' are seen as having the potential to broaden the range of contributions by all parties, including emerging economies, to greenhouse gas emissions reductions, and to help moderate competitiveness concerns in trade-exposed industries. In particular, such approaches may help to quantify emissions on a sector-by-sector basis, building confidence that policies and measures can be put in place to reduce emissions. They can also help identify national or global commitments through the aggregation of sectoral data. While sectoral approaches allow policy-makers to concentrate on those individual sectors that contribute significantly to global emissions, they also pose a number of challenges. This CEPS Task Force Report identifies the principal issues associated with sectoral approaches - their rationale and the associated political dynamics - and gives an overview of existing approaches, the formulation of preconditions that would allow sectoral approaches to be implemented and an analysis of the potential interaction of sectoral approaches with existing climate change policies. The concluding chapter sketches a possible way forward

  5. Hazardous waste transportation risk assessment: Benefits of a combined deterministic and probabilistic Monte Carlo approach in expressing risk uncertainty

    Policastro, A.J.; Lazaro, M.A.; Cowen, M.A.; Hartmann, H.M.; Dunn, W.E.; Brown, D.F.

    1995-01-01

    This paper presents a combined deterministic and probabilistic methodology for modeling hazardous waste transportation risk and expressing the uncertainty in that risk. Both the deterministic and probabilistic methodologies are aimed at providing tools useful in the evaluation of alternative management scenarios for US Department of Energy (DOE) hazardous waste treatment, storage, and disposal (TSD). The probabilistic methodology can be used to provide perspective on and quantify uncertainties in deterministic predictions. The methodology developed has been applied to 63 DOE shipments made in fiscal year 1992, which contained poison by inhalation chemicals that represent an inhalation risk to the public. Models have been applied to simulate shipment routes, truck accident rates, chemical spill probabilities, spill/release rates, dispersion, population exposure, and health consequences. The simulation presented in this paper is specific to trucks traveling from DOE sites to their commercial TSD facilities, but the methodology is more general. Health consequences are presented as the number of people with potentially life-threatening health effects. Probabilistic distributions were developed (based on actual item data) for accident release amounts, time of day and season of the accident, and meteorological conditions

  6. The biopsychosocial approach and global mental health: Synergies and opportunities

    Emmanuel Babalola

    2017-01-01

    Full Text Available The biopsychosocial (BPS approach proposed by Engel four decades ago was regarded as one of the most important developments in medicine and psychiatry in the late 20th century. Unlike the biomedical model, the BPS approach posits that biological, psychological, and social factors play a significant role in disease causation and treatment. This approach brought about a new way of conceptualizing mental health difficulties and engendered changes within research, medical teaching and practice. Global mental health (GMH is a relatively new area of study and practice that seek to bridge inequities and inequality in mental healthcare services provision for people worldwide. The significance of the BPS approach for understanding mental health difficulties is being debated in the context of GMH initiatives. This paper critically evaluates strengths and weaknesses of the BPS approach to mental health difficulties and explores its relevance to GMH initiatives.

  7. Probabilistic 21st and 22nd Century Sea-Level Projections at a Global Network of Tide-Gauge Sites

    Kopp, Robert E.; Horton, Radley M.; Little, Christopher M.; Mitrovica, Jerry X.; Oppenheimer, Michael; Rasmussen, D. J.; Strauss, Benjamin H.; Tebaldi, Claudia

    2014-01-01

    Sea-level rise due to both climate change and non-climatic factors threatens coastal settlements, infrastructure, and ecosystems. Projections of mean global sea-level (GSL) rise provide insufficient information to plan adaptive responses; local decisions require local projections that accommodate different risk tolerances and time frames and that can be linked to storm surge projections. Here we present a global set of local sea-level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We provide complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling. Between the years 2000 and 2100, we project a very likely (90% probability) GSL rise of 0.5–1.2?m under representative concentration pathway (RCP) 8.5, 0.4–0.9?m under RCP 4.5, and 0.3–0.8?m under RCP 2.6. Site-to-site differences in LSL projections are due to varying non-climatic background uplift or subsidence, oceanographic effects, and spatially variable responses of the geoid and the lithosphere to shrinking land ice. The Antarctic ice sheet (AIS) constitutes a growing share of variance in GSL and LSL projections. In the global average and at many locations, it is the dominant source of variance in late 21st century projections, though at some sites oceanographic processes contribute the largest share throughout the century. LSL rise dramatically reshapes flood risk, greatly increasing the expected number of “1-in-10” and “1-in-100” year events.

  8. Global GPS Ionospheric Modelling Using Spherical Harmonic Expansion Approach

    Byung-Kyu Choi

    2010-12-01

    Full Text Available In this study, we developed a global ionosphere model based on measurements from a worldwide network of global positioning system (GPS. The total number of the international GPS reference stations for development of ionospheric model is about 100 and the spherical harmonic expansion approach as a mathematical method was used. In order to produce the ionospheric total electron content (TEC based on grid form, we defined spatial resolution of 2.0 degree and 5.0 degree in latitude and longitude, respectively. Two-dimensional TEC maps were constructed within the interval of one hour, and have a high temporal resolution compared to global ionosphere maps which are produced by several analysis centers. As a result, we could detect the sudden increase of TEC by processing GPS observables on 29 October, 2003 when the massive solar flare took place.

  9. Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function

    Alekseychuk, O.

    2006-07-01

    A new algorithm for detection of longitudinal crack-like indications in radiographic images is developed in this work. Conventional local detection techniques give unsatisfactory results for this task due to the low signal to noise ratio (SNR {proportional_to} 1) of crack-like indications in radiographic images. The usage of global features of crack-like indications provides the necessary noise resistance, but this is connected with prohibitive computational complexities of detection and difficulties in a formal description of the indication shape. Conventionally, the excessive computational complexity of the solution is reduced by usage of heuristics. The heuristics to be used, are selected on a trial and error basis, are problem dependent and do not guarantee the optimal solution. Not following this way is a distinctive feature of the algorithm developed here. Instead, a global characteristic of crack-like indication (the estimation function) is used, whose maximum in the space of all possible positions, lengths and shapes can be found exactly, i.e. without any heuristics. The proposed estimation function is defined as a sum of a posteriori information gains about hypothesis of indication presence in each point along the whole hypothetical indication. The gain in the information about hypothesis of indication presence results from the analysis of the underlying image in the local area. Such an estimation function is theoretically justified and exhibits a desirable behaviour on changing signals. The developed algorithm is implemented in the C++ programming language and tested on synthetic as well as on real images. It delivers good results (high correct detection rate by given false alarm rate) which are comparable to the performance of trained human inspectors.

  10. Detection of crack-like indications in digital radiography by global optimisation of a probabilistic estimation function

    Alekseychuk, O.

    2006-01-01

    A new algorithm for detection of longitudinal crack-like indications in radiographic images is developed in this work. Conventional local detection techniques give unsatisfactory results for this task due to the low signal to noise ratio (SNR ∝ 1) of crack-like indications in radiographic images. The usage of global features of crack-like indications provides the necessary noise resistance, but this is connected with prohibitive computational complexities of detection and difficulties in a formal description of the indication shape. Conventionally, the excessive computational complexity of the solution is reduced by usage of heuristics. The heuristics to be used, are selected on a trial and error basis, are problem dependent and do not guarantee the optimal solution. Not following this way is a distinctive feature of the algorithm developed here. Instead, a global characteristic of crack-like indication (the estimation function) is used, whose maximum in the space of all possible positions, lengths and shapes can be found exactly, i.e. without any heuristics. The proposed estimation function is defined as a sum of a posteriori information gains about hypothesis of indication presence in each point along the whole hypothetical indication. The gain in the information about hypothesis of indication presence results from the analysis of the underlying image in the local area. Such an estimation function is theoretically justified and exhibits a desirable behaviour on changing signals. The developed algorithm is implemented in the C++ programming language and tested on synthetic as well as on real images. It delivers good results (high correct detection rate by given false alarm rate) which are comparable to the performance of trained human inspectors

  11. Foreign Direct Investment versus Portfolio Investment : A Global Games Approach

    Yamin Ahmad; Pietro Cova; Rodrigo Harrison

    2004-01-01

    We present a model of investment under uncertainty about fundamentals, using a global games approach. Goldstein & Razin (2003) show that there is an information based trade-off between foreign direct investment (FDI) and portfolio investment (PI) which rationalizes some well known stylised facts in the literature - the relative volatility and reversibility of foreign direct investment versus portfolio investment. We extend their result and show that uncertainty about fundamentals does not imp...

  12. Helicopter precision approach capability using the Global Positioning System

    Kaufmann, David N.

    1992-01-01

    The period between 1 July and 31 December, 1992, was spent developing a research plan as well as a navigation system document and flight test plan to investigate helicopter precision approach capability using the Global Positioning System (GPS). In addition, all hardware and software required for the research was acquired, developed, installed, and verified on both the test aircraft and the ground-based reference station.

  13. Memristive Probabilistic Computing

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  14. A probabilistic approach for the computation of non-linear vibrations of tubes under cross-flow

    Payen, Th.; Langre, E. de.

    1996-01-01

    For the predictive analysis of flow-induced vibration and wear of tube bundles, a probabilistic method is proposed taking into account the uncertainties of the physical parameters. Monte-Carlo simulations are performed to estimate the density probability function of wear work rate and a sensitivity analysis is done on physical parameters influencing wear on the case of loosely supported tube under cross-flow. (authors). 8 refs., 8 figs

  15. Operationalizing the One Health approach: the global governance challenges.

    Lee, Kelley; Brumme, Zabrina L

    2013-10-01

    While there has been wide-ranging commitment to the One Health approach, its operationalisation has so far proven challenging. One Health calls upon the human, animal and environmental health sectors to cross professional, disciplinary and institutional boundaries, and to work in a more integrated fashion. At the global level, this paper argues that this vision is hindered by dysfunctions characterising current forms of global health governance (GHG), namely institutional proliferation, fragmentation, competition for scarce resources, lack of an overarching authority, and donor-driven vertical programmes. This has contributed, in part, to shortcomings in how One Health has been articulated to date. An agreed operational definition of One Health among key global institutions, efforts to build One Health institutions from the ground up, comparative case studies of what works or does not work institutionally, and high-level global support for research, training and career opportunities would all help to enable One Health to help remedy, and not be subsumed by, existing dysfunctions in GHG.

  16. A systematic approach to developing a global surgery elective.

    Hoehn, Richard S; Davis, Bradley R; Huber, Nathan L; Edwards, Michael J; Lungu, Douglas; Logan, Jocelyn M

    2015-01-01

    Interest in global health has been increasing for years among American residents and medical students. Many residency programs have developed global health tracks or electives in response to this need. Our goal was to create a global surgery elective based on a synergistic partnership between our institution and a hospital in the developing world. We created a business plan and 1-year schedule for researching potential sites and completing a pilot rotation at our selected hospital. We administered a survey to general surgery residents at the University of Cincinnati and visited medical facilities in Sierra Leone, Cameroon, and Malawi. The survey was given to all general surgery residents. A resident and a faculty member executed the fact-finding trip as well as the pilot rotation. Our general surgery residents view an international elective as integral to residency training and would participate in such an elective. After investigating 6 hospitals in sub-Saharan Africa, we conducted a pilot rotation at our selected hospital and gained the necessary information to organize a curriculum. We will begin sending senior residents for 8-week rotations in the coming academic year. By systematically approaching the process of creating a global surgery elective, we were able to gain considerable insight into choosing a location and organizing the elective. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  17. Approach on a global HTGR R and D network

    Lensa, W. von

    1997-01-01

    The present situation of nuclear power in general and of the innovative nuclear reactor systems in particular requires more comprehensive, coordinated R and D efforts on a broad international level to respond to today's requirements with respect to public and economic acceptance as well as to globalization trends and global environmental problems. HTGR technology development has already reached a high degree of maturity that will be complemented by the operation of the two new test reactors in Japan and China, representing technological milestones for the demonstration of HTGR safety characteristics and Nuclear Process Heat generation capabilities. It is proposed by the IAEA 'International Working Group on Gas-Cooled Reactors' to establish a 'Global HTGR R and D Network' on basic HTGR technology for the stable, long-term advancement of the specific HTGR features and as a basis for the future market introduction of this innovative reactor system. The background and the motivation for this approach are illustrated, as well as first proposals on the main objectives, the structure and the further procedures for the implementation of such a multinational working sharing R and D network. Modern telecooperation methods are foreseen as an interactive tool for effective communication and collaboration on a global scale. (author)

  18. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  19. Global music approach to persons with dementia: evidence and practice

    Raglio A

    2014-10-01

    Full Text Available Alfredo Raglio,1,2 Stefania Filippi,2 Daniele Bellandi,3 Marco Stramba-Badiale4 1Department of Public Health, Experimental and Forensic Medicine, University of Pavia, Pavia, Italy; 2APSP “Margherita Grazioli”, Povo, Trento, Italy; 3Geriatric Department, Sospiro Foundation, Sospiro, Cremona, Italy; 4Department of Geriatrics and Cardiovascular Medicine, IRCCS Istituto Auxologico Italiano, Milan, Italy Abstract: Music is an important resource for achieving psychological, cognitive, and social goals in the field of dementia. This paper describes the different types of evidence-based music interventions that can be found in literature and proposes a structured intervention model (global music approach to persons with dementia, GMA-D. The literature concerning music and dementia was considered and analyzed. The reported studies included more recent studies and/or studies with relevant scientific characteristics. From this background, a global music approach was proposed using music and sound–music elements according to the needs, clinical characteristics, and therapeutic–rehabilitation goals that emerge in the care of persons with dementia. From the literature analysis the following evidence-based interventions emerged: active music therapy (psychological and rehabilitative approaches, active music therapy with family caregivers and persons with dementia, music-based interventions, caregivers singing, individualized listening to music, and background music. Characteristics of each type of intervention are described and discussed. Standardizing the operational methods and evaluation of the single activities and a joint practice can contribute to achieve the validation of the application model. The proposed model can be considered a low-cost nonpharmacological intervention and a therapeutic–rehabilitation method for the reduction of behavioral disturbances, for stimulation of cognitive functions, and for increasing the overall quality of life

  20. Applying probabilistic temporal and multisite data quality control methods to a public health mortality registry in Spain: a systematic approach to quality control of repositories.

    Sáez, Carlos; Zurriaga, Oscar; Pérez-Panadés, Jordi; Melchor, Inma; Robles, Montserrat; García-Gómez, Juan M

    2016-11-01

    To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. A tiered approach for probabilistic ecological risk assessment of contaminated sites; Un approccio multilivello per l'analisi probabilistica di rischio ecologico di siti contaminati

    Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia

    2005-09-15

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente

  2. Systems Approaches: A Global and Historical Perspective on Integrative Medicine

    2012-01-01

    The globalization of healing systems is a dance of cultural awareness and cultural dominance that has arisen throughout history. With the development of greater communication and interest in whole-systems approaches to healing, the opportunity for the development of a global perspective on healing has emerged with new life force. The birth of integrative holistic healing systems in the West, such as naturopathic, homeopathic, anthroposophic, integral and functional medicine, and others, echoes the ocean of wisdom present in traditional healing systems, such as traditional Chinese medicine (TCM) and Ayurveda. In working to integrate the lessons from these systems, we see the inextricable link between man and the natural world, we work to understand the root cause of disease, we focus on the whole person to return balance, and we use empiric observation in large populations over time to grasp the interrelationships inherent in the whole-systems view of illness and wellness. PMID:24278794

  3. Global music approach to persons with dementia: evidence and practice.

    Raglio, Alfredo; Filippi, Stefania; Bellandi, Daniele; Stramba-Badiale, Marco

    2014-01-01

    Music is an important resource for achieving psychological, cognitive, and social goals in the field of dementia. This paper describes the different types of evidence-based music interventions that can be found in literature and proposes a structured intervention model (global music approach to persons with dementia, GMA-D). The literature concerning music and dementia was considered and analyzed. The reported studies included more recent studies and/or studies with relevant scientific characteristics. From this background, a global music approach was proposed using music and sound-music elements according to the needs, clinical characteristics, and therapeutic-rehabilitation goals that emerge in the care of persons with dementia. From the literature analysis the following evidence-based interventions emerged: active music therapy (psychological and rehabilitative approaches), active music therapy with family caregivers and persons with dementia, music-based interventions, caregivers singing, individualized listening to music, and background music. Characteristics of each type of intervention are described and discussed. Standardizing the operational methods and evaluation of the single activities and a joint practice can contribute to achieve the validation of the application model. The proposed model can be considered a low-cost nonpharmacological intervention and a therapeutic-rehabilitation method for the reduction of behavioral disturbances, for stimulation of cognitive functions, and for increasing the overall quality of life of persons with dementia.

  4. A comparative study of the probabilistic fracture mechanics and the stochastic Markovian process approaches for structural reliability assessment

    Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1990-01-01

    The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).

  5. A Robust Optimisation Approach using CVaR for Unit Commitment in a Market with Probabilistic Offers

    Bukhsh, W. A.; Papakonstantinou, Athanasios; Pinson, Pierre

    2016-01-01

    The large scale integration of renewable energy sources (RES) challenges power system planners and operators alike as it can potentially introduce the need for costly investments in infrastructure. Furthermore, traditional market clearing mechanisms are no longer optimal due to the stochastic...... nature of RES. This paper presents a risk-aware market clearing strategy for a network with significant shares of RES. We propose an electricity market that embeds the uncertainty brought by wind power and other stochastic renewable sources by accepting probabilistic offers and use a risk measure defined...

  6. Global justice, capabilities approach and commercial surrogacy in India.

    Saravanan, Sheela

    2015-08-01

    Inequalities, ineffective governance, unclear surrogacy regulations and unethical practices make India an ideal environment for global injustice in the process of commercial surrogacy. This article aims to apply the 'capabilities approach' to find possibilities of global justice through human fellowship in the context of commercial surrogacy. I draw primarily on my research findings supplemented by other relevant empirical research and documentary films on surrogacy. The paper reveals inequalities and inadequate basic entitlements among surrogate mothers as a consequence of which they are engaged in unjust contracts. Their limited entitlements also limit their opportunities to engage in enriching goals. It is the role of the state to provide all its citizens with basic entitlements and protect their basic human rights. Individuals in India evading their basic duty also contribute to the existing inequalities. Individual responsibilities of the medical practitioners and the intended parents are in question here as they are more inclined towards self-interest rather than commitment towards human fellowship. At the global level, the injustice in transnational commercial surrogacy practices in developing countries calls for an international declaration of women and child rights in third party reproduction with a normative vision of mutual fellowship and human dignity.

  7. Probabilistic Unawareness

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  8. Defining and Theorizing Terrorism: A Global Actor-Centered Approach

    Omar Lizardo

    2015-08-01

    Full Text Available Arriving at a consensual definition of the phenomenon of terrorism has been a particularly difficult undertaking. Some definitions are either too specific or too vague, concentrating on some essential “terrorist” aspect of the actions, strategies, or types of non-state organizations that engage in terrorism. In this paper I draw on global approaches from international relations and world systems theories to propose a definition of terrorism that skirts these issues by concentrating on terrorist actors rather than terrorist behavior. I argue that this approach has several advantages, including the dissolution of several empirical and analytical problems produced by more essentialist definitions, and the location of terrorism within a two dimensional continuum of collective-violence phenomena in the international system which discloses important theoretical insights. I proceed to examine the characteristics of terrorism by comparing it with other forms of violence in the international system. I propose that terrorism may be part of the cycles and trends of unrest in the world system, responding to the same broad families of global dynamics as other forms of system-level conflict.

  9. A global sensitivity analysis approach for morphogenesis models

    Boas, Sonja E. M.

    2015-11-21

    Background Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such ‘black-box’ models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. Results To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. Conclusions We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all ‘black-box’ models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  10. A global sensitivity analysis approach for morphogenesis models.

    Boas, Sonja E M; Navarro Jimenez, Maria I; Merks, Roeland M H; Blom, Joke G

    2015-11-21

    Morphogenesis is a developmental process in which cells organize into shapes and patterns. Complex, non-linear and multi-factorial models with images as output are commonly used to study morphogenesis. It is difficult to understand the relation between the uncertainty in the input and the output of such 'black-box' models, giving rise to the need for sensitivity analysis tools. In this paper, we introduce a workflow for a global sensitivity analysis approach to study the impact of single parameters and the interactions between them on the output of morphogenesis models. To demonstrate the workflow, we used a published, well-studied model of vascular morphogenesis. The parameters of this cellular Potts model (CPM) represent cell properties and behaviors that drive the mechanisms of angiogenic sprouting. The global sensitivity analysis correctly identified the dominant parameters in the model, consistent with previous studies. Additionally, the analysis provided information on the relative impact of single parameters and of interactions between them. This is very relevant because interactions of parameters impede the experimental verification of the predicted effect of single parameters. The parameter interactions, although of low impact, provided also new insights in the mechanisms of in silico sprouting. Finally, the analysis indicated that the model could be reduced by one parameter. We propose global sensitivity analysis as an alternative approach to study the mechanisms of morphogenesis. Comparison of the ranking of the impact of the model parameters to knowledge derived from experimental data and from manipulation experiments can help to falsify models and to find the operand mechanisms in morphogenesis. The workflow is applicable to all 'black-box' models, including high-throughput in vitro models in which output measures are affected by a set of experimental perturbations.

  11. Probabilistic methods for physics

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  12. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    Albertson, J. D.

    2015-12-01

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large uncertainties in current approaches. In this paper, we describe results from a series of near-source (< 30 m) controlled methane releases where an instrumented van was used to measure methane concentrations during both fixed location sampling and during mobile traverses immediately downwind of the source. The measurements were used to evaluate the application of EPA Method 33A for estimating methane emissions downwind of a source and also to test the application of a new probabilistic approach for estimating emission rates from mobile traverse data.

  13. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  14. Helicopter Approach Capability Using the Differential Global Positioning System

    Kaufmann, David N.

    1994-01-01

    The results of flight tests to determine the feasibility of using the Global Positioning System (GPS) in the Differential mode (DGPS) to provide high accuracy, precision navigation and guidance for helicopter approaches to landing are presented. The airborne DGPS receiver and associated equipment is installed in a NASA UH-60 Black Hawk helicopter. The ground-based DGPS reference receiver is located at a surveyed test site and is equipped with a real-time VHF data link to transmit correction information to the airborne DGPS receiver. The corrected airborne DGPS information, together with the preset approach geometry, is used to calculate guidance commands which are sent to the aircraft's approach guidance instruments. The use of DGPS derived guidance for helicopter approaches to landing is evaluated by comparing the DGPS data with the laser tracker truth data. The errors indicate that the helicopter position based on DGPS guidance satisfies the International Civil Aviation Organization (ICAO) Category 1 (CAT 1) lateral and vertical navigational accuracy requirements.

  15. Development of system based code for integrity of FBR. Fundamental probabilistic approach, Part 1: Model calculation of creep-fatigue damage (Research report)

    Kawasaki, Nobuchika; Asayama, Tai

    2001-09-01

    Both reliability and safety have to be further improved for the successful commercialization of FBRs. At the same time, construction and operation costs need to be reduced to a same level of future LWRs. To realize compatibility among reliability, safety and, cost, the Structural Mechanics Research Group in JNC started the development of System Based Code for Integrity of FBR. This code extends the present structural design standard to include the areas of fabrication, installation, plant system design, safety design, operation and maintenance, and so on. A quantitative index is necessary to connect different partial standards in this code. Failure probability is considered as a candidate index. Therefore we decided to make a model calculation using failure probability and judge its applicability. We first investigated other probabilistic standards like ASME Code Case N-578. A probabilistic approach in the structural integrity evaluation was created based on these results, and also an evaluation flow was proposed. According to this flow, a model calculation of creep-fatigue damage was performed. This trial calculation was for a vessel in a sodium-cooled FBR. As the result of this model calculation, a crack initiation probability and a crack penetration probability were found to be effective indices. Last we discussed merits of this System Based Code, which are presented in this report. Furthermore, this report presents future development tasks. (author)

  16. The exposure of Sydney (Australia) to earthquake-generated tsunamis, storms and sea level rise: a probabilistic multi-hazard approach.

    Dall'Osso, F; Dominey-Howes, D; Moore, C; Summerhayes, S; Withycombe, G

    2014-12-10

    Approximately 85% of Australia's population live along the coastal fringe, an area with high exposure to extreme inundations such as tsunamis. However, to date, no Probabilistic Tsunami Hazard Assessments (PTHA) that include inundation have been published for Australia. This limits the development of appropriate risk reduction measures by decision and policy makers. We describe our PTHA undertaken for the Sydney metropolitan area. Using the NOAA NCTR model MOST (Method for Splitting Tsunamis), we simulate 36 earthquake-generated tsunamis with annual probabilities of 1:100, 1:1,000 and 1:10,000, occurring under present and future predicted sea level conditions. For each tsunami scenario we generate a high-resolution inundation map of the maximum water level and flow velocity, and we calculate the exposure of buildings and critical infrastructure. Results indicate that exposure to earthquake-generated tsunamis is relatively low for present events, but increases significantly with higher sea level conditions. The probabilistic approach allowed us to undertake a comparison with an existing storm surge hazard assessment. Interestingly, the exposure to all the simulated tsunamis is significantly lower than that for the 1:100 storm surge scenarios, under the same initial sea level conditions. The results have significant implications for multi-risk and emergency management in Sydney.

  17. Probabilistic Near and Far-Future Climate Scenarios of Precipitation and Surface Temperature for the North American Monsoon Region Under a Weighted CMIP5-GCM Ensemble Approach.

    Montero-Martinez, M. J.; Colorado, G.; Diaz-Gutierrez, D. E.; Salinas-Prieto, J. A.

    2017-12-01

    It is well known the North American Monsoon (NAM) region is already a very dry region which is under a lot of stress due to the lack of water resources on multiple locations of the area. However, it is very interesting that even under those conditions, the Mexican part of the NAM region is certainly the most productive in Mexico from the agricultural point of view. Thus, it is very important to have realistic climate scenarios for climate variables such as temperature, precipitation, relative humidity, radiation, etc. This study tries to tackle that problem by generating probabilistic climate scenarios using a weighted CMIP5-GCM ensemble approach based on the Xu et al. (2010) technique which is on itself an improved method from the better known Reliability Ensemble Averaging algorithm of Giorgi and Mearns (2002). In addition, it is compared the 20-plus GCMs individual performances and the weighted ensemble versus observed data (CRU TS2.1) by using different metrics and Taylor diagrams. This study focuses on probabilistic results reaching a certain threshold given the fact that those types of products could be of potential use for agricultural applications.

  18. Probabilistic liver atlas construction.

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  19. Explaining Differences Between Bioaccumulation Measurements in Laboratory and Field Data Through Use of a Probabilistic Modeling Approach

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen

    2012-01-01

    was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability...... in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance...... components. Improvements in the accuracy of aqueous exposure appear to be less relevant when applied to moderate to highly hydrophobic compounds, because this route contributes only marginally to total uptake. The determination of chemical bioavailability and the increase in understanding and qualifying...

  20. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  1. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  2. Global approaches and local strategies for phase unwrapping

    Guerriero, L.; Refice, A.; Stramaglia, S.; Chiaradia, M. T.; Satalino, G.; Veneziani, N.; Blonda, P.; Pasquariello, G.

    2001-01-01

    Phase unwrapping, i.e. the retrieval of absolute phases from wrapped, noisy measures, is a tough problem because of the presence of rotational inconsistencies (residues), randomly generated by noise and undersampling on the principal phase gradient field. These inconsistencies prevent the recovery of the absolute phase field by direct integration of the wrapped gradients. In this paper it is examined the relative merit of known global approaches and then it is presented evidence that the approach based on stochastic annealing can recover the true phase field also in noisy areas with severe undersampling, where other methods fail. Then, some experiments with local approaches are presented. A fast neural filter has been trained to eliminate close residue couples by joining them in a way which takes into account the local phase information. Performances are about 60-70% of the residues. Finally, other experiments have been aimed at designing an automated method for the determination of weight matrices to use in conjunction with local phase unwrapping algorithms. The method, tested with the minimum cost flow algorithm, gives good performances over both simulated and real data

  3. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  4. Global Natural Disaster Risk Hotspots: Transition to a Regional Approach

    Lerner-Lam, A.; Chen, R.; Dilley, M.

    2005-12-01

    The "Hotspots Project" is a collaborative study of the global distribution and occurrence of multiple natural hazards and the associated exposures of populations and their economic output. In this study we assess the global risks of two disaster-related outcomes: mortality and economic losses. We estimate risk levels by combining hazard exposure with historical vulnerability for two indicators of elements at risk-gridded population and Gross Domestic Product (GDP) per unit area - for six major natural hazards: earthquakes, volcanoes, landslides, floods, drought, and cyclones. By calculating relative risks for each grid cell rather than for countries as a whole, we are able to estimate risk levels at sub-national scales. These can then be used to estimate aggregate relative multiple hazard risk at regional and national scales. Mortality-related risks are assessed on a 2.5' x 2.5' latitude-longitude grid of global population (GPW Version 3). Economic risks are assessed at the same resolution for gridded GDP per unit area, using World Bank estimates of GDP based on purchasing power parity. Global hazard data were compiled from multiple sources. The project collaborated directly with UNDP and UNEP, the International Research Institute for Climate Prediction (IRI) at Columbia, and the Norwegian Geotechnical Institute (NGI) in the creation of data sets for several hazards for which global data sets did not previously exist. Drought, flood and volcano hazards are characterized in terms of event frequency, storms by frequency and severity, earthquakes by frequency and ground acceleration exceedance probability, and landslides by an index derived from probability of occurrence. The global analysis undertaken in this project is clearly limited by issues of scale as well as by the availability and quality of data. For some hazards, there exist only 15- to 25-year global records with relatively crude spatial information. Data on historical disaster losses, and particularly on

  5. Global Crop Monitoring: A Satellite-Based Hierarchical Approach

    Bingfang Wu

    2015-04-01

    Full Text Available Taking advantage of multiple new remote sensing data sources, especially from Chinese satellites, the CropWatch system has expanded the scope of its international analyses through the development of new indicators and an upgraded operational methodology. The approach adopts a hierarchical system covering four spatial levels of detail: global, regional, national (thirty-one key countries including China and “sub-countries” (for the nine largest countries. The thirty-one countries encompass more that 80% of both production and exports of maize, rice, soybean and wheat. The methodology resorts to climatic and remote sensing indicators at different scales. The global patterns of crop environmental growing conditions are first analyzed with indicators for rainfall, temperature, photosynthetically active radiation (PAR as well as potential biomass. At the regional scale, the indicators pay more attention to crops and include Vegetation Health Index (VHI, Vegetation Condition Index (VCI, Cropped Arable Land Fraction (CALF as well as Cropping Intensity (CI. Together, they characterize crop situation, farming intensity and stress. CropWatch carries out detailed crop condition analyses at the national scale with a comprehensive array of variables and indicators. The Normalized Difference Vegetation Index (NDVI, cropped areas and crop conditions are integrated to derive food production estimates. For the nine largest countries, CropWatch zooms into the sub-national units to acquire detailed information on crop condition and production by including new indicators (e.g., Crop type proportion. Based on trend analysis, CropWatch also issues crop production supply outlooks, covering both long-term variations and short-term dynamic changes in key food exporters and importers. The hierarchical approach adopted by CropWatch is the basis of the analyses of climatic and crop conditions assessments published in the quarterly “CropWatch bulletin” which

  6. Toward a global space exploration program: A stepping stone approach

    Ehrenfreund, Pascale; McKay, Chris; Rummel, John D.; Foing, Bernard H.; Neal, Clive R.; Masson-Zwaan, Tanja; Ansdell, Megan; Peter, Nicolas; Zarnecki, John; Mackwell, Steve; Perino, Maria Antionetta; Billings, Linda; Mankins, John; Race, Margaret

    2012-01-01

    In response to the growing importance of space exploration in future planning, the Committee on Space Research (COSPAR) Panel on Exploration (PEX) was chartered to provide independent scientific advice to support the development of exploration programs and to safeguard the potential scientific assets of solar system objects. In this report, PEX elaborates a stepwise approach to achieve a new level of space cooperation that can help develop world-wide capabilities in space science and exploration and support a transition that will lead to a global space exploration program. The proposed stepping stones are intended to transcend cross-cultural barriers, leading to the development of technical interfaces and shared legal frameworks and fostering coordination and cooperation on a broad front. Input for this report was drawn from expertise provided by COSPAR Associates within the international community and via the contacts they maintain in various scientific entities. The report provides a summary and synthesis of science roadmaps and recommendations for planetary exploration produced by many national and international working groups, aiming to encourage and exploit synergies among similar programs. While science and technology represent the core and, often, the drivers for space exploration, several other disciplines and their stakeholders (Earth science, space law, and others) should be more robustly interlinked and involved than they have been to date. The report argues that a shared vision is crucial to this linkage, and to providing a direction that enables new countries and stakeholders to join and engage in the overall space exploration effort. Building a basic space technology capacity within a wider range of countries, ensuring new actors in space act responsibly, and increasing public awareness and engagement are concrete steps that can provide a broader interest in space exploration, worldwide, and build a solid basis for program sustainability. By engaging

  7. Local to global: a collaborative approach to volcanic risk assessment

    Calder, Eliza; Loughlin, Sue; Barsotti, Sara; Bonadonna, Costanza; Jenkins, Susanna

    2017-04-01

    -economic conditions tending to influence longer term well-being and recovery. The volcanological community includes almost 100 Volcano Observatories worldwide, the official institutions responsible for monitoring volcanoes. They may be dedicated institutions, or operate from national institutions (geological surveys, universities, met agencies). They have a key role in early warning, forecasting and long term hazard assessment (often in the form of volcanic hazards maps). The complexity of volcanic systems means that once unrest begins there are multiple potential eruptive outcomes and short term forecasts can change rapidly. This local knowledge of individual volcanoes underpins hazard and risk assessments developed at national, regional and global scales. Combining this local expertise with the knowledge of the international research community (including interdisciplinary perspectives) creates a powerful partnership. A collaborative approach is therefore needed to develop effective volcanic risk assessments at regional to global scale. The World Organisation of Volcano Observatories is a Commission of IAVCEI, alongside other Commissions such as 'Hazard and Risk' (with an active working group on volcanic hazards maps) and the 'Cities and Volcanoes' Commission. The Global Volcano Model network is a collaborative initiative developing hazards and risk information at national to global scales, underpinned by local expertise. Partners include IAVCEI, Smithsonian Institution, International Volcanic Health Hazard Network, VHub and other initiatives and institutions.

  8. A probabilistic approach of the Flash Flood Early Warning System (FF-EWS) in Catalonia based on radar ensemble generation

    Velasco, David; Sempere-Torres, Daniel; Corral, Carles; Llort, Xavier; Velasco, Enrique

    2010-05-01

    probabilistic component to the FF-EWS. As a first step, we have incorporated the uncertainty in rainfall estimates and forecasts based on an ensemble of equiprobable rainfall scenarios. The presented study has focused on a number of rainfall events and the performance of the FF-EWS evaluated in terms of its ability to produce probabilistic hazard warnings for decision-making support.

  9. A global "imaging'' view on systems approaches in immunology.

    Ludewig, Burkhard; Stein, Jens V; Sharpe, James; Cervantes-Barragan, Luisa; Thiel, Volker; Bocharov, Gennady

    2012-12-01

    The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Spatiotemporal radiotherapy planning using a global optimization approach

    Adibi, Ali; Salari, Ehsan

    2018-02-01

    This paper aims at quantifying the extent of potential therapeutic gain, measured using biologically effective dose (BED), that can be achieved by altering the radiation dose distribution over treatment sessions in fractionated radiotherapy. To that end, a spatiotemporally integrated planning approach is developed, where the spatial and temporal dose modulations are optimized simultaneously. The concept of equivalent uniform BED (EUBED) is used to quantify and compare the clinical quality of spatiotemporally heterogeneous dose distributions in target and critical structures. This gives rise to a large-scale non-convex treatment-plan optimization problem, which is solved using global optimization techniques. The proposed spatiotemporal planning approach is tested on two stylized cancer cases resembling two different tumor sites and sensitivity analysis is performed for radio-biological and EUBED parameters. Numerical results validate that spatiotemporal plans are capable of delivering a larger BED to the target volume without increasing the BED in critical structures compared to conventional time-invariant plans. In particular, this additional gain is attributed to the irradiation of different regions of the target volume at different treatment sessions. Additionally, the trade-off between the potential therapeutic gain and the number of distinct dose distributions is quantified, which suggests a diminishing marginal gain as the number of dose distributions increases.

  11. Probabilistic approach for assessing infants' health risks due to ingestion of nanoscale silver released from consumer products.

    Pang, Chengfang; Hristozov, Danail; Zabeo, Alex; Pizzol, Lisa; Tsang, Michael P; Sayre, Phil; Marcomini, Antonio

    2017-02-01

    Silver nanoparticles (n-Ag) are widely used in consumer products and many medical applications because of their unique antibacterial properties. Their use is raising concern about potential human exposures and health effects. Therefore, it is informative to assess the potential human health risks of n-Ag in order to ensure that nanotechnology-based consumer products are deployed in a safe and sustainable way. Even though toxicity studies clearly show the potential hazard of n-Ag, there have been few attempts to integrate hazard and exposure assessments to evaluate risks. The underlying reason for this is the difficulty in characterizing exposure and the lack of toxicity studies essential for human health risk assessment (HHRA). Such data gaps introduce significant uncertainty into the risk assessment process. This study uses probabilistic methods to assess the relative uncertainty and potential risks of n-Ag exposure to infants. In this paper, we estimate the risks for infants potentially exposed to n-Ag through drinking juice or milk from sippy cups or licking baby blankets containing n-Ag. We explicitly evaluate uncertainty and variability contained in available dose-response and exposure data in order to make the risk characterization process transparent. Our results showed that individual margin of exposures for oral exposure to sippy cups and baby blankets containing n-Ag exhibited minimal risk. Copyright © 2016. Published by Elsevier Ltd.

  12. Is it possible to predict the presence of colorectal cancer in a blood test? A probabilistic approach method.

    Navarro Rodríguez, José Manuel; Gallego Plazas, Javier; Borrás Rocher, Fernando; Calpena Rico, Rafael; Ruiz Macia, José Antonio; Morcillo Ródenas, Miguel Ángel

    2017-10-01

    The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias) in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Statistically significant differences (p indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio), hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85). We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.

  13. Is it possible to predict the presence of colorectal cancer in a blood test?: a probabilistic approach method

    José Manuel Navarro-Rodríguez

    Full Text Available Introduction: The assessment of the state of immunosurveillance (the ability of the organism to prevent the development of neoplasias in the blood has prognostic implications of interest in colorectal cancer. We evaluated and quantified a possible predictive character of the disease in a blood test using a mathematical interaction index of several blood parameters. The predictive capacity of the index to detect colorectal cancer was also assessed. Methods: We performed a retrospective case-control study of a comparative analysis of the distribution of blood parameters in 266 patients with colorectal cancer and 266 healthy patients during the period from 2009 to 2013. Results: Statistically significant differences (p < 0.05 were observed between patients with colorectal cancer and the control group in terms of platelet counts, fibrinogen, total leukocytes, neutrophils, systemic immunovigilance indexes (neutrophil to lymphocyte ratio and platelet to lymphocyte ratio, hemoglobin, hematocrit and eosinophil levels. These differences allowed the design of a blood analytical profile that calculates the risk of colorectal cancer. This risk profile can be quantified via a mathematical formula with a probabilistic capacity to identify patients with the highest risk of the presence of colorectal cancer (area under the ROC curve = 0.85. Conclusions: We showed that a colorectal cancer predictive character exists in blood which can be quantified by an interaction index of several blood parameters. The design and development of interaction indexes of blood parameters constitutes an interesting research line for the development and improvement of programs for the screening of colorectal cancer.

  14. A probabilistic approach to the assessment of some life history pattern parameters in a Middle Pleistocene human population.

    Durand, A I; Ipina, S L; Bermúdez de Castro, J M

    2000-06-01

    Parameters of a Middle Pleistocene human population such as the expected length of the female reproductive period (E(Y)), the expected interbirth interval (E(X)), the survival rate (tau) for females after the expected reproductive period, the rate (phi(2)) of women who, given that they reach first birth, do not survive to the end of the expected reproductive period, and the female infant plus juvenile mortality rate (phi(1)) have been assessed from a probabilistic standpoint provided that such a population were stationary. The hominid sample studied, the Sima de los Huesos (SH) cave site, Sierra de Atapuerca (Spain), is the most exhaustive human fossil sample currently available. Results suggest that the Atapuerca (SH) sample can derive from a stationary population. Further, in the case that the expected reproductive period ends between 37 and 40 yr of age, then 24 less, similarE(Y) less, similar27 yr, E(X)=3 yr, 0.224

  15. Absorption systems at z ˜ 2 as a probe of the circum galactic medium: a probabilistic approach

    Mongardi, C.; Viel, M.; D'Odorico, V.; Kim, T.-S.; Barai, P.; Murante, G.; Monaco, P.

    2018-05-01

    We characterize the properties of the intergalactic medium (IGM) around a sample of galaxies extracted from state-of-the-art hydrodynamical simulations of structure formation in a cosmological volume of 25 Mpc comoving at z ˜ 2. The simulations are based on two different sub-resolution schemes for star formation and supernova feedback: the MUlti-Phase Particle Integrator (MUPPI) scheme and the Effective Model. We develop a quantitative and probabilistic analysis based on the apparent optical depth method of the properties of the absorbers as a function of impact parameter from their nearby galaxies: in such a way we probe different environments from circumgalactic medium (CGM) to low density filaments. Absorbers' properties are then compared with a spectroscopic observational data set obtained from high resolution quasar spectra. Our main focus is on the NCIV - NHI relation around simulated galaxies: the results obtained with MUPPI and the Effective model are remarkably similar, with small differences only confined to regions at impact parameters b = [1 - 3] × rvir. Using {C IV} as a tracer of the metallicity, we obtain evidence that the observed metal absorption systems have the highest probability to be confined in a region of 150-400 kpc around galaxies. Near-filament environments have instead metallicities too low to be probed by present-day telescopes, but could be probed by future spectroscopical studies. Finally we compute {C IV} covering fractions which are in agreement with observational data.

  16. A probabilistic approach to combining smart meter and electric vehicle charging data to investigate distribution network impacts

    Neaimeh, Myriam; Wardle, Robin; Jenkins, Andrew M.; Yi, Jialiang; Hill, Graeme; Lyons, Padraig F.; Hübner, Yvonne; Blythe, Phil T.; Taylor, Phil C.

    2015-01-01

    Highlights: • Working with unique datasets of EV charging and smart meter load demand. • Distribution networks are not a homogenous group with more capabilities to accommodate EVs than previously suggested. • Spatial and temporal diversity of EV charging demand alleviate the impacts on networks. • An extensive recharging infrastructure could enable connection of additional EVs on constrained distribution networks. • Electric utilities could increase the network capability to accommodate EVs by investing in recharging infrastructure. - Abstract: This work uses a probabilistic method to combine two unique datasets of real world electric vehicle charging profiles and residential smart meter load demand. The data was used to study the impact of the uptake of Electric Vehicles (EVs) on electricity distribution networks. Two real networks representing an urban and rural area, and a generic network representative of a heavily loaded UK distribution network were used. The findings show that distribution networks are not a homogeneous group with a variation of capabilities to accommodate EVs and there is a greater capability than previous studies have suggested. Consideration of the spatial and temporal diversity of EV charging demand has been demonstrated to reduce the estimated impacts on the distribution networks. It is suggested that distribution network operators could collaborate with new market players, such as charging infrastructure operators, to support the roll out of an extensive charging infrastructure in a way that makes the network more robust; create more opportunities for demand side management; and reduce planning uncertainties associated with the stochastic nature of EV charging demand.

  17. Integration of anatomical and external response mappings explains crossing effects in tactile localization: A probabilistic modeling approach.

    Badde, Stephanie; Heed, Tobias; Röder, Brigitte

    2016-04-01

    To act upon a tactile stimulus its original skin-based, anatomical spatial code has to be transformed into an external, posture-dependent reference frame, a process known as tactile remapping. When the limbs are crossed, anatomical and external location codes are in conflict, leading to a decline in tactile localization accuracy. It is unknown whether this impairment originates from the integration of the resulting external localization response with the original, anatomical one or from a failure of tactile remapping in crossed postures. We fitted probabilistic models based on these diverging accounts to the data from three tactile localization experiments. Hand crossing disturbed tactile left-right location choices in all experiments. Furthermore, the size of these crossing effects was modulated by stimulus configuration and task instructions. The best model accounted for these results by integration of the external response mapping with the original, anatomical one, while applying identical integration weights for uncrossed and crossed postures. Thus, the model explained the data without assuming failures of remapping. Moreover, performance differences across tasks were accounted for by non-individual parameter adjustments, indicating that individual participants' task adaptation results from one common functional mechanism. These results suggest that remapping is an automatic and accurate process, and that the observed localization impairments in touch result from a cognitively controlled integration process that combines anatomically and externally coded responses.

  18. Blind RRT: A probabilistically complete distributed RRT

    Rodriguez, Cesar; Denny, Jory; Jacobs, Sam Ade; Thomas, Shawna; Amato, Nancy M.

    2013-01-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  19. Blind RRT: A probabilistically complete distributed RRT

    Rodriguez, Cesar

    2013-11-01

    Rapidly-Exploring Random Trees (RRTs) have been successful at finding feasible solutions for many types of problems. With motion planning becoming more computationally demanding, we turn to parallel motion planning for efficient solutions. Existing work on distributed RRTs has been limited by the overhead that global communication requires. A recent approach, Radial RRT, demonstrated a scalable algorithm that subdivides the space into regions to increase the computation locality. However, if an obstacle completely blocks RRT growth in a region, the planning space is not covered and is thus not probabilistically complete. We present a new algorithm, Blind RRT, which ignores obstacles during initial growth to efficiently explore the entire space. Because obstacles are ignored, free components of the tree become disconnected and fragmented. Blind RRT merges parts of the tree that have become disconnected from the root. We show how this algorithm can be applied to the Radial RRT framework allowing both scalability and effectiveness in motion planning. This method is a probabilistically complete approach to parallel RRTs. We show that our method not only scales but also overcomes the motion planning limitations that Radial RRT has in a series of difficult motion planning tasks. © 2013 IEEE.

  20. Consolidating Data of Global Urban Populations: a Comparative Approach

    Blankespoor, B.; Khan, A.; Selod, H.

    2017-12-01

    Global data on city populations are essential for the study of urbanization, city growth and the spatial distribution of human settlements. Such data are either gathered by combining official estimates of urban populations from across countries or extracted from gridded population models that combine these estimates with geospatial data. These data sources provide varying estimates of urban populations and each approach has its advantages and limitations. In particular, official figures suffer from a lack of consistency in defining urban units (across both space and time) and often provide data for jurisdictions rather than the functionally meaningful urban area. On the other hand, gridded population models require a user-imposed definition to identify urban areas and are constrained by the modelling techniques and input data employed. To address these drawbacks, we combine these approaches by consolidating information from three established sources: (i) the Citypopulation.de (Brinkhoff, 2016); (ii) the World Urban Prospects data (United Nations, 2014); and (iii) the Global Human Settlements population grid (GHS-POP) (EC - JRC, 2015). We create urban footprints with GHS-POP and spatially merge georeferenced city points from both UN WUP and Citypopulation.de with these urban footprints to identify city points that belong to a single agglomeration. We create a consolidated dataset by combining population data from the UN WUP and Citypopulation.de. The flexible framework outlined can incorporate information from alternative inputs to identify urban clusters e.g. by using night-time lights, built-up area or alternative gridded population models (e.g WorldPop or Landscan) and the parameters employed (e.g. density thresholds for urban footprints) may also be adjusted, e.g., as a function of city-specific characteristics. Our consolidated dataset provides a wider and more accurate coverage of city populations to support studies of urbanization. We apply the data to re

  1. Arbitrage and Hedging in a non probabilistic framework

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  2. Risk-based probabilistic approach to assess the impact of false mussel invasions on farmed hard clams.

    Liao, Chung-Min; Ju, Yun-Ru; Chio, Chia-Pin; Chen, Wei-Yu

    2010-02-01

    The purpose of this article is to provide a risk-based predictive model to assess the impact of false mussel Mytilopsis sallei invasions on hard clam Meretrix lusoria farms in the southwestern region of Taiwan. The actual spread of invasive false mussel was predicted by using analytical models based on advection-diffusion and gravity models. The proportion of hard clam colonized and infestation by false mussel were used to characterize risk estimates. A mortality model was parameterized to assess hard clam mortality risk characterized by false mussel density and infestation intensity. The published data were reanalyzed to parameterize a predictive threshold model described by a cumulative Weibull distribution function that can be used to estimate the exceeding thresholds of proportion of hard clam colonized and infestation. Results indicated that the infestation thresholds were 2-17 ind clam(-1) for adult hard clams, whereas 4 ind clam(-1) for nursery hard clams. The average colonization thresholds were estimated to be 81-89% for cultivated and nursery hard clam farms, respectively. Our results indicated that false mussel density and infestation, which caused 50% hard clam mortality, were estimated to be 2,812 ind m(-2) and 31 ind clam(-1), respectively. This study further indicated that hard clam farms that are close to the coastal area have at least 50% probability for 43% mortality caused by infestation. This study highlighted that a probabilistic risk-based framework characterized by probability distributions and risk curves is an effective representation of scientific assessments for farmed hard clam in response to the nonnative false mussel invasion.

  3. DB2: a probabilistic approach for accurate detection of tandem duplication breakpoints using paired-end reads.

    Yavaş, Gökhan; Koyutürk, Mehmet; Gould, Meetha P; McMahon, Sarah; LaFramboise, Thomas

    2014-03-05

    With the advent of paired-end high throughput sequencing, it is now possible to identify various types of structural variation on a genome-wide scale. Although many methods have been proposed for structural variation detection, most do not provide precise boundaries for identified variants. In this paper, we propose a new method, Distribution Based detection of Duplication Boundaries (DB2), for accurate detection of tandem duplication breakpoints, an important class of structural variation, with high precision and recall. Our computational experiments on simulated data show that DB2 outperforms state-of-the-art methods in terms of finding breakpoints of tandem duplications, with a higher positive predictive value (precision) in calling the duplications' presence. In particular, DB2's prediction of tandem duplications is correct 99% of the time even for very noisy data, while narrowing down the space of possible breakpoints within a margin of 15 to 20 bps on the average. Most of the existing methods provide boundaries in ranges that extend to hundreds of bases with lower precision values. Our method is also highly robust to varying properties of the sequencing library and to the sizes of the tandem duplications, as shown by its stable precision, recall and mean boundary mismatch performance. We demonstrate our method's efficacy using both simulated paired-end reads, and those generated from a melanoma sample and two ovarian cancer samples. Newly discovered tandem duplications are validated using PCR and Sanger sequencing. Our method, DB2, uses discordantly aligned reads, taking into account the distribution of fragment length to predict tandem duplications along with their breakpoints on a donor genome. The proposed method fine tunes the breakpoint calls by applying a novel probabilistic framework that incorporates the empirical fragment length distribution to score each feasible breakpoint. DB2 is implemented in Java programming language and is freely available

  4. A global approach to risk management: lessons from the nuclear industry

    Lazo, T.; Kaufer, B.

    2003-01-01

    The industry's nuclear safety experts are continuously striving to minimise the possible risk and extent of a nuclear accident, while nuclear regulatory, authorities work to ensure that all safety requirements are met. Relying on a combination of deterministic and probabilistic approaches, they are obtaining positive results in terms of both risk-informed regulation and nuclear safety management. This article addresses this aspect of risk management, as well as the management of radiation exposure risk. It looks into nuclear emergency planning, preparedness and management, and stresses the importance of coordinating potential protection approaches and providing effective communication should a nuclear accident occur. (authors)

  5. Probabilistic methods used in NUSS

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  6. Economic responses to global warming: Prospects for cooperative approaches

    Schelling, T.C.

    1991-01-01

    At the outset, any cooperative approach to global warming will have to reach some rough consensus on two sets of magnitudes and the marginal trade-off between them. One set of magnitudes relates to CO 2 production and abatement. It is the cost and difficulties of reducing energy use by households, farms, and industry, and of switching to cleaner fossil fuels or converting to nonfossil energies. These are the kinds of things that economists and engineers, sometimes sociologists and architects, have been working on with special motivation since 1973. The uncertainties remain great, and they increase many-fold when projected to the middle of the next century. But these estimates do receive attention. The other set of magnitudes has to do with the impact of changing climate on economic productivity, on health and comfort, on the quality of life in general, and on the differential rates of progress among countries. These estimates, on which virtually no work was done until recently, are doubly uncertain. In this study the author offers a judgment about the magnitude of the consequences of failing to reduce CO 2 emissions drastically below what they would be in the absence of such an effort. The author takes 'drastic' to mean anything between an emissions growth rate half of what it would otherwise be and an emissions growth rate of zero beginning one or two decades from now - that is, annual emissions leveling off within a decade or two. That level would still leave emissions growing at the maximum achieved rate

  7. Historical approach of contemporary understanding of school in globalization

    Parlić-Božović Jasna Lj.

    2015-01-01

    Full Text Available Intercultural education respects and promotes diversity in all areas of human life. This phenomenon indicates that people naturally and spontaneously develop different lifestyles, different customs and worldviews. These differences need to be considered as a wealth of life. When we talk about education, and school as a narrow concept, in which the differences are usually promoted, we have a vision of a community that provides equal opportunities, opposes injustice and discrimination, and strives for the values which equality is being built on. Especially, this phenomenon is being pointed out in our conditions, when we are still adapting to the reforms of globalization. In the Contemporary World, promoting the democracy becomes a key goal of education, as well as the whole society. Therefore, in the education system it should be taken into account the multicultural character of the society that tends to actively contribute to a peaceful coexistence and positive interaction between different cultural groups. Traditionally, in this sense, there are two approaches in education: multi­cultural education that strives to provide the acceptance and tolerance of other cultures through learning about them. On the other side, we have the intercultural education that aims to overcome the passive coexistence and achieve developed and sustainable way of living together in a multicultural society. This is achieved through the construcive process of understanding, mutual respect and dialogue among groups of different cultures, then, ensuring equal opportunities and combating discrimination.

  8. Probabilistic metric spaces

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  9. A Bayesian Approach to Integrate Real-Time Data into Probabilistic Risk Analysis of Remediation Efforts in NAPL Sites

    Fernandez-Garcia, D.; Sanchez-Vila, X.; Bolster, D.; Tartakovsky, D. M.

    2010-12-01

    The release of non-aqueous phase liquids (NAPLs) such as petroleum hydrocarbons and chlorinated solvents in the subsurface is a severe source of groundwater and vapor contamination. Because these liquids are essentially immiscible due to low solubility, these contaminants get slowly dissolved in groundwater and/or volatilized in the vadoze zone threatening the environment and public health over a long period. Many remediation technologies and strategies have been developed in the last decades for restoring the water quality properties of these contaminated sites. The failure of an on-site treatment technology application is often due to the unnoticed presence of dissolved NAPL entrapped in low permeability areas (heterogeneity) and/or the remaining of substantial amounts of pure phase after remediation efforts. Full understanding of the impact of remediation efforts is complicated due to the role of many interlink physical and biochemical processes taking place through several potential pathways of exposure to multiple receptors in a highly unknown heterogeneous environment. Due to these difficulties, the design of remediation strategies and definition of remediation endpoints have been traditionally determined without quantifying the risk associated with the failure of such efforts. We conduct a probabilistic risk analysis (PRA) of the likelihood of success of an on-site NAPL treatment technology that easily integrates all aspects of the problem (causes, pathways, and receptors) without doing extensive modeling. Importantly, the method is further capable to incorporate the inherent uncertainty that often exist in the exact location where the dissolved NAPL plume leaves the source zone. This is achieved by describing the failure of the system as a function of this source zone exit location, parameterized in terms of a vector of parameters. Using a Bayesian interpretation of the system and by means of the posterior multivariate distribution, the failure of the

  10. Probabilistic risk assessment methodology

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  11. Probabilistic risk assessment methodology

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  12. A unified approach to global and local beam position feedback

    Chung, Y.

    1994-01-01

    The Advanced Photon Source (APS) will implement both global and local beam position feedback systems to stabilize the particle and X-ray beams for the storage ring. The global feedback system uses 40 BPMs and 40 correctors per plane. Singular value decomposition (SVD) of the response matrix is used for closed orbit correction. The local feedback system uses two X-ray BPMS, two rf BPMS, and the four-magnet local bump to control the angle and displacement of the X-ray beam from a bending magnet or an insertion device. Both the global and local feedback systems are based on digital signal processing (DSP) running at 4-kHz sampling rate with a proportional, integral, and derivative (PID) control algorithm. In this paper, we will discuss resolution of the conflict among multiple local feedback systems due to local bump closure error and decoupling of the global and local feedback systems to maximize correction efficiency. In this scheme, the global feedback system absorbs the local bump closure error and the local feedback systems compensate for the effect of global feedback on the local beamlines. The required data sharing between the global and local feedback systems is done through the fiber-optically networked reflective memory

  13. Evaluation Methodology between Globalization and Localization Features Approaches for Skin Cancer Lesions Classification

    Ahmed, H. M.; Al-azawi, R. J.; Abdulhameed, A. A.

    2018-05-01

    Huge efforts have been put in the developing of diagnostic methods to skin cancer disease. In this paper, two different approaches have been addressed for detection the skin cancer in dermoscopy images. The first approach uses a global method that uses global features for classifying skin lesions, whereas the second approach uses a local method that uses local features for classifying skin lesions. The aim of this paper is selecting the best approach for skin lesion classification. The dataset has been used in this paper consist of 200 dermoscopy images from Pedro Hispano Hospital (PH2). The achieved results are; sensitivity about 96%, specificity about 100%, precision about 100%, and accuracy about 97% for globalization approach while, sensitivity about 100%, specificity about 100%, precision about 100%, and accuracy about 100% for Localization Approach, these results showed that the localization approach achieved acceptable accuracy and better than globalization approach for skin cancer lesions classification.

  14. Transport of nutrients from land to sea: Global modeling approaches and uncertainty analyses

    Beusen, A.H.W.

    2014-01-01

    This thesis presents four examples of global models developed as part of the Integrated Model to Assess the Global Environment (IMAGE). They describe different components of global biogeochemical cycles of the nutrients nitrogen (N), phosphorus (P) and silicon (Si), with a focus on approaches to

  15. Probabilistic logic networks a comprehensive framework for uncertain inference

    Goertzel, Ben; Goertzel, Izabela Freire; Heljakka, Ari

    2008-01-01

    This comprehensive book describes Probabilistic Logic Networks (PLN), a novel conceptual, mathematical and computational approach to uncertain inference. A broad scope of reasoning types are considered.

  16. Finite element meshing approached as a global minimization process

    WITKOWSKI,WALTER R.; JUNG,JOSEPH; DOHRMANN,CLARK R.; LEUNG,VITUS J.

    2000-03-01

    The ability to generate a suitable finite element mesh in an automatic fashion is becoming the key to being able to automate the entire engineering analysis process. However, placing an all-hexahedron mesh in a general three-dimensional body continues to be an elusive goal. The approach investigated in this research is fundamentally different from any other that is known of by the authors. A physical analogy viewpoint is used to formulate the actual meshing problem which constructs a global mathematical description of the problem. The analogy used was that of minimizing the electrical potential of a system charged particles within a charged domain. The particles in the presented analogy represent duals to mesh elements (i.e., quads or hexes). Particle movement is governed by a mathematical functional which accounts for inter-particles repulsive, attractive and alignment forces. This functional is minimized to find the optimal location and orientation of each particle. After the particles are connected a mesh can be easily resolved. The mathematical description for this problem is as easy to formulate in three-dimensions as it is in two- or one-dimensions. The meshing algorithm was developed within CoMeT. It can solve the two-dimensional meshing problem for convex and concave geometries in a purely automated fashion. Investigation of the robustness of the technique has shown a success rate of approximately 99% for the two-dimensional geometries tested. Run times to mesh a 100 element complex geometry were typically in the 10 minute range. Efficiency of the technique is still an issue that needs to be addressed. Performance is an issue that is critical for most engineers generating meshes. It was not for this project. The primary focus of this work was to investigate and evaluate a meshing algorithm/philosophy with efficiency issues being secondary. The algorithm was also extended to mesh three-dimensional geometries. Unfortunately, only simple geometries were tested

  17. Debating Globalization in Social Studies Education: Approaching Globalization Historically and Discursively

    Agbaria, Ayman K.

    2011-01-01

    The purpose of this paper is to explore the dominant positions in the debates on globalization in American social studies education. Specifically, the paper illustrates that, first, globalization is conceived of as more of an unprecedented new age and less of a historical development. Second, it is conceived of as more of a natural process and…

  18. Probabilistic Approaches to Examining Linguistic Features of Test Items and Their Effect on the Performance of English Language Learners

    Solano-Flores, Guillermo

    2014-01-01

    This article addresses validity and fairness in the testing of English language learners (ELLs)--students in the United States who are developing English as a second language. It discusses limitations of current approaches to examining the linguistic features of items and their effect on the performance of ELL students. The article submits that…

  19. Computation of a coastal protection, using classical method, the PIANC-method or a full probabilistic approach ?

    Verhagen, H.J.

    2003-01-01

    In a classical design approach to breakwaters a design wave height is determined, and filled in into a design formula. Some undefined safety is added. In the method using partial safety coefficients (as developed by PIANC [1992] and recently also adopted by the Coastal Engineering Manual of the US

  20. Probabilistic modelling in urban drainage – two approaches that explicitly account for temporal variation of model errors

    Löwe, Roland; Del Giudice, Dario; Mikkelsen, Peter Steen

    of input uncertainties observed in the models. The explicit inclusion of such variations in the modelling process will lead to a better fulfilment of the assumptions made in formal statistical frameworks, thus reducing the need to resolve to informal methods. The two approaches presented here...

  1. Probabilistic costing of transmission services

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  2. Adequacy of the default values for skin surface area used for risk assessment and French anthropometric data by a probabilistic approach.

    Dornic, N; Ficheux, A S; Bernard, A; Roudot, A C

    2017-08-01

    The notes of guidance for the testing of cosmetic ingredients and their safety evaluation by the Scientific Committee on Consumer Safety (SCCS) is a document dedicated to ensuring the safety of European consumers. This contains useful data for risk assessment such as default values for Skin Surface Area (SSA). A more in-depth study of anthropometric data across Europe reveals considerable variations. The default SSA value was derived from a study on the Dutch population, which is known to be one of the tallest nations in the World. This value could be inadequate for shorter populations of Europe. Data were collected in a survey on cosmetic consumption in France. Probabilistic treatment of these data and analysis of the case of methylisothiazolinone, a sensitizer recently evaluated by a deterministic approach submitted to SCCS, suggest that the default value for SSA used in the quantitative risk assessment might not be relevant for a significant share of the French female population. Others female populations of Southern Europe may also be excluded. This is of importance given that some studies show an increasing risk of developping skin sensitization among women. The disparities in anthropometric data across Europe should be taken into consideration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Probabilistic approach to decision making under uncertainty during volcanic crises. Retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands

    Sobradelo, Rosa; Martí, Joan; Kilburn, Christopher; López, Carmen

    2014-05-01

    Understanding the potential evolution of a volcanic crisis is crucial to improving the design of effective mitigation strategies. This is especially the case for volcanoes close to densely-populated regions, where inappropriate decisions may trigger widespread loss of life, economic disruption and public distress. An outstanding goal for improving the management of volcanic crises, therefore, is to develop objective, real-time methodologies for evaluating how an emergency will develop and how scientists communicate with decision makers. Here we present a new model BADEMO (Bayesian Decision Model) that applies a general and flexible, probabilistic approach to managing volcanic crises. The model combines the hazard and risk factors that decision makers need for a holistic analysis of a volcanic crisis. These factors include eruption scenarios and their probabilities of occurrence, the vulnerability of populations and their activities, and the costs of false alarms and failed forecasts. The model can be implemented before an emergency, to identify actions for reducing the vulnerability of a district; during an emergency, to identify the optimum mitigating actions and how these may change as new information is obtained; and after an emergency, to assess the effectiveness of a mitigating response and, from the results, to improve strategies before another crisis occurs. As illustrated by a retrospective analysis of the 2011 eruption of El Hierro, in the Canary Islands, BADEMO provides the basis for quantifying the uncertainty associated with each recommended action as an emergency evolves, and serves as a mechanism for improving communications between scientists and decision makers.

  4. A probabilistic approach for assessing the vulnerability of transportation infrastructure to flooding from sea level rise and storm surge.

    Douglas, E. M.; Kirshen, P. H.; Bosma, K.; Watson, C.; Miller, S.; McArthur, K.

    2015-12-01

    There now exists a plethora of information attesting to the reality of our changing climate and its impacts on both human and natural systems. There also exists a growing literature linking climate change impacts and transportation infrastructure (highways, bridges, tunnels, railway, shipping ports, etc.) which largely agrees that the nation's transportation systems are vulnerable. To assess this vulnerability along the coast, flooding due to sea level rise and storm surge has most commonly been evaluated by simply increasing the water surface elevation and then estimating flood depth by comparing the new water surface elevation with the topographic elevations of the land surface. While this rudimentary "bathtub" approach may provide a first order identification of potential areas of vulnerability, accurate assessment requires a high resolution, physically-based hydrodynamic model that can simulate inundation due to the combined effects of sea level rise, storm surge, tides and wave action for site-specific locations. Furthermore, neither the "bathtub" approach nor other scenario-based approaches can quantify the probability of flooding due to these impacts. We developed a high resolution coupled ocean circulation-wave model (ADCIRC/SWAN) that utilizes a Monte Carlo approach for predicting the depths and associated exceedance probabilities of flooding due to both tropical (hurricanes) and extra-tropical storms under current and future climate conditions. This required the development of an entirely new database of meteorological forcing (e.g. pressure, wind speed, etc.) for historical Nor'easters in the North Atlantic basin. Flooding due to hurricanes and Nor'easters was simulated separately and then composite flood probability distributions were developed. Model results were used to assess the vulnerability of the Central Artery/Tunnel system in Boston, Massachusetts to coastal flooding now and in the future. Local and regional adaptation strategies were

  5. Some thoughts on the future of probabilistic structural design of nuclear components

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  6. Probabilistic escalation modelling

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  7. Probabilistic fracture finite elements

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-05-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  8. Probabilistic retinal vessel segmentation

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  9. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I. [Hanyang University, Seoul (Korea, Republic of); Lee, J. S.; Lee, D. S.; Kwon, J. S. [Seoul National University College of Medicine, Seoul (Korea, Republic of); Kim, J. J. [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2003-06-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease.

  10. Developing a Korean standard brain atlas on the basis of statistical and probabilistic approach and visualization tool for functional image analysis

    Koo, B. B.; Lee, J. M.; Kim, J. S.; Kim, I. Y.; Kim, S. I.; Lee, J. S.; Lee, D. S.; Kwon, J. S.; Kim, J. J.

    2003-01-01

    The probabilistic anatomical maps are used to localize the functional neuro-images and morphological variability. The quantitative indicator is very important to inquire the anatomical position of an activated region because functional image data has the low-resolution nature and no inherent anatomical information. Although previously developed MNI probabilistic anatomical map was enough to localize the data, it was not suitable for the Korean brains because of the morphological difference between Occidental and Oriental. In this study, we develop a probabilistic anatomical map for Korean normal brain. Normal 75 brains of T1-weighted spoiled gradient echo magnetic resonance images were acquired on a 1.5-T GESIGNA scanner. Then, a standard brain is selected in the group through a clinician searches a brain of the average property in the Talairach coordinate system. With the standard brain, an anatomist delineates 89 regions of interest (ROI) parcellating cortical and subcortical areas. The parcellated ROIs of the standard are warped and overlapped into each brain by maximizing intensity similarity. And every brain is automatically labeled with the registered ROIs. Each of the same-labeled region is linearly normalize to the standard brain, and the occurrence of each region is counted. Finally, 89 probabilistic ROI volumes are generated. This paper presents a probabilistic anatomical map for localizing the functional and structural analysis of Korean normal brain. In the future, we'll develop the group specific probabilistic anatomical maps of OCD and schizophrenia disease

  11. Deterministic global optimization an introduction to the diagonal approach

    Sergeyev, Yaroslav D

    2017-01-01

    This book begins with a concentrated introduction into deterministic global optimization and moves forward to present new original results from the authors who are well known experts in the field. Multiextremal continuous problems that have an unknown structure with Lipschitz objective functions and functions having the first Lipschitz derivatives defined over hyperintervals are examined. A class of algorithms using several Lipschitz constants is introduced which has its origins in the DIRECT (DIviding RECTangles) method. This new class is based on an efficient strategy that is applied for the search domain partitioning. In addition a survey on derivative free methods and methods using the first derivatives is given for both one-dimensional and multi-dimensional cases. Non-smooth and smooth minorants and acceleration techniques that can speed up several classes of global optimization methods with examples of applications and problems arising in numerical testing of global optimization algorithms are discussed...

  12. Global warming policy: A coherent-sequential approach

    Manicke, R.L.

    1996-01-01

    This paper addresses these two closely related themes: (1) the need for structuring and evaluating global climate policy sequentially and (2) the need to incorporate the analysis of real options which may contribute significantly to global climate policy. This paper is organized into four sections. The first section deals with benefit-cost analysis and capital budgeting as they are generally practiced and discusses the reasons why the traditional benefit-cost formulation is inadequate. The second section then discusses the case of one financial option, namely, the European Call Option and discusses some important results. The third section of the paper addresses some of the important results or principles derived in the literature on real options, and while most of the mathematics is not easily transferred nor relevant to the global climate policy, there are many principles that can be applied. In the fourth section the author discusses the implications of a real option environment for the policy process

  13. Probabilistic machine learning and artificial intelligence.

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Probabilistic machine learning and artificial intelligence

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  15. Probabilistic assessment of nuclear safety and safeguards

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  16. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  17. Approaches to the Understanding of the Global Governance

    Кукарцев, Олег Вікторович

    2016-01-01

    A central issue in this paper is the origins and specific character of global governance discourse that formed at the turn of the twenty-first century. The new discourse marked a gradual exclusion of most early discourses – «international relations» and «world government». It is demonstrated that new term «global governance» signalizes about grand reconstruction of previous international relations discourse where international society was considered as the complex of states and their governme...

  18. Network approach for decision making under risk—How do we choose among probabilistic options with the same expected value?

    Chen, Yi-Shin

    2018-01-01

    Conventional decision theory suggests that under risk, people choose option(s) by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing ‘goal’ and ‘time’ factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight. PMID:29702665

  19. Network approach for decision making under risk-How do we choose among probabilistic options with the same expected value?

    Pan, Wei; Chen, Yi-Shin

    2018-01-01

    Conventional decision theory suggests that under risk, people choose option(s) by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing 'goal' and 'time' factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight.

  20. Network approach for decision making under risk-How do we choose among probabilistic options with the same expected value?

    Wei Pan

    Full Text Available Conventional decision theory suggests that under risk, people choose option(s by maximizing the expected utility. However, theories deal ambiguously with different options that have the same expected utility. A network approach is proposed by introducing 'goal' and 'time' factors to reduce the ambiguity in strategies for calculating the time-dependent probability of reaching a goal. As such, a mathematical foundation that explains the irrational behavior of choosing an option with a lower expected utility is revealed, which could imply that humans possess rationality in foresight.

  1. Joining the UN global compact in Spain: an institutional approach

    Ainhoa Garayar Erro

    2012-12-01

    Full Text Available The aim of this study is to analyse the incentives for Spanish organisations that have adopted a voluntary code of conduct such as the United Nations Global Compact – GC. In the light of the sociological approach of neoinstitutional theory, we sought to determine the main isomorphic processes that result from joining the GC and factors of the institutional field that might undermine the legitimacy of organisations participating in this voluntary initiative.On the one hand, the main results, while not conclusive, showed that Spanish participants in the GC have more than one reason for joining the initiative. The study’s findings suggest that both institutional processes of mimetic isomorphism and normative isomorphism explain participation in the GC, which emphasizes the improvement in employee satisfaction, an improvement in customer satisfaction and also the fact of being part of sustainable development efforts.On the other hand, it can be asserted that the respondent organisations do not perceive major threats from the institutional environment that might undermine the legitimacy gained by joining the GC. Nonetheless, the Spanish organisations participating in the GC highlight the fact that the main institutional factor that might undermine the organization’s legitimacy is weak governance and corruption.RESUMENEl presente trabajo examina la incidencia que la adopción del Pacto Mundial de Naciones Unidas –PM– ha tenido en organizaciones españolas. En concreto, el objetivo es aportar evidencia sobre las motivaciones que impulsan a un grupo de entidades a participar en esta iniciativa voluntaria. Basándonos en el enfoque sociológico de la teoría neoinstitucional, analizamos los principales procesos de isomorfismo resultantes del proceso de adhesión, así como aquellos factores del entorno institucional que pueden llegar a mermar su legitimidad.Los resultados, aunque no concluyentes, muestran que las organizaciones espa

  2. Conflicts over natural resources in the Global South : conceptual approaches

    Bavinck, M.; Pellegrini, L.; Mostert, E.

    2014-01-01

    Inhabitants of poor, rural areas in the Global South heavily depend on natural resources in their immediate vicinity. Conflicts over and exploitation of these resources - whether it is water, fish, wood fuel, minerals, or land - severely affect their livelihoods. The contributors to this volume

  3. Global Challenges and Threats: European and US Approaches

    Antonio Marquina

    2010-01-01

    Full Text Available Este artículo presenta las similitudes y diferencias que existen en las aproximaciones de seguridad entre la Unión Europea y los Estados Unidos, así como sus implicaciones para la OTAN. La Estrategia de Seguridad Europea enfatiza los desafíos y amenazas globales, dejando en un segundo plano los problemas de seguridad tradicional existentes en la periferia europea. Los Estados Unidos, por su parte, que es una potencia militar global tiende a considerar los problemas de seguridad europea en un contexto más global. El artículo hace un recorrido por las políticas puestas en pie por la Unión Europea y los Estados Unidos para hacer frente a los desafíos globales y explica las similitudes y diferencias en orden a entender los problemas cruciales que los estados miembros de la OTAN tienen que abordar para dar consistencia y permanencia al nuevo concepto estratégico de la OTAN que se está elaborando.

  4. Study Behaviour: A counselling approach | Okpechi | Global Journal ...

    The researcher recommended that students should take their studies seriously as their failure and success lies on it. He equally draws the attention of students to the essentials of study behaviour, time management, organisation of study task, etc. Global Journal of Educational Research Vol. 5 (1&2) 2006: pp. 5-11 ...

  5. Probabilistic Logical Characterization

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  6. Conditional Probabilistic Population Forecasting

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  7. Conditional probabilistic population forecasting

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  8. Conditional Probabilistic Population Forecasting

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  9. Approaches to Building Global Strategic Deterrence System after 2021

    Vitaliy V. Kabernik

    2016-01-01

    Full Text Available The article studies prospective for transformation of the current global deterrence system in 21st century, paying special attention to the structures of treaties past 2021. After the mainstay arms control treaty (New START expiration development of the new system of treaties and agreements seems inevitable, quite possibly, on multilateral basis. The hypothesis stressing possibility of multilateral deterrence system for global stability is quite popular nowadays. Studying the dynamics of nuclear arms cuts and monitoring progress on New START treaty, we can see numerous positive effects. However, the nuclear modernization programs currently in progress or planned for the near future should be taken into account for future agreements. This is when geospatialanalysis is important, demonstrating effectively which states are deterring each other and for which ones this is simply impossible because of the available weapons delivery range. This analysis is performed for three possible candidates for future multilateral treaties: USA, Russia and China, mentioning Great Britain and France as well. Going further into geospatial analysis, strategic ABM factor is accounted and the role of global ABM is estimated for future treaties. Numerical estimates of nuclear potentials of third countries - incomparable to the current numbers in possession of two main nuclear powers - performed specifically. Based on the analysis provided we can effectively deny the possibility of multilateral agreements for future deterrence scenarios. However, some steps for involving third countries into the global process of nuclear regulations can be outlined. This includes a number of bilateral agreements for arms control in certain regions, specifically developed to form a system of treaties aimed for global tensions reduction moving towards a safer world in the 21st century.

  10. Approaches to Building Global Strategic Deterrence System after 2021

    Vitaliy V. Kabernik

    2016-01-01

    Full Text Available The article studies prospective for transformation of the current global deterrence system in 21 century, paying special attention to the structures of treaties past 2021. After the mainstay arms control treaty (New START expiration development of the new system of treaties and agreements seems inevitable, quite possibly, on multilateral basis. The hypothesis stressing possibility of multilateral deterrence system for global stability is quite popular nowadays. Studying the dynamics of nuclear arms cuts and monitoring progress on New START treaty, we can see numerous positive effects. However, the nuclear modernization programs currently in progress or planned for the near future should be taken into account for future agreements. This is when geospatialanalysis is important, demonstrating effectively which states are deterring each other and for which ones this is simply impossible because of the available weapons delivery range. This analysis is performed for three possible candidates for future multilateral treaties: USA, Russia and China, mentioning Great Britain and France as well. Going further into geospatial analysis, strategic ABM factor is accounted and the role of global ABM is estimated for future treaties. Numerical estimates of nuclear potentials of third countries - incomparable to the current numbers in possession of two main nuclear powers - performed specifically. Based on the analysis provided we can effectively deny the possibility of multilateral agreements for future deterrence scenarios. However, some steps for involving third countries into the global process of nuclear regulations can be outlined. This includes a number of bilateral agreements for arms control in certain regions, specifically developed to form a system of treaties aimed for global tensions reduction moving towards a safer world in the 21st century.

  11. Estimates of dietary exposure to bisphenol A (BPA) from light metal packaging using food consumption and packaging usage data: a refined deterministic approach and a fully probabilistic (FACET) approach.

    Oldring, P K T; Castle, L; O'Mahony, C; Dixon, J

    2014-01-01

    The FACET tool is a probabilistic model to estimate exposure to chemicals in foodstuffs, originating from flavours, additives and food contact materials. This paper demonstrates the use of the FACET tool to estimate exposure to BPA (bisphenol A) from light metal packaging. For exposure to migrants from food packaging, FACET uses industry-supplied data on the occurrence of substances in the packaging, their concentrations and construction of the packaging, which were combined with data from a market research organisation and food consumption data supplied by national database managers. To illustrate the principles, UK packaging data were used together with consumption data from the UK National Diet and Nutrition Survey (NDNS) dietary survey for 19-64 year olds for a refined deterministic verification. The UK data were chosen mainly because the consumption surveys are detailed, data for UK packaging at a detailed level were available and, arguably, the UK population is composed of high consumers of packaged foodstuffs. Exposures were run for each food category that could give rise to BPA from light metal packaging. Consumer loyalty to a particular type of packaging, commonly referred to as packaging loyalty, was set. The BPA extraction levels used for the 15 types of coating chemistries that could release BPA were in the range of 0.00005-0.012 mg dm(-2). The estimates of exposure to BPA using FACET for the total diet were 0.0098 (mean) and 0.0466 (97.5th percentile) mg/person/day, corresponding to 0.00013 (mean) and 0.00059 (97.5th percentile) mg kg(-1) body weight day(-1) for consumers of foods packed in light metal packaging. This is well below the current EFSA (and other recognised bodies) TDI of 0.05 mg kg(-1) body weight day(-1). These probabilistic estimates were compared with estimates using a refined deterministic approach drawing on the same input data. The results from FACET for the mean, 95th and 97.5th percentile exposures to BPA lay between the

  12. Probabilistic Analysis Methods for Hybrid Ventilation

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  13. A probabilistic risk assessment approach used to prioritize chemical constituents in mainstream smoke of cigarettes sold in China.

    Xie, Jianping; Marano, Kristin M; Wilson, Cody L; Liu, Huimin; Gan, Huamin; Xie, Fuwei; Naufal, Ziad S

    2012-03-01

    The chemical and physical complexity of cigarette mainstream smoke (MSS) presents a challenge in the understanding of risk for smoking-related diseases. Quantitative risk assessment is a useful tool for assessing the toxicological risks that may be presented by smoking currently available commercial cigarettes. In this study, yields of a selected group of chemical constituents were quantified in machine-generated MSS from 30 brands of cigarettes sold in China. Using constituent yields, exposure estimates specific to and representative of the Chinese population, and available dose-response data, a Monte Carlo method was applied to simulate probability distributions for incremental lifetime cancer risk (ILCR), hazard quotient (HQ), and margin of exposure (MOE) values for each constituent as appropriate. Measures of central tendency were extracted from the outcome distributions and constituents were ranked according to these three risk assessment indices. The constituents for which ILCR >10(-4), HQ >1, and MOE risk contributed by each MSS constituent, this approach provides a plausible and objective framework for the prioritization of toxicants in cigarette smoke and is valuable in guiding tobacco risk management. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Local and Global Gestalt Laws: A Neurally Based Spectral Approach.

    Favali, Marta; Citti, Giovanna; Sarti, Alessandro

    2017-02-01

    This letter presents a mathematical model of figure-ground articulation that takes into account both local and global gestalt laws and is compatible with the functional architecture of the primary visual cortex (V1). The local gestalt law of good continuation is described by means of suitable connectivity kernels that are derived from Lie group theory and quantitatively compared with long-range connectivity in V1. Global gestalt constraints are then introduced in terms of spectral analysis of a connectivity matrix derived from these kernels. This analysis performs grouping of local features and individuates perceptual units with the highest salience. Numerical simulations are performed, and results are obtained by applying the technique to a number of stimuli.

  15. Whither probabilistic security management for real-time operation of power systems ?

    Karangelos, Efthymios; Panciatici, Patrick; Wehenkel, Louis

    2016-01-01

    This paper investigates the stakes of introducing probabilistic approaches for the management of power system’s security. In real-time operation, the aim is to arbitrate in a rational way between preventive and corrective control, while taking into account i) the prior probabilities of contingencies, ii) the possible failure modes of corrective control actions, iii) the socio-economic consequences of service interruptions. This work is a first step towards the construction of a globally co...

  16. Fiscal federalism approach for controlling global environmental pollution

    Murty, M.N.

    1996-01-01

    It is found that optimal international carbon taxes are country specific and we can decompose a tax on a domestically produced carbon-intensive commodity into a revenue tax, a tax to control local atmospheric pollution and an international carbon tax. It shows that an institutional arrangement for the world economy similar to the fiscal federalism in the federal countries can be useful to internalize the global externalities of atmospheric pollution. 18 refs

  17. Unemployment, Investment and Global Expected Returns: A Panel FAVAR Approach

    Ron Smith; Gylfi Zoega

    2005-01-01

    We consider the hypothesis that a common factor, global expected returns, drives unemployment and investment in 21 OECD countries over the period 1960-2002. We investigate this hypothesis using a panel-factor augmented-vector autoregression (FAVAR). We first estimate the common factors of unemployment and investment by principal components and show that the first principal component of unemployment is almost identical to that of investment and that they both show the pattern one would expect ...

  18. Global facial beauty: approaching a unified aesthetic ideal.

    Sands, Noah B; Adamson, Peter A

    2014-04-01

    Recognition of facial beauty is both inborn and learned through social discourses and exposures. Demographic shifts across the globe, in addition to cross-cultural interactions that typify 21st century globalization in virtually all industries, comprise major active evolutionary forces that reshape our individual notions of facial beauty. This article highlights the changing perceptions of beauty, while defining and distinguishing natural beauty and artificial beauty. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  19. Electromagnetic microinstabilities in tokamak plasmas using a global spectral approach

    Falchetto, G. L

    2002-03-01

    Electromagnetic microinstabilities in tokamak plasmas are studied by means of a linear global eigenvalue numerical code. The code is the electromagnetic extension of an existing electrostatic global gyrokinetic spectral toroidal code, called GLOGYSTO. Ion dynamics is described by the gyrokinetic equation, so that ion finite Larmor radius effects are taken into account to all orders. Non adiabatic electrons are included in the model, with passing particles described by the drift-kinetic equation and trapped particles through the bounce averaged drift-kinetic equation. A low frequency electromagnetic perturbation is applied to a low -but finite- {beta}plasma (where the parameter {beta} identifies the ratio of plasma pressure to magnetic pressure); thus, the parallel perturbations of the magnetic field are neglected. The system is closed by the quasi-neutrality equation and the parallel component of Ampere's law. The formulation is applied to a large aspect ratio toroidal configuration, with circular shifted surfaces. Such a simple configuration enables one to derive analytically the gyrocenter trajectories. The system is solved in Fourier space, taking advantage of a decomposition adapted to the toroidal geometry. The major contributions of this thesis are as follows. The electromagnetic effects on toroidal Ion Temperature Gradient driven (ITG) modes are studied. The stabilization of these modes with increasing {beta}, as predicted in previous work, is confirmed. The inclusion of trapped electron dynamics enables the study of its coupling to the ITG modes and of Trapped Electron Modes (TEM) .The effects of finite {beta} are considered together with those of different magnetic shear profiles and of the Shafranov shift. The threshold for the destabilization of an electromagnetic mode is identified. Moreover, the global formulation yields for the first time the radial structure of this so-called Alfvenic Ion Temperature Gradient (AITG) mode. The stability of the

  20. A novel approach for the global localization problem

    Abraham Sánchez

    2012-03-01

    Full Text Available Este artículo describe una metodología de planificación, localización y mapeo simultáneos enfocada en el problema de localización global, el robot explora el ambiente eficientemente y también considera los requisitos de un algoritmo de localización y mapeo simultáneos. El método está basado en la generación aleatoria incremental de una estructura de datos llamada árbol aleatorio basado en sensores, la cual representa un mapa de caminos del área explorada con su región segura asociada. Un procedimiento de localización continuo basado encaracterísticas B-splines de la región segura se integró en el esquema.This paper describes a simultaneous planning localization and mapping (SPLAM methodology focussed on the global localization problem, where the robot explores the environment efficiently and also considers the requisites of the simultaneous localization and mapping algorithm. The method is based on the randomized incremental generation of a data structure called Sensor-based Random Tree, which represents a roadmap of the explored area with an associated safe region. A continuous localization procedure based on B-Splines features of the safe region is integrated in the scheme.

  1. Glass viscosity calculation based on a global statistical modelling approach

    Fluegel, Alex

    2007-02-01

    A global statistical glass viscosity model was developed for predicting the complete viscosity curve, based on more than 2200 composition-property data of silicate glasses from the scientific literature, including soda-lime-silica container and float glasses, TV panel glasses, borosilicate fiber wool and E type glasses, low expansion borosilicate glasses, glasses for nuclear waste vitrification, lead crystal glasses, binary alkali silicates, and various further compositions from over half a century. It is shown that within a measurement series from a specific laboratory the reported viscosity values are often over-estimated at higher temperatures due to alkali and boron oxide evaporation during the measurement and glass preparation, including data by Lakatos et al. (1972) and the recently published High temperature glass melt property database for process modeling by Seward et al. (2005). Similarly, in the glass transition range many experimental data of borosilicate glasses are reported too high due to phase separation effects. The developed global model corrects those errors. The model standard error was 9-17°C, with R^2 = 0.985-0.989. The prediction 95% confidence interval for glass in mass production largely depends on the glass composition of interest, the composition uncertainty, and the viscosity level. New insights in the mixed-alkali effect are provided.

  2. On the quality of global emission inventories. Approaches, methodologies, input data and uncertainties

    Olivier, J.G.J.

    2002-01-01

    Four key scientific questions will be investigated: (1) How does a user define the 'quality' of a global (or national) emission inventory? (Chapter 2); (2) What determines the quality of a global emission inventory? (Chapters 2 and 7); (3) How can inventory quality be achieved in practice and expressed in quantitative terms ('uncertainty')? (Chapters 3 to 6); and (4) What is the preferred approach for compiling a global emission inventory, given the practical limitations and the desired inventory quality? (Chapters 7 and 8)

  3. Adaptive approach to global synchronization of directed networks with fast switching topologies

    Qin Buzhi; Lu Xinbiao

    2010-01-01

    Global synchronization of directed networks with switching topologies is investigated. It is found that if there exists at least one directed spanning tree in the network with the fixed time-average topology and the time-average topology is achieved sufficiently fast, the network will reach global synchronization for appreciate coupling strength. Furthermore, this appreciate coupling strength may be obtained by local adaptive approach. A sufficient condition about the global synchronization is given. Numerical simulations verify the effectiveness of the adaptive strategy.

  4. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  5. Probabilistic composition of preferences, theory and applications

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  6. APPROACHES TO GLOBAL SECURITY. ACTORS, MANIFESTATIONS AND TENDENCIES

    Gheorghe MINCULETE

    2013-01-01

    Full Text Available Nowadays, the world seems to be in a transition from the current system founded on the liberal social, economic and political model to a more diverse and heterogeneous model in which the determinant role is played by a number of state and non-state actors. The step from the Western system of cultural, political and predominant economic values to a more diverse and heterogeneous system makes the actors involved defend not only their visions, but also promote their own interests. The differences between visions gain relevance and clarity because the countries supporting them obtain increased power, and that is more than obvious. All this leads to a symmetric allocation of different means, which generates uncertainties and diminishes unilateral actions This transition process impacts global security especially through the asymmetric, unconventional and hybrid risks and threats manifesting worldwide.

  7. Global Internet Governance: Russian Approach and International Practice

    Elena S. Zinovieva

    2015-01-01

    Full Text Available The article studies the processes of Internet governance at the international level in the context of the position and interests of Russia in this area. The theory of global governance was used as a theoretical and methodological framework of the study. Initially, Internet governance was carried out on the state level, with coordination carried out in the interests of the United States created the Internet. At the present stage states and other actors in world politics has to be integrated into the existing system of Internet governance, resulting in development of multi-level or multi-directional diplomacy, formation of the so-called "hybrid" organizations and new models of cooperation. There are new formats of regulation of international relations formed under the influence of scientific and technological progress. Russia's position on Internet governance is based on the goal to ensure equal consideration of interests of all states in the governance of the Internet.

  8. Probabilistic Design of Wind Turbines

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  9. Probabilistic approaches to robotic perception

    Ferreira, João Filipe

    2014-01-01

    This book tries to address the following questions: How should the uncertainty and incompleteness inherent to sensing the environment be represented and modelled in a way that will increase the autonomy of a robot? How should a robotic system perceive, infer, decide and act efficiently? These are two of the challenging questions robotics community and robotic researchers have been facing. The development of robotic domain by the 1980s spurred the convergence of automation to autonomy, and the field of robotics has consequently converged towards the field of artificial intelligence (AI). Since the end of that decade, the general public’s imagination has been stimulated by high expectations on autonomy, where AI and robotics try to solve difficult cognitive problems through algorithms developed from either philosophical and anthropological conjectures or incomplete notions of cognitive reasoning. Many of these developments do not unveil even a few of the processes through which biological organisms solve thes...

  10. Probabilistic Approaches to Energy Systems

    Iversen, Jan Emil Banning

    of renewable energy generation. Particularly we focus on producing forecasting models that can predict renewable energy generation, single user demand, and provide advanced forecast products that are needed for an efficient integration of renewable energy into the power generation mix. Such forecasts can...... integration of renewable energy.Thus forecast products should be developed in unison with the decision making tool as they are two sides of the same overall challenge.......Energy generation from wind and sun is increasing rapidly in many parts of the world. This presents new challenges on how to integrate this uncertain, intermittent and non-dispatchable energy source. This thesis deals with forecasting and decision making in energy systems with a large proportion...

  11. Reducing marine mammal bycatch in global fisheries: An economics approach

    Lent, Rebecca; Squires, Dale

    2017-06-01

    The broader ecosystem impacts of fishing continue to present a challenge to scientists and resource managers around the world. Bycatch is of greatest concern for marine mammals, for which fishery bycatch and entanglement is the number one cause of direct mortality. Climate change will only add to the challenge, as marine species and fishing practices adapt to a changing environment, creating a dynamic pattern of overlap between fishing and species (both target and bycatch). Economists suggest policy instruments for reducing bycatch that move away from top-down, command-and-control measures (e.g. effort reduction, time/area closures, gear restrictions, bycatch quotas) towards an approach that creates incentives to reduce bycatch (e.g. transferable bycatch allowances, taxes, and other measures). The advantages of this flexible, incentive-oriented approach are even greater in a changing and increasingly variable environment, as regulatory measures would have to be adapted constantly to keep up with climate change. Unlike the regulatory process, individual operators in the fishery sector can make adjustments to their harvesting practices as soon as the incentives for such changes are apparent and inputs or operations can be modified. This paper explores policy measures that create economic incentives not only to reduce marine mammal bycatch, but also to increase compliance and induce technological advances by fishery operators. Economists also suggest exploration of direct economic incentives as have been used in other conservation programs, such as payments for economic services, in an approach that addresses marine mammal bycatch as part of a larger conservation strategy. Expanding the portfolio of mandatory and potentially, voluntary, measures to include novel approaches will provide a broader array of opportunities for successful stewardship of the marine environment.

  12. An empirical system for probabilistic seasonal climate prediction

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  13. The Methodical Approaches to the Research of Informatization of the Global Economic Development

    Kazakova Nadezhda A.

    2018-03-01

    Full Text Available The article is aimed at researching the identification of global economic development informatization. The complex of issues connected with research of development of informatization of the world countries in the conditions of globalization is considered. The development of informatization in the global economic space, which facilitates opening of new markets for international trade enterprises, international transnational corporations and other organizations, which not only provide exports, but also create production capacities for local producers. The methodical approach which includes three stages together with formation of the input information on the status of informatization of the global economic development of the world countries has been proposed.

  14. Probabilistic Structural Analysis Program

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  15. Probabilistic flood damage modelling at the meso-scale

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  16. Probabilistic Open Set Recognition

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  17. Probabilistic programmable quantum processors

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  18. A Multisensor Approach to Global Retrievals of Land Surface Albedo

    Aku Riihelä

    2018-05-01

    Full Text Available Satellite-based retrievals offer the most cost-effective way to comprehensively map the surface albedo of the Earth, a key variable for understanding the dynamics of radiative energy interactions in the atmosphere-surface system. Surface albedo retrievals have commonly been designed separately for each different spaceborne optical imager. Here, we introduce a novel type of processing framework that combines the data from two polar-orbiting optical imager families, the Advanced Very High-Resolution Radiometer (AVHRR and Moderate Resolution Imaging Spectroradiometer (MODIS. The goal of the paper is to demonstrate that multisensor albedo retrievals can provide a significant reduction in the sampling time required for a robust and comprehensive surface albedo retrieval, without a major degradation in retrieval accuracy, as compared to state-of-the-art single-sensor retrievals. We evaluated the multisensor retrievals against reference in situ albedo measurements and compare them with existing datasets. The results show that global land surface albedo retrievals with a sampling period of 10 days can offer near-complete spatial coverage, with a retrieval bias mostly comparable to existing single sensor datasets, except for bright surfaces (deserts and snow where the retrieval framework shows degraded performance because of atmospheric correction design compromises. A level difference is found between the single sensor datasets and the demonstrator developed here, pointing towards a need for further work in the atmospheric correction, particularly over bright surfaces, and inter-sensor radiance homogenization. The introduced framework is expandable to include other sensors in the future.

  19. Space Applications and Global Information Infrastructure: a Global Approach against Epidemics

    Bastos, C. R.

    2002-01-01

    Brazilian space expenditures correspond to a low-middle rank among the space-faring nations. In this regard, international partnerships have opened doors for the country to take part in a wider range of projects than it would be possible if carried out on its own. Within the above framework, this paper will address a concept in which countries join efforts in pursuit of common objectives and needs in the field of health, countries whose similarities tend to make them face the same types of health problems. Exactly for this reason, such countries can get together and share the costs, risks and ultimately the benefits of their joint efforts. Infectious diseases are mankind's leading causes of death. And their agents travel around the world by the action of their vectors: insects, birds, winds, infected individuals, and others. The ways how Global Information Infrastructure and Space applications can be very helpful in the detection, identification, tracking and fighting migratory diseases will then be discussed. A concept for an international cooperative initiative is presented, addressing its composition, its implementation, the international coordination requirements, the financial and funding issues related to its implementation and sustainability, and the roles to be played by such an organization. The funding issue deserves a closer attention, since many good ideas are killed by financial problems in their early implementation stages. Finally, a conclusion drives the audience's attention towards the potential advantages of space-based assets in covering large portions of the Earth, and consequently being suitable for global initiatives for the benefit of mankind.

  20. THE PANCHROMATIC HUBBLE ANDROMEDA TREASURY. IV. A PROBABILISTIC APPROACH TO INFERRING THE HIGH-MASS STELLAR INITIAL MASS FUNCTION AND OTHER POWER-LAW FUNCTIONS

    Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Hogg, David W.; Foreman-Mackey, Daniel T. [Center for Cosmology and Particle Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Rix, Hans-Walter; Gouliermis, Dimitrios [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Dolphin, Andrew E. [Raytheon Company, 1151 East Hermans Road, Tucson, AZ 85756 (United States); Lang, Dustin [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States); Bell, Eric F. [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI 48109 (United States); Gordon, Karl D.; Kalirai, Jason S. [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Skillman, Evan D., E-mail: dweisz@astro.washington.edu [Minnesota Institute for Astrophysics, University of Minnesota, 116 Church Street SE, Minneapolis, MN 55455 (United States)

    2013-01-10

    We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M {approx}> 1 M {sub Sun }). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, {alpha}, are unbiased and that the uncertainty, {Delta}{alpha}, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on {alpha}, and provide an analytic approximation for {Delta}{alpha} as a function of the observed number of stars and mass range. Comparison with literature studies shows that {approx}3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield ({alpha}) = 2.46, with a 1{sigma} dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the

  1. The Panchromatic Hubble Andromeda Treasury. IV. A Probabilistic Approach to Inferring the High-mass Stellar Initial Mass Function and Other Power-law Functions

    Weisz, Daniel R.; Fouesneau, Morgan; Hogg, David W.; Rix, Hans-Walter; Dolphin, Andrew E.; Dalcanton, Julianne J.; Foreman-Mackey, Daniel T.; Lang, Dustin; Johnson, L. Clifton; Beerman, Lori C.; Bell, Eric F.; Gordon, Karl D.; Gouliermis, Dimitrios; Kalirai, Jason S.; Skillman, Evan D.; Williams, Benjamin F.

    2013-01-01

    We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M >~ 1 M ⊙). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ~3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield langαrang = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF

  2. THE PANCHROMATIC HUBBLE ANDROMEDA TREASURY. IV. A PROBABILISTIC APPROACH TO INFERRING THE HIGH-MASS STELLAR INITIAL MASS FUNCTION AND OTHER POWER-LAW FUNCTIONS

    Weisz, Daniel R.; Fouesneau, Morgan; Dalcanton, Julianne J.; Clifton Johnson, L.; Beerman, Lori C.; Williams, Benjamin F.; Hogg, David W.; Foreman-Mackey, Daniel T.; Rix, Hans-Walter; Gouliermis, Dimitrios; Dolphin, Andrew E.; Lang, Dustin; Bell, Eric F.; Gordon, Karl D.; Kalirai, Jason S.; Skillman, Evan D.

    2013-01-01

    We present a probabilistic approach for inferring the parameters of the present-day power-law stellar mass function (MF) of a resolved young star cluster. This technique (1) fully exploits the information content of a given data set; (2) can account for observational uncertainties in a straightforward way; (3) assigns meaningful uncertainties to the inferred parameters; (4) avoids the pitfalls associated with binning data; and (5) can be applied to virtually any resolved young cluster, laying the groundwork for a systematic study of the high-mass stellar MF (M ∼> 1 M ☉ ). Using simulated clusters and Markov Chain Monte Carlo sampling of the probability distribution functions, we show that estimates of the MF slope, α, are unbiased and that the uncertainty, Δα, depends primarily on the number of observed stars and on the range of stellar masses they span, assuming that the uncertainties on individual masses and the completeness are both well characterized. Using idealized mock data, we compute the theoretical precision, i.e., lower limits, on α, and provide an analytic approximation for Δα as a function of the observed number of stars and mass range. Comparison with literature studies shows that ∼3/4 of quoted uncertainties are smaller than the theoretical lower limit. By correcting these uncertainties to the theoretical lower limits, we find that the literature studies yield (α) = 2.46, with a 1σ dispersion of 0.35 dex. We verify that it is impossible for a power-law MF to obtain meaningful constraints on the upper mass limit of the initial mass function, beyond the lower bound of the most massive star actually observed. We show that avoiding substantial biases in the MF slope requires (1) including the MF as a prior when deriving individual stellar mass estimates, (2) modeling the uncertainties in the individual stellar masses, and (3) fully characterizing and then explicitly modeling the completeness for stars of a given mass. The precision on MF

  3. The Comparative Study Of Local Governance: Towards A Global Approach The Comparative Study Of Local Governance: Towards A Global Approach

    Gerry Stoker

    2010-12-01

    Full Text Available The comparative study of local governance has been too focused on the institutional arrangements of the Systems of different nation states rather than the more fundamental issue of the societal functions performed by local government. This article focuses attention on four societal roles that local government systems undertake. They can support political identity, underwrite economic development, facilitate social welfare provision or act as a lifestyle co-ordinator through the practice of community governance. Linking our investigation to the embedded societal roles of local government in different systems opens up the opportunity for a more genuinely global comparative perspective. It also helps us to understand the likely forms of politics associated with different systems of local governance. It also enables us to explore the sustainability of different systems of local governance. It is suggested that a strong system of local government is likely to be one that is able to combine societal roles to a substantial degree. A vulnerable local government system is one trapped with one function that in changing societal and economic circumstances could find itself under threat.El estudio comparado de la gobernanza local se ha focalizado excesivamente en los arreglos institucionales de los sistemas de los diferentes Estados-nación en lugar de centrarse en el tema esencial de las funciones sociales que desempeñan los gobiernos locales. Este artículo centra su atención en cuatro roles sociales que desempeñan los sistemas de gobierno local. Pueden proporcionar identidad política, garantizar el desarrollo económico, facilitar la provisión de servicios sociales o actuar como coordinador de la forma de vida mediante la práctica de la gobernanza comunitaria. La vinculación de la investigación a los roles sociales asumidos por los gobiernos locales en los diferentes sistemas proporciona la posibilidad de adoptar una perspectiva global comparada

  4. A fuzzy regression with support vector machine approach to the estimation of horizontal global solar radiation

    Baser, Furkan; Demirhan, Haydar

    2017-01-01

    Accurate estimation of the amount of horizontal global solar radiation for a particular field is an important input for decision processes in solar radiation investments. In this article, we focus on the estimation of yearly mean daily horizontal global solar radiation by using an approach that utilizes fuzzy regression functions with support vector machine (FRF-SVM). This approach is not seriously affected by outlier observations and does not suffer from the over-fitting problem. To demonstrate the utility of the FRF-SVM approach in the estimation of horizontal global solar radiation, we conduct an empirical study over a dataset collected in Turkey and applied the FRF-SVM approach with several kernel functions. Then, we compare the estimation accuracy of the FRF-SVM approach to an adaptive neuro-fuzzy system and a coplot supported-genetic programming approach. We observe that the FRF-SVM approach with a Gaussian kernel function is not affected by both outliers and over-fitting problem and gives the most accurate estimates of horizontal global solar radiation among the applied approaches. Consequently, the use of hybrid fuzzy functions and support vector machine approaches is found beneficial in long-term forecasting of horizontal global solar radiation over a region with complex climatic and terrestrial characteristics. - Highlights: • A fuzzy regression functions with support vector machines approach is proposed. • The approach is robust against outlier observations and over-fitting problem. • Estimation accuracy of the model is superior to several existent alternatives. • A new solar radiation estimation model is proposed for the region of Turkey. • The model is useful under complex terrestrial and climatic conditions.

  5. Asthma Treatments for Children and Adolescents: Strategies for a Global Approach

    Robert L Thivierge

    1995-01-01

    Full Text Available Strategies for a global approach to the management of asthma in children and adolescents are described. Such an approach requires the physician to explain to the patient the pathophysiology of asthma, to evaluate and, whenever possible, change predisposing environmental factors, to establish a written plan of action and to maintain a close follow-up of the patient to ensure compliance.

  6. Probabilistic Infinite Secret Sharing

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  7. Probabilistic Programming (Invited Talk)

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  8. Developing a curriculum framework for global health in family medicine: emerging principles, competencies, and educational approaches.

    Redwood-Campbell, Lynda; Pakes, Barry; Rouleau, Katherine; MacDonald, Colla J; Arya, Neil; Purkey, Eva; Schultz, Karen; Dhatt, Reena; Wilson, Briana; Hadi, Abdullahel; Pottie, Kevin

    2011-07-22

    Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes) for Canadian Family Medicine training. The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS) competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the framework. The process used to develop this framework can be applied

  9. Developing a curriculum framework for global health in family medicine: emerging principles, competencies, and educational approaches

    Wilson Briana

    2011-07-01

    Full Text Available Abstract Background Recognizing the growing demand from medical students and residents for more comprehensive global health training, and the paucity of explicit curricula on such issues, global health and curriculum experts from the six Ontario Family Medicine Residency Programs worked together to design a framework for global health curricula in family medicine training programs. Methods A working group comprised of global health educators from Ontario's six medical schools conducted a scoping review of global health curricula, competencies, and pedagogical approaches. The working group then hosted a full day meeting, inviting experts in education, clinical care, family medicine and public health, and developed a consensus process and draft framework to design global health curricula. Through a series of weekly teleconferences over the next six months, the framework was revised and used to guide the identification of enabling global health competencies (behaviours, skills and attitudes for Canadian Family Medicine training. Results The main outcome was an evidence-informed interactive framework http://globalhealth.ennovativesolution.com/ to provide a shared foundation to guide the design, delivery and evaluation of global health education programs for Ontario's family medicine residency programs. The curriculum framework blended a definition and mission for global health training, core values and principles, global health competencies aligning with the Canadian Medical Education Directives for Specialists (CanMEDS competencies, and key learning approaches. The framework guided the development of subsequent enabling competencies. Conclusions The shared curriculum framework can support the design, delivery and evaluation of global health curriculum in Canada and around the world, lay the foundation for research and development, provide consistency across programmes, and support the creation of learning and evaluation tools to align with the

  10. Globalization

    Tulio Rosembuj

    2006-12-01

    Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  11. Globalization

    Tulio Rosembuj

    2006-01-01

    There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  12. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  13. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-01-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics

  14. Time-varying correlations in global real estate markets: A multivariate GARCH with spatial effects approach

    Gu, Huaying; Liu, Zhixue; Weng, Yingliang

    2017-04-01

    The present study applies the multivariate generalized autoregressive conditional heteroscedasticity (MGARCH) with spatial effects approach for the analysis of the time-varying conditional correlations and contagion effects among global real estate markets. A distinguishing feature of the proposed model is that it can simultaneously capture the spatial interactions and the dynamic conditional correlations compared with the traditional MGARCH models. Results reveal that the estimated dynamic conditional correlations have exhibited significant increases during the global financial crisis from 2007 to 2009, thereby suggesting contagion effects among global real estate markets. The analysis further indicates that the returns of the regional real estate markets that are in close geographic and economic proximities exhibit strong co-movement. In addition, evidence of significantly positive leverage effects in global real estate markets is also determined. The findings have significant implications on global portfolio diversification opportunities and risk management practices.

  15. Global Kalman filter approaches to estimate absolute angles of lower limb segments.

    Nogueira, Samuel L; Lambrecht, Stefan; Inoue, Roberto S; Bortole, Magdo; Montagnoli, Arlindo N; Moreno, Juan C; Rocon, Eduardo; Terra, Marco H; Siqueira, Adriano A G; Pons, Jose L

    2017-05-16

    In this paper we propose the use of global Kalman filters (KFs) to estimate absolute angles of lower limb segments. Standard approaches adopt KFs to improve the performance of inertial sensors based on individual link configurations. In consequence, for a multi-body system like a lower limb exoskeleton, the inertial measurements of one link (e.g., the shank) are not taken into account in other link angle estimations (e.g., foot). Global KF approaches, on the other hand, correlate the collective contribution of all signals from lower limb segments observed in the state-space model through the filtering process. We present a novel global KF (matricial global KF) relying only on inertial sensor data, and validate both this KF and a previously presented global KF (Markov Jump Linear Systems, MJLS-based KF), which fuses data from inertial sensors and encoders from an exoskeleton. We furthermore compare both methods to the commonly used local KF. The results indicate that the global KFs performed significantly better than the local KF, with an average root mean square error (RMSE) of respectively 0.942° for the MJLS-based KF, 1.167° for the matrical global KF, and 1.202° for the local KFs. Including the data from the exoskeleton encoders also resulted in a significant increase in performance. The results indicate that the current practice of using KFs based on local models is suboptimal. Both the presented KF based on inertial sensor data, as well our previously presented global approach fusing inertial sensor data with data from exoskeleton encoders, were superior to local KFs. We therefore recommend to use global KFs for gait analysis and exoskeleton control.

  16. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  17. Integrated assessment of the global warming problem. A decision-analytical approach

    Van Lenthe, J.; Hendrickx, L.; Vlek, C.A.J.

    1995-01-01

    The project on the title subject aims at developing a policy-oriented methodology for the integrated assessment of the global warming problem. Decision analysis in general and influence diagrams in particular appear to constitute an appropriate integrated assessment methodology. The influence-diagram approach is illustrated at a preliminary integrated modeling of the global warming problem. In next stages of the research, attention will be shifted from the methodology of integrated assessment to the contents of integrated models. 4 figs., 5 refs

  18. Finsler metrics—a global approach with applications to geometric function theory

    Abate, Marco

    1994-01-01

    Complex Finsler metrics appear naturally in complex analysis. To develop new tools in this area, the book provides a graduate-level introduction to differential geometry of complex Finsler metrics. After reviewing real Finsler geometry stressing global results, complex Finsler geometry is presented introducing connections, Kählerianity, geodesics, curvature. Finally global geometry and complex Monge-Ampère equations are discussed for Finsler manifolds with constant holomorphic curvature, which are important in geometric function theory. Following E. Cartan, S.S. Chern and S. Kobayashi, the global approach carries the full strength of hermitian geometry of vector bundles avoiding cumbersome computations, and thus fosters applications in other fields.

  19. An Approach for Assessing the Benefits of IT Investments in Global Supply Chains

    Betz, Michaela; Henningsson, Stefan

    2016-01-01

    -duced by the technology as an isolated product. In contrast, research on global supply chains has shown that benefits generated from IT investments in this domain are typically generated by the coor-dinated use of many stakeholders and by technologies producing complimentary effects in systemic relationships......This paper develops and demonstrates a novel approach for ex-ante assessment of business benefits from IT investments in global supply chains. Extant IT assessment approaches are typically based on the assumption that benefit realization from IT investments involves a single stakeholder and are pro....... The assessment approach in this paper brings the contingent inter-organizational and technological dependencies of IT investments to the forefront of the assessment. It provides actors in industries relating to global supply chains the means to better apprehend the possible benefits from an IT investment...

  20. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  1. Probabilistic record linkage.

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  2. Globalization

    Andru?cã Maria Carmen

    2013-01-01

    The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

  3. Quantitative interpretation of myocardial Tl-201 single-photon emission computerized tomograms: A probabilistic approach to the assessment of coronary artery disease

    Maddahi, J.; Prigent, F.; Staniloff, H.; Garcia, E.; Becerra, A.; Van Train, K.; Swan, H.J.C.; Waxman, A.; Berman, D.

    1985-01-01

    Probabilistic criteria for abnormality would enhance application of stress-redistribution Tl-201 rotational tomography (tomo) for evaluation of coronary artery disease (CAD). Thus, 91 pts were studied, of whom 45 had angiographic CAD (≥ 50% coronary narrowing) and 46 were normal (nl). The validity of this model was prospectively tested in the remaining 51 pts (26 nls and 25 with CAD) by comparing the predicted and observed likelihood of CAD in four subgroups (I-IV). In this paper a logistic model is developed and validated that assigns a CAD likelihood to the quantified size of tomograhic myocardial perfusion defects

  4. Formalizing Probabilistic Safety Claims

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  5. Application of probabilistic precipitation forecasts from a ...

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...

  6. A Probabilistic Framework for Curve Evolution

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  7. Probabilistic Durability Analysis in Advanced Engineering Design

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  8. Coordinated approaches to quantify long-term ecosystem dynamics in response to global change

    Liu, Y.; Melillo, J.; Niu, S.

    2011-01-01

    a coordinated approach that combines long-term, large-scale global change experiments with process studies and modeling. Long-term global change manipulative experiments, especially in high-priority ecosystems such as tropical forests and high-latitude regions, are essential to maximize information gain......Many serious ecosystem consequences of climate change will take decades or even centuries to emerge. Long-term ecological responses to global change are strongly regulated by slow processes, such as changes in species composition, carbon dynamics in soil and by long-lived plants, and accumulation...... to be the most effective strategy to gain the best information on long-term ecosystem dynamics in response to global change....

  9. Probabilistic Mu-Calculus

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  10. Adding a soil fertility dimension to the global farming systems approach, with cases from Africa

    Smaling, E.M.A.; Dixon, J.

    2006-01-01

    The global farming systems (GFS) approach is extended by adding a soil fertility and nutrient management dimension for Africa's forest-based, maize mixed, cereal¿root crop mixed, and agro-pastoral millet/sorghum farming systems. Use is made of sustainable livelihood concepts, translated into farmer

  11. Global Practical Stabilization and Tracking for an Underactuated Ship - A Combined Averaging and Backstepping Approach

    Kristin Y. Pettersen

    1999-10-01

    Full Text Available We solve both the global practical stabilization and tracking problem for an underactuated ship, using a combined integrator backstepping and averaging approach. Exponential convergence to an arbitrarily small neighbourhood of the origin and of the reference trajectory, respectively, is proved. Simulation results are included.

  12. Quantifying Spatial Variation in Ecosystem Services Demand : A Global Mapping Approach

    Wolff, S.; Schulp, C. J E; Kastner, T.; Verburg, P. H.

    2017-01-01

    Understanding the spatial-temporal variability in ecosystem services (ES) demand can help anticipate externalities of land use change. This study presents new operational approaches to quantify and map demand for three non-commodity ES on a global scale: animal pollination, wild medicinal plants and

  13. Global Learning in a Geography Course Using the Mystery Method as an Approach to Complex Issues

    Applis, Stefan

    2014-01-01

    In the study which is the foundation of this essay, the question is examined of whether the complexity of global issues can be solved at the level of teaching methodology. In this context, the first qualitative and constructive study was carried out which researches the Mystery Method using the Thinking-Through-Geography approach (David Leat,…

  14. Scalable group level probabilistic sparse factor analysis

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  15. Probabilistic Design of Wave Energy Devices

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  16. Probabilistic Flood Defence Assessment Tools

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  17. Probabilistic systems coalgebraically: A survey

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  18. Adjoint-based global variance reduction approach for reactor analysis problems

    Zhang, Qiong; Abdel-Khalik, Hany S.

    2011-01-01

    A new variant of a hybrid Monte Carlo-Deterministic approach for simulating particle transport problems is presented and compared to the SCALE FW-CADIS approach. The new approach, denoted by the Subspace approach, optimizes the selection of the weight windows for reactor analysis problems where detailed properties of all fuel assemblies are required everywhere in the reactor core. Like the FW-CADIS approach, the Subspace approach utilizes importance maps obtained from deterministic adjoint models to derive automatic weight-window biasing. In contrast to FW-CADIS, the Subspace approach identifies the correlations between weight window maps to minimize the computational time required for global variance reduction, i.e., when the solution is required everywhere in the phase space. The correlations are employed to reduce the number of maps required to achieve the same level of variance reduction that would be obtained with single-response maps. Numerical experiments, serving as proof of principle, are presented to compare the Subspace and FW-CADIS approaches in terms of the global reduction in standard deviation. (author)

  19. Global sourcing risk management approaches: A study of small clothing and textile retailers in Gauteng

    Wesley Niemann

    2018-02-01

    Full Text Available Background: Global sourcing has increased as buyers searched for new markets that offered better pricing, quality, variety and delivery lead times than their local markets. However, the increase in global sourcing has also exposed businesses to many supply risks. Purpose: The purpose of this descriptive qualitative study was to explore the global sourcing supply risks encountered by small clothing and textile retailers in Gauteng and to determine what supply risk identification and management approaches they utilise. Method: This study utilised semi-structured interviews conducted with 12 small clothing and textile retail owners. Results: The study found that the three major supply risks encountered by these retailers were fluctuating exchange rates, communication barriers and costly and complicated logistics, which included high customs costs. Furthermore, although aware of the supply risks, none of the small clothing and textile retailers had formal identification and management approaches in place. Instead, risks are dealt with at the sole discretion of the owner as and when they occur. The study also found that informal identification and management approaches were being applied by some of the retailers. These included factoring exchange rate fluctuations into the profit margins and using translators to combat communication barriers. Contribution: The study is one of the first empirical studies conducted on global supply risks and the associated identification and management approaches in the South African small business context, specifically focused on clothing and textile retailers. Conclusion: Small clothing and textile retailers need to proactively identify and manage global sourcing risk using the identified approaches in order to reduce and mitigate potential supply disruptions.

  20. Confluence reduction for probabilistic systems

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  1. River Export of Plastic from Land to Sea: A Global Modeling Approach

    Siegfried, Max; Gabbert, Silke; Koelmans, Albert A.; Kroeze, Carolien; Löhr, Ansje; Verburg, Charlotte

    2016-04-01

    Plastic is increasingly considered a serious cause of water pollution. It is a threat to aquatic ecosystems, including rivers, coastal waters and oceans. Rivers transport considerable amounts of plastic from land to sea. The quantity and its main sources, however, are not well known. Assessing the amount of macro- and microplastic transport from river to sea is, therefore, important for understanding the dimension and the patterns of plastic pollution of aquatic ecosystems. In addition, it is crucial for assessing short- and long-term impacts caused by plastic pollution. Here we present a global modelling approach to quantify river export of plastic from land to sea. Our approach accounts for different types of plastic, including both macro- and micro-plastics. Moreover, we distinguish point sources and diffuse sources of plastic in rivers. Our modelling approach is inspired by global nutrient models, which include more than 6000 river basins. In this paper, we will present our modelling approach, as well as first model results for micro-plastic pollution in European rivers. Important sources of micro-plastics include personal care products, laundry, household dust and car tyre wear. We combine information on these sources with information on sewage management, and plastic retention during river transport for the largest European rivers. Our modelling approach may help to better understand and prevent water pollution by plastic , and at the same time serves as 'proof of concept' for future application on global scale.

  2. A GOCE-only global gravity field model by the space-wise approach

    Migliaccio, Frederica; Reguzzoni, Mirko; Gatti, Andrea

    2011-01-01

    The global gravity field model computed by the spacewise approach is one of three official solutions delivered by ESA from the analysis of the GOCE data. The model consists of a set of spherical harmonic coefficients and the corresponding error covariance matrix. The main idea behind this approach...... the orbit to reduce the noise variance and correlation before gridding the data. In the first release of the space-wise approach, based on a period of about two months, some prior information coming from existing gravity field models entered into the solution especially at low degrees and low orders...... degrees; the second is an internally computed GOCE-only prior model to be used in place of the official quick-look model, thus removing the dependency on EIGEN5C especially in the polar gaps. Once the procedure to obtain a GOCE-only solution has been outlined, a new global gravity field model has been...

  3. Probabilistic thread algebra

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  4. Probabilistic simple sticker systems

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  5. Visualizing Probabilistic Proof

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  6. Probabilistic Load Flow

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  7. Transitive probabilistic CLIR models.

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  8. A hybrid computational approach to estimate solar global radiation: An empirical evidence from Iran

    Mostafavi, Elham Sadat; Ramiyani, Sara Saeidi; Sarvar, Rahim; Moud, Hashem Izadi; Mousavi, Seyyed Mohammad

    2013-01-01

    This paper presents an innovative hybrid approach for the estimation of the solar global radiation. New prediction equations were developed for the global radiation using an integrated search method of genetic programming (GP) and simulated annealing (SA), called GP/SA. The solar radiation was formulated in terms of several climatological and meteorological parameters. Comprehensive databases containing monthly data collected for 6 years in two cities of Iran were used to develop GP/SA-based models. Separate models were established for each city. The generalization of the models was verified using a separate testing database. A sensitivity analysis was conducted to investigate the contribution of the parameters affecting the solar radiation. The derived models make accurate predictions of the solar global radiation and notably outperform the existing models. -- Highlights: ► A hybrid approach is presented for the estimation of the solar global radiation. ► The proposed method integrates the capabilities of GP and SA. ► Several climatological and meteorological parameters are included in the analysis. ► The GP/SA models make accurate predictions of the solar global radiation.

  9. Limited probabilistic risk assessment applications in plant backfitting

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  10. Probabilistic risk assessment for six vapour intrusion algorithms

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against

  11. Fairness through regulation? Reflections on a cosmopolitan approach to global finance

    Marta Božina Beroš

    2013-11-01

    Full Text Available In the aftermath of the last financial crisis a strong message prevails that ‘something’ has to be changed in the manner global finance is governed. What exactly this ‘something’ entails and what could constitute the ‘common ground’ of anticipated change is more difficult to determine. Many envisage future improvements of global financial governance by evoking deliberative democracy, political equality and cosmopolitanism. As financial regulation is the main instrument through which global finance is shaped and governed nowadays, these principles should then be transmitted to regulatory arrangements. This paper focuses on a new conceptual approach to regulatory and governance issues in global finance, by employing the philosophical idea of cosmopolitanism. It argues that although as a concept, cosmopolitanism cannot mitigate all the flaws attributed to contemporary finance, its development and extension to international financial regulation that is promulgated by institutions of the global financial system, would represent a worthwhile endeavour in making global finance more accountable and just in the eyes of many.

  12. Global Fund financing of public-private mix approaches for delivery of tuberculosis care.

    Lal, S S; Uplekar, Mukund; Katz, Itamar; Lonnroth, Knut; Komatsu, Ryuichi; Yesudian Dias, Hannah Monica; Atun, Rifat

    2011-06-01

    To map the extent and scope of public-private mix (PPM) interventions in tuberculosis (TB) control programmes supported by the Global Fund. We reviewed the Global Fund's official documents and data to analyse the distribution, characteristics and budgets of PPM approaches within Global Fund supported TB grants in recipient countries between 2003 and 2008. We supplemented this analysis with data on contribution of PPM to TB case notifications in 14 countries reported to World Health Organization in 2009, for the preparation of the global TB control report. Fifty-eight of 93 countries and multi-country recipients of Global Fund-supported TB grants had PPM activities in 2008. Engagement with 'for-profit' private sector was more prevalent in South Asia while involvement of prison health services has been common in Eastern Europe and central Asia. In the Middle East and North Africa, involving non-governmental organizations seemed to be the focus. Average and median spending on PPM within grants was 10% and 5% respectively, ranging from 0.03% to 69% of the total grant budget. In China, India, Nigeria and the Philippines, PPM contributed to detecting more than 25% TB cases while maintaining high treatment success rates. In spite of evidence of cost-effectiveness, PPM constitutes only a modest part of overall TB control activities. Scaling up PPM across countries could contribute to expanding access to TB care, increasing case detection, improving treatment outcomes and help achieve the global TB control targets. © 2011 Blackwell Publishing Ltd.

  13. Global Climate Change as Perceived by Elementary School Teachers in Yogyakarta , Indigenous Psychology Approach

    Aquilina Tanti Arini

    2017-12-01

    Full Text Available This study aimed to describe how the global climate change was perceived by teachers of elementary schools. The subjects were 111 teachers from 7 elementary schools in Yogyakarta City and Sleman district. The data were collected using open-ended questions (including perception about the weather, feeling evoked by global warming words and free responses related to global warming issues. The data were analyzed using the technique of qualitative and quantitative content analysis with Indigenous Psychology Approach. The result showed that only one teacher reported that there was no weather anomaly, while 110 teachers reported that they perceived weather anomaly. Of those who perceived weather anomaly mostly referred to natural conditions (including global climatic condition and environmental destruction and human behavior as its causes. Responses about feeling as evoked by global warming word were classified into three categories, i.e. emotional, physical and irrelevant responses. Free responses about global warming were classified into four categories respectively from the highest frequency of responses: prevention (including statement “must be prevented”, prevention behaviors and prevention efforts, states (including the weather states and feeling, causes (including technological advances and human behavior generally, and others. The research finding was discussed in the frame of environmental concern as a means of character education in elementary school.

  14. Integrated assessment of the global warming problem: A decision-analytical approach

    Van Lenthe, J.; Hendrickx, L.; Vlek, C.A.J.

    1994-12-01

    The multi-disciplinary character of the global warming problem asks for an integrated assessment approach for ordering and combining the various physical, ecological, economical, and sociological results. The Netherlands initiated their own National Research Program (NRP) on Global Air Pollution and Climate Change (NRP). The first phase (NRP-1) identified the integration theme as one of five central research themes. The second phase (NRP-2) shows a growing concern for integrated assessment issues. The current two-year research project 'Characterizing the risks: a comparative analysis of the risks of global warming and of relevant policy options, which started in September 1993, comes under the integrated assessment part of the Dutch NRP. The first part of the interim report describes the search for an integrated assessment methodology. It starts with emphasizing the need for integrated assessment at a relatively high level of aggregation and from a policy point of view. The conclusion will be that a decision-analytical approach might fit the purpose of a policy-oriented integrated modeling of the global warming problem. The discussion proceeds with an account on decision analysis and its explicit incorporation and analysis of uncertainty. Then influence diagrams, a relatively recent development in decision analysis, are introduced as a useful decision-analytical approach for integrated assessment. Finally, a software environment for creating and analyzing complex influence diagram models is discussed. The second part of the interim report provides a first, provisional integrated modeling of the global warming problem, emphasizing on the illustration of the decision-analytical approach. Major problem elements are identified and an initial problem structure is developed. The problem structure is described in terms of hierarchical influence diagrams. At some places the qualitative structure is filled with quantitative data

  15. A framework to assess the impacts of Climate Change for different hazards at local and regional scale through probabilistic multi-model approaches

    Mercogliano, P.; Reder, A.; Rianna, G.

    2017-12-01

    Extreme weather events (EWEs) are projected to be more frequent and severe across the globe because of global warming. This poses challenging problems for critical infrastructures (CIs) which can be dramatically affected by EWEs needing adaptation countermeasures againts changing climate conditions. In this work, we present the main results achieved in the framework of the FP7-European project INTACT aimed at analyzing the resilience of CIs against shocks and stresses due to the climate changes. To identify variations in the hazard induced by climate change, appropriate Extreme Weather Indicators (EWIs) are defined for several case studies and different approaches are analyzed to obtain local climate projections. The different approaches, with increasing refinement depending on local information available and methodologies selected, are investigated considering raw versus bias corrected data and weighted or equiprobable ensemble mean projections given by the regional climate models within the Euro-CORDEX program. Specifically, this work focuses on two case studies selected from the five proposed within the INTACT project and for which local station data are available: • rainfall-induced landslide affecting Campania region (Southern Italy) with a special view on the Nocera municipality; • storms and heavy rainfall/winds in port of Rotterdam (Netherlands). In general, our results show a small sensitivity to the weighting approach and a large sensitivity to bias-correction in the future projections. For landslides in Campania region, the Euro-CORDEX simulations projected a generalized worsening of the safety conditions depending on the scenario (RCP4.5/8.5) and period (2011-2040/2041-2070/2071-2100) considered. For the port of Rotterdam, the Euro-CORDEX simulations projected an increment in the intense events of daily and weekly precipitation, also in this case depending on the scenario and period considered. Considering framework, methodologies and results, the

  16. A probabilistic-based approach to monitoring tool wear state and assessing its effect on workpiece quality in nickel-based alloys

    Akhavan Niaki, Farbod

    The objective of this research is first to investigate the applicability and advantage of statistical state estimation methods for predicting tool wear in machining nickel-based superalloys over deterministic methods, and second to study the effects of cutting tool wear on the quality of the part. Nickel-based superalloys are among those classes of materials that are known as hard-to-machine alloys. These materials exhibit a unique combination of maintaining their strength at high temperature and have high resistance to corrosion and creep. These unique characteristics make them an ideal candidate for harsh environments like combustion chambers of gas turbines. However, the same characteristics that make nickel-based alloys suitable for aggressive conditions introduce difficulties when machining them. High strength and low thermal conductivity accelerate the cutting tool wear and increase the possibility of the in-process tool breakage. A blunt tool nominally deteriorates the surface integrity and damages quality of the machined part by inducing high tensile residual stresses, generating micro-cracks, altering the microstructure or leaving a poor roughness profile behind. As a consequence in this case, the expensive superalloy would have to be scrapped. The current dominant solution for industry is to sacrifice the productivity rate by replacing the tool in the early stages of its life or to choose conservative cutting conditions in order to lower the wear rate and preserve workpiece quality. Thus, monitoring the state of the cutting tool and estimating its effects on part quality is a critical task for increasing productivity and profitability in machining superalloys. This work aims to first introduce a probabilistic-based framework for estimating tool wear in milling and turning of superalloys and second to study the detrimental effects of functional state of the cutting tool in terms of wear and wear rate on part quality. In the milling operation, the

  17. Approche probabiliste des milieux poreux hétérogènes ou fracturés en relation avec les écoulements diphasiques Probabilistic Approach to Heterogeneous Or Fractured Porous Media in Relation to Two-Phase Flows

    Jacquin C.

    2006-11-01

    Full Text Available La prise en compte des particularités structurales des gisements pétroliers fracturés ou hétérogènes est nécessaire à l'amélioration des prévisions de production. La description de ce type de gisements relève d'une approche probabiliste, qui conduit à une estimation des caractéristiques de la roche réservoir : distribution des dimensions des blocs d'un réservoir fissuré, échelles d'hétérogénéité. Ces caractéristiques sont introduites dans les modèles déterministes qui décrivent l'écoulement des fluides. On présente en particulier les problèmes que pose la transposition au gisement des résultats obtenus au laboratoire sur petits échantillons : changement d'échelle géométrique, estimation de la récupération finale et de l'évolution de la production en fonction du temps. The structural features of fractured or heterogenous oil fields must be taken into consideration to improve production forecasting. The description of such fields is based on a probabilistic approach leading to an estimate of the characteristics of the reservoir rock, i. e. distribution of the block sizes of a fissured reservoir, scales of heterogeneity. These characteristics are fed into deterministic models that describe fluid flows. Special attention is paid to problems raised by the transposition of laboratory results obtained on small samples to a field. Such problems include the change in geometric scale, the estimating of ultimate recovery and how production will evolve in time.

  18. Investing for Impact: The Global Fund Approach to Measurement of AIDS Response.

    Jain, Suman; Zorzi, Nathalie

    2017-07-01

    The Global Fund raises and invests nearly US$4 billion a year to support programs run in more than 140 countries. The Global Fund strategy 2012-2016 is focused on "Investing for Impact". In order to accomplish this, timely and accurate data are needed to inform strategies and prioritize activities to achieve greater coverage with quality services. Monitoring and evaluation is intrinsic to the Global Fund's system of performance-based funding. The Global Fund invests in strengthening measurement and reporting of results at all stages of the grant cycle. The Global Fund approach to measurement is based on three key principles-(1) simplified reporting: the Global Fund has updated its measurement guidance to focus on impact, coverage and quality with the use of a harmonized set of indicators. (2) Supporting data systems-based on a common framework developed and supported by partners, it promotes investment in five common data systems: routine reporting including HMIS; Surveys-population based and risk group surveys; Analysis, reviews and transparency; Administrative and financial data sources; and, Vital registration systems. (3) Strengthen data use: the Global Fund funding encourages use of data at all levels-national, subnational and site level. Countries do not automatically prioritize M&E but when guidance, tools and investments are available, there is high level utilization of M&E systems in program design, planning, implementation, and results reporting. An in-depth analysis of the available data helps the Global Fund and countries to direct investments towards interventions where impact could be achieved and focus on target population groups and geographic areas that are most affected.

  19. Senior University Officials' Approaches to Global Engagement: A Case Study of a Private and a Public Research University

    Chan, Shirley

    2013-01-01

    The phenomenon of globalization has a significant impact on higher education, but the lack of a clear roadmap for how senior university officials should create and implement global engagement strategies and for how these approaches support (or impede) an organizational culture that fosters globalization remains a gap in knowledge in higher…

  20. The Emergence of Cambodian Civil Society within Global Educational Governance: A Morphogenetic Approach to Agency and Structure

    Edwards, D. Brent, Jr.; Brehm, William C.

    2015-01-01

    This paper uses Margaret Archer's morphogenetic approach to analyze the emergence of civil society within global educational governance. The purpose is to understand the intersection of historical structures with global actors and spaces that have accompanied the globalization of education. Based on findings from a study on the impact in Cambodia…

  1. Globalization

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

  2. Globalization

    F. Gerard Adams

    2008-01-01

    The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is “flat†. While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between “old†countries and “new†. As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

  3. 78 FR 10181 - Global Quality Systems-An Integrated Approach To Improving Medical Product Safety; Public Workshop

    2013-02-13

    ...] Global Quality Systems--An Integrated Approach To Improving Medical Product Safety; Public Workshop... (AFDO), is announcing a public workshop entitled ``Global Quality Systems--An Integrated Approach to... topics concerning FDA requirements related to the production and marketing of drugs and/or devices...

  4. A global approach to estimate irrigated areas - a comparison between different data and statistics

    Meier, Jonas; Zabel, Florian; Mauser, Wolfram

    2018-02-01

    Agriculture is the largest global consumer of water. Irrigated areas constitute 40 % of the total area used for agricultural production (FAO, 2014a) Information on their spatial distribution is highly relevant for regional water management and food security. Spatial information on irrigation is highly important for policy and decision makers, who are facing the transition towards more efficient sustainable agriculture. However, the mapping of irrigated areas still represents a challenge for land use classifications, and existing global data sets differ strongly in their results. The following study tests an existing irrigation map based on statistics and extends the irrigated area using ancillary data. The approach processes and analyzes multi-temporal normalized difference vegetation index (NDVI) SPOT-VGT data and agricultural suitability data - both at a spatial resolution of 30 arcsec - incrementally in a multiple decision tree. It covers the period from 1999 to 2012. The results globally show a 18 % larger irrigated area than existing approaches based on statistical data. The largest differences compared to the official national statistics are found in Asia and particularly in China and India. The additional areas are mainly identified within already known irrigated regions where irrigation is more dense than previously estimated. The validation with global and regional products shows the large divergence of existing data sets with respect to size and distribution of irrigated areas caused by spatial resolution, the considered time period and the input data and assumption made.

  5. Cultivating Global Competencies in Costa Rica: One Community College's Innovative Approach to Taking Early Childhood Education Global

    Delafield, Julia

    2018-01-01

    Giving an immersive global experience to preservice early childhood educators lays the foundation for building their global competencies and thereby helping them provide their own students with 21st century skills.

  6. A new approach to a global fit of the CKM matrix

    Hoecker, A.; Lacker, H.; Laplace, S. [Laboratoire de l' Accelerateur Lineaire, 91 - Orsay (France); Le Diberder, F. [Laboratoire de Physique Nucleaire et des Hautes Energies, 75 - Paris (France)

    2001-05-01

    We report on a new approach to a global CKM matrix analysis taking into account most recent experimental and theoretical results. The statistical framework (Rfit) developed in this paper advocates frequentist statistics. Other approaches, such as Bayesian statistics or the 95% CL scan method are also discussed. We emphasize the distinction of a model testing and a model dependent, metrological phase in which the various parameters of the theory are estimated. Measurements and theoretical parameters entering the global fit are thoroughly discussed, in particular with respect to their theoretical uncertainties. Graphical results for confidence levels are drawn in various one and two-dimensional parameter spaces. Numerical results are provided for all relevant CKM parameterizations, the CKM elements and theoretical input parameters. Predictions for branching ratios of rare K and B meson decays are obtained. A simple, predictive SUSY extension of the Standard Model is discussed. (authors)

  7. A global learning-centered approach to higher education: workplace development in the 21st century

    Carlos Tasso Eira de Aquino

    2017-01-01

    Full Text Available Competition in the 21st century economy requires corporations, organizations, and professionals to face a common challenge: diverse individuals need consistent motivation towards building competences that increase personal marketability using a combination of higher education and professional development. This article represents an evolving report summary and non-traditional learning-centered approach focusing on adult competences necessary for succeeding in the competitive global marketplace of the 21st century. The purpose of this article is to understand the needs of constantly changing employer demands in the work environment. Exploring contemporary approaches related to skill development, adult education, and learning processes, will be the path towards higher levels of professional success. This article will provide readers with an enlightening discussion focusing on the necessary adult skills and competencies professionals need to succeed in the global marketplace.

  8. Probabilistic reasoning for assembly-based 3D modeling

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  9. The response of terrestrial ecosystems to global climate change: Towards an integrated approach

    Rustad, Lindsey E.

    2008-01-01

    Accumulating evidence points to an anthropogenic 'fingerprint' on the global climate change that has occurred in the last century. Climate change has, and will continue to have, profound effects on the structure and function of terrestrial ecosystems. As such, there is a critical need to continue to develop a sound scientific basis for national and international policies regulating carbon sequestration and greenhouse gas emissions. This paper reflects on the nature of current global change experiments, and provides recommendations for a unified multidisciplinary approach to future research in this dynamic field. These recommendations include: (1) better integration between experiments and models, and amongst experimental, monitoring, and space-for-time studies; (2) stable and increased support for long-term studies and multi-factor experiments; (3) explicit inclusion of biodiversity, disturbance, and extreme events in experiments and models; (4) consideration of timing vs intensity of global change factors in experiments and models; (5) evaluation of potential thresholds or ecosystem 'tipping points'; and (6) increased support for model-model and model-experiment comparisons. These recommendations, which reflect discussions within the TERACC international network of global change scientists, will facilitate the unraveling of the complex direct and indirect effects of global climate change on terrestrial ecosystems and their components

  10. Perspectives on global climate change: A review of the adaptation and mitigation approaches

    Morrisette, P.M.

    1992-01-01

    This paper was prepared for the conference on Global Climate Change and International Security sponsored by the Midwest Consortium for International Security Studies of the American Academy of Arts and Sciences and held in Chicago, Illinois on February 11-13, 1992. The purpose of the paper is to provide some background on the different perceptions and perspectives that are presently shaping the policy debate on how to respond to the problem of global warming. For better or worse, this debate has focused primarily on whether to adapt to climate change in the future or to mitigate climate change in the present, and as the issue has become increasingly political this debate has become polarized. The two approaches, as this paper notes, are not mutually exclusive; in fact, they share much in common. Differences, however, can be found in how proponents of each view the risks of global climate change. This paper provides a brief outline of the progression of global warming from an obscure scientific concern into a leading international political issue, reviews previous efforts by social scientists to assess attitudes and positions on global warming, and examines in detail the adaptation and mitigation perspectives and assesses how they differ on the basis of different conceptions of uncertainty and risk, equity, and technology

  11. Probabilistic Methods for the Quantification of Uncertainty and Error in Computational Fluid Dynamic Simulations

    Faragher, John

    2004-01-01

    ... conservatism to allow for them. This report examines the feasibility of using a probabilistic approach for modelling the component temperatures in an engine using CFD (Computational Fluid Dynamics).

  12. Guidelines and recommendations for regional approaches to disarmament within the context of global security

    Mason, P.

    1994-01-01

    Guidelines and recommendations for regional approaches to disarmament within the context of global security provide both a conceptual framework within which to pursue arms control in South Asia and a variety of concrete mechanisms or tools to carry out the task. However, they cannot operate independently of a broader process of political accommodation, which might be named as 'cooperative security building'. That process, however embryonic, is under way across Asia Pacific region

  13. Comprehensive and market-based approaches to global-change policy

    Stewart, R.B.

    1992-01-01

    The summary highlights the need to take a system-wide approach to global change. The need to minimize costs while achieving environmental goals suggests the utility of employing market-based incentives that (1) ensure that the most is achieved by those who can do so at least cost and (2) encourage, innovation, rather than using 'command-and-control' tactics that mandate uniform adoption of centrally selected techniques. 18 refs

  14. Probabilistic assessment of faults

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  15. Global partnering related to nuclear materials safeguards and security - A pragmatic approach to international safeguards work

    Stanford, Dennis

    2007-01-01

    This paper documents issues Nuclear Fuel Services, Inc. has addressed in the performance of international work to safeguards and security work. It begins with a description of the package we put together for a sample proposal for the Global Threat Reduction Initiative, for which we were ranked number one for technical approach and cost, and concludes with a discussion of approaches that we have taken to performing this work, including issues related to performing the work as part of a team. The primary focus is on communication, workforce, equipment, and coordination issues. Finally, the paper documents the rules that we use to assure the work is performed safely and successfully. (author)

  16. Critical remarks on Simon Caney's humanity- centered approach to global justice

    Julian Culp

    2016-09-01

    The practice-independent approach to theorizing justice (PIA holds that the social practices to which a particular conception of justice is meant to apply are of no importance for the justification of such a conception. In this paper I argue that this approach to theorizing justice is incompatible with the method of reflective equilibrium (MRE because the MRE is antithetical to a clean separation between issues of justification and application. In particular I will be maintaining that this incompatibility renders Simon Caney’s cosmopolitan theory of global justice inconsistent, because Caney claims to endorse both a humanity-centered PIA and the MRE.

  17. Mode decomposition methods for flows in high-contrast porous media. Global-local approach

    Ghommem, Mehdi; Presho, Michael; Calo, Victor M.; Efendiev, Yalchin R.

    2013-01-01

    In this paper, we combine concepts of the generalized multiscale finite element method (GMsFEM) and mode decomposition methods to construct a robust global-local approach for model reduction of flows in high-contrast porous media. This is achieved by implementing Proper Orthogonal Decomposition (POD) and Dynamic Mode Decomposition (DMD) techniques on a coarse grid computed using GMsFEM. The resulting reduced-order approach enables a significant reduction in the flow problem size while accurately capturing the behavior of fully-resolved solutions. We consider a variety of high-contrast coefficients and present the corresponding numerical results to illustrate the effectiveness of the proposed technique. This paper is a continuation of our work presented in Ghommem et al. (2013) [1] where we examine the applicability of POD and DMD to derive simplified and reliable representations of flows in high-contrast porous media on fully resolved models. In the current paper, we discuss how these global model reduction approaches can be combined with local techniques to speed-up the simulations. The speed-up is due to inexpensive, while sufficiently accurate, computations of global snapshots. © 2013 Elsevier Inc.

  18. Mode decomposition methods for flows in high-contrast porous media. Global-local approach

    Ghommem, Mehdi

    2013-11-01

    In this paper, we combine concepts of the generalized multiscale finite element method (GMsFEM) and mode decomposition methods to construct a robust global-local approach for model reduction of flows in high-contrast porous media. This is achieved by implementing Proper Orthogonal Decomposition (POD) and Dynamic Mode Decomposition (DMD) techniques on a coarse grid computed using GMsFEM. The resulting reduced-order approach enables a significant reduction in the flow problem size while accurately capturing the behavior of fully-resolved solutions. We consider a variety of high-contrast coefficients and present the corresponding numerical results to illustrate the effectiveness of the proposed technique. This paper is a continuation of our work presented in Ghommem et al. (2013) [1] where we examine the applicability of POD and DMD to derive simplified and reliable representations of flows in high-contrast porous media on fully resolved models. In the current paper, we discuss how these global model reduction approaches can be combined with local techniques to speed-up the simulations. The speed-up is due to inexpensive, while sufficiently accurate, computations of global snapshots. © 2013 Elsevier Inc.

  19. Probabilistic Model Development

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  20. Geothermal probabilistic cost study

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)