WorldWideScience

Sample records for on-line probabilistic diagnostic

  1. Probabilistic Forecasting for On-line Operation of Urban Drainage Systems

    DEFF Research Database (Denmark)

    Löwe, Roland

    This thesis deals with the generation of probabilistic forecasts in urban hydrology. In particular, we focus on the case of runoff forecasting for real-time control (RTC) on horizons of up to two hours. For the generation of probabilistic on-line runoff forecasts, we apply the stochastic grey...... and forecasts have on on-line runoff forecast quality. Finally, we implement the stochastic grey-box model approach in a real-world real-time control (RTC) setup and study how RTC can benefit from a dynamic quantification of runoff forecast uncertainty....

  2. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  3. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    Science.gov (United States)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  4. Efficient Probabilistic Diagnostics for Electrical Power Systems

    Science.gov (United States)

    Mengshoel, Ole J.; Chavira, Mark; Cascio, Keith; Poll, Scott; Darwiche, Adnan; Uckun, Serdar

    2008-01-01

    We consider in this work the probabilistic approach to model-based diagnosis when applied to electrical power systems (EPSs). Our probabilistic approach is formally well-founded, as it based on Bayesian networks and arithmetic circuits. We investigate the diagnostic task known as fault isolation, and pay special attention to meeting two of the main challenges . model development and real-time reasoning . often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-to-use speci.cation language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In essence, we introduce a high-level EPS speci.cation language from which Bayesian networks that can diagnose multiple simultaneous failures are auto-generated, and we illustrate the feasibility of using arithmetic circuits, compiled from Bayesian networks, for real-time diagnosis on real-world EPSs of interest to NASA. The experimental system is a real-world EPS, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. In experiments with the ADAPT Bayesian network, which currently contains 503 discrete nodes and 579 edges, we .nd high diagnostic accuracy in scenarios where one to three faults, both in components and sensors, were inserted. The time taken to compute the most probable explanation using arithmetic circuits has a small mean of 0.2625 milliseconds and standard deviation of 0.2028 milliseconds. In experiments with data from ADAPT we also show that arithmetic circuit evaluation substantially outperforms joint tree propagation and variable elimination, two alternative algorithms for diagnosis using Bayesian network inference.

  5. On-line diagnostics for a real time system

    International Nuclear Information System (INIS)

    Sreenivasan, P.

    1976-01-01

    The purpose of an on-line diagnostics is to infuse the ability of self diagnosing in an online computer to enhance its dependability in a real time system. Such a diagnostics evolved for the CDPS of the Fast Breeder Test Reactor at Kalpakkam is reported. The two phases of the diagnostics, i.e., the malfunction detection and post detection action are described in some detail. (A.K.)

  6. Experience with on-line diagnostics for bushings and current transformers

    Energy Technology Data Exchange (ETDEWEB)

    Brusetti, R.

    2004-02-01

    The application of on-line diagnostic techniques is advocated as an alternative, under certain conditions, for the conventional off-line power factor test used to evaluate the condition of bushings and current transformers. Bushings which are tied to critical system apparatus or which cannot be readily removed from service are considered to be excellent candidates for on-line diagnostics. A case history involving 138-KV bushings is used to demonstrate the value of the on-line diagnostic system which, by taking into account power factor capacitance values along with the rate of change, not only detected the high power factor associated with the bushings, but allowed the high power factor bushings to continue operating with a known problem for over a year. With conventional testing, bushings that exhibited this level of power factor would typically be removed from service. 4 refs., 3 figs.

  7. Reliability assessment of fiber optic communication lines depending on external factors and diagnostic errors

    Science.gov (United States)

    Bogachkov, I. V.; Lutchenko, S. S.

    2018-05-01

    The article deals with the method for the assessment of the fiber optic communication lines (FOCL) reliability taking into account the effect of the optical fiber tension, the temperature influence and the built-in diagnostic equipment errors of the first kind. The reliability is assessed in terms of the availability factor using the theory of Markov chains and probabilistic mathematical modeling. To obtain a mathematical model, the following steps are performed: the FOCL state is defined and validated; the state graph and system transitions are described; the system transition of states that occur at a certain point is specified; the real and the observed time of system presence in the considered states are identified. According to the permissible value of the availability factor, it is possible to determine the limiting frequency of FOCL maintenance.

  8. Mjollnir Rotational Line Scan Diagnostics.

    Science.gov (United States)

    1981-05-19

    using long cavity. M8 Removable Pellicle Beam Splitter for He-Ne Lineup Beam. Removed before HF or DF laser is turned on. 27 A 27 * A r of the chopper...three probe laser lines, however three lines were sequentially measured to verify the diagnostic equipment. Two of the three lines have been monitored

  9. Statistical physics of medical diagnostics: Study of a probabilistic model.

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  10. Statistical physics of medical diagnostics: Study of a probabilistic model

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  11. Diagnostic efficacy of optimised evaluation of planar MIBI myocardium perfusion scintigraphy: a probabilistic approach

    International Nuclear Information System (INIS)

    Kusmierek, J.; Plachcinska, A.

    1999-01-01

    Background: The Bayesian (probabilistic) approach to the results of a diagnostic test appears to be more informative than an interpretation of results in binary terms (having disease or not). The aim of our study was the analysis of the effect of an optimised evaluation of myocardium perfusion scintigrams on the probability of CAD in individual patients. Methods: 197 patients (132 males and 65 females) suspected of CAD, with no history of myocardial infarction were examined. Scintigraphic images were evaluated applying two methods of analysis: visual (semiquantitative) and quantitative, and the combination of both. The sensitivity and specificity of both methods (and their combination) in the detection of CAD were determined and optimal methods of scintigram evaluation, separately for males and females, were selected. All patients were subjected to coronary angiography. The pre-test probability of CAD was assessed according to Diamond (1) and the post-test probability was evaluated in accordance with Bayes's theorem. Patients were divided, according to a pre-test probability of CAD, into 3 groups: with low, medium and high probability of the disease. The same subdivision was made in relation to post-test probability of CAD. The numbers of patients in respective subgroups, before and after the test, were compared. Moreover, in order to test the reliability of post-test probability, its values were compared with real percentages of CAD occurrence among the patients under study, as demonstrated by the angiography. Results: The combination of visual and quantitative methods was accepted as the optimal method of male scintigram evaluation (with sensitivity and specificity equalling 95% and 82%, respectively) and a sole quantitative analysis as the optimal method of female scintigram evaluation (sensitivity and specificity amounted to 81% and 84%, respectively). In the subgroup of males the percentage of individuals with medium pre-test CAD probability equalled 52 and

  12. Temperature diagnostic line ratios of Fe XVII

    International Nuclear Information System (INIS)

    Raymond, J.C.; Smith, B.W.; Los Alamos National Lab., NM)

    1986-01-01

    Based on extensive calculations of the excitation rates of Fe XVII, four temperature-sensitive line ratios are investigated, paying special attention to the contribution of resonances to the excitation rates and to the contributions of dielectronic recombination satellites to the observed line intensities. The predictions are compared to FPCS observations of Puppis A and to Solar Maximum Mission (SMM) and SOLEX observations of the sun. Temperature-sensitive line ratios are also computed for emitting gas covering a broad temperature range. It is found that each ratio yields a differently weighted average for the temperature and that this accounts for some apparent discrepancies between the theoretical ratios and solar observations. The effects of this weighting on the Fe XVII temperature diagnostics and on the analogous Fe XXIV/Fe XXV satellite line temperature diagnostics are discussed. 27 references

  13. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting pa...

  14. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  15. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  16. A modular multi-microcomputer system for on-line vibration diagnostics

    International Nuclear Information System (INIS)

    Saedtler, E.

    1988-01-01

    A new modular multi-microprocessor system for on-line vibration monitoring and diagnostics of PWRs is described. The aim of the system is to make feasible an early detection of increasing failures in relevant regions of a reactor plant, to verify the mechanical integrity of the investigated components, and to improve therefore the operational safety of the plant. After a discussion of the implemented surveillance methods and algorithms, which are based on hierarchical structured identification (estimation) and statistical pattern recognition tools, the system architecture (software and hardware) is portrayed. The classification scheme itself works sequential so that samples (or features) can arrive on-line. This on-line classification is important in order to take necessary actions in time. Furthermore, the system has learning capabilities, which means it is adaptable to different, varying states and plant conditions. The main features of the system are presented and its contribution to an automation of complex surveillance and monitoring tasks is shown. (author)

  17. Probabilistic Safety Analysis of High Speed and Conventional Lines Using Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Grande Andrade, Z.; Castillo Ron, E.; O' Connor, A.; Nogal, M.

    2016-07-01

    A Bayesian network approach is presented for probabilistic safety analysis (PSA) of railway lines. The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line, such as light and speed limit signals, tunnel or viaduct entries or exits, cuttings and embankments, acoustic sounds received in the cabin, curves, switches, etc. In addition, since the human error is very relevant for safety evaluation, the automatic train protection (ATP) systems and the driver behavior and its time evolution are modelled and taken into account to determine the probabilities of human errors. The nodes of the Bayesian network, their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given. The conditional probability tables are reproduced by closed formulas, which facilitate the modelling and the sensitivity analysis. A sorted list of the most dangerous elements in the line is obtained, which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially. The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines. (Author)

  18. Design considerations for on-line vibration diagnostic systems

    International Nuclear Information System (INIS)

    Branagan, L.A.; Schjeibel, J.R.

    1989-01-01

    The decisions made in the design of a data system for on-line vibration diagnostic system in power plants define how well the system will meet its intended goals. Direct use of the data for performing troubleshooting or developing operating correlations requires an understanding of the subtle impact of the design decisions incorporated in the data system. A data system includes data acquisition, data storage, and data retrieval. Data acquisition includes the selection of sensors, of vibration measurement modes, and of the time stamping format, and the arrangement of data collection cycles. Data storage requires the evaluation of data compression options and of data segregation. Data retrieval design requires an understanding of the data storage and acquisition techniques. Each of these options and design decisions involves compromises, many of which are discussed in this paper. Actual and synthetic data are presented to illustrate these points. The authors' experience with multiple data collection cycles, with frequent monitoring, and with storage by exception suggests that these techniques can be developed into an effective diagnostic system

  19. On-line quantile regression in the RKHS (Reproducing Kernel Hilbert Space) for operational probabilistic forecasting of wind power

    International Nuclear Information System (INIS)

    Gallego-Castillo, Cristobal; Bessa, Ricardo; Cavalcante, Laura; Lopez-Garcia, Oscar

    2016-01-01

    Wind power probabilistic forecast is being used as input in several decision-making problems, such as stochastic unit commitment, operating reserve setting and electricity market bidding. This work introduces a new on-line quantile regression model based on the Reproducing Kernel Hilbert Space (RKHS) framework. Its application to the field of wind power forecasting involves a discussion on the choice of the bias term of the quantile models, and the consideration of the operational framework in order to mimic real conditions. Benchmark against linear and splines quantile regression models was performed for a real case study during a 18 months period. Model parameter selection was based on k-fold crossvalidation. Results showed a noticeable improvement in terms of calibration, a key criterion for the wind power industry. Modest improvements in terms of Continuous Ranked Probability Score (CRPS) were also observed for prediction horizons between 6 and 20 h ahead. - Highlights: • New online quantile regression model based on the Reproducing Kernel Hilbert Space. • First application to operational probabilistic wind power forecasting. • Modest improvements of CRPS for prediction horizons between 6 and 20 h ahead. • Noticeable improvements in terms of Calibration due to online learning.

  20. Experiences with 'on-line' diagnostic instrumentation in nuclear power plants

    International Nuclear Information System (INIS)

    Gopal, R.; Ciaramitaro, W.; Smith, J.R.

    1981-01-01

    Over the past several years, Westinghouse has developed a coordinated system of on-line diagnostic instrumentation for the acquisition and analysis of data for diagnostics and incipient failure detection of critical plant equipment and systems. Primary motivation for this work is to improve Nuclear Steam Supply System (NSSS) availability and maintainability through the detection of malfunctions at their inception. These systems include: 1) Acoustic leak monitoring for detection and location of leaks in the primary system pressure boundary and other piping systems in PWR's; 2) Metal impact monitoring for detection of loose debris in the reactor vessel and steam generators; 3) Nuclear noise monitoring for monitoring core barrel vibration. Summarized in this paper are some of the features of the systems and inplant experience. (author)

  1. Density dependence of line intensities and application to plasma diagnostics

    International Nuclear Information System (INIS)

    Masai, Kuniaki.

    1993-02-01

    Electron density dependence of spectral lines are discussed in view of application to density diagnostics of plasmas. The dependence arises from competitive level population processes, radiative and collisional transitions from the excited states. Results of the measurement on tokamak plasmas are presented to demonstrate the usefulness of line intensity ratios for density diagnostics. Also general characteristics related to density dependence are discussed with atomic-number scaling for H-like and He-like systems to be helpful for application to higher density plasmas. (author)

  2. An on-line BCI for control of hand grasp sequence and holding using adaptive probabilistic neural network.

    Science.gov (United States)

    Hazrati, Mehrnaz Kh; Erfanian, Abbas

    2008-01-01

    This paper presents a new EEG-based Brain-Computer Interface (BCI) for on-line controlling the sequence of hand grasping and holding in a virtual reality environment. The goal of this research is to develop an interaction technique that will allow the BCI to be effective in real-world scenarios for hand grasp control. Moreover, for consistency of man-machine interface, it is desirable the intended movement to be what the subject imagines. For this purpose, we developed an on-line BCI which was based on the classification of EEG associated with imagination of the movement of hand grasping and resting state. A classifier based on probabilistic neural network (PNN) was introduced for classifying the EEG. The PNN is a feedforward neural network that realizes the Bayes decision discriminant function by estimating probability density function using mixtures of Gaussian kernels. Two types of classification schemes were considered here for on-line hand control: adaptive and static. In contrast to static classification, the adaptive classifier was continuously updated on-line during recording. The experimental evaluation on six subjects on different days demonstrated that by using the static scheme, a classification accuracy as high as the rate obtained by the adaptive scheme can be achieved. At the best case, an average classification accuracy of 93.0% and 85.8% was obtained using adaptive and static scheme, respectively. The results obtained from more than 1500 trials on six subjects showed that interactive virtual reality environment can be used as an effective tool for subject training in BCI.

  3. Characterization of the Goubau line for testing beam diagnostic instruments

    Science.gov (United States)

    Kim, S. Y.; Stulle, F.; Sung, C. K.; Yoo, K. H.; Seok, J.; Moon, K. J.; Choi, C. U.; Chung, Y.; Kim, G.; Woo, H. J.; Kwon, J.; Lee, I. G.; Choi, E. M.; Chung, M.

    2017-12-01

    One of the main characteristics of the Goubau line is that it supports a low-loss, non-radiated surface wave guided by a dielectric-coated metal wire. The dominant mode of the surface wave along the Goubau line is a TM01 mode, which resembles the pattern of the electromagnetic fields induced in the metallic beam pipe when the charged particle beam passes through it. Therefore, the Goubau line can be used for the preliminary bench test and performance optimization of the beam diagnostic instruments without requiring charged particle beams from the accelerators. In this paper, we discuss the basic properties of the Goubau line for testing beam diagnostic instruments and present the initial test results for button-type beam position monitors (BPMs). The experimental results are consistent with the theoretical estimations, which indicates that Goubau line allows effective testing of beam diagnostic equipment.

  4. On-line monitoring system for utility boiler diagnostics

    International Nuclear Information System (INIS)

    Radovanovic, P.M.; Afgan, N.H.; Caralho, M.G.

    1997-01-01

    The paper deals with the new developed modular type Monitoring System for Utility Boiler Diagnostics. Each module is intended to assess the specific process and can be used as a stand alone application. Four modules are developed, namely: LTC - module for the on-line monitoring of parameters related to the life-time consumption of selected boiler components; TRD - module for the tube rupture detection by the position and working fluid Ieakage quantity; FAM - module for the boiler surfaces fouling (slagging) assessment and FLAP - module for visualization of the boiler furnace flame position. All four modules are tested on respective pilot plants built oil the 200 and 300 MWe utility boilers. Monitoring System is commercially available and can be realized in any combination of its modules depending on demands induced by the operational problems of specific boiler. Further development of Monitoring System is performed in accordance with the respective EU project on development of Boiler Expert System. (Author)

  5. Time-dependent analysis of visible helium line-ratios for electron temperature and density diagnostic using synthetic simulations on NSTX-U

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz Burgos, J. M., E-mail: jmunozbu@pppl.gov; Stutman, D.; Tritz, K. [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, Maryland 21218 (United States); Barbui, T.; Schmitz, O. [Department of Engineering Physics, University of Wisconsin-Madison, Madison, Wisconsin 53706 (United States)

    2016-11-15

    Helium line-ratios for electron temperature (T{sub e}) and density (n{sub e}) plasma diagnostic in the Scrape-Off-Layer (SOL) and edge regions of tokamaks are widely used. Due to their intensities and proximity of wavelengths, the singlet, 667.8 and 728.1 nm, and triplet, 706.5 nm, visible lines have been typically preferred. Time-dependency of the triplet line (706.5 nm) has been previously analyzed in detail by including transient effects on line-ratios during gas-puff diagnostic applications. In this work, several line-ratio combinations within each of the two spin systems are analyzed with the purpose of eliminating transient effects to extend the application of this powerful diagnostic to high temporal resolution characterization of plasmas. The analysis is done using synthetic emission modeling and diagnostic for low electron density NSTX SOL plasma conditions by several visible lines. Quasi-static equilibrium and time-dependent models are employed to evaluate transient effects of the atomic population levels that may affect the derived electron temperatures and densities as the helium gas-puff penetrates the plasma. The analysis of a wider range of spectral lines will help to extend this powerful diagnostic to experiments where the wavelength range of the measured spectra may be constrained either by limitations of the spectrometer or by other conflicting lines from different ions.

  6. Development of Probabilistic Performance Evaluation Procedure for Umbilical Lines of Seismically Isolated NPPs

    International Nuclear Information System (INIS)

    Hahm, Daegi; Park, Junhee; Choi, Inkil

    2013-01-01

    In this study, we proposed a procedure to perform the probabilistic performance evaluation of interface piping system for seismically isolated NPPs, and carried out the preliminary performance evaluation of the target example umbilical line. For EDB level earthquakes, the target performance goal cannot be fulfilled, but we also find out that the result can be changed with respect to the variation of the assumed values, i. e., the distribution of response, and the limit state of piping system. Recently, to design the nuclear power plants (NPPs) more efficiently and safely against the strong seismic load, many researchers focus on the seismic isolation system. For the adoption of seismic isolation system to the NPPs, the seismic performance of isolation devices, structures, and components should be guaranteed firstly. Hence, some researches were performed to determine the seismic performance of such items. For the interface piping system between isolated structure and non-isolated structure, the seismic capacity should be carefully estimated since that the required displacement absorption capacity will be increased significantly by the adoption of the seismic isolation system. Nowadays, in NUREG report, the probabilistic performance criteria for isolated NPP structures and components are proposed. Hence, in this study, we developed the probabilistic performance evaluation method and procedure for interface piping system, and applied the method to an example pipe. The detailed procedure and main results are summarized in next section. For the interface piping system, the seismic capacity should be carefully estimated since that the required displacement absorption capacity will be increased significantly by the adoption of the seismic isolation system

  7. Vaccum and beam diagnostic controls for ORIC beam lines

    International Nuclear Information System (INIS)

    Tatum, B.A.

    1991-01-01

    Vacuum and beam diagnostic equipment on beam lines from the Oak Ridge Isochronous Cyclotron, ORIC, is now controlled by a new dedicated system. The new system is based on an industrial programmable logic controller with an IBM AT personal computer providing control room operator interface. Expansion of this system requires minimal reconfiguration and programming, thus facilitating the construction of additional beam lines. Details of the implementation, operation, and performance of the system are discussed. 2 refs., 2 figs

  8. On-line spectral diagnostic system for Dalian Coherent Light Source

    Energy Technology Data Exchange (ETDEWEB)

    Li, Chaoyang; Wei, Shen; Du, Xuewei [Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); Du, Liangliang [National Synchrotron Radiation Laboratory, University of Science & Technology of China, Hefei, Anhui 230029 (China); Wang, Qiuping, E-mail: qiuping@ustc.edu.cn [Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); Zhang, Weiqing; Wu, Guorong; Dai, Dongxu [Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China); Yang, Xueming, E-mail: xmyang@dicp.ac.cn [Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023 (China)

    2015-05-21

    The Dalian Coherent Light Source (DCLS) is a Free electron laser (FEL) user facility currently under construction in the northeast of China. It is designed to work on high gain high harmonic principle with the capability of wavelength continuously tunable in the EUV regime of 50–150 nm. The light source has unique features such as the turntable radiation frequency, wide spectral range, high brightness and peak power, very short pulse time structure, etc. A key diagnostic task in DCLS is the on-line source spectral characteristic recording during the source development, and for the definition of the experimental conditions. For this purpose, an online grazing incidence spectrometer with a toroidal mirror and a variable-line-spacing plane grating is designed and presented in this paper to monitor each single FEL pulse. A circular stage is chosen to fit the focal curve and to realize the wavelength scanning. This scanning mechanics is simpler and stable. Resolving power (λ/Δλ) of this spectrometer is better than 12,000 in the whole wavelength range.

  9. On-line spectral diagnostic system for Dalian Coherent Light Source

    International Nuclear Information System (INIS)

    Li, Chaoyang; Wei, Shen; Du, Xuewei; Du, Liangliang; Wang, Qiuping; Zhang, Weiqing; Wu, Guorong; Dai, Dongxu; Yang, Xueming

    2015-01-01

    The Dalian Coherent Light Source (DCLS) is a Free electron laser (FEL) user facility currently under construction in the northeast of China. It is designed to work on high gain high harmonic principle with the capability of wavelength continuously tunable in the EUV regime of 50–150 nm. The light source has unique features such as the turntable radiation frequency, wide spectral range, high brightness and peak power, very short pulse time structure, etc. A key diagnostic task in DCLS is the on-line source spectral characteristic recording during the source development, and for the definition of the experimental conditions. For this purpose, an online grazing incidence spectrometer with a toroidal mirror and a variable-line-spacing plane grating is designed and presented in this paper to monitor each single FEL pulse. A circular stage is chosen to fit the focal curve and to realize the wavelength scanning. This scanning mechanics is simpler and stable. Resolving power (λ/Δλ) of this spectrometer is better than 12,000 in the whole wavelength range

  10. A new gamma-ray diagnostic for energetic ion distributions - The Compton tail on the neutron capture line

    International Nuclear Information System (INIS)

    Vestrand, W.T.

    1990-01-01

    This paper presents a new radiation diagnostic for assaying the energy spectrum and the angular distribution of energetic ions incident on thick hydrogen-rich thermal targets. This diagnostic compares the number of emergent photons in the narrow neutron capture line at 2.223 MeV to the number of Compton scattered photons that form a low-energy tail on the line. It is shown that the relative strength of the tail can be used as a measure of the hardness of the incident ion-energy spectrum. Application of this diagnostic to solar flare conditions is the main thrust of the work presented here. It is examined how the strength of the Compton tail varies with flare viewing angle and the angular distribution of the flare-accelerated particles. Application to compact X-ray binary systems is also briefly discussed. 39 refs

  11. On-line Vibration Diagnostics of the Optical Elements at BL-28 of the Photon Factory

    International Nuclear Information System (INIS)

    Maruyama, T.; Kashiwagi, T.; Kikuchi, T.; Toyoshima, A.; Kubota, M.; Ono, K.

    2007-01-01

    We have analyzed the data of encoders attached to optical elements and developed an on-line vibration diagnostics system of the monochromator. After eliminating the vibration source we have been able to improve the performance of the monochromator

  12. Operational experiences on the Borssele nuclear power plant using computer based surveillance and diagnostic system on-line

    International Nuclear Information System (INIS)

    Turkcan, E.; Quaadvliet, W.H.J.; Peeters, T.T.J.M.; Verhoef, J.P

    1991-06-01

    The on-line monitoring and diagnostics system of Borssele nuclear power plant (NPP), designed and established by the ECN Energy Research Foundation, has been operating continuously since 1983. The system is extended in form of multiprocessing, multi-tasking structure performing real-time monitoring, on-line reactor parameters' calculation, data-base preparation for expert systems and providing early information on possible malfunctions even in the incipient stage making alert by passive alarms. The system realized has already been operating in the course of 7 fuel cycles of the reactor starting from start-up through normal power operation. An expert system operating on the VAX work station is added to the surveillance and diagnostics system for data base management of the observed physical parameters relevant to the NPP under supervision. The paper highlights the surveillance and diagnostic modules involved, in their actual hierarchical form in use, presents theoretical considerations applied to the design of the surveillance system together with the results obtained through the 12th to 17th fuel cycles of the NPP including start-ups and shut-downs and reveals the experience thus gained by both utility and ECN through the application of the system described. (author). 19 refs.; 4 figs

  13. A probabilistic approach to emission-line galaxy classification

    Science.gov (United States)

    de Souza, R. S.; Dantas, M. L. L.; Costa-Duarte, M. V.; Feigelson, E. D.; Killedar, M.; Lablanche, P.-Y.; Vilalta, R.; Krone-Martins, A.; Beck, R.; Gieseke, F.

    2017-12-01

    We invoke a Gaussian mixture model (GMM) to jointly analyse two traditional emission-line classification schemes of galaxy ionization sources: the Baldwin-Phillips-Terlevich (BPT) and WH α versus [N II]/H α (WHAN) diagrams, using spectroscopic data from the Sloan Digital Sky Survey Data Release 7 and SEAGal/STARLIGHT data sets. We apply a GMM to empirically define classes of galaxies in a three-dimensional space spanned by the log [O III]/H β, log [N II]/H α and log EW(H α) optical parameters. The best-fitting GMM based on several statistical criteria suggests a solution around four Gaussian components (GCs), which are capable to explain up to 97 per cent of the data variance. Using elements of information theory, we compare each GC to their respective astronomical counterpart. GC1 and GC4 are associated with star-forming galaxies, suggesting the need to define a new starburst subgroup. GC2 is associated with BPT's active galactic nuclei (AGN) class and WHAN's weak AGN class. GC3 is associated with BPT's composite class and WHAN's strong AGN class. Conversely, there is no statistical evidence - based on four GCs - for the existence of a Seyfert/low-ionization nuclear emission-line region (LINER) dichotomy in our sample. Notwithstanding, the inclusion of an additional GC5 unravels it. The GC5 appears associated with the LINER and passive galaxies on the BPT and WHAN diagrams, respectively. This indicates that if the Seyfert/LINER dichotomy is there, it does not account significantly to the global data variance and may be overlooked by standard metrics of goodness of fit. Subtleties aside, we demonstrate the potential of our methodology to recover/unravel different objects inside the wilderness of astronomical data sets, without lacking the ability to convey physically interpretable results. The probabilistic classifications from the GMM analysis are publicly available within the COINtoolbox at https://cointoolbox.github.io/GMM_Catalogue/.

  14. On-line surveillance system for Borssele nuclear power plant monitoring and diagnostics

    International Nuclear Information System (INIS)

    Tuerkcan, E.; Ciftcioglu, Oe.

    1993-08-01

    An operating on-line surveillance and diagnostic system is described where information processing for monitoring and fault diagnosis and plant maintenance are addressed. The surveillance system by means of its realtime multiprocessing, multitasking execution capabilities can perform plant-wide and wide-range monitoring for enhanced plant safety and operational reliability as well as enhanced maintenance. At the same time the system provides the possibilities for goal-oriented research and development such as estimation, filtering, verification and validation and neural networks. (orig./HP)

  15. On-line fatigue monitoring and margins probabilistic assessment

    International Nuclear Information System (INIS)

    Fournier, I.; Morilhat, P.

    1993-01-01

    An on-line computer aided system has been developed by Electricite de France, the French utility, for a fatigue monitoring of critical locations in the nuclear steam supply system. This tool, called fatiguemeter, includes as input data only existing plant parameters and is based on some conservative assumptions at several steps of the damage assessment (thermal boundary conditions, stress computation...). This paper presents recent developments performed toward a better assessing of margins involved in the complete analysis. The methodology is enlightened with an example showing the influence of plant parameters incertitude on the final stress computed at a PWR 900 MW unit pressurizer surge line nozzle. (author)

  16. Agent-based station for on-line diagnostics by self-adaptive laser Doppler vibrometry

    Science.gov (United States)

    Serafini, S.; Paone, N.; Castellini, P.

    2013-12-01

    A self-adaptive diagnostic system based on laser vibrometry is proposed for quality control of mechanical defects by vibration testing; it is developed for appliances at the end of an assembly line, but its characteristics are generally suited for testing most types of electromechanical products. It consists of a laser Doppler vibrometer, equipped with scanning mirrors and a camera, which implements self-adaptive bahaviour for optimizing the measurement. The system is conceived as a Quality Control Agent (QCA) and it is part of a Multi Agent System that supervises all the production line. The QCA behaviour is defined so to minimize measurement uncertainty during the on-line tests and to compensate target mis-positioning under guidance of a vision system. Best measurement conditions are reached by maximizing the amplitude of the optical Doppler beat signal (signal quality) and consequently minimize uncertainty. In this paper, the optimization strategy for measurement enhancement achieved by the down-hill algorithm (Nelder-Mead algorithm) and its effect on signal quality improvement is discussed. Tests on a washing machine in controlled operating conditions allow to evaluate the efficacy of the method; significant reduction of noise on vibration velocity spectra is observed. Results from on-line tests are presented, which demonstrate the potential of the system for industrial quality control.

  17. Agent-based station for on-line diagnostics by self-adaptive laser Doppler vibrometry.

    Science.gov (United States)

    Serafini, S; Paone, N; Castellini, P

    2013-12-01

    A self-adaptive diagnostic system based on laser vibrometry is proposed for quality control of mechanical defects by vibration testing; it is developed for appliances at the end of an assembly line, but its characteristics are generally suited for testing most types of electromechanical products. It consists of a laser Doppler vibrometer, equipped with scanning mirrors and a camera, which implements self-adaptive bahaviour for optimizing the measurement. The system is conceived as a Quality Control Agent (QCA) and it is part of a Multi Agent System that supervises all the production line. The QCA behaviour is defined so to minimize measurement uncertainty during the on-line tests and to compensate target mis-positioning under guidance of a vision system. Best measurement conditions are reached by maximizing the amplitude of the optical Doppler beat signal (signal quality) and consequently minimize uncertainty. In this paper, the optimization strategy for measurement enhancement achieved by the down-hill algorithm (Nelder-Mead algorithm) and its effect on signal quality improvement is discussed. Tests on a washing machine in controlled operating conditions allow to evaluate the efficacy of the method; significant reduction of noise on vibration velocity spectra is observed. Results from on-line tests are presented, which demonstrate the potential of the system for industrial quality control.

  18. A Study on Remote On-Line Diagnostic System for Vehicles by Integrating the Technology of OBD, GPS, and 3G

    OpenAIRE

    Jyong Lin; Shih-Chang Chen; Yu-Tsen Shih; Shi-Huang Chen

    2009-01-01

    This paper presents a remote on-line diagnostic system for vehicles via the use of On-Board Diagnostic (OBD), GPS, and 3G techniques. The main parts of the proposed system are on-board computer, vehicle monitor server, and vehicle status browser. First, the on-board computer can obtain the location of deriver and vehicle status from GPS receiver and OBD interface, respectively. Then on-board computer will connect with the vehicle monitor server through 3G network to trans...

  19. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  20. Resonance broadening of Hg lines as a density diagnostic in high intensity discharge lamps

    International Nuclear Information System (INIS)

    Lawler, J E

    2004-01-01

    The use of width measurements on resonance broadened lines of Hg as a density diagnostic in high intensity discharge (HID) lamps is reviewed and further developed in this paper. Optical depths of Hg I lines at 491.6 nm, 577.0 nm, and 1014 nm are computed as a function of temperature to confirm that these lines are optically thin in most HID lamps. The effect of quadratic and quartic radial temperature variation on the width of resonance broadened lines is computed for arc core temperatures from 4000 K to 7000 K. Such variations in temperature, and inverse variations in Hg density, are found to increase the line widths by less than 10% for 'side-on' emission measurements averaged over the arc radius. Theoretical profiles of resonance broadened spectral lines, both radially averaged and as a function of chord offset, are presented. Observations of resonance broadened lines in a metal-halide HID lamp are presented and analysed. It is concluded that the widths of resonance broadened lines provide a convenient and reliable diagnostic for the arc core Hg density but are generally not very sensitive to the radial temperature and Hg density gradient

  1. Diagnostics on Z (invited)

    International Nuclear Information System (INIS)

    Nash, T. J.; Derzon, M. S.; Chandler, G. A.; Fehl, D. L.; Leeper, R. J.; Porter, J. L.; Spielman, R. B.; Ruiz, C.; Cooper, G.; McGurn, J.

    2001-01-01

    The 100 ns, 20 MA pinch-driver Z is surrounded by an extensive set of diagnostics. There are nine radial lines of sight set at 12 o above horizontal and each of these may be equipped with up to five diagnostic ports. Instruments routinely fielded viewing the pinch from the side with these ports include x-ray diode arrays, photoconducting detector arrays, bolometers, transmission grating spectrometers, time-resolved x-ray pinhole cameras, x-ray crystal spectrometers, calorimeters, silicon photodiodes, and neutron detectors. A diagnostic package fielded on axis for viewing internal pinch radiation consists of nine lines of sight. This package accommodates virtually the same diagnostics as the radial ports. Other diagnostics not fielded on the axial or radial ports include current B-dot monitors, filtered x-ray scintillators coupled by fiber optics to streak cameras, streaked visible spectroscopy, velocity interferometric system for any reflector, bremsstrahlung cameras, and active shock breakout measurement of hohlraum temperature. The data acquisition system is capable of recording up to 500 channels and the data from each shot is available on the Internet. A major new diagnostic presently under construction is the BEAMLET backlighter. We will briefly describe each of these diagnostics and present some of the highest-quality data from them

  2. Power plant experience with artificial intelligence based, on-line diagnostic systems

    International Nuclear Information System (INIS)

    Osborne, R.L.; Coffman, M.

    1987-01-01

    The utility industry is entering a period when generation equipment availability becomes increasingly critical due to the lack of new power plants being planned and built. The increasing percentage of all electric homes adding to peak demands requires more plant equipment to be used in a cyclic duty mode. Availability is on the increase with forced and planned maintenance hours decreasing. Factors that are contributing to this improvement are new units coming on-line with the latest in technology coupled with the installation of retrofit components containing that same technology such as the Rigi-Flex generators and ruggedized turbine rotors. In conjunction with hardware advances, technology advancements in monitoring and diagnostics are permitting the identification of potential malfunctions so that corrective actions can be taken, thus preventing lengthy outages. It is this last area that this paper will address

  3. Influence of a source line position on results of EM observations applied to the diagnostics of underground heating system pipelines in urban area

    Science.gov (United States)

    Vetrov, A.

    2009-05-01

    The condition of underground constructions, communication and supply systems in the cities has to be periodically monitored and controlled in order to prevent their breakage, which can result in serious accident, especially in urban area. The most risk of damage have the underground construction made of steal such as pipelines widely used for water, gas and heat supply. To ensure the pipeline survivability it is necessary to carry out the operative and inexpensive control of pipelines condition. Induced electromagnetic methods of geophysics can be applied to provide such diagnostics. The highly developed surface in urbane area is one of cause hampering the realization of electromagnetic methods of diagnostics. The main problem is in finding of an appropriate place for the source line and electrodes on a limited surface area and their optimal position relative to the observation path to minimize their influence on observed data. Author made a number of experiments of an underground heating system pipeline diagnostics using different position of the source line and electrodes. The experiments were made on a 200 meters section over 2 meters deep pipeline. The admissible length of the source line and angle between the source line and the observation path were determined. The minimal length of the source line for the experiment conditions and accuracy made 30 meters, the maximum admissible angle departure from the perpendicular position made 30 degrees. The work was undertaken in cooperation with diagnostics company DIsSO, Saint-Petersburg, Russia.

  4. Novel Infiltration Diagnostics based on Laser-line Scanning and Infrared Temperature Field Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xinwei [Iowa State Univ., Ames, IA (United States)

    2017-12-08

    This project targets the building energy efficiency problems induced by building infiltration/leaks. The current infiltration inspection techniques often require extensive visual inspection and/or whole building pressure test. These current techniques cannot meet more than three of the below five criteria of ideal infiltration diagnostics: 1. location and extent diagnostics, 2. building-level application, 3. least surface preparation, 4. weather-proof, and 5. non-disruption to building occupants. These techniques are either too expensive or time consuming, and often lack accuracy and repeatability. They are hardly applicable to facades/facades section. The goal of the project was to develop a novel infiltration diagnostics technology based on laser line-scanning and simultaneous infrared temperature imaging. A laboratory scale experimental setup was designed to mimic a model house of well-defined pressure difference below or above the outside pressure. Algorithms and Matlab-based programs had been developed for recognition of the hole location in infrared images. Our experiment based on laser wavelengths of 450 and 1550 nm and laser beam diameters of 4-25 mm showed that the location of the holes could be identified using laser heating; the diagnostic approach however could not readily distinguish between infiltration and non-infiltration points. To significantly improve the scanning throughput and recognition accuracy, a second approach was explored, developed, and extensively tested. It incorporates a liquid spray on the surface to induce extra phase change cooling effect. In this spray method, we termed it as PECIT (Phase-change Enhanced Cooling Infrared Thermography), phase-change enhanced cooling was used, which significantly amplifies the effect of air flow (infiltration and exfiltration). This heat transfer method worked extremely well to identify infiltration and exfiltration locations with high accuracy and increased throughput. The PECIT technique was

  5. On-line fatigue monitoring and probabilistic assessment of margins

    Energy Technology Data Exchange (ETDEWEB)

    Fournier, I. [Electricite de France, 93 - Saint-Denis (France). Direction des Etudes et Recherches; Morilhat, P. [Electricite de France, 93 - Saint-Denis (France). Direction des Etudes et Recherches

    1995-01-01

    An on-line computer-aided system has been developed by Electricite de France, the French utility, for fatigue monitoring of critical locations in the nuclear steam supply system. This tool, called a fatigue meter, includes as input data plant parameters and is based on some conservative assumptions at several steps of the damage assessment (thermal boundary conditions, stress computation,..). In this paper we present recent developments performed towards a better assessment of margins involved in the complete analysis. The methodology is illustrated with an example showing the influence of uncertainty in plant parameters on the final stress computed at a pressurized water reactor 900MW unit pressurizer surge line nozzle. A second example is shown to illustrate the possibility of defining some transient archetypes. ((orig.)).

  6. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  7. Implementation of an integrated on-line process surveillance and diagnostic system at the Halden reactor project: MOAS

    International Nuclear Information System (INIS)

    Kim, I.S.; Grini, R.-E.; Nilsen, S.

    2001-01-01

    MOAS is an integrated on-line process surveillance and diagnostic system that uses several different models for knowledge acquisition and diagnostic reasoning, such as goal-tree success-tree model, process monitor trees, and sensor failure diagnosis trees. Within these models, the knowledge of the process and its operation, including deep knowledge, like mass balance or controller algorithm, is incorporated. During an extensive review, made as part of the integrated diagnosis system project of the Halden reactor project, MOAS (Maryland Operator Advisory System) was identified as one of the most thorough systems developed thus far. MOAS encompasses diverse functional aspects that are required for an effective process disturbance management: (1) intelligent process monitoring and alarming, (2) on-line sensor data validation and sensor failure diagnosis, (3) on-line hardware (besides sensors) failure diagnosis, and (4) real-time corrective measure synthesis. The MOAS methodology was used for the NORS (Nokia Research Simulator) process at the Halden man-machine laboratory HAMMLAB of the OECD Halden reactor project. The performance tests of MOAS, implemented in G2 real-time expert system shell, show that MOAS successfully carries out its intended functions, i.e. quickly recognizing an occurring disturbance, correctly diagnosing its cause, and presenting advice on its control to the operator. The lessons learned and insights gained during the implementation and performance tests also are discussed

  8. The beam diagnostic instruments in Beijing radioactive ion-beam facilities isotope separator on-line

    International Nuclear Information System (INIS)

    Ma, Y.; Cui, B.; Ma, R.; Tang, B.; Chen, L.; Huang, Q.; Jiang, W.

    2014-01-01

    The beam diagnostic instruments for Beijing Radioactive Ion-beam Facilities Isotope Separator On-Line are introduced [B. Q. Cui, Z. H. Peng, Y. J. Ma, R. G. Ma, B. Tang, T. Zhang, and W. S. Jiang, Nucl. Instrum. Methods 266, 4113 (2008); T. J. Zhang, X. L. Guan, and B. Q. Cui, in Proceedings of APAC 2004, Gyeongju, Korea, 2004, http://www.jacow.org , p. 267]. For low intensity ion beam [30–300 keV/1 pA–10 μA], the beam profile monitor, the emittance measurement unit, and the analyzing slit will be installed. For the primary proton beam [100 MeV/200 μA], the beam profile scanner will be installed. For identification of the nuclide, a beam identification unit will be installed. The details of prototype of the beam diagnostic units and some experiment results will be described in this article

  9. Line and continuum spectroscopy as diagnostic tools for gamma ray bursts

    International Nuclear Information System (INIS)

    Liang, E.P.

    1990-12-01

    We review the theoretical framework of both line and continuum spectra formation in gamma ray bursts. These include the cyclotron features at 10's of keV, redshifted annihilation features at ∼400 keV, as well as other potentially detectable nuclear transition lines, atomic x-ray lines, proton cyclotron lines and plasma oscillation lines. By combining the parameters derived from line and continuum modeling we can try to reconstruct the location, geometry and physical conditions of the burst emission region, thereby constraining and discriminating the astrophysical models. Hence spectroscopy with current and future generations of detectors should provide powerful diagnostic tools for gamma ray bursters. 48 refs., 10 figs., 4 tabs

  10. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    Science.gov (United States)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  11. On Farmer's line, probability density functions, and overall risk

    International Nuclear Information System (INIS)

    Munera, H.A.; Yadigaroglu, G.

    1986-01-01

    Limit lines used to define quantitative probabilistic safety goals can be categorized according to whether they are based on discrete pairs of event sequences and associated probabilities, on probability density functions (pdf's), or on complementary cumulative density functions (CCDFs). In particular, the concept of the well-known Farmer's line and its subsequent reinterpretations is clarified. It is shown that Farmer's lines are pdf's and, therefore, the overall risk (defined as the expected value of the pdf) that they represent can be easily calculated. It is also shown that the area under Farmer's line is proportional to probability, while the areas under CCDFs are generally proportional to expected value

  12. Plasma diagnostics using the He I 447.1 nm line at high and low densities

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, Manuel A [Departamento de Fisica Aplicada, E.T.S.I. Informatica, Universidad de Valladolid, 47071 Valladolid (Spain); Ivkovic, Milivoje; Jovicevic, Sonja; Konjevic, Nikola [Institute of Physics, University of Belgrade, 11081 Belgrade, PO Box 68 (Serbia); Gigosos, Marco A; Lara, Natividad, E-mail: manuelgd@termo.uva.es, E-mail: gigosos@coyanza.opt.cie.uva.es [Departamento de Fisica Teorica, Atomica y Optica, Facultad de Ciencias, Universidad de Valladolid, 47071 Valladolid (Spain)

    2011-05-18

    The broadening of the He I 447.1 nm line and its forbidden components in plasmas is studied using computer simulation techniques and the results are compared with our and other experiments. In these calculations wide ranges of electron densities and temperatures are considered. Experimental measurements are performed with a high electron density pulsed discharge and with a low electron density microwave torch at atmospheric pressure. Both calculations and experimental measurements are extended from previous works towards low electron densities in order to study the accuracy of plasma diagnostics using this line in ranges of interest in different practical applications. The calculation results are compared with experimental profiles registered in plasmas diagnosed using independent techniques. The obtained agreement justifies the use of these line parameters for plasma diagnostics. The influence of self-absorption on line parameters is also analysed. It is shown that the separation between the peaks of the allowed and forbidden components exhibits a clear dependence upon plasma electron density free of self-absorption influence. This allows the peak separation to be used as a good parameter for plasma diagnostics. From the simulation results, a simple fitting formula is applied that permits obtaining the electron number density plasma diagnostics in the range 5 x 10{sup 22}-7 x 10{sup 23} m{sup -3}. At lower densities the fitting of simulated to experimental full profiles is a reliable method for N{sub e} determination.

  13. Prediction of temperature-insensitive molecular absorption lines in laser-assisted combustion diagnostics

    International Nuclear Information System (INIS)

    Walewski, Joachim W.; Elmqvist, Anders

    2005-01-01

    In laser-assisted combustion diagnostics it is a recurring task to predict molecular transitions whose signal strength depends only weakly on variations in temperature. The signal strength is proportional to the Boltzmann fraction of the level probed and the amplitude of the absorption line profile. In the past investigations have been presented in which this task was attack by detailed numerical calculations of the temperature dependence of pertinent physical properties of the molecule. Another widely applied approach relies on an analytical formula for the Boltzmann fraction of hetero-nuclear diatomic molecules and the neglect of line shape effects. The analytical approach experiences a continuing popularity in laser-assisted combustion diagnostics, which is why we compared both approaches with each other. The objective of this comparison was to assess the accuracy of the analytical approach and to reveal its potential pitfalls. Our comparison revealed that the analytical approach suffers from mediocre accuracy, which makes it unfit for practical applications. One cause is the neglect of higher lying vibrational levels, which show a non-negligible population for typical flame temperatures. Another reason is the neglect of fine structure splitting in molecules with non-zero orbit angular momentum in the ground state. Another reason for the observed inaccuracy is the neglect of line shape effects quenching, which were found to have a significant effect on the temperature sensitivity of a line. Because of its insufficient accuracy due to both oversimplified models of the molecular energy levels and the neglect of line shape effects and quenching we discourage from applying the analytical approach and recommend the use of detailed numerical approaches that are free of the above limitations

  14. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  15. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  16. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  17. A new beam emission polarimetry diagnostic for measuring the magnetic field line angle at the plasma edge of ASDEX Upgrade.

    Science.gov (United States)

    Viezzer, E; Dux, R; Dunne, M G

    2016-11-01

    A new edge beam emission polarimetry diagnostic dedicated to the measurement of the magnetic field line angle has been installed on the ASDEX Upgrade tokamak. The new diagnostic relies on the motional Stark effect and is based on the simultaneous measurement of the polarization direction of the linearly polarized π (parallel to the electric field) and σ (perpendicular to the electric field) lines of the Balmer line D α . The technical properties of the system are described. The calibration procedures are discussed and first measurements are presented.

  18. An Approach to On-line Risk Assessment in NPP

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; O'Brien, J.

    1996-01-01

    Probabilistic Risk Assessment (PRA) can provide safety status information for a plant during different configurations; additional effort is needed to do this in real time for on-line operation. This paper describes an approach to use PRA to achieve these goals. A Risk Assessment On-Line (RAOL) application was developed to monitor maintenance (on-line and planned) activities. RAOL is based on the results from a full-scope PRA, engineering/operational judgment and incorporates a user friendly program interface approach. Results from RAOL can be used by planners or operations to effectively manage the level of risk by controlling the actual plant configuration. (author)

  19. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  20. Nuclear Energy Research Initiative (NERI): On-Line Intelligent Self-Diagnostic Monitoring for Next Generation Nuclear Plants - Phase I Annual Report

    International Nuclear Information System (INIS)

    Bond, L.G.; Doctor, S.R.; Gilbert, R.W.; Jarrell, D.B.; Greitzer, F.L.; Meador, R.J.

    2000-01-01

    OAK-B135 This OSTI ID belongs to an IWO and is being released out of the system. The Program Manager Rebecca Richardson has confirmed that all reports have been received. The objective of this project is to design and demonstrate the operation of the real-time intelligent self-diagnostic and prognostic system for next generation nuclear power plant systems. This new self-diagnostic technology is titled, ''On-Line Intelligent Self-Diagnostic Monitoring System'' (SDMS). This project provides a proof-of-principle technology demonstration for SDMS on a pilot plant scale service water system, where a distributed array of sensors is integrated with active components and passive structures typical of next generation nuclear power reactor and plant systems. This project employs state-of-the-art sensors, instrumentation, and computer processing to improve the monitoring and assessment of the power reactor system and to provide diagnostic and automated prognostics capabilities

  1. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  2. Probabilistic safety assessment goals in Canada

    International Nuclear Information System (INIS)

    Snell, V.G.

    1986-01-01

    CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting

  3. Quantitative diffusion tensor deterministic and probabilistic fiber tractography in relapsing-remitting multiple sclerosis

    International Nuclear Information System (INIS)

    Hu Bing; Ye Binbin; Yang Yang; Zhu Kangshun; Kang Zhuang; Kuang Sichi; Luo Lin; Shan Hong

    2011-01-01

    Purpose: Our aim was to study the quantitative fiber tractography variations and patterns in patients with relapsing-remitting multiple sclerosis (RRMS) and to assess the correlation between quantitative fiber tractography and Expanded Disability Status Scale (EDSS). Material and methods: Twenty-eight patients with RRMS and 28 age-matched healthy volunteers underwent a diffusion tensor MR imaging study. Quantitative deterministic and probabilistic fiber tractography were generated in all subjects. And mean numbers of tracked lines and fiber density were counted. Paired-samples t tests were used to compare tracked lines and fiber density in RRMS patients with those in controls. Bivariate linear regression model was used to determine the relationship between quantitative fiber tractography and EDSS in RRMS. Results: Both deterministic and probabilistic tractography's tracked lines and fiber density in RRMS patients were less than those in controls (P < .001). Both deterministic and probabilistic tractography's tracked lines and fiber density were found negative correlations with EDSS in RRMS (P < .001). The fiber tract disruptions and reductions in RRMS were directly visualized on fiber tractography. Conclusion: Changes of white matter tracts can be detected by quantitative diffusion tensor fiber tractography, and correlate with clinical impairment in RRMS.

  4. Beam diagnostics and data acquisition system for ion beam transport line used in applied research

    International Nuclear Information System (INIS)

    Skuratov, V.A.; Didyk, A.Yu.; Arkhipov, A.V.; Illes, A.; Bodnar, K.; Illes, Z.; Havancsak, K.

    1999-01-01

    Ion beam transport line for applied research on U-400 cyclotron, beam diagnostics and data acquisition system for condensed matter studies are described. The main features of Windows-based real time program are considered

  5. Effectiveness of Securities with Fuzzy Probabilistic Return

    Directory of Open Access Journals (Sweden)

    Krzysztof Piasecki

    2011-01-01

    Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract

  6. NEW STRONG-LINE ABUNDANCE DIAGNOSTICS FOR H II REGIONS: EFFECTS OF κ-DISTRIBUTED ELECTRON ENERGIES AND NEW ATOMIC DATA

    Energy Technology Data Exchange (ETDEWEB)

    Dopita, Michael A.; Sutherland, Ralph S.; Nicholls, David C.; Kewley, Lisa J.; Vogt, Frédéric P. A., E-mail: Michael.Dopita@anu.edu.au [Research School of Astronomy and Astrophysics, Australian National University, Cotter Rd., Weston ACT 2611 (Australia)

    2013-09-01

    Recently, Nicholls et al., inspired by in situ observations of solar system astrophysical plasmas, suggested that the electrons in H II regions are characterized by a κ-distribution of energies rather than a simple Maxwell-Boltzmann distribution. Here, we have collected together new atomic data within a modified photoionization code to explore the effects of both the new atomic data and the κ-distribution on the strong-line techniques used to determine chemical abundances in H II regions. By comparing the recombination temperatures (T {sub rec}) with the forbidden line temperatures (T {sub FL}), we conclude that κ ∼ 20. While representing only a mild deviation from equilibrium, this result is sufficient to strongly influence abundances determined using methods that depend on measurements of the electron temperature from forbidden lines. We present a number of new emission line ratio diagnostics that cleanly separate the two parameters determining the optical spectrum of H II regions—the ionization parameter q or U and the chemical abundance, 12+log(O/H). An automated code to extract these parameters is presented. Using the homogeneous data set from van Zee et al., we find self-consistent results between all of these different diagnostics. The systematic errors between different line ratio diagnostics are much smaller than those found in the earlier strong-line work. Overall, the effect of the κ-distribution on the strong-line abundances derived solely on the basis of theoretical models is rather small.

  7. Nuclear Energy Research Initiative (NERI): On-Line Intelligent Self-Diagnostic Monitoring for Next Generation Nuclear Plants - Phase I Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    L. J. Bond; S. R. Doctor; R. W. Gilbert; D. B. Jarrell; F. L. Greitzer; R. J. Meador

    2000-09-01

    OAK-B135 This OSTI ID belongs to an IWO and is being released out of the system. The Program Manager Rebecca Richardson has confirmed that all reports have been received. The objective of this project is to design and demonstrate the operation of the real-time intelligent self-diagnostic and prognostic system for next generation nuclear power plant systems. This new self-diagnostic technology is titled, ''On-Line Intelligent Self-Diagnostic Monitoring System'' (SDMS). This project provides a proof-of-principle technology demonstration for SDMS on a pilot plant scale service water system, where a distributed array of sensors is integrated with active components and passive structures typical of next generation nuclear power reactor and plant systems. This project employs state-of-the-art sensors, instrumentation, and computer processing to improve the monitoring and assessment of the power reactor system and to provide diagnostic and automated prognostics capabilities.

  8. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  9. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  10. A Methodology for Probabilistic Accident Management

    International Nuclear Information System (INIS)

    Munteanu, Ion; Aldemir, Tunc

    2003-01-01

    While techniques have been developed to tackle different tasks in accident management, there have been very few attempts to develop an on-line operator assistance tool for accident management and none that can be found in the literature that uses probabilistic arguments, which are important in today's licensing climate. The state/parameter estimation capability of the dynamic system doctor (DSD) approach is combined with the dynamic event-tree generation capability of the integrated safety assessment (ISA) methodology to address this issue. The DSD uses the cell-to-cell mapping technique for system representation that models the system evolution in terms of probability of transitions in time between sets of user-defined parameter/state variable magnitude intervals (cells) within a user-specified time interval (e.g., data sampling interval). The cell-to-cell transition probabilities are obtained from the given system model. The ISA follows the system dynamics in tree form and braches every time a setpoint for system/operator intervention is exceeded. The combined approach (a) can automatically account for uncertainties in the monitored system state, inputs, and modeling uncertainties through the appropriate choice of the cells, as well as providing a probabilistic measure to rank the likelihood of possible system states in view of these uncertainties; (b) allows flexibility in system representation; (c) yields the lower and upper bounds on the estimated values of state variables/parameters as well as their expected values; and (d) leads to fewer branchings in the dynamic event-tree generation. Using a simple but realistic pressurizer model, the potential use of the DSD-ISA methodology for on-line probabilistic accident management is illustrated

  11. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  12. Computerized systems for on-line management of failures: a state-of-the-art discussion of alarm systems and diagnostic systems applied in the nuclear industry

    International Nuclear Information System (INIS)

    Kim, I.S.

    1994-01-01

    It is now well perceived in the nuclear industry that improving plant information systems is vital for enhancing the operational safety of nuclear power plants. Considerable work is underway worldwide to support operators' decision-making, particularly in their difficult tasks of managing process anomalies on-line. The work includes development of (1) advanced alarm systems, such as various kinds of computer-based alarm processing systems, Critical Function Monitoring System, Success Path Monitoring System and Safety Assessment System II, and (2) real-timer diagnostic systems, such as Disturbance Analysis System, Maryland Operator Advisory System II, Model-Integrated Diagnostic Analysis System, Diagnosis System using Knowledge Engineering Technique, Detailed Diagnosis, and Operator Advisor System. This paper presents a state-of-the-art review of plant information systems for on-line management of failures in nuclear power plants, focusing on the methodological features of computerized alarm systems and diagnostic systems. (author)

  13. Impact of external events on site evaluation: a probabilistic approach

    International Nuclear Information System (INIS)

    Jaccarino, E.; Giuliani, P.; Zaffiro, C.

    1975-01-01

    A probabilistic method is proposed for definition of the reference external events of nuclear sites. The external events taken into account are earthquakes, floods and tornadoes. On the basis of the available historical data for each event it is possible to perform statistical analyses to determine the probability of occurrence on site of events of given characteristics. For earthquakes, the method of analysis takes into consideration both the annual frequency of seismic events in Italy and the probabilistic distribution of areas stricken by each event. For floods, the methods of analysis of hydrological data and the basic criteria for the determination of design events are discussed and the general lines of the hydraulic analysis of a nuclear site are shown. For tornadoes, the statistical analysis has been performed for the events which occurred in Italy during the last 40 years; these events have been classified according to an empirical intensity scale. The probability of each reference event should be a function of the potential radiological damage associated with the particular type of plant which must be installed on the site. Thus the reference event could be chosen such that for the whole of the national territory the risk for safety and environmental protection is the same. (author)

  14. Global Infrasound Association Based on Probabilistic Clutter Categorization

    Science.gov (United States)

    Arora, Nimar; Mialle, Pierrick

    2016-04-01

    The IDC advances its methods and continuously improves its automatic system for the infrasound technology. The IDC focuses on enhancing the automatic system for the identification of valid signals and the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria. An objective of this study is to reduce the number of associated infrasound arrivals that are rejected from the automatic bulletins when generating the reviewed event bulletins. Indeed, a considerable number of signal detections are due to local clutter sources such as microbaroms, waterfalls, dams, gas flares, surf (ocean breaking waves) etc. These sources are either too diffuse or too local to form events. Worse still, the repetitive nature of this clutter leads to a large number of false event hypotheses due to the random matching of clutter at multiple stations. Previous studies, for example [1], have worked on categorization of clutter using long term trends on detection azimuth, frequency, and amplitude at each station. In this work we continue the same line of reasoning to build a probabilistic model of clutter that is used as part of NETVISA [2], a Bayesian approach to network processing. The resulting model is a fusion of seismic, hydroacoustic and infrasound processing built on a unified probabilistic framework. References: [1] Infrasound categorization Towards a statistics based approach. J. Vergoz, P. Gaillard, A. Le Pichon, N. Brachet, and L. Ceranna. ITW 2011 [2] NETVISA: Network Processing Vertically Integrated Seismic Analysis. N. S. Arora, S. Russell, and E. Sudderth. BSSA 2013

  15. Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis

    2003-01-01

    Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)

  16. Fast and sensitive medical diagnostic protocol based on integrating circular current lines for magnetic washing and optical detection of fluorescent magnetic nanobeads

    Directory of Open Access Journals (Sweden)

    Jaiyam Sharma

    2016-07-01

    Full Text Available Magnetic nanoparticles (MNPs are increasingly being used as ‘magnetic labels’ in medical diagnostics. Practical applications of MNPs necessitate reducing their non-specific interactions with sensor surfaces that result in noise in measurements. Here we describe the design and implementation of a sensing platform that incorporates circular shaped current lines that reduce non-specific binding by enabling the “magnetic washing” of loosely attached MNPs attached to the senor surface. Generating magnetic fields by passing electrical currents through the circular shaped current lines enabled the capture and collection of fluorescent MNPs that was more efficient and effective than straight current lines reported to-date. The use of fluorescent MNPs allows their optical detection rather than with widely used magnetoresistive sensors. As a result our approach is not affected by magnetic noise due to the flow of currents. Our design is expected to improve the speed, accuracy, and sensitivity of MNPs based medical diagnostics. Keywords: Biosensors, Magnetic beads, Fluorescent magnetic nanoparticles, Lab on chip, Point of care testing

  17. Efficient probabilistic model checking on general purpose graphic processors

    NARCIS (Netherlands)

    Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.

    2009-01-01

    We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic

  18. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  19. Wood pole overhead lines

    CERN Document Server

    Wareing, Brian

    2005-01-01

    This new book concentrates on the mechanical aspects of distribution wood pole lines, including live line working, environmental influences, climate change and international standards. Other topics include statutory requirements, safety, profiling, traditional and probabilistic design, weather loads, bare and covered conductors, different types of overhead systems, conductor choice, construction and maintenance. A section has also been devoted to the topic of lightning, which is one of the major sources of faults on overhead lines. The book focuses on the effects of this problem and the strate

  20. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  1. Initiating Events Modeling for On-Line Risk Monitoring Application

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.

    1998-01-01

    In order to make on-line risk monitoring application of Probabilistic Risk Assessment more complete and realistic, a special attention need to be dedicated to initiating events modeling. Two different issues are of special importance: one is how to model initiating events frequency according to current plant configuration (equipment alignment and out of service status) and operating condition (weather and various activities), and the second is how to preserve dependencies between initiating events model and rest of PRA model. First, the paper will discuss how initiating events can be treated in on-line risk monitoring application. Second, practical example of initiating events modeling in EPRI's Equipment Out of Service on-line monitoring tool will be presented. Gains from application and possible improvements will be discussed in conclusion. (author)

  2. Diagnostic tools used in the calibration and verification of protein crystallography synchrotron beam lines and apparatus

    International Nuclear Information System (INIS)

    Rotella, F.J.; Alkire, R.W.; Duke, N.E.C.; Molitsky, M.J.

    2011-01-01

    Diagnostic tools have been developed for use at the Structural Biology Center beam lines at the Advanced Photon Source. These tools are used in the calibration and operating verification of these synchrotron X-ray beam lines and constituent equipment.

  3. SDSS-IV MaNGA: the impact of diffuse ionized gas on emission-line ratios, interpretation of diagnostic diagrams and gas metallicity measurements

    Science.gov (United States)

    Zhang, Kai; Yan, Renbin; Bundy, Kevin; Bershady, Matthew; Haffner, L. Matthew; Walterbos, René; Maiolino, Roberto; Tremonti, Christy; Thomas, Daniel; Drory, Niv; Jones, Amy; Belfiore, Francesco; Sánchez, Sebastian F.; Diamond-Stanic, Aleksandar M.; Bizyaev, Dmitry; Nitschelm, Christian; Andrews, Brett; Brinkmann, Jon; Brownstein, Joel R.; Cheung, Edmond; Li, Cheng; Law, David R.; Roman Lopes, Alexandre; Oravetz, Daniel; Pan, Kaike; Storchi Bergmann, Thaisa; Simmons, Audrey

    2017-04-01

    Diffuse ionized gas (DIG) is prevalent in star-forming galaxies. Using a sample of 365 nearly face-on star-forming galaxies observed by Mapping Nearby Galaxies at APO, we demonstrate how DIG in star-forming galaxies impacts the measurements of emission-line ratios, hence the interpretation of diagnostic diagrams and gas-phase metallicity measurements. At fixed metallicity, DIG-dominated low ΣHα regions display enhanced [S II]/Hα, [N II]/Hα, [O II]/Hβ and [O I]/Hα. The gradients in these line ratios are determined by metallicity gradients and ΣHα. In line ratio diagnostic diagrams, contamination by DIG moves H II regions towards composite or low-ionization nuclear emission-line region (LI(N)ER)-like regions. A harder ionizing spectrum is needed to explain DIG line ratios. Leaky H II region models can only shift line ratios slightly relative to H II region models, and thus fail to explain the composite/LI(N)ER line ratios displayed by DIG. Our result favours ionization by evolved stars as a major ionization source for DIG with LI(N)ER-like emission. DIG can significantly bias the measurement of gas metallicity and metallicity gradients derived using strong-line methods. Metallicities derived using N2O2 are optimal because they exhibit the smallest bias and error. Using O3N2, R23, N2 = [N II]/Hα and N2S2Hα to derive metallicities introduces bias in the derived metallicity gradients as large as the gradient itself. The strong-line method of Blanc et al. (IZI hereafter) cannot be applied to DIG to get an accurate metallicity because it currently contains only H II region models that fail to describe the DIG.

  4. Using technique vibration diagnostics for assessing the quality of power transmission line supports repairs

    Directory of Open Access Journals (Sweden)

    Cherpakov Aleksander

    2017-01-01

    Full Text Available The considered method for assessing the quality of the repair work to restore the rack supports of transmission lines is based on the method of vibration diagnostics. Power transmission line supports with a symmetrical destruction of the protective layer of concrete in the ground in violation of the construction section were chosen as an object. Finite element modelling package Ansys was used in assessing the quality of repair work. The example of evaluating the quality of repair using the relative adhesion defective area design criteria in the analysis of natural vibration frequencies is given.

  5. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  6. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  7. Characterizing Neutron Diagnostics on the nTOF Line at SUNY Geneseo

    Science.gov (United States)

    Harrison, Hannah; Seppala, Hannah; Visca, Hannah; Wakwella, Praveen; Fletcher, Kurt; Padalino, Stephen; Forrest, Chad; Regan, Sean; Sangster, Craig

    2016-10-01

    Charged particle beams from SUNY Geneseo's 1.7 MV Tandem Pelletron Accelerator induce nuclear reactions that emit neutrons ranging from 0.5 to 17.9 MeV via 2H(d,n)3He and 11B(d,n)12C. This adjustable neutron source can be used to calibrate ICF and HEDP neutron scintillators for ICF diagnostics. However, gamma rays and muons, which are often present during an accelerator-based calibration, are difficult to differentiate from neutron signals in scintillators. To mitigate this problem, a new neutron time-of-flight (nTOF) line has been constructed. The nTOF timing is measured using the associated particle technique. A charged particle produced by the nuclear reaction serves as a start signal, while its associated neutron is the stop signal. Each reaction is analyzed event-by-event to determine whether the scintillator signal was generated by a neutron, gamma or muon. Using this nTOF technique, the neutron response for different scintillation detectors can be determined. Funded in part by a LLE contract through the DOE.

  8. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  9. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  10. Forbidden lines of highly ionized ions for localized plasma diagnostics

    International Nuclear Information System (INIS)

    Hinnov, E.; Fonck, R.; Suckewer, S.

    1980-06-01

    Numerous optically forbidden lines resulting from magnetic dipole transitions in low-lying electron configurations of highly ionized Fe, Ti and Cr atoms have been identified in PLT and PDX tokamak discharges, and applied for localized diagnostics in the high-temperature (0.5 to 3.0 keV) interior of these plasmas. The measurements include determination of local ion densities and their variation in time, and of ion motions (ion temperature, plasma rotations) through Doppler effect of the lines. These forbidden lines are particularly appropriate for such measurements because under typical tokamak conditions their emissivities are quite high (10 11 to 10 14 photons/cm 3 -sec), and their relatively long wavelengths allow the use of intricate optical techniques and instrumentation. The spatial location of the emissivity is directly measurable, and tends to occur near radii where the ionization potential of the ion in question is equal to the local electron temperature. In future larger and presumably higher-temperature tokamaks analogous measurements with somewhat heavier atoms, particularly krypton, and perhaps zirconium appear both feasible and desirable

  11. Line Shape Modeling for the Diagnostic of the Electron Density in a Corona Discharge

    Directory of Open Access Journals (Sweden)

    Joël Rosato

    2017-09-01

    Full Text Available We present an analysis of spectra observed in a corona discharge designed for the study of dielectrics in electrical engineering. The medium is a gas of helium and the discharge was performed at the vicinity of a tip electrode under high voltage. The shape of helium lines is dominated by the Stark broadening due to the plasma microfield. Using a computer simulation method, we examine the sensitivity of the He 492 nm line shape to the electron density. Our results indicate the possibility of a density diagnostic based on passive spectroscopy. The influence of collisional broadening due to interactions between the emitters and neutrals is discussed.

  12. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  13. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  14. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  15. Influence of probabilistic safety analysis on design and operation of PWR plants

    International Nuclear Information System (INIS)

    Bastl, W.; Hoertner, H.; Kafka, P.

    1978-01-01

    This paper gives a comprehensive presentation of the connections and influences of probabilistic safety analysis on design and operation of PWR plants. In this context a short historical retrospective view concerning probabilistic reliability analysis is given. In the main part of this paper some examples are presented in detail, showing special outcomes of such probabilistic investigations. Additional paragraphs illustrate some activities and issues in the field of probabilistic safety analysis

  16. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  17. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  18. Diagnostics for the 1.5 GeV Transport Line at the NSRRC

    CERN Document Server

    Hu, K H; Hsu, K T; Kuo, C H; Lee, D; Wang, C J; Yang, Y T

    2005-01-01

    The extracted 1.5 GeV electron beams from the booster synchrotron are transported via a transport line and injected into the storage ring. This booster-to-storage ring transport line equipped with stripline beam positions monitors, integrated current transformers, fast current transformer, and screen monitors. Commercial log-ratio BPM electronics were adopted to process the 500MHz bunch signal directly. The position of the passing beam is digitized by VME analog interface. The transmission efficiency is measured by integrated current transformer. Screen monitors are used to support routine operation. This report summary the system architecture, software tools, and performance of the BTS diagnostics.

  19. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  20. THE FORMATION OF IRIS DIAGNOSTICS. VII. THE FORMATION OF THE O i 135.56 NM LINE IN THE SOLAR ATMOSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Hsiao-Hsuan; Carlsson, Mats, E-mail: h.h.lin@astro.uio.no, E-mail: mats.carlsson@astro.uio.no [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, NO-0315 Oslo (Norway)

    2015-11-01

    The O i 135.56 nm line is covered by NASA's Interface Region Imaging Spectrograph (IRIS) small explorer mission which studies how the solar atmosphere is energized. We study here the formation and diagnostic potential of this line by means of non-local thermodynamic equilibrium modeling employing both 1D semi-empirical and 3D radiation magnetohydrodynamic models. We study the basic formation mechanisms and derive a quintessential model atom that incorporates essential atomic physics for the formation of the O i 135.56 nm line. This atomic model has 16 levels and describes recombination cascades through highly excited levels by effective recombination rates. The ionization balance O i/O ii is set by the hydrogen ionization balance through charge exchange reactions. The emission in the O i 135.56 nm line is dominated by a recombination cascade and the line is optically thin. The Doppler shift of the maximum emission correlates strongly with the vertical velocity in its line forming region, which is typically located at 1.0–1.5 Mm height. The total intensity of the line emission is correlated with the square of the electron density. Since the O i 135.56 nm line is optically thin, the width of the emission line is a very good diagnostic of non-thermal velocities. We conclude that the O i 135.56 nm line is an excellent probe of the middle chromosphere, and compliments other powerful chromospheric diagnostics of IRIS such as the Mg ii h and k lines and the C ii lines around 133.5 nm.

  1. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.

    Science.gov (United States)

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.

  2. The Far Infrared Lines of OH as Molecular Cloud Diagnostics

    Science.gov (United States)

    Smith, Howard A.

    2004-01-01

    Future IR missions should give some priority to high resolution spectroscopic observations of the set of far-IR transitions of OH. There are 15 far-IR lines arising between the lowest eight rotational levels of OH, and ISO detected nine of them. Furthermore, ISO found the OH lines, sometimes in emission and sometimes in absorption, in a wide variety of galactic and extragalactic objects ranging from AGB stars to molecular clouds to active galactic nuclei and ultra-luminous IR galaxies. The ISO/LWS Fabry-Perot resolved the 119 m doublet line in a few of the strong sources. This set of OH lines provides a uniquely important diagnostic for many reasons: the lines span a wide wavelength range (28.9 m to 163.2 m); the transitions have fast radiative rates; the abundance of the species is relatively high; the IR continuum plays an important role as a pump; the contribution from shocks is relatively minor; and, not least, the powerful centimeter-wave radiation from OH allows comparison with radio and VLBI datasets. The problem is that the large number of sensitive free parameters, and the large optical depths of the strongest lines, make modeling the full set a difficult job. The SWAS montecarlo radiative transfer code has been used to analyze the ISO/LWS spectra of a number of objects with good success, including in both the lines and the FIR continuum; the DUSTY radiative transfer code was used to insure a self-consistent continuum. Other far IR lines including those from H2O, CO, and [OI] are also in the code. The OH lines all show features which future FIR spectrometers should be able to resolve, and which will enable further refinements in the details of each cloud's structure. Some examples are given, including the case of S140, for which independent SWAS data found evidence for bulk flows.

  3. On probabilistic forecasting of wind power time-series

    DEFF Research Database (Denmark)

    Pinson, Pierre

    power dynamics. In both cases, the model parameters are adaptively and recursively estimated, time-adaptativity being the result of exponential forgetting of past observations. The probabilistic forecasting methodology is applied at the Horns Rev wind farm in Denmark, for 10-minute ahead probabilistic...... forecasting of wind power generation. Probabilistic forecasts generated from the proposed methodology clearly have higher skill than those obtained from a classical Gaussian assumption about wind power predictive densities. Corresponding point forecasts also exhibit significantly lower error criteria....

  4. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  5. An on-line diagnostic expert system

    International Nuclear Information System (INIS)

    Felkel, L.

    1987-01-01

    As experience with on-line information systems, experts systems and artificial intelligence tools grows, the authors retreat from the first euphoria that AI could help them solve the problem they were unable to solve with conventional programming. The major effort of the development time goes into building the knowledge-base. There is no such thing as a generic knowledge-base for nuclear power plants as there is, for example, for the diagnosis of a Boeing 747 aircraft. AI-methods, tools and hardware are still in a state which does not optimally lend itself to real-time application. The ability of developing prototype systems to investigate variants otherwise too costly to justify is one advantage that the authors gladly accept. Last, but no least the tools provide a flexible and adaptable user interface (desktop window systems) etc. The development of such tools in a project would be prohibitive and room for experimentation would be limited

  6. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  7. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  8. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  9. On-line experimental validation of a model-based diagnostic algorithm dedicated to a solid oxide fuel cell system

    Science.gov (United States)

    Polverino, Pierpaolo; Esposito, Angelo; Pianese, Cesare; Ludwig, Bastian; Iwanschitz, Boris; Mai, Andreas

    2016-02-01

    In the current energetic scenario, Solid Oxide Fuel Cells (SOFCs) exhibit appealing features which make them suitable for environmental-friendly power production, especially for stationary applications. An example is represented by micro-combined heat and power (μ-CHP) generation units based on SOFC stacks, which are able to produce electric and thermal power with high efficiency and low pollutant and greenhouse gases emissions. However, the main limitations to their diffusion into the mass market consist in high maintenance and production costs and short lifetime. To improve these aspects, the current research activity focuses on the development of robust and generalizable diagnostic techniques, aimed at detecting and isolating faults within the entire system (i.e. SOFC stack and balance of plant). Coupled with appropriate recovery strategies, diagnosis can prevent undesired system shutdowns during faulty conditions, with consequent lifetime increase and maintenance costs reduction. This paper deals with the on-line experimental validation of a model-based diagnostic algorithm applied to a pre-commercial SOFC system. The proposed algorithm exploits a Fault Signature Matrix based on a Fault Tree Analysis and improved through fault simulations. The algorithm is characterized on the considered system and it is validated by means of experimental induction of faulty states in controlled conditions.

  10. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  11. THE FORMATION OF IRIS DIAGNOSTICS. II. THE FORMATION OF THE Mg II h and k LINES IN THE SOLAR ATMOSPHERE

    Energy Technology Data Exchange (ETDEWEB)

    Leenaarts, J.; Pereira, T. M. D.; Carlsson, M.; De Pontieu, B. [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, NO-0315 Oslo (Norway); Uitenbroek, H., E-mail: jorritl@astro.uio.no, E-mail: tiago.pereira@astro.uio.no, E-mail: mats.carlsson@astro.uio.no, E-mail: bdp@lmsal.com, E-mail: huitenbroek@nso.edu [NSO/Sacramento Peak P.O. Box 62 Sunspot, NM 88349-0062 (United States)

    2013-08-01

    NASA's Interface Region Imaging Spectrograph (IRIS) small explorer mission will study how the solar atmosphere is energized. IRIS contains an imaging spectrograph that covers the Mg II h and k lines as well as a slit-jaw imager centered at Mg II k. Understanding the observations requires forward modeling of Mg II h and k line formation from three-dimensional (3D) radiation-magnetohydrodynamic (RMHD) models. This paper is the second in a series where we undertake this modeling. We compute the vertically emergent h and k intensity from a snapshot of a dynamic 3D RMHD model of the solar atmosphere, and investigate which diagnostic information about the atmosphere is contained in the synthetic line profiles. We find that the Doppler shift of the central line depression correlates strongly with the vertical velocity at optical depth unity, which is typically located less than 200 km below the transition region (TR). By combining the Doppler shifts of the h and k lines we can retrieve the sign of the velocity gradient just below the TR. The intensity in the central line depression is anti-correlated with the formation height, especially in subfields of a few square Mm. This intensity could thus be used to measure the spatial variation of the height of the TR. The intensity in the line-core emission peaks correlates with the temperature at its formation height, especially for strong emission peaks. The peaks can thus be exploited as a temperature diagnostic. The wavelength difference between the blue and red peaks provides a diagnostic of the velocity gradients in the upper chromosphere. The intensity ratio of the blue and red peaks correlates strongly with the average velocity in the upper chromosphere. We conclude that the Mg II h and k lines are excellent probes of the very upper chromosphere just below the TR, a height regime that is impossible to probe with other spectral lines. They also provide decent temperature and velocity diagnostics of the middle

  12. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  13. Probabilistic information on object weight shapes force dynamics in a grip-lift task.

    Science.gov (United States)

    Trampenau, Leif; Kuhtz-Buschbeck, Johann P; van Eimeren, Thilo

    2015-06-01

    Advance information, such as object weight, size and texture, modifies predictive scaling of grip forces in a grip-lift task. Here, we examined the influence of probabilistic advance information about object weight. Fifteen healthy volunteers repeatedly grasped and lifted an object equipped with a force transducer between their thumb and index finger. Three clearly distinguishable object weights were used. Prior to each lift, the probabilities for the three object weights were given by a visual cue. We examined the effect of probabilistic pre-cues on grip and lift force dynamics. We expected predictive scaling of grip force parameters to follow predicted values calculated according to probabilistic contingencies of the cues. We observed that probabilistic cues systematically influenced peak grip and load force rates, as an index of predictive motor scaling. However, the effects of probabilistic cues on force rates were nonlinear, and anticipatory adaptations of the motor output generally seemed to overestimate high probabilities and underestimate low probabilities. These findings support the suggestion that anticipatory adaptations and force scaling of the motor system can integrate probabilistic information. However, probabilistic information seems to influence motor programs in a nonlinear fashion.

  14. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  15. Prioritic directions of inculcation of diagnostic equipment at NPP

    International Nuclear Information System (INIS)

    Morozov, V.I.

    2000-01-01

    The diagnostic provision creates the conditions for increasing the safety and reliability of the NPP functioning, technical service and maintenance by the actual state. With an account of the large number of the NPP equipment elements, limitedness of financial resources, different technical-economical effect from diagnostics determination of the priority directions for introduction of technical diagnostic means into the operational practice is one of the main factors. The method for determining the above-mentioned priorities is proposed. The main aspects of the method and mathematical models, based on the logical-probabilistic modeling, are presented. The essence of the method consists in ranging the technical-economical effect from introduction of various factors of the diagnostic equipment [ru

  16. Quality of information accompanying on-line marketing of home diagnostic tests.

    Science.gov (United States)

    Datta, Adrija K; Selman, Tara J; Kwok, Tony; Tang, Teresa; Khan, Khalid S

    2008-01-01

    To assess the quality of information provided to consumers by websites marketing medical home diagnostic tests. A cross-sectional analysis of a database developed from searching targeted websites. Data sources were websites written in English which marketed medical home diagnostic tests. A meta-search engine was used to identify the first 20 citations for each type of home diagnostic medical test. Relevant websites limited to those written in English were reviewed independently and in triplicate, with disputes resolved by two further reviewers. Information on the quality of these sites was extracted using a pre-piloted performer. 168 websites were suitable for inclusion in the review. The quality of these sites showed marked variation. Only 24 of 168 (14.2%) complied with at least three-quarters of the quality items and just over half (95 of 168, 56.5%) reported official approval or certification of the test. Information on accuracy of the test marketed was reported by 87 of 168 (51.7%) websites, with 15 of 168 (8.9%) providing a scientific reference. Instructions for use of the product were found in 97 of 168 (57.9%). However, the course of action to be taken after obtaining the test result was stated in only 63 of 168 (37.5%) for a positive result and 43 of 168 (25.5%) for a negative result. The quality of information posted on commercial websites marketing home tests online is unsatisfactory and potentially misleading for consumers.

  17. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  18. Fielding of the on-axis diagnostic package at Z

    International Nuclear Information System (INIS)

    Hurst, M.J.; Nash, T.J.; Derzon, M.; Kellogg, J.W.; Torres, J.; McGurn, J.; Seaman, J.; Jobe, D.; Lazier, S.E.

    1998-06-01

    The authors have developed a comprehensive diagnostic package for observing z-pinch radiation along the pinch axis on the Z accelerator. The instrumentation, which was fielded on the axial package, are x-ray diagnostics requiring direct lines of sight to the target. The diagnostics require vacuum access to the center of the accelerator. The environment is a hostile one, where one must deal with an intense, energetic photon flux (>100 keV), EMP, debris (e.g. bullets or shrapnel), and mechanical shock in order for the diagnostics to survive. In addition, practical constraints require the package be refurbished and utilized on a once a day shot schedule. In spite of this harsh environment, the authors have successfully fielded the diagnostic package with a high survivability of the data and the instruments. In this paper, they describe the environment and issues related to the re-entrant diagnostic package's implementation and maintenance

  19. A utility theoretic view on probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.E.

    1997-03-01

    A probabilistic safety criterion specifies the maximum acceptable hazard rates of various accidental consequences. Assuming that the criterion depends also on the benefit of the process to society and on the licensing time applied, we can regard such statements as preference relations. In this paper, a probabilistic safety criterion is interpreted to mean that if the accident hazard rate is higher than the accident hazard rate criterion, then the optimal stopping time of a hazardous process is shorter than the licensing time. This interpretation yields a condition for a feasible utility function. In particular, we derive such a condition for the parameters of a linear plus exponential utility function. (orig.) (12 refs.)

  20. Quality of information accompanying on-line marketing of home diagnostic tests

    Science.gov (United States)

    Datta, Adrija K; Selman, Tara J; Kwok, Tony; Tang, Teresa; Khan, Khalid S

    2008-01-01

    Objective To assess the quality of information provided to consumers by websites marketing medical home diagnostic tests. Design A cross-sectional analysis of a database developed from searching targeted websites. Setting Data sources were websites written in English which marketed medical home diagnostic tests. Main outcome measures A meta-search engine was used to identify the first 20 citations for each type of home diagnostic medical test. Relevant websites limited to those written in English were reviewed independently and in triplicate, with disputes resolved by two further reviewers. Information on the quality of these sites was extracted using a pre-piloted performer. Results 168 websites were suitable for inclusion in the review. The quality of these sites showed marked variation. Only 24 of 168 (14.2%) complied with at least three-quarters of the quality items and just over half (95 of 168, 56.5%) reported official approval or certification of the test. Information on accuracy of the test marketed was reported by 87 of 168 (51.7%) websites, with 15 of 168 (8.9%) providing a scientific reference. Instructions for use of the product were found in 97 of 168 (57.9%). However, the course of action to be taken after obtaining the test result was stated in only 63 of 168 (37.5%) for a positive result and 43 of 168 (25.5%) for a negative result. Conclusions The quality of information posted on commercial websites marketing home tests online is unsatisfactory and potentially misleading for consumers. PMID:18263912

  1. Density-dependent lines of one- and two-electron ions in diagnostics of laboratory plasma. I. The rates of collision relaxation of excited levels

    Energy Technology Data Exchange (ETDEWEB)

    Shevelko, V P; Skobelev, I Yu; Vinogradov, A V [Lebedev Physical Institute, Academy of Sciences of the USSR, Moscow, USSR

    1977-01-01

    Plasma devices with inertial plasma confinement such as laser produced plasmas, exploding wires, plasma focus, etc., which have been rapidly developed during recent years., appear to be very intensive sources of spectral line radiation in far UV and X-ray regions. Analysis of this radiation provides a good tool for plasma diagnostics with very high electron densities up to 10/sup 22/cm/sup -3/. In this work, consisting of two parts, the authors consider the mechanism of the formation of spectral lines in hot and dense plasma. The key point for density diagnostics is the fact that for some ion levels the rate of collisional relaxation has the same order of magnitude as the radiative decay. Thus the intensities of spectral lines arising from these levels show a strong dependence on electron density which makes diagnostics possible. In this paper, emphasis is laid on the calculation of rates of transition between close ion levels induced by electron or ion impact, which usually gives the main contribution to the collisional relaxation constants. The influence of plasma polarization effects on the collision frequency in a dense plasma is also considered.

  2. Performing Probabilistic Risk Assessment Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  3. On the progress towards probabilistic basis for deterministic codes

    International Nuclear Information System (INIS)

    Ellyin, F.

    1975-01-01

    Fundamentals arguments for a probabilistic basis of codes are presented. A class of code formats is outlined in which explicit statistical measures of uncertainty of design variables are incorporated. The format looks very much like present codes (deterministic) except for having probabilistic background. An example is provided whereby the design factors are plotted against the safety index, the probability of failure, and the risk of mortality. The safety level of the present codes is also indicated. A decision regarding the new probabilistically based code parameters thus could be made with full knowledge of implied consequences

  4. On the use of data and judgment in probabilistic risk and safety analysis

    International Nuclear Information System (INIS)

    Kaplan, S.

    1986-01-01

    This paper reviews the line of thought of a nuclear plant probabilistic risk analysis (PRA) identifying the points where data and judgement enter. At the ''bottom'' of the process, data and judgment are combined, using one and two stage Bayesian methods, to express what is known about the element of variables. Higher in the process, we see the use of judgment in identifying scenarios and developing almost models and specifying initiating event categories. Finally, we discuss the judgments involved in deciding to do a PRA and in applying the results. (orig.)

  5. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  6. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  7. Diagnostics of Coronal Magnetic Fields through the Hanle Effect in UV and IR Lines

    Energy Technology Data Exchange (ETDEWEB)

    Raouafi, Nour E. [The John Hopkins University Applied Physics Laboratory, Laurel, MD (United States); Riley, Pete [Predictive Science Inc., San Diego, CA (United States); Gibson, Sarah [High Altitude Observatory, National Center for Atmospheric Research, Boulder, CO (United States); Fineschi, Silvano [The Astrophysical Observatory of Turin, National Institute for Astrophysics, Turin (Italy); Solanki, Sami K., E-mail: noureddine.raouafi@jhuapl.edu [Max-Planck-Institut für Sonnensystemforschung, Göttingen (Germany); School of Space Research, Kyung Hee University, Yongin, South (Korea, Republic of)

    2016-06-22

    The plasma thermodynamics in the solar upper atmosphere, particularly in the corona, are dominated by the magnetic field, which controls the flow and dissipation of energy. The relative lack of knowledge of the coronal vector magnetic field is a major handicap for progress in coronal physics. This makes the development of measurement methods of coronal magnetic fields a high priority in solar physics. The Hanle effect in the UV and IR spectral lines is a largely unexplored diagnostic. We use magnetohydrodynamic (MHD) simulations to study the magnitude of the signal to be expected for typical coronal magnetic fields for selected spectral lines in the UV and IR wavelength ranges, namely the H i Ly-α and the He i 10,830 Å lines. We show that the selected lines are useful for reliable diagnosis of coronal magnetic fields. The results show that the combination of polarization measurements of spectral lines with different sensitivities to the Hanle effect may be most appropriate for deducing coronal magnetic properties from future observations.

  8. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  9. Active galactic nuclei emission line diagnostics and the mass-metallicity relation up to redshift z ∼ 2: The impact of selection effects and evolution

    Energy Technology Data Exchange (ETDEWEB)

    Juneau, Stéphanie; Bournaud, Frédéric; Daddi, Emanuele; Elbaz, David; Duc, Pierre-Alain; Gobat, Raphael; Jean-Baptiste, Ingrid; Le Floc' h, Émeric; Pannella, Maurilio; Schreiber, Corentin [CEA-Saclay, DSM/IRFU/SAp, F-91191 Gif-sur-Yvette (France); Charlot, Stéphane; Lehnert, M. D.; Pacifici, Camilla [UPMC-CNRS, UMR 7095, Institut d' Astrophysique de Paris, F-75014 Paris (France); Trump, Jonathan R. [University of California Observatories/Lick Observatory, University of California, Santa Cruz, CA 95064 (United States); Brinchmann, Jarle [Leiden Observatory, Leiden University, P.O. Box 9513, 2300 RA Leiden (Netherlands); Dickinson, Mark, E-mail: stephanie.juneau@cea.fr [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States)

    2014-06-10

    Emission line diagnostic diagrams probing the ionization sources in galaxies, such as the Baldwin-Phillips-Terlevich (BPT) diagram, have been used extensively to distinguish active galactic nuclei (AGN) from purely star-forming galaxies. However, they remain poorly understood at higher redshifts. We shed light on this issue with an empirical approach based on a z ∼ 0 reference sample built from ∼300,000 Sloan Digital Sky Survey galaxies, from which we mimic selection effects due to typical emission line detection limits at higher redshift. We combine this low-redshift reference sample with a simple prescription for luminosity evolution of the global galaxy population to predict the loci of high-redshift galaxies on the BPT and Mass-Excitation (MEx) diagnostic diagrams. The predicted bivariate distributions agree remarkably well with direct observations of galaxies out to z ∼ 1.5, including the observed stellar mass-metallicity (MZ) relation evolution. As a result, we infer that high-redshift star-forming galaxies are consistent with having normal interstellar medium (ISM) properties out to z ∼ 1.5, after accounting for selection effects and line luminosity evolution. Namely, their optical line ratios and gas-phase metallicities are comparable to that of low-redshift galaxies with equivalent emission-line luminosities. In contrast, AGN narrow-line regions may show a shift toward lower metallicities at higher redshift. While a physical evolution of the ISM conditions is not ruled out for purely star-forming galaxies and may be more important starting at z ≳ 2, we find that reliably quantifying this evolution is hindered by selections effects. The recipes provided here may serve as a basis for future studies toward this goal. Code to predict the loci of galaxies on the BPT and MEx diagnostic diagrams and the MZ relation as a function of emission line luminosity limits is made publicly available.

  10. Active galactic nuclei emission line diagnostics and the mass-metallicity relation up to redshift z ∼ 2: The impact of selection effects and evolution

    International Nuclear Information System (INIS)

    Juneau, Stéphanie; Bournaud, Frédéric; Daddi, Emanuele; Elbaz, David; Duc, Pierre-Alain; Gobat, Raphael; Jean-Baptiste, Ingrid; Le Floc'h, Émeric; Pannella, Maurilio; Schreiber, Corentin; Charlot, Stéphane; Lehnert, M. D.; Pacifici, Camilla; Trump, Jonathan R.; Brinchmann, Jarle; Dickinson, Mark

    2014-01-01

    Emission line diagnostic diagrams probing the ionization sources in galaxies, such as the Baldwin-Phillips-Terlevich (BPT) diagram, have been used extensively to distinguish active galactic nuclei (AGN) from purely star-forming galaxies. However, they remain poorly understood at higher redshifts. We shed light on this issue with an empirical approach based on a z ∼ 0 reference sample built from ∼300,000 Sloan Digital Sky Survey galaxies, from which we mimic selection effects due to typical emission line detection limits at higher redshift. We combine this low-redshift reference sample with a simple prescription for luminosity evolution of the global galaxy population to predict the loci of high-redshift galaxies on the BPT and Mass-Excitation (MEx) diagnostic diagrams. The predicted bivariate distributions agree remarkably well with direct observations of galaxies out to z ∼ 1.5, including the observed stellar mass-metallicity (MZ) relation evolution. As a result, we infer that high-redshift star-forming galaxies are consistent with having normal interstellar medium (ISM) properties out to z ∼ 1.5, after accounting for selection effects and line luminosity evolution. Namely, their optical line ratios and gas-phase metallicities are comparable to that of low-redshift galaxies with equivalent emission-line luminosities. In contrast, AGN narrow-line regions may show a shift toward lower metallicities at higher redshift. While a physical evolution of the ISM conditions is not ruled out for purely star-forming galaxies and may be more important starting at z ≳ 2, we find that reliably quantifying this evolution is hindered by selections effects. The recipes provided here may serve as a basis for future studies toward this goal. Code to predict the loci of galaxies on the BPT and MEx diagnostic diagrams and the MZ relation as a function of emission line luminosity limits is made publicly available.

  11. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  12. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  13. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  14. On revision of partially specified convex probabilistic belief bases

    CSIR Research Space (South Africa)

    Rens, G

    2016-08-01

    Full Text Available We propose a method for an agent to revise its incomplete probabilistic beliefs when a new piece of propositional information is observed. In this work, an agent’s beliefs are represented by a set of probabilistic formulae – a belief base...

  15. SYSTEM OF CONTROL FOR ACTIVE CAR SUSPENSION CHARACTERISTICS IN THE COMPOSITION OF THE EXPRESS DIAGNOSTICS LINE

    Directory of Open Access Journals (Sweden)

    Y. Borodenko

    2017-12-01

    Full Text Available The issues related to the organization and technical implementation of the control area for the output parameters of the active pendants as part of the technical control line are considered. An option is proposed to complete the baseline of express diagnostics with an additional bench for checking the angles of installation of the rear axle wheels. To reduce the cost of hardware implementation and increase the productivity of the measuring complex, it is proposed to compile software for computer diagnostic tools into a single testing.

  16. Filterscope diagnostic system on EAST tokamak

    International Nuclear Information System (INIS)

    Xu, Z.; Wu, Z.W.; Gao, W.; Zhang, L.; Huang, J.; Chen, Y.J.; Wu, C.R.; Zhang, P.F.

    2015-01-01

    Filterscope diagnostic system, which is designed for monitoring the line emission in fusion plasma has been widely used on fusion devices such as DIII-D, NSTX, CDX-U, KSTAR etc. On EAST (Experimental Advanced Superconducting Tokamak), a filterscope diagnostic system has been mounted to observe the line emission and visible bremsstrahlung emission in plasma from discharge campaign of 2014. It plays a crucial role in studying Edge Localized Modes (ELM) and H-mode, thanks to its high temporal resolution (0.005ms) and good spatial resolution (∼2cm). Furthermore, multi-channel signals at up to 200kHz sampling rates can be digitized simultaneously. The wavelength covers He II (468.5nm), Li I (670.8nm), Li II (548.3nm), C III (465.0nm), O II (441.5nm), Mo I (386.4nm), W I (400.9nm) and visible bremsstrahlung radiation at 538nm besides Dα (656.1nm) and Dγ (433.9nm) with the corresponding wavelength filters. The new developed filterscope system was operating during the EAST 2014 fall experimental campaign and several types ELMs has been observed. (author)

  17. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  18. Characterizing ICF Neutron Diagnostics on the nTOF line at SUNY Geneseo

    Science.gov (United States)

    Simone, Angela; Padalino, Stephen; Turner, Ethan; Ginnane, Mary Kate; Dubois, Natalie; Fletcher, Kurtis; Giordano, Michael; Lawson-Keister, Patrick; Harrison, Hannah; Visca, Hannah; Sangster, Craig; Regan, Sean

    2014-10-01

    Charged particle beams from the Geneseo 1.7 MV tandem Pelletron accelerator produce nuclear reactions that emit neutrons in the range of 0.5 to 17.9 MeV via the d(d,n)3He and 11B(d,n)12C reactions. The neutron energy and flux can be adjusted by controlling the accelerator beam current and potential. This adjustable neutron source makes it possible to calibrate ICF and HEDP neutron scintillator diagnostics. However, gamma rays which are often present during an accelerator-based calibration are difficult to differentiate from neutron signals in scintillators. To identify neutrons from gamma rays and to determine their energy, a permanent neutron time-of-flight (nTOF) line is being constructed. By detecting the scintillator signal in coincidence with an associated charged particle (ACP) produced in the reaction, the identity of the neutron can be known and its energy determined by time of flight. Using a 100% efficient surface barrier detector to count the ACPs, the absolute efficiency of the scintillator as a function of neutron energy can be determined. This is done by determining the ratio of the ACP counts in the singles spectrum to coincidence counts for matched solid angles of the SBD and scintillator. Funded in part by a LLE contract through the DOE.

  19. Update on diagnostic strategies of pulmonary embolism

    International Nuclear Information System (INIS)

    Kauczor, H.U.; Heussel, C.P.; Thelen, M.

    1999-01-01

    Acute pulmonary embolism is a frequent disease with non-specific findings, high mortality, and multiple therapeutic options. A definitive diagnosis must be established by accurate, non-invasive, easily performed, cost-effective, and widely available imaging modalities. Conventional diagnostic strategies have relied on ventilation-perfusion scintigraphy complemented by venous imaging. If the results are inconclusive, pulmonary angiography, which is regarded as the gold standard, is to be performed. Recently, marked improvements in CT and MRI and shortcomings of scintigraphy led to an update of the diagnostic strategy. Spiral CT is successfully employed as a second-line procedure to clarify indeterminate scintigraphic results avoiding pulmonary angiography. It can also be used as a first-line screening tool if service and expertise is provided. Venous imaging is indicated if CT is inconclusive. The MRI technique can be applied as an alternative second-line test if spiral CT is not available or is contraindicated. It has the greatest potential for further developments and refinements. Echocardiography should be used as a first-line bedside examination in critical patients. If inconclusive stabilized patients undergo spiral CT, unstable patients should be referred for pulmonary angiography. Chronic thromboembolic pulmonary hypertension is a rare sequela of acute pulmonary embolism which can be cured surgically. Morphology, complications, and differential diagnoses are better illustrated by spiral CT and MRA, whereas invasive acquisition of hemodynamic data is the sole advantage of angiography. (orig.)

  20. On the Probabilistic Characterization of Robustness and Resilience

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Qin, J.; Miraglia, Simona

    2017-01-01

    Over the last decade significant research efforts have been devoted to the probabilistic modeling and analysis of system characteristics. Especially performance characteristics of systems subjected to random disturbances, such as robustness and resilience have been in the focus of these efforts...... in the modeling of robustness and resilience in the research areas of natural disaster risk management, socio-ecological systems and social systems and we propose a generic decision analysis framework for the modeling and analysis of systems across application areas. The proposed framework extends the concept...... of direct and indirect consequences and associated risks in probabilistic systems modeling formulated by the Joint Committee on Structural Safety (JCSS) to facilitate the modeling and analysis of resilience in addition to robustness and vulnerability. Moreover, based on recent insights in the modeling...

  1. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  2. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  3. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  4. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  5. Simultaneous-Fault Diagnosis of Gas Turbine Generator Systems Using a Pairwise-Coupled Probabilistic Classifier

    Directory of Open Access Journals (Sweden)

    Zhixin Yang

    2013-01-01

    Full Text Available A reliable fault diagnostic system for gas turbine generator system (GTGS, which is complicated and inherent with many types of component faults, is essential to avoid the interruption of electricity supply. However, the GTGS diagnosis faces challenges in terms of the existence of simultaneous-fault diagnosis and high cost in acquiring the exponentially increased simultaneous-fault vibration signals for constructing the diagnostic system. This research proposes a new diagnostic framework combining feature extraction, pairwise-coupled probabilistic classifier, and decision threshold optimization. The feature extraction module adopts wavelet packet transform and time-domain statistical features to extract vibration signal features. Kernel principal component analysis is then applied to further reduce the redundant features. The features of single faults in a simultaneous-fault pattern are extracted and then detected using a probabilistic classifier, namely, pairwise-coupled relevance vector machine, which is trained with single-fault patterns only. Therefore, the training dataset of simultaneous-fault patterns is unnecessary. To optimize the decision threshold, this research proposes to use grid search method which can ensure a global solution as compared with traditional computational intelligence techniques. Experimental results show that the proposed framework performs well for both single-fault and simultaneous-fault diagnosis and is superior to the frameworks without feature extraction and pairwise coupling.

  6. On-line integration of computer controlled diagnostic devices and medical information systems in undergraduate medical physics education for physicians.

    Science.gov (United States)

    Hanus, Josef; Nosek, Tomas; Zahora, Jiri; Bezrouk, Ales; Masin, Vladimir

    2013-01-01

    We designed and evaluated an innovative computer-aided-learning environment based on the on-line integration of computer controlled medical diagnostic devices and a medical information system for use in the preclinical medical physics education of medical students. Our learning system simulates the actual clinical environment in a hospital or primary care unit. It uses a commercial medical information system for on-line storage and processing of clinical type data acquired during physics laboratory classes. Every student adopts two roles, the role of 'patient' and the role of 'physician'. As a 'physician' the student operates the medical devices to clinically assess 'patient' colleagues and records all results in an electronic 'patient' record. We also introduced an innovative approach to the use of supportive education materials, based on the methods of adaptive e-learning. A survey of student feedback is included and statistically evaluated. The results from the student feedback confirm the positive response of the latter to this novel implementation of medical physics and informatics in preclinical education. This approach not only significantly improves learning of medical physics and informatics skills but has the added advantage that it facilitates students' transition from preclinical to clinical subjects. Copyright © 2011 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  7. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  8. On the quality and value of probabilistic forecasts of wind generation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Juban, Jeremie; Kariniotakis, Georges

    2006-01-01

    the uncertainty information, can be seen as optimal for the management or trading of wind generation. This paper explores the differences and relations between the quality (i.e. statistical performance) and the operational value of these forecasts. An application is presented on the use of probabilistic...... predictions for bidding in a European electricity market. The benefits of a probabilistic view of wind power forecasting are clearly demonstrated....

  9. Effects of methamphetamine administration on information gathering during probabilistic reasoning in healthy humans.

    Science.gov (United States)

    Ermakova, Anna O; Ramachandra, Pranathi; Corlett, Philip R; Fletcher, Paul C; Murray, Graham K

    2014-01-01

    Jumping to conclusions (JTC) during probabilistic reasoning is a cognitive bias repeatedly demonstrated in people with schizophrenia and shown to be associated with delusions. Little is known about the neurochemical basis of probabilistic reasoning. We tested the hypothesis that catecholamines influence data gathering and probabilistic reasoning by administering intravenous methamphetamine, which is known to cause synaptic release of the catecholamines noradrenaline and dopamine, to healthy humans whilst they undertook a probabilistic inference task. Our study used a randomised, double-blind, cross-over design. Seventeen healthy volunteers on three visits were administered either placebo or methamphetamine or methamphetamine preceded by amisulpride. In all three conditions participants performed the "beads" task in which participants decide how much information to gather before making a probabilistic inference, and which measures the cognitive bias towards jumping to conclusions. Psychotic symptoms triggered by methamphetamine were assessed using Comprehensive Assessment of At-Risk Mental States (CAARMS). Methamphetamine induced mild psychotic symptoms, but there was no effect of drug administration on the number of draws to decision (DTD) on the beads task. DTD was a stable trait that was highly correlated within subjects across visits (intra-class correlation coefficients of 0.86 and 0.91 on two versions of the task). The less information was sampled in the placebo condition, the more psychotic-like symptoms the person had after the methamphetamine plus amisulpride condition (p = 0.028). Our results suggest that information gathering during probabilistic reasoning is a stable trait, not easily modified by dopaminergic or noradrenergic modulation.

  10. The Formation of IRIS Diagnostics. IX. The Formation of the C i 135.58 NM Line in the Solar Atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Hsiao-Hsuan; Carlsson, Mats; Leenaarts, Jorrit, E-mail: mats.carlsson@astro.uio.no, E-mail: jorrit.leenaarts@astro.su.se [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029 Blindern, NO-0315 Oslo (Norway)

    2017-09-01

    The C i 135.58 nm line is located in the wavelength range of NASA’s Interface Region Imaging Spectrograph ( IRIS ) small explorer mission. We study the formation and diagnostic potential of this line by means of non local-thermodynamic-equilibrium modeling, employing both 1D and 3D radiation-magnetohydrodynamic models. The C i/C ii ionization balance is strongly influenced by photoionization by Ly α emission. The emission in the C i 135.58 nm line is dominated by a recombination cascade and the line forming region is optically thick. The Doppler shift of the line correlates strongly with the vertical velocity in its line forming region, which is typically located at 1.5 Mm height. With IRIS , the C i 135.58 nm line is usually observed together with the O i 135.56 nm line, and from the Doppler shift of both lines, we obtain the velocity difference between the line forming regions of the two lines. From the ratio of the C i/O i line core intensity, we can determine the distance between the C i and the O i forming layers. Combined with the velocity difference, the velocity gradient at mid-chromospheric heights can be derived. The C i/O i total intensity line ratio is correlated with the inverse of the electron density in the mid-chromosphere. We conclude that the C i 135.58 nm line is an excellent probe of the middle chromosphere by itself, and together with the O i 135.56 nm line the two lines provide even more information, which complements other powerful chromospheric diagnostics of IRIS such as the Mg ii h and k lines and the C ii lines around 133.5 nm.

  11. Unit commitment with probabilistic reserve: An IPSO approach

    International Nuclear Information System (INIS)

    Lee, Tsung-Ying; Chen, Chun-Lung

    2007-01-01

    This paper presents a new algorithm for solution of the nonlinear optimal scheduling problem. This algorithm is named the iteration particle swarm optimization (IPSO). A new index, called iteration best, is incorporated into particle swarm optimization (PSO) to improve the solution quality and computation efficiency. IPSO is applied to solve the unit commitment with probabilistic reserve problem of a power system. The outage cost as well as fuel cost of thermal units was considered in the unit commitment program to evaluate the level of spinning reserve. The optimal scheduling of on line generation units was reached while minimizing the sum of fuel cost and outage cost. A 48 unit power system was used as a numerical example to test the new algorithm. The optimal scheduling of on line generation units could be reached in the testing results while satisfying the requirement of the objective function

  12. Powerloads on the front end components and the duct of the heating and diagnostic neutral beam lines at ITER

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M. J.; Boilson, D.; Hemsworth, R. S.; Geli, F.; Graceffa, J.; Urbani, M.; Schunke, B.; Chareyre, J. [ITER Organisation, 13607 St. Paul-Lez-Durance Cedex (France); Dlougach, E.; Krylov, A. [RRC Kurchatov institute, 1, Kurchatov Sq, Moscow, 123182 (Russian Federation)

    2015-04-08

    The heating and current drive beam lines (HNB) at ITER are expected to deliver ∼16.7 MW power per beam line for H beams at 870 keV and D beams at 1 MeV during the H-He and the DD/DT phases of ITER operation respectively. On the other hand the diagnostic neutral beam (DNB) line shall deliver ∼2 MW power for H beams at 100 keV during both the phases. The path lengths over which the beams from the HNB and DNB beam lines need to be transported are 25.6 m and 20.7 m respectively. The transport of the beams over these path lengths results in beam losses, mainly by the direct interception of the beam with the beam line components and reionisation. The lost power is deposited on the surfaces of the various components of the beam line. In order to ensure the survival of these components over the operational life time of ITER, it is important to determine to the best possible extent the operational power loads and power densities on the various surfaces which are impacted by the beam in one way or the other during its transport. The main factors contributing to these are the divergence of the beamlets and the halo fraction in the beam, the beam aiming, the horizontal and vertical misalignment of the beam, and the gas profile along the beam path, which determines the re-ionisation loss, and the re-ionisation cross sections. The estimations have been made using a combination of the modified version of the Monte Carlo Gas Flow code (MCGF) and the BTR code. The MCGF is used to determine the gas profile in the beam line and takes into account the active gas feed into the ion source and neutraliser, the HNB-DNB cross over, the gas entering the beamline from the ITER machine, the additional gas atoms generated in the beam line due to impacting ions and the pumping speed of the cryopumps. The BTR code has been used to obtain the power loads and the power densities on the various surfaces of the front end components and the duct modules for different scenarios of ITER

  13. Characterizing ICF Neutron Scintillation Diagnostics on the nTOF line at SUNY Geneseo

    Science.gov (United States)

    Lawson-Keister, Pat; Padawar-Curry, Jonah; Visca, Hannah; Fletcher, Kurt; Padalino, Stephen; Sangster, T. Craig; Regan, Sean

    2015-11-01

    Neutron scintillator diagnostics for ICF and HEDP can be characterized using the neutron time-of-flight (nTOF) line on Geneseo's 1.7 MV tandem Pelletron accelerator. Neutron signals can be differentiated from gamma signals by employing coincidence methods. A 1.8-MeV beam of deuterons incident on a deuterated polyethylene target produces neutrons via the 2H(d,n)3He reaction. Neutrons emerging at a lab angle of 88° have an energy of 2.96 MeV; the 3He ions associated with these neutrons are detected at a scattering angle of 43° using a surface barrier detector. The time of flight of the neutron can be measured by using the 3He detection as a ``start'' signal and the scintillation detection as a ``stop'' signal. This time of flight requirement is used to identify the 2.96-MeV neutron signals in the scintillator. To measure the light curve produced by these monoenergetic neutrons, two photomultiplier (PMT) tubes are attached to the scintillator. The full aperture PMT establishes the nTOF coincidence. The other PMT is fitted with a pinhole to collect single events. The time between the full aperture PMT signal and the arrival of the signal in the pinhole PMT is used to determine the light curve for the scintillator. This system will enable the neutron response of various scintillators to be compared. Supported in part by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  14. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  15. Risk-based equipment removal guide for on-line maintenance at PSE ampersand G

    International Nuclear Information System (INIS)

    Knoll, A.; Smith, C.; Pollock, J.

    1995-01-01

    On-line maintenance plays an important role in achieving safe and reliable power generation in a nuclear power plant. However, maintenance, if not properly planned and performed, may also be an important contributor to plant risk. Therefore, plant-specific procedures are needed for equipment removal from service to enhance the benefits of on-line maintenance and minimize the risks involved. The problem is to identify and implement the most effective on-line maintenance policy in the form of a proceduralized guide to assure plant safety under various operation and maintenance constraints. This paper presents a methodology to develop plant-specific on-line maintenance strategies and acceptance criteria using a multivariate safety approach based on risk assessment. Based on plant-specific data as modeled in the individual plant evaluation (IPE) and the updated probabilistic safety assessment (PSA), the risk-based methodology is currently being applied to the development of proceduralized equipment removal guides at Hope Creek and Salem units 1 and 2 of Public Service Electric and Gas Company (PSE ampersand G)

  16. Charge exchange spectroscopy as a fast ion diagnostic on TEXTOR

    International Nuclear Information System (INIS)

    Delabie, E.; Jaspers, R. J. E.; Hellermann, M. G. von; Nielsen, S. K.; Marchuk, O.

    2008-01-01

    An upgraded charge exchange spectroscopy diagnostic has been taken into operation at the TEXTOR tokamak. The angles of the viewing lines with the toroidal magnetic field are close to the pitch angles at birth of fast ions injected by one of the neutral beam injectors. Using another neutral beam for active spectroscopy, injected counter the direction in which fast ions injected by the first beam are circulating, we can simultaneously measure a fast ion tail on the blue wing of the D α spectrum while the beam emission spectrum is Doppler shifted to the red wing. An analysis combining the two parts of the spectrum offers possibilities to improve the accuracy of the absolute (fast) ion density profiles. Fast beam modulation or passive viewing lines cannot be used for background subtraction on this diagnostic setup and therefore the background has to be modeled and fitted to the data together with a spectral model for the slowing down feature. The analysis of the fast ion D α spectrum obtained with the new diagnostic is discussed.

  17. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  18. Perceptual learning as improved probabilistic inference in early sensory areas.

    Science.gov (United States)

    Bejjanki, Vikranth R; Beck, Jeffrey M; Lu, Zhong-Lin; Pouget, Alexandre

    2011-05-01

    Extensive training on simple tasks such as fine orientation discrimination results in large improvements in performance, a form of learning known as perceptual learning. Previous models have argued that perceptual learning is due to either sharpening and amplification of tuning curves in early visual areas or to improved probabilistic inference in later visual areas (at the decision stage). However, early theories are inconsistent with the conclusions of psychophysical experiments manipulating external noise, whereas late theories cannot explain the changes in neural responses that have been reported in cortical areas V1 and V4. Here we show that we can capture both the neurophysiological and behavioral aspects of perceptual learning by altering only the feedforward connectivity in a recurrent network of spiking neurons so as to improve probabilistic inference in early visual areas. The resulting network shows modest changes in tuning curves, in line with neurophysiological reports, along with a marked reduction in the amplitude of pairwise noise correlations.

  19. UV and X-ray spectral lines of Be-like Fe ion for plasma diagnostics

    International Nuclear Information System (INIS)

    Murakami, Izumi; Kato, Takako; Dubau, J.

    1996-04-01

    We have calculated X-ray and UV spectra of the Be-like Fe (FeXXIII) ion using collisional-radiative model including all fine-structure transitions among the 2s 2 , 2s2p, 2p 2 , 2snl, and 2pnl levels where n = 3 and 4, adopting data for the collision strengths by Zhang and Sampson (1992) and by Sampson, Goett, and Clark (1984). Some line intensity ratios can be used for the temperature diagnostic. We show 5 ratios in UV region and 9 ratios in X-ray region as functions of electron temperature and density at 0.3keV e e = 1-10 25 cm -3 . The effect of cascade in these line ratios is discussed. (author)

  20. A Novel TRM Calculation Method by Probabilistic Concept

    Science.gov (United States)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  1. VIBRO-DIAGNOSTIC SYSTEM ON BASIS OF PERSONAL COMPUTER

    Directory of Open Access Journals (Sweden)

    V. V. Bokut

    2007-01-01

    Full Text Available A system for vibration diagnostics based on a mobile computer and two-channel microprocessor measuring device has been developed. Usage of fast Hartley-Fourier transform allows to increase frequency resolution up to 25000 spectral lines that makes it possible to use the system for wide range of applications. 

  2. Probabilistic Modelling of Timber Material Properties

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber; Köhler, Jochen; Sørensen, John Dalsgaard

    2001-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for grading of timber in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be compared. The tail behavior of the probability distributions of timber material characteristics play an important role...... such that they may readily be applied in structural reliability analysis and the format appears to be appropriate for codification purposes of quality control and selection for grading procedures...

  3. Diode line scanner for beam diagnostics

    International Nuclear Information System (INIS)

    Gustov, S.A.

    1987-01-01

    The device-scanning diode line is described. It is applied for beam profile measuring with space precision better than ± 0.5 mm and with discreteness of 3 mm along Y-axis and 0.25 mm along X-axis. The device is easy in construction, reliable and has a small time of information acquisition (2-5 min). The working range is from 100 to 10 6 rad/min (10 6 -10 10 part/mm 2 /s for 660 MeV protons). Radioresistance is 10 7 rad. The device can be applied for precise beam line element tuning at beam transporting and emittance measuring. The fixed diode line (a simplified device version) has smaller dimensions and smaller time of data acquisition (2-5 s). It is applied for quick preliminary beamline tuning. The flowsheet and different variants of data representation on beam profile are given

  4. Probabilistic Fatigue Analysis of Jacket Support Structures for Offshore Wind Turbines Exemplified on Tubular Joints

    OpenAIRE

    Kelma, Sebastian; Schaumann, Peter

    2015-01-01

    The design of offshore wind turbines is usually based on the semi-probabilistic safety concept. Using probabilistic methods, the aim is to find an advanced structural design of OWTs in order to improve safety and reduce costs. The probabilistic design is exemplified on tubular joints of a jacket substructure. Loads and resistance are considered by their respective probability distributions. Time series of loads are generated by fully-coupled numerical simulation of the offshore wind turbine. ...

  5. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  6. Diagnostic performance of automated liquid culture and molecular line probe assay in smear-negative pulmonary tuberculosis.

    Science.gov (United States)

    Kotwal, Aarti; Biswas, Debasis; Raghuvanshi, Shailendra; Sindhwani, Girish; Kakati, Barnali; Sharma, Shweta

    2017-04-01

    The diagnosis of smear-negative pulmonary tuberculosis (PTB) is particularly challenging, and automated liquid culture and molecular line probe assays (LPA) may prove particularly useful. The objective of our study was to evaluate the diagnostic potential of automated liquid culture (ALC) technology and commercial LPA in sputum smear-negative PTB suspects. Spot sputum samples were collected from 145 chest-symptomatic smear-negative patients and subjected to ALC, direct drug susceptibility test (DST) testing and LPA, as per manufacturers' instructions. A diagnostic yield of 26.2% was observed among sputum smear-negative TB suspects with 47.4% of the culture isolates being either INH- and/or rifampicin-resistant. Complete agreement was observed between the results of ALC assay and LPA except for two isolates which demonstrated sensitivity to INH and rifampicin at direct DST but were rifampicin-resistant in LPA. Two novel mutations were also detected among the multidrug isolates by LPA. In view of the diagnostic challenges associated with the diagnosis of TB in sputum smear-negative patients, our study demonstrates the applicability of ALC and LPA in establishing diagnostic evidence of TB.

  7. Polarization of Coronal Forbidden Lines

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hao; Qu, Zhongquan [Yunnan Observatories, Chinese Academy of Sciences, Kunming, Yunnan 650011 (China); Landi Degl’Innocenti, Egidio, E-mail: sayahoro@ynao.ac.cn [Dipartimento di Astronomia e Scienza dello Spazio, Università di Firenze, Largo E. Fermi 2, I-50125 Firenze (Italy)

    2017-03-20

    Since the magnetic field is responsible for most manifestations of solar activity, one of the most challenging problems in solar physics is the diagnostics of solar magnetic fields, particularly in the outer atmosphere. To this end, it is important to develop rigorous diagnostic tools to interpret polarimetric observations in suitable spectral lines. This paper is devoted to analyzing the diagnostic content of linear polarization imaging observations in coronal forbidden lines. Although this technique is restricted to off-limb observations, it represents a significant tool to diagnose the magnetic field structure in the solar corona, where the magnetic field is intrinsically weak and still poorly known. We adopt the quantum theory of polarized line formation developed in the framework of the density matrix formalism, and synthesize images of the emergent linear polarization signal in coronal forbidden lines using potential-field source-surface magnetic field models. The influence of electronic collisions, active regions, and Thomson scattering on the linear polarization of coronal forbidden lines is also examined. It is found that active regions and Thomson scattering are capable of conspicuously influencing the orientation of the linear polarization. These effects have to be carefully taken into account to increase the accuracy of the field diagnostics. We also found that linear polarization observation in suitable lines can give valuable information on the long-term evolution of the magnetic field in the solar corona.

  8. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  9. Probabilistic causality and radiogenic cancers

    International Nuclear Information System (INIS)

    Groeer, P.G.

    1986-01-01

    A review and scrutiny of the literature on probability and probabilistic causality shows that it is possible under certain assumptions to estimate the probability that a certain type of cancer diagnosed in an individual exposed to radiation prior to diagnosis was caused by this exposure. Diagnosis of this causal relationship like diagnosis of any disease - malignant or not - requires always some subjective judgments by the diagnostician. It is, therefore, illusory to believe that tables based on actuarial data can provide objective estimates of the chance that a cancer diagnosed in an individual is radiogenic. It is argued that such tables can only provide a base from which the diagnostician(s) deviate in one direction or the other according to his (their) individual (consensual) judgment. Acceptance of a physician's diagnostic judgment by patients is commonplace. Similar widespread acceptance of expert judgment by claimants in radiation compensation cases does presently not exist. Judicious use of the present radioepidemiological tables prepared by the Working Group of the National Institutes of Health or of updated future versions of similar tables may improve the situation. 20 references

  10. Lessons learned on probabilistic methodology for precursor analyses

    Energy Technology Data Exchange (ETDEWEB)

    Babst, Siegfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Berlin (Germany); Wielenberg, Andreas; Gaenssmantel, Gerhard [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  11. Lessons learned on probabilistic methodology for precursor analyses

    International Nuclear Information System (INIS)

    Babst, Siegfried; Wielenberg, Andreas; Gaenssmantel, Gerhard

    2016-01-01

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  12. On Probabilistic Automata in Continuous Time

    DEFF Research Database (Denmark)

    Eisentraut, Christian; Hermanns, Holger; Zhang, Lijun

    2010-01-01

    We develop a compositional behavioural model that integrates a variation of probabilistic automata into a conservative extension of interactive Markov chains. The model is rich enough to embody the semantics of generalised stochastic Petri nets. We define strong and weak bisimulations and discuss...

  13. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  14. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  15. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  16. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  17. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  18. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  19. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  20. New receiving line for the remote-steering antenna of the 140 GHz CTS diagnostics in the FTU Tokamak

    Science.gov (United States)

    D'Arcangelo, O.; Bin, W.; Bruschi, A.; Cappelli, M.; Fanale, F.; Gittini, G.; Pallotta, F.; Rocchi, G.; Tudisco, O.; Garavaglia, S.; Granucci, G.; Moro, A.; Tuccillo, A. A.

    2018-01-01

    A new receiving antenna for collecting signals of the Collective Thomson Scattering (CTS) diagnostics in FTU Tokamak has been recently installed. The squared corrugated section and the precisely defined length make it possible to receive from different directions by remotely steering the receiving mirrors. This type of Remote-Steering (RS) antennas, being studied on FTU for the DEMO Electron Cyclotron Heating (ECH) system launch, is already installed on the W7- X stellarator and will be tested in the next campaign. The transmission of the signal from the antenna in the tokamak hall to the CTS diagnostics hall will be mainly realized by means of oversized circular corrugated waveguides carrying the hybrid HE11 (quasi-gaussian) waveguide mode, with inclusion of a special smooth-waveguide section and a short run of reduced-size square-corrugated waveguide through the tokamak bio-shield. The coupling between different waveguide types is made with ellipsoidal focusing mirrors, using quasi-optical matching formulas between the gaussian-shaped beams in input and output to the waveguides. In this work, after a complete study of feasibility of the overall line, a design for the receiving line will be proposed, in order to realize an executive layout to be used as a guideline for the commissioning phase.

  1. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  2. Probabilistic wind power forecasting based on logarithmic transformation and boundary kernel

    International Nuclear Information System (INIS)

    Zhang, Yao; Wang, Jianxue; Luo, Xu

    2015-01-01

    Highlights: • Quantitative information on the uncertainty of wind power generation. • Kernel density estimator provides non-Gaussian predictive distributions. • Logarithmic transformation reduces the skewness of wind power density. • Boundary kernel method eliminates the density leakage near the boundary. - Abstracts: Probabilistic wind power forecasting not only produces the expectation of wind power output, but also gives quantitative information on the associated uncertainty, which is essential for making better decisions about power system and market operations with the increasing penetration of wind power generation. This paper presents a novel kernel density estimator for probabilistic wind power forecasting, addressing two characteristics of wind power which have adverse impacts on the forecast accuracy, namely, the heavily skewed and double-bounded nature of wind power density. Logarithmic transformation is used to reduce the skewness of wind power density, which improves the effectiveness of the kernel density estimator in a transformed scale. Transformations partially relieve the boundary effect problem of the kernel density estimator caused by the double-bounded nature of wind power density. However, the case study shows that there are still some serious problems of density leakage after the transformation. In order to solve this problem in the transformed scale, a boundary kernel method is employed to eliminate the density leak at the bounds of wind power distribution. The improvement of the proposed method over the standard kernel density estimator is demonstrated by short-term probabilistic forecasting results based on the data from an actual wind farm. Then, a detailed comparison is carried out of the proposed method and some existing probabilistic forecasting methods

  3. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  4. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  5. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  6. The choice between two designs for the safety-injection system of a pressurized-water reactor, using probabilistic methods

    International Nuclear Information System (INIS)

    Villemeur, Alain

    1982-01-01

    A probabilistic study has been carried out to compare two designs for the safety-injection circuit of a pressurized-water reactor. It appears that unavailability of the circuit after an accident involving loss of coolant decreases little when one moves from a 2-line to a 3-line system. These results are compared with the disadvantages arising from increased redundancy, and in particular the increased cost of the installations. The 2-line circuit appears the optimum one on the basis of cost and reliability criteria. It has been chosen for the 1300-MWe units [fr

  7. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  8. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    Science.gov (United States)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more

  9. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  10. Simultaneous-Fault Diagnosis of Gearboxes Using Probabilistic Committee Machine

    Science.gov (United States)

    Zhong, Jian-Hua; Wong, Pak Kin; Yang, Zhi-Xin

    2016-01-01

    This study combines signal de-noising, feature extraction, two pairwise-coupled relevance vector machines (PCRVMs) and particle swarm optimization (PSO) for parameter optimization to form an intelligent diagnostic framework for gearbox fault detection. Firstly, the noises of sensor signals are de-noised by using the wavelet threshold method to lower the noise level. Then, the Hilbert-Huang transform (HHT) and energy pattern calculation are applied to extract the fault features from de-noised signals. After that, an eleven-dimension vector, which consists of the energies of nine intrinsic mode functions (IMFs), maximum value of HHT marginal spectrum and its corresponding frequency component, is obtained to represent the features of each gearbox fault. The two PCRVMs serve as two different fault detection committee members, and they are trained by using vibration and sound signals, respectively. The individual diagnostic result from each committee member is then combined by applying a new probabilistic ensemble method, which can improve the overall diagnostic accuracy and increase the number of detectable faults as compared to individual classifiers acting alone. The effectiveness of the proposed framework is experimentally verified by using test cases. The experimental results show the proposed framework is superior to existing single classifiers in terms of diagnostic accuracies for both single- and simultaneous-faults in the gearbox. PMID:26848665

  11. Galaxy emission line classification using three-dimensional line ratio diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Vogt, Frédéric P. A.; Dopita, Michael A.; Kewley, Lisa J.; Sutherland, Ralph S. [Research School of Astronomy and Astrophysics, Australian National University, Canberra, ACT 2611 (Australia); Scharwächter, Julia [Observatoire de Paris, LERMA (CNRS: UMR8112), 61 Av. de l' Observatoire, F-75014 Paris (France); Basurah, Hassan M.; Ali, Alaa; Amer, Morsi A., E-mail: frederic.vogt@anu.edu.au [Astronomy Department, King Abdulaziz University, P.O. Box 80203, Jeddah (Saudi Arabia)

    2014-10-01

    Two-dimensional (2D) line ratio diagnostic diagrams have become a key tool in understanding the excitation mechanisms of galaxies. The curves used to separate the different regions—H II-like or excited by an active galactic nucleus (AGN)—have been refined over time but the core technique has not evolved significantly. However, the classification of galaxies based on their emission line ratios really is a multi-dimensional problem. Here we exploit recent software developments to explore the potential of three-dimensional (3D) line ratio diagnostic diagrams. We introduce the ZQE diagrams, which are a specific set of 3D diagrams that separate the oxygen abundance and the ionization parameter of H II region-like spectra and also enable us to probe the excitation mechanism of the gas. By examining these new 3D spaces interactively, we define the ZE diagnostics, a new set of 2D diagnostics that can provide the metallicity of objects excited by hot young stars and that cleanly separate H II region-like objects from the different classes of AGNs. We show that these ZE diagnostics are consistent with the key log [N II]/Hα versus log [O III]/Hβ diagnostic currently used by the community. They also have the advantage of attaching a probability that a given object belongs to one class or the other. Finally, we discuss briefly why ZQE diagrams can provide a new way to differentiate and study the different classes of AGNs in anticipation of a dedicated follow-up study.

  12. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  13. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [University of Texas at Dallas; Feng, Cong [University of Texas at Dallas; Wang, Zhenke [University of Texas at Dallas; Zhang, Jie [University of Texas at Dallas

    2018-02-01

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power and currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start-time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  14. Probabilistic analysis of flaw distribution on structure under cyclic load

    International Nuclear Information System (INIS)

    Kwak, Sang Log; Choi, Young Hwan; Kim, Hho Jung

    2003-01-01

    Flaw geometries, applied stress, and material properties are major input variables for the fracture mechanics analysis. Probabilistic approach can be applied for the consideration of uncertainties within these input variables. But probabilistic analysis requires many assumptions due to the lack of initial flaw distributions data. In this study correlations are examined between initial flaw distributions and in-service flaw distributions on structures under cyclic load. For the analysis, LEFM theories and Monte Carlo simulation are applied. Result shows that in-service flaw distributions are determined by initial flaw distributions rather than fatigue crack growth rate. So initial flaw distribution can be derived from in-service flaw distributions

  15. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  16. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  17. On-line Monitoring System for Power Transformers

    Directory of Open Access Journals (Sweden)

    Alexandru HOTEA

    2016-12-01

    Full Text Available Power transformers are the most important and expensive equipment from the electricity transmission system, so it is very important to know the real state of health of such equipment in every moment. De-energizing the power transformer accidentally due to internal defects can generate high costs. Annual maintenance proved to be ineffective in many cases to determine the internal condition of the equipment degradation due to faults rapidly evolving. An On-line Monitoring System for Power Transformers help real-time condition assessment and to detect errors early enough to take action to eliminate or minimize them. After abnormality detected, it is still important to perform full diagnostic tests to determine the exact condition of the equipment. On-line monitoring systems can help increase the level of availability and reliability of power transformers and lower costs of accidental interruption. This paper presents cases studies on several power transformers equipped with on-line monitoring systems from Transelectrica substation.

  18. A New Diagnostic Diagram of Ionization Sources for High-redshift Emission Line Galaxies

    Science.gov (United States)

    Zhang, Kai; Hao, Lei

    2018-04-01

    We propose a new diagram, the kinematics–excitation (KEx) diagram, which uses the [O III] λ5007/Hβ line ratio and the [O III] λ5007 emission line width (σ [O III]) to diagnose the ionization source and physical properties of active galactic nuclei (AGNs) and star-forming galaxies (SFGs). The KEx diagram is a suitable tool to classify emission line galaxies at intermediate redshift because it uses only the [O III] λ5007 and Hβ emission lines. We use the main galaxy sample of SDSS DR7 and the Baldwin‑Phillips‑Terlevich (BPT) diagnostic to calibrate the diagram at low redshift. The diagram can be divided into three regions: the KEx-AGN region, which consists mainly of pure AGNs, the KEx-composite region, which is dominated by composite galaxies, and the KEx-SFG region, which contains mostly SFGs. LINERs strongly overlap with the composite and AGN regions. AGNs are separated from SFGs in this diagram mainly because they preferentially reside in luminous and massive galaxies and have higher [O III]/Hβ than SFGs. The separation between AGNs and SFGs is even cleaner thanks to the additional 0.15/0.12 dex offset in σ [O III] at fixed luminosity/stellar mass. We apply the KEx diagram to 7866 galaxies at 0.3 Survey, and compare it to an independent X-ray classification scheme using Chandra observations. X-ray AGNs are mostly located in the KEx-AGN region, while X-ray SFGs are mostly located in the KEx-SFG region. Almost all Type 1 AGNs lie in the KEx-AGN region. These tests support the reliability of this classification diagram for emission line galaxies at intermediate redshift. At z ∼ 2, the demarcation line between SFGs and AGNs is shifted by ∼0.3 dex toward higher values of σ [O III] due to evolution effects.

  19. Suppression of panel flutter of near-space aircraft based on non-probabilistic reliability theory

    Directory of Open Access Journals (Sweden)

    Ye-Wei Zhang

    2016-03-01

    Full Text Available The vibration active control of the composite panels with the uncertain parameters in the hypersonic flow is studied using the non-probabilistic reliability theory. Using the piezoelectric patches as active control actuators, dynamic equations of panel are established by finite element method and Hamilton’s principle. And the control model of panel with uncertain parameters is obtained. According to the non-probabilistic reliability index, and besides being based on H∞ robust control theory and non-probabilistic reliability theory, the non-probabilistic reliability performance function is given. Moreover, the relationships between the robust controller and H∞ performance index and reliability are established. Numerical results show that the control method under the influence of reliability, H∞ performance index, and approaching velocity is effective to the vibration suppression of panel in the whole interval of uncertain parameters.

  20. Probabilistic Modeling of Graded Timber Material Properties

    DEFF Research Database (Denmark)

    Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard

    2004-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be quantified and compared. The tail behavior of the probability distributions of timber material characteristics plays...... such that they may readily be applied in structural reliability analysis and their format appears to be appropriate for codification purposes of quality control and selection for grading procedures....

  1. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  2. Searching Algorithms Implemented on Probabilistic Systolic Arrays

    Czech Academy of Sciences Publication Activity Database

    Kramosil, Ivan

    1996-01-01

    Roč. 25, č. 1 (1996), s. 7-45 ISSN 0308-1079 R&D Projects: GA ČR GA201/93/0781 Keywords : searching algorithms * probabilistic algorithms * systolic arrays * parallel algorithms Impact factor: 0.214, year: 1996

  3. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  4. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  5. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  6. A new perspective into root-cause analysis and diagnostics

    International Nuclear Information System (INIS)

    Kim, Inn Seock; Kim, Tae Kwon; Kim, Min Chull

    1998-01-01

    A critical review of diagnostic and root-cause analysis methods, developed in nuclear, chemical process, aviation industries, was made. Based on this review, the insights into both off-line and on-line diagnostics, and also root-cause analysis are preseted from a new perspective. This perspective may be applied for various purposes, including real-time on-line process diagnosis, root-cause analysis of reactor scrams, diagnosis of severe accidents, or situation identification of an on-going emergency at a nuclear site

  7. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  8. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  9. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  10. Learning on probabilistic manifolds in massive fusion databases: Application to confinement regime identification

    International Nuclear Information System (INIS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-01-01

    Highlights: ► We present an integrated framework for pattern recognition in fusion data. ► We model measurement uncertainty through an appropriate probability distribution. ► We use the geodesic distance on probabilistic manifolds as a similarity measure. ► We apply the framework to confinement mode classification. ► The classification accuracy benefits from uncertainty information and its geometry. - Abstract: We present an integrated framework for (real-time) pattern recognition in fusion data. The main premise is the inherent probabilistic nature of measurements of plasma quantities. We propose the geodesic distance on probabilistic manifolds as a similarity measure between data points. Substructure induced by data dependencies may further reduce the dimensionality and redundancy of the data set. We present an application to confinement mode classification, showing the distinct advantage obtained by considering the measurement uncertainty and its geometry.

  11. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    Science.gov (United States)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  12. On-line determination of operating limits incorporating constraint costs and reliability assessment

    International Nuclear Information System (INIS)

    Meisingset, M.; Lovas, G. G.

    1997-01-01

    Problems regarding power system operation following deregulation were discussed. The problems arise as a result of the increased power flow pattern created by deregulation and competitive power markets, resulting in power in excess of N-1, (the capacity of transmission lines available), which in turn creates bottlenecks. In a situation like this, constraint costs and security costs (i.e. the cost of supply interruptions) are incurred as the direct result of the deterministic criteria used in reliability assessment. This paper describes an on-line probabilistic method to determine operating limits based on a trade-off between constraint costs and security costs. The probability of the contingencies depend on the existing weather conditions, which therefore has significant impact on the calculated operating limit. In consequence, the proposed method allows power flow to exceed the N-1 limit during normal weather. Under adverse weather conditions the N-1 criteria should be maintained. 15 refs., 13 figs

  13. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  14. Statistical Model Checking for Product Lines

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2016-01-01

    average cost of products (in terms of the attributes of the products’ features) and the probability of features to be (un)installed at runtime. The product lines must be modelled in QFLan, which extends the probabilistic feature-oriented language PFLan with novel quantitative constraints among features...

  15. Implementation of condition-dependent probabilistic risk assessment using surveillance data on passive components

    International Nuclear Information System (INIS)

    Lewandowski, Radoslaw; Denning, Richard; Aldemir, Tunc; Zhang, Jinsuo

    2016-01-01

    Highlights: • Condition-dependent probabilistic risk assessment (PRA). • Time-dependent characterization of plant-specific risk. • Containment bypass involving in secondary system piping and SCC in SG tubes. - Abstract: A great deal of surveillance data are collected for a nuclear power plant that reflect the changing condition of the plant as it ages. Although surveillance data are used to determine failure probabilities of active components for the plant’s probabilistic risk assessment (PRA) and to indicate the need for maintenance activities, they are not used in a structured manner to characterize the evolving risk of the plant. The present study explores the feasibility of using a condition-dependent PRA framework that takes a first principles approach to modeling the progression of degradation mechanisms to characterize evolving risk, periodically adapting the model to account for surveillance results. A case study is described involving a potential containment bypass accident sequence due to the progression of flow-accelerated corrosion in secondary system piping and stress corrosion cracking of steam generator tubes. In this sequence, a steam line break accompanied by failure to close of a main steam isolation valve results in depressurization of the steam generator and induces the rupture of one or more faulted steam generator tubes. The case study indicates that a condition-dependent PRA framework might be capable of providing early identification of degradation mechanisms important to plant risk.

  16. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  17. Balmer line diagnostic of electron heating at collisionless shocks in supernova remnants

    International Nuclear Information System (INIS)

    Rakowski, C.

    2008-01-01

    The mechanism and extent of electron heating at collisionless shocks has recently been under intense investigation. H α Balmer line emission is excited immediately behind the shock front and provides the best diagnostic for the electron to proton temperature ratio at supernova remnant shocks. Two components of emission are produced, a narrow component from electron and proton impact excitation of cold neutrals, and a broad component produced through charge exchange between the cold neutrals and the shock heated protons. Thus the broad and narrow component fluxes reflect the competition between electron and proton impact ionization, electron and proton impact excitation and charge exchange. This diagnostic has led to the discovery of an approximate inverse square relationship between the electron to proton temperature ratio and the shock velocity. In turn, this implies a constant level of electron heating, independent of shock speed above ∼ 450 km/s. In this talk I will present the observational evidence to date. Time permitting, I will introduce how lower-hybrid waves in an extended cosmic ray precursor could explain such a relationship, and how this and other parameters in the H α profile might relate to properties of cosmic rays and magnetic field amplification ahead of the shock. (author)

  18. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  19. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  20. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  1. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  2. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  3. Chemo-radioresistance of small cell lung cancer cell lines derived from untreated primary tumors obtained by diagnostic bronchofiberscopy

    International Nuclear Information System (INIS)

    Tanio, Yoshiro; Watanabe, Masatoshi; Inoue, Tamotsu

    1990-01-01

    New cell lines of small cell lung cancer (SCLC) were established from specimens of untreated primary tumors biopsied by diagnostic bronchofiberscopy. The advantage of this method was ease of obtaining specimens from lung tumors. Establishment of cell lines was successful with 4 of 13 specimens (30%). Clinical responses of the tumors showed considerable variation, but were well correlated with the in vitro sensitivity of the respective cell lines to chemotherapeutic drugs and irradiation. One of the cell lines was resistant to all drugs tested and irradiation, while another was sensitive to all of them. Although the acquired resistance of SCLC is the biggest problem in treatment, the natural resistance to therapy is another significant problem. Either acquired or natural, resistance mechanisms of SCLC may be elucidated by the use of such cell lines derived from untreated tumors. This method and these SCLC cell lines are expected to be useful for the serial study of biologic and genetic changes of untreated and pre-treated tumors, or primary and secondary tumors. (author)

  4. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  5. A common fixed point for operators in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.

    2009-01-01

    Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.

  6. CHEMICAL EVOLUTION OF THE UNIVERSE AT 0.7 < z < 1.6 DERIVED FROM ABUNDANCE DIAGNOSTICS OF THE BROAD-LINE REGION OF QUASARS

    Energy Technology Data Exchange (ETDEWEB)

    Sameshima, H. [Laboratory of Infrared High-resolution Spectroscopy, Koyama Astronomical Observatory, Kyoto Sangyo University, Motoyama, Kamigamo, Kita-ku, Kyoto 603-8555 (Japan); Yoshii, Y.; Kawara, K., E-mail: sameshima@cc.kyoto-su.ac.jp [Institute of Astronomy, School of Science, University of Tokyo, 2-21-1 Osawa, Mitaka, Tokyo 181-0015 (Japan)

    2017-01-10

    We present an analysis of Mg ii λ 2798 and Fe ii UV emission lines for archival Sloan Digital Sky Survey (SDSS) quasars to explore the diagnostics of the magnesium-to-iron abundance ratio in a broad-line region cloud. Our sample consists of 17,432 quasars selected from the SDSS Data Release 7 with a redshift range of 0.72 <  z  < 1.63. A strong anticorrelation between the Mg ii equivalent width (EW) and the Eddington ratio is found, while only a weak positive correlation is found between the Fe ii EW and the Eddington ratio. To investigate the origin of these differing behaviors of Mg ii and Fe ii emission lines, we perform photoionization calculations using the Cloudy code, where constraints from recent reverberation mapping studies are considered. We find from calculations that (1) Mg ii and Fe ii emission lines are created at different regions in a photoionized cloud, and (2) their EW correlations with the Eddington ratio can be explained by just changing the cloud gas density. These results indicate that the Mg ii/Fe ii flux ratio, which has been used as a first-order proxy for the Mg/Fe abundance ratio in chemical evolution studies with quasar emission lines, depends largely on the cloud gas density. By correcting this density dependence, we propose new diagnostics of the Mg/Fe abundance ratio for a broad-line region cloud. In comparing the derived Mg/Fe abundance ratios with chemical evolution models, we suggest that α -enrichment by mass loss from metal-poor intermediate-mass stars occurred at z  ∼ 2 or earlier.

  7. Probabilistic Wind Power Ramp Forecasting Based on a Scenario Generation Method: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Qin [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnan, Venkat K [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, Brian S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cui, Mingjian [Univ. of Texas-Dallas, Richardson, TX (United States); Feng, Cong [Univ. of Texas-Dallas, Richardson, TX (United States); Wang, Zhenke [Univ. of Texas-Dallas, Richardson, TX (United States); Zhang, Jie [Univ. of Texas-Dallas, Richardson, TX (United States)

    2017-08-31

    Wind power ramps (WPRs) are particularly important in the management and dispatch of wind power, and they are currently drawing the attention of balancing authorities. With the aim to reduce the impact of WPRs for power system operations, this paper develops a probabilistic ramp forecasting method based on a large number of simulated scenarios. An ensemble machine learning technique is first adopted to forecast the basic wind power forecasting scenario and calculate the historical forecasting errors. A continuous Gaussian mixture model (GMM) is used to fit the probability distribution function (PDF) of forecasting errors. The cumulative distribution function (CDF) is analytically deduced. The inverse transform method based on Monte Carlo sampling and the CDF is used to generate a massive number of forecasting error scenarios. An optimized swinging door algorithm is adopted to extract all the WPRs from the complete set of wind power forecasting scenarios. The probabilistic forecasting results of ramp duration and start time are generated based on all scenarios. Numerical simulations on publicly available wind power data show that within a predefined tolerance level, the developed probabilistic wind power ramp forecasting method is able to predict WPRs with a high level of sharpness and accuracy.

  8. Probabilistic safety assessment as a standpoint for decision making

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper focuses on the role of probabilistic safety assessment in decision-making. The prerequisites for use of the results of probabilistic safety assessment and the criteria for the decision-making based on probabilistic safety assessment are discussed. The decision-making process is described. It provides a risk evaluation of impact of the issue under investigation. Selected examples are discussed, which highlight the described process. (authors)

  9. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  10. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  11. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  12. Probabilistic safety criteria on high burnup HWR fuels

    International Nuclear Information System (INIS)

    Marino, A.C.

    2002-01-01

    BACO is a code for the simulation of the thermo-mechanical and fission gas behaviour of a cylindrical fuel rod under operation conditions. Their input parameters and, therefore, output ones may include statistical dispersion. In this paper, experimental CANDU fuel rods irradiated at the NRX reactor together with experimental MOX fuel rods and the IAEA-CRP FUMEX cases are used in order to determine the sensitivity of BACO code predictions. The techniques for sensitivity analysis defined in BACO are: the 'extreme case analysis', the 'parametric analysis' and the 'probabilistic (or statistics) analysis'. We analyse the CARA and CAREM fuel rods relation between predicted performance and statistical dispersion in order of enhanced their original designs taking account probabilistic safety criteria and using the BACO's sensitivity analysis. (author)

  13. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  14. Building a high-resolution T2-weighted MR-based probabilistic model of tumor occurrence in the prostate.

    Science.gov (United States)

    Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R

    2018-02-19

    We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.

  15. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    Science.gov (United States)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  16. Drawing the line on the sand

    Science.gov (United States)

    Ranasinghe, R.; Jongejan, R.; Wainwright, D.; Callaghan, D. P.

    2016-02-01

    Up to 70% of the world's sandy coastlines are eroding, resulting in gradual and continuous coastline recession. The rate of coastline recession is likely to increase due to the projected impacts of climate change on mean sea levels, offshore wave climate and storm surges. At the same time, rapid development in the world's coastal zones continues to increase potential damages, while often reducing the resilience of coastal systems. The risks associated with coastline recession are thus likely to increase over the coming decades, unless effective risk management plans are put in place. Land-use restrictions are a key component of coastal zone risk management plans. These involve the use of coastal setback lines which are mainly established by linearly adding the impacts of storms, recession due to sea level rise, and ambient long term trends in shoreline evolution. This approach does not differentiate between uncertainties that develop differently over time, nor takes into account the value and lifetime of property developments. Both shortcomings could entail considerable social cost. For balancing risk and reward, probabilistic estimates of coastline recession are a pre-requisite. Yet the presently adopted deterministic methods for establishing setback lines are unable to provide such estimates. Here, we present a quantitative risk analysis (QRA) model, underpinned by a multi-scale, physics based coastal recession model capable of providing time-dependent risk estimates. The modelling approach presented enables the determination of setback lines in terms of exceedance probabilities, a quantity that directly feeds into risk evaluations and economic optimizations. As a demonstration, the risk-informed approach is applied to Narrabeen beach, Sydney, Australia.

  17. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  18. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  19. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.; Ghazzai, Hakim; Kadri, Abdullah; Alouini, Mohamed-Slim

    2018-01-01

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  20. A Time-Varied Probabilistic ON/OFF Switching Algorithm for Cellular Networks

    KAUST Repository

    Rached, Nadhir B.

    2018-01-11

    In this letter, we develop a time-varied probabilistic on/off switching planning method for cellular networks to reduce their energy consumption. It consists in a risk-aware optimization approach that takes into consideration the randomness of the user profile associated with each base station (BS). The proposed approach jointly determines (i) the instants of time at which the current active BS configuration must be updated due to an increase or decrease of the network traffic load, and (ii) the set of minimum BSs to be activated to serve the networks’ subscribers. Probabilistic metrics modeling the traffic profile variation are developed to trigger this dynamic on/off switching operation. Selected simulation results are then performed to validate the proposed algorithm for different system parameters.

  1. Probabilistic safety analysis forecast for Trillo 1 NPP

    International Nuclear Information System (INIS)

    Carretero Fernandino, J.A.; Martin Alvarez, L.; gomez, F.; Cuallado, G.

    1995-01-01

    The performance of Probabilistic Safety Analyses (PSA) at Trillo 1 NPP is facing a number of challenges, unprecedented in previous PSAs carried out in Spain, due to the particular design characteristics of the plant. On account of this, it has been necessary to implemented specific approaches and methodological alternatives to perform a PSA which, while maintaining detail level and requirements in line with PSAs carried out previously in Spain, offers a solution technically adapted to the characteristics of the SIEMENS-KWU design as opposed to other Spanish reactors with a basic Westinghouse-General Electric design, which are based on standard US design. The purpose of this paper is to describe the most significant characteristics of the PSA at Trillo 1 NPP and the methodology used to date, taking into account current project progress

  2. Probabilistic estimates of drought impacts on agricultural production

    Science.gov (United States)

    Madadgar, Shahrbanou; AghaKouchak, Amir; Farahmand, Alireza; Davis, Steven J.

    2017-08-01

    Increases in the severity and frequency of drought in a warming climate may negatively impact agricultural production and food security. Unlike previous studies that have estimated agricultural impacts of climate condition using single-crop yield distributions, we develop a multivariate probabilistic model that uses projected climatic conditions (e.g., precipitation amount or soil moisture) throughout a growing season to estimate the probability distribution of crop yields. We demonstrate the model by an analysis of the historical period 1980-2012, including the Millennium Drought in Australia (2001-2009). We find that precipitation and soil moisture deficit in dry growing seasons reduced the average annual yield of the five largest crops in Australia (wheat, broad beans, canola, lupine, and barley) by 25-45% relative to the wet growing seasons. Our model can thus produce region- and crop-specific agricultural sensitivities to climate conditions and variability. Probabilistic estimates of yield may help decision-makers in government and business to quantitatively assess the vulnerability of agriculture to climate variations. We develop a multivariate probabilistic model that uses precipitation to estimate the probability distribution of crop yields. The proposed model shows how the probability distribution of crop yield changes in response to droughts. During Australia's Millennium Drought precipitation and soil moisture deficit reduced the average annual yield of the five largest crops.

  3. Research on probabilistic assessment method based on the corroded pipeline assessment criteria

    International Nuclear Information System (INIS)

    Zhang Guangli; Luo, Jinheng; Zhao Xinwei; Zhang Hua; Zhang Liang; Zhang Yi

    2012-01-01

    Pipeline integrity assessments are performed using conventional deterministic approaches, even though there are many uncertainties about the parameters in the pipeline integrity assessment. In this paper, a probabilistic assessment method is provided for the gas pipeline with corrosion defects based on the current corroded pipe evaluation criteria, and the failure probability of corroded pipelines due to the uncertainties of loadings, material property and measurement accuracy is estimated using Monte-Carlo technique. Furthermore, the sensitivity analysis approach is introduced to rank the influence of various random variables to the safety of pipeline. And the method to determine the critical defect size based on acceptable failure probability is proposed. Highlights: ► The folias factor in pipeline corrosion assessment methods was analyzed. ► The probabilistic method was applied in corrosion assessment methods. ► The influence of assessment variables to the reliability of pipeline was ranked. ► The acceptable failure probability was used to determine the critical defect size.

  4. Unary probabilistic and quantum automata on promise problems

    OpenAIRE

    Gainutdinova, Aida; Yakaryilmaz, Abuzer

    2015-01-01

    We continue the systematic investigation of probabilistic and quantum finite automata (PFAs and QFAs) on promise problems by focusing on unary languages. We show that bounded-error QFAs are more powerful than PFAs. But, in contrary to the binary problems, the computational powers of Las-Vegas QFAs and bounded-error PFAs are equivalent to deterministic finite automata (DFAs). Lastly, we present a new family of unary promise problems with two parameters such that when fixing one parameter QFAs ...

  5. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  6. Use of probabilistic risk assessment in maintenance activities at Palo Verde

    International Nuclear Information System (INIS)

    Lindquist, R.C.; Pobst, D.S.

    1993-01-01

    Probabilistic risk assessment (PRA) is an important tool in addressing various maintenance activities. At the Palo Verde nuclear generating station (PVNGS), the PRA has been used in a variety of ways to support a wide and diverse selection of maintenance-related activities. For on-line or at-power maintenance, the PRA was used to evaluate combinations of maintenance activities possible with the 12-week or floating maintenance schedule. The maintenance schedule was evaluated to identify any higher risk, undesirable combinations of equipment outages, such as the sole steam-driven auxiliary feedwater pump and the same train emergency diesel generator. Table I is a sampling of the results from the maintenance schedule evaluation in terms of increase in conditional core damage frequency (CDF) above the base- line value due to maintenance on some important key safety systems and combinations thereof. The baseline CDF is 7.4 x 10 -7 per 72 h

  7. Atomic emission spectroscopy for the on-line monitoring of incineration processes

    NARCIS (Netherlands)

    Timmermans, E.A.H.; de Groote, F.P.J.; Jonkers, J.; Gamero, A.; Sola, A.; Mullen, van der J.J.A.M.

    2003-01-01

    A diagnostic measurement system based on atomic emission spectroscopy has been developed for the purpose of on-line monitoring of hazardous elements in industrial combustion gases. The aim was to construct a setup with a high durability for rough and variable experimental conditions, e.g. a strongly

  8. Diagnostic development

    International Nuclear Information System (INIS)

    Barnett, C.F.; Brisson, D.A.; Greco, S.E.

    1978-01-01

    During the past year the far-infrared or submillimeter diagnostic research program resulted in three major developments: (1) an optically pumped 0.385-μm D 2 O-laser oscillator-amplifier system was operated at a power level of 1 MW with a line width of less than 50 MHz; (2) a conical Pyrex submillimeter laser beam dump with a retention efficiency greater than 10 4 was developed for the ion temperature Thompson scattering experiment; and (3) a new diagnostic technique was developed that makes use of the Faraday rotation of a modulated submillimeter laser beam to determine plasma current profile. Measurements of the asymmetric distortion of the H/sub α/ (6563 A) spectral line profile show that the effective toroidal drift velocity, dv/sub two vertical bars i/dT/sub i/, may be used as an indicator of plasma quality and as a complement to other ion temperature diagnostics

  9. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  10. High-resolution spectroscopy diagnostics for measuring impurity ion temperature and velocity on the COMPASS tokamak

    Energy Technology Data Exchange (ETDEWEB)

    Weinzettl, Vladimir, E-mail: vwei@ipp.cas.cz [Institute of Plasma Physics ASCR, Prague (Czech Republic); Shukla, Gaurav [Institute of Plasma Physics ASCR, Prague (Czech Republic); Department of Applied Physics, Ghent University, Ghent (Belgium); Faculty of Mathematics and Physics, Charles University in Prague, Prague (Czech Republic); Ghosh, Joydeep [Institute for Plasma Research, Bhat, Gandhinagar (India); Melich, Radek; Panek, Radomir [Institute of Plasma Physics ASCR, Prague (Czech Republic); Tomes, Matej; Imrisek, Martin; Naydenkova, Diana [Institute of Plasma Physics ASCR, Prague (Czech Republic); Faculty of Mathematics and Physics, Charles University in Prague, Prague (Czech Republic); Varju, Josef [Institute of Plasma Physics ASCR, Prague (Czech Republic); Pereira, Tiago [Instituto de Plasmas e Fusão Nuclear, Lisboa (Portugal); Instituto Superior Técnico, Universidade de Lisboa, Lisboa (Portugal); Gomes, Rui [Instituto de Plasmas e Fusão Nuclear, Lisboa (Portugal); Abramovic, Ivana; Jaspers, Roger [Eindhoven University of Technology, Eindhoven (Netherlands); Pisarik, Michael [SQS Vlaknova optika a.s., Nova Paka (Czech Republic); Department of Electromagnetic Field, Faculty of Electrical Engineering, Czech Technical University in Prague (Czech Republic); Odstrcil, Tomas [Max-Planck-Institut fur Plasmaphysik, Garching (Germany); Van Oost, Guido [Department of Applied Physics, Ghent University, Ghent (Belgium)

    2015-10-15

    Highlights: • We built a new diagnostic of poloidal plasma rotation on the COMPASS tokamak. • Improvements in throughput via toroidal integration and fiber optimizations shown. • Poloidal rotation and ion temperature measured in L- and H-mode and during RMP. • Design and parameters of a new CXRS diagnostic for COMPASS are introduced. - Abstract: High-resolution spectroscopy is a powerful tool for the measurement of plasma rotation as well as ion temperature using the Doppler shift of the emitted spectral lines and their Doppler broadening, respectively. Both passive and active diagnostic variants for the COMPASS tokamak are introduced. The passive diagnostic focused on the C III lines at about 465 nm is utilized for the observation of the poloidal plasma rotation. The current set-up of the measuring system is described, including the intended high-throughput optics upgrade. Different options to increase the fiber collection area are mentioned, including a flower-like fiber bundle, and the use of micro-lenses or tapered fibers. Recent measurements of poloidal plasma rotation of the order of 0–6 km/s are shown. The design of the new active diagnostic using a deuterium heating beam and based on charge exchange recombination spectroscopy (C VI line at 529 nm) is introduced. The tool will provide both space (0.5–5 cm) and time (10 ms) resolved toroidal plasma rotation and ion temperature profiles. The results of the Simulation of Spectra code used to examine the feasibility of charge exchange measurements on COMPASS are shown and connected with a selection of the spectrometer coupled with the CCD camera.

  11. High-resolution spectroscopy diagnostics for measuring impurity ion temperature and velocity on the COMPASS tokamak

    International Nuclear Information System (INIS)

    Weinzettl, Vladimir; Shukla, Gaurav; Ghosh, Joydeep; Melich, Radek; Panek, Radomir; Tomes, Matej; Imrisek, Martin; Naydenkova, Diana; Varju, Josef; Pereira, Tiago; Gomes, Rui; Abramovic, Ivana; Jaspers, Roger; Pisarik, Michael; Odstrcil, Tomas; Van Oost, Guido

    2015-01-01

    Highlights: • We built a new diagnostic of poloidal plasma rotation on the COMPASS tokamak. • Improvements in throughput via toroidal integration and fiber optimizations shown. • Poloidal rotation and ion temperature measured in L- and H-mode and during RMP. • Design and parameters of a new CXRS diagnostic for COMPASS are introduced. - Abstract: High-resolution spectroscopy is a powerful tool for the measurement of plasma rotation as well as ion temperature using the Doppler shift of the emitted spectral lines and their Doppler broadening, respectively. Both passive and active diagnostic variants for the COMPASS tokamak are introduced. The passive diagnostic focused on the C III lines at about 465 nm is utilized for the observation of the poloidal plasma rotation. The current set-up of the measuring system is described, including the intended high-throughput optics upgrade. Different options to increase the fiber collection area are mentioned, including a flower-like fiber bundle, and the use of micro-lenses or tapered fibers. Recent measurements of poloidal plasma rotation of the order of 0–6 km/s are shown. The design of the new active diagnostic using a deuterium heating beam and based on charge exchange recombination spectroscopy (C VI line at 529 nm) is introduced. The tool will provide both space (0.5–5 cm) and time (10 ms) resolved toroidal plasma rotation and ion temperature profiles. The results of the Simulation of Spectra code used to examine the feasibility of charge exchange measurements on COMPASS are shown and connected with a selection of the spectrometer coupled with the CCD camera.

  12. Bayesian based Diagnostic Model for Condition based Maintenance of Offshore Wind Farms

    DEFF Research Database (Denmark)

    Asgarpour, Masoud; Sørensen, John Dalsgaard

    2018-01-01

    Operation and maintenance costs are a major contributor to the Levelized Cost of Energy for electricity produced by offshore wind and can be significantly reduced if existing corrective actions are performed as efficiently as possible and if future corrective actions are avoided by performing...... sufficient preventive actions. This paper presents an applied and generic diagnostic model for fault detection and condition based maintenance of offshore wind components. The diagnostic model is based on two probabilistic matrices; first, a confidence matrix, representing the probability of detection using...... for a wind turbine component based on vibration, temperature, and oil particle fault detection methods. The last part of the paper will have a discussion of the case study results and present conclusions....

  13. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  14. On-line valve monitoring at the Ormen Lange gas plant

    Energy Technology Data Exchange (ETDEWEB)

    Greenlees, R.; Hale, S. [Score Atlanta Inc., Kennesaw, Georgia (United States)

    2011-07-01

    The purpose of this presentation is to discuss replacing time and labor intensive nuclear outage activities with on line condition monitoring solutions, primarily the periodic verification of MOV functionality discussed in USNRC Generic Letter 96.05. This regulation requires that MOV age related performance degradations are properly identified and accounted for, causing utilities to have to retest valves periodically for the duration of the plants operating license. AECL designed CANDU reactors have a world class performance and safety record, with typical average annual capacity factors of 90%. The CANDU reactor design has the ability to refuel on line, as a result (a) it can be a challenge scheduling all required valve testing into limited duration outage work windows, (b) at multi unit sites, Unit 0 valves can be difficult to test because they are rarely ever out of service, (c) deuterium-oxide (heavy water) moderator is expensive to manufacture, as a result, effective through valve leakage monitoring is essential. These three factors alone make CANDU sites the most suitable candidates for on line valve monitoring systems. Nuclear industry regulations have been instrumental in the development of 'at the valve' diagnostic systems, but diagnostic testing has not typically been utilized to the same degree in other less regulated industries. However, that trend is changing, and the move toward valve diagnostics and condition monitoring has moved fastest in the offshore oil and gas industry on the Norwegian side of the North Sea. The Ormen Lange plant, located on Nyhamna Island on the west coast of Norway, operated by Shell, is one of the worlds most advanced gas processing plants. A stated maintenance goal for the plant is that 70% of the maintenance budget and spend should be based on the results of on line condition monitoring, utilizing monitoring systems equipped with switch sensing, strain gages, hydraulic and pneumatic pressure transducers and

  15. On-line valve monitoring at the Ormen Lange gas plant

    International Nuclear Information System (INIS)

    Greenlees, R.; Hale, S.

    2011-01-01

    The purpose of this presentation is to discuss replacing time and labor intensive nuclear outage activities with on line condition monitoring solutions, primarily the periodic verification of MOV functionality discussed in USNRC Generic Letter 96.05. This regulation requires that MOV age related performance degradations are properly identified and accounted for, causing utilities to have to retest valves periodically for the duration of the plants operating license. AECL designed CANDU reactors have a world class performance and safety record, with typical average annual capacity factors of 90%. The CANDU reactor design has the ability to refuel on line, as a result (a) it can be a challenge scheduling all required valve testing into limited duration outage work windows, (b) at multi unit sites, Unit 0 valves can be difficult to test because they are rarely ever out of service, (c) deuterium-oxide (heavy water) moderator is expensive to manufacture, as a result, effective through valve leakage monitoring is essential. These three factors alone make CANDU sites the most suitable candidates for on line valve monitoring systems. Nuclear industry regulations have been instrumental in the development of 'at the valve' diagnostic systems, but diagnostic testing has not typically been utilized to the same degree in other less regulated industries. However, that trend is changing, and the move toward valve diagnostics and condition monitoring has moved fastest in the offshore oil and gas industry on the Norwegian side of the North Sea. The Ormen Lange plant, located on Nyhamna Island on the west coast of Norway, operated by Shell, is one of the worlds most advanced gas processing plants. A stated maintenance goal for the plant is that 70% of the maintenance budget and spend should be based on the results of on line condition monitoring, utilizing monitoring systems equipped with switch sensing, strain gages, hydraulic and pneumatic pressure transducers and acoustic leakage

  16. Predictive control for stochastic systems based on multi-layer probabilistic sets

    Directory of Open Access Journals (Sweden)

    Huaqing LIANG

    2016-04-01

    Full Text Available Aiming at a class of discrete-time stochastic systems with Markov jump features, the state-feedback predictive control problem under probabilistic constraints of input variables is researched. On the basis of the concept and method of the multi-layer probabilistic sets, the predictive controller design algorithm with the soft constraints of different probabilities is presented. Under the control of the multi-step feedback laws, the system state moves to different ellipses with specified probabilities. The stability of the system is guaranteed, the feasible region of the control problem is enlarged, and the system performance is improved. Finally, a simulation example is given to prove the effectiveness of the proposed method.

  17. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  18. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  19. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  20. PROBABILISTIC FLOW DISTRIBUTION AS A REACTION TO THE STOCHASTICITY OF THE LOAD IN THE POWER SYSTEM

    Directory of Open Access Journals (Sweden)

    A. M. Hashimov

    2016-01-01

    Full Text Available For the analysis and control of power systems deterministic approaches that are implemented in the form of well-known methods and models of calculation of steady-state and transient modes are mostly use in current practice. With the use of these methods it is possible to obtain solutions only for fixed circuit parameters of the system scheme and assuming that active and reactive powers as well as generation in nodal points of the network remain the same. In reality the stochastic character of power consumption cause the casual fluctuations of voltages at the nodes and power flows in electric power lines of the power system. Such casual fluctuations of operation can be estimated with the use of probabilistic simulation of the power flows. In the article the results of research of the influence of depth of casual fluctuations of the load power of the system on the probability distribution of voltage at nodes as well as on the flows of active and reactive power in the lines are presented. Probabilistic modeling of flow under stochastic load change is performed for different levels of fluctuations and under loading of the mode of the system up to peak load power. Test study to quantify the effect of stochastic variability of loads on the probabilistic distribution parameters of the modes was carried out on behalf of the electrical network of the real power system. The results of the simulation of the probability flow distribution for these fluctuations of the load, represented in the form of discrete sample values of the active power obtained with the use of the analytical Monte-Carlo method, and real data measurements of their values in the network under examination were compared.

  1. Mid-IR Properties of an Unbiased AGN Sample of the Local Universe. 1; Emission-Line Diagnostics

    Science.gov (United States)

    Weaver, K. A.; Melendez, M.; Muhotzky, R. F.; Kraemer, S.; Engle, K.; Malumuth. E.; Tueller, J.; Markwardt, C.; Berghea, C. T.; Dudik, R. P.; hide

    2010-01-01

    \\Ve compare mid-IR emission-lines properties, from high-resolution Spitzer IRS spectra of a statistically-complete hard X-ray (14-195 keV) selected sample of nearby (z < 0.05) AGN detected by the Burst Alert Telescope (BAT) aboard Swift. The luminosity distribution for the mid-infrared emission-lines, [O IV] 25.89 microns, [Ne II] 12.81 microns, [Ne III] 15.56 microns and [Ne V] 14.32 microns, and hard X-ray continuum show no differences between Seyfert 1 and Seyfert 2 populations, although six newly discovered BAT AGNs are shown to be under-luminous in [O IV], most likely the result of dust extinction in the host galaxy. The overall tightness of the mid-infrared correlations and BAT luminosities suggests that the emission lines primarily arise in gas ionized by the AGN. We also compared the mid-IR emission-lines in the BAT AGNs with those from published studies of star-forming galaxies and LINERs. We found that the BAT AGN fall into a distinctive region when comparing the [Ne III]/[Ne II] and the [O IV]/[Ne III] quantities. From this we found that sources that have been previously classified in the mid-infrared/optical as AGN have smaller emission line ratios than those found for the BAT AGNs, suggesting that, in our X-ray selected sample, the AGN represents the main contribution to the observed line emission. Overall, we present a different set of emission line diagnostics to distinguish between AGN and star forming galaxies that can be used as a tool to find new AGN.

  2. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  3. MULTI-LINE STOKES INVERSION FOR PROMINENCE MAGNETIC-FIELD DIAGNOSTICS

    International Nuclear Information System (INIS)

    Casini, R.; Lopez Ariste, A.; Paletou, F.; Leger, L.

    2009-01-01

    We present test results on the simultaneous inversion of the Stokes profiles of the He I lines at 587.6 nm (D 3 ) and 1083.0 nm in prominences (90 deg. scattering). We created data sets of synthetic Stokes profiles for the case of quiescent prominences (B -3 of the peak intensity for the polarimetric sensitivity of the simulated observations. In this work, we focus on the error analysis for the inference of the magnetic field vector, under the usual assumption that the prominence can be assimilated to a slab of finite optical thickness with uniform magnetic and thermodynamic properties. We find that the simultaneous inversion of the two lines significantly reduces the errors on the inference of the magnetic field vector, with respect to the case of single-line inversion. These results provide a solid justification for current and future instrumental efforts with multi-line capabilities for the observations of solar prominences and filaments.

  4. Linac4 chopper line commissioning strategy

    CERN Document Server

    Bellodi, G; Lombardi, A M; Posocco, P A; Sargsyan, E

    2010-01-01

    The report outlines the strategy for beam-based commissioning of the Linac4 3 MeV chopper line as currently scheduled to start in the second half of 2011 in the Test Stand Area. A dedicated temporary diagnostics test bench will complement the measurement devices foreseen for permanent installation in the chopper line. A commissioning procedure is set out as a series of consecutive phases, each one supposed to meet a well- defined milestone in the path to fully characterise the beam-line. Specific set-ups for each stage are defined in terms of beam characteristics, machine settings and diagnostics used. Operational guidelines are given and expected results at the relative points of measurements are shown for simulated scenarios (on the basis of multi-particle tracking studies carried out with the codes PATH and TRACEWin). These are then interpreted in the light of the resolution limits of the available diagnostics instruments to assess the precision reach on individual measurements and the feasibility of techn...

  5. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  6. Considerations in applying on-line IC techniques to BWR's

    International Nuclear Information System (INIS)

    Kaleda, R.J.

    1992-01-01

    Ion-Chromatography (IC) has moved from its traditional role as a laboratory analytical tool to a real time, dynamic, on-line measurement device to follow ppb and sub-ppb concentrations of deleterious impurities in nuclear power plants. Electric Power Research Institute (EPRI), individual utilities, and industry all have played significant roles in effecting the transition. This paper highlights considerations and the evolution in current on-line Ion Chromatography systems. The first applications of on-line techniques were demonstrated by General Electric (GE) under EPRI sponsorship at Rancho Seco (1980), Calvert Cliffs, and McGuire nuclear units. The primary use was for diagnostic purposes. Today the on-line IC applications have been expanded to include process control and routine plant monitoring. Current on-line IC's are innovative in design, promote operational simplicity, are modular for simplified maintenance and repair, and use field-proven components which enhance reliability. Conductivity detection with electronic or chemical suppression and spectrometric detection techniques are intermixed in applications. Remote multi-point sample systems have addressed memory effects. Early applications measured ionic species in the part per billion range. Today reliable part per trillion measurements are common for on-line systems. Current systems are meeting the challenge of EPRI guideline requirements. Today's on-line IC's, with programmed sampling systems, monitor fluid streams throughout a power plant, supplying data that can be trended, stored and retrieved easily. The on-line IC has come of age. Many technical challenges were overcome to achieve today's IC

  7. Probabilistic risk assessment in nuclear power plant regulation

    Energy Technology Data Exchange (ETDEWEB)

    Wall, J B

    1980-09-01

    A specific program is recommended to utilize more effectively probabilistic risk assessment in nuclear power plant regulation. It is based upon the engineering insights from the Reactor Safety Study (WASH-1400) and some follow-on risk assessment research by USNRC. The Three Mile Island accident is briefly discussed from a risk viewpoint to illustrate a weakness in current practice. The development of a probabilistic safety goal is recommended with some suggestions on underlying principles. Some ongoing work on risk perception and the draft probabilistic safety goal being reviewed on Canada is described. Some suggestions are offered on further risk assessment research. Finally, some recent U.S. Nuclear Regulatory Commission actions are described.

  8. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  9. Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET

    Science.gov (United States)

    Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET

    2018-05-01

    Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.

  10. Diagnostic causal reasoning with verbal information.

    Science.gov (United States)

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Cooperation in an evolutionary prisoner’s dilemma game with probabilistic strategies

    International Nuclear Information System (INIS)

    Li Haihong; Dai Qionglin; Cheng Hongyan; Yang Junzhong

    2012-01-01

    Highlights: ► Introducing probabilistic strategies instead of the pure C/D in the PDG. ► The strategies patterns depends on interaction structures and updating rules. ► There exists an optimal increment of the probabilistic strategy. - Abstract: In this work, we investigate an evolutionary prisoner’s dilemma game in structured populations with probabilistic strategies instead of the pure strategies of cooperation and defection. We explore the model in details by considering different strategy update rules and different population structures. We find that the distribution of probabilistic strategies patterns is dependent on both the interaction structures and the updating rules. We also find that, when an individual updates her strategy by increasing or decreasing her probabilistic strategy a certain amount towards that of her opponent, there exists an optimal increment of the probabilistic strategy at which the cooperator frequency reaches its maximum.

  12. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  13. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  14. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  15. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  16. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  17. Probabilistic Damage Stability Calculations for Ships

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    1996-01-01

    The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...... compartments in the formulation of a computer-based general procedure for probabilistic damaged stability assessment is shown. Some comments are given on the current state of knowledge on the ship survivability in damaged conditions. Finally, problems regarding proper account of water ingress through openings...

  18. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  19. The Role of Language in Building Probabilistic Thinking

    Science.gov (United States)

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  20. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  1. Diagnostic performance of line-immunoassay based algorithms for incident HIV-1 infection

    Directory of Open Access Journals (Sweden)

    Schüpbach Jörg

    2012-04-01

    Full Text Available Abstract Background Serologic testing algorithms for recent HIV seroconversion (STARHS provide important information for HIV surveillance. We have previously demonstrated that a patient's antibody reaction pattern in a confirmatory line immunoassay (INNO-LIA™ HIV I/II Score provides information on the duration of infection, which is unaffected by clinical, immunological and viral variables. In this report we have set out to determine the diagnostic performance of Inno-Lia algorithms for identifying incident infections in patients with known duration of infection and evaluated the algorithms in annual cohorts of HIV notifications. Methods Diagnostic sensitivity was determined in 527 treatment-naive patients infected for up to 12 months. Specificity was determined in 740 patients infected for longer than 12 months. Plasma was tested by Inno-Lia and classified as either incident ( Results The 10 best algorithms had a mean raw sensitivity of 59.4% and a mean specificity of 95.1%. Adjustment for overrepresentation of patients in the first quarter year of infection further reduced the sensitivity. In the preferred model, the mean adjusted sensitivity was 37.4%. Application of the 10 best algorithms to four annual cohorts of HIV-1 notifications totalling 2'595 patients yielded a mean IIR of 0.35 in 2005/6 (baseline and of 0.45, 0.42 and 0.35 in 2008, 2009 and 2010, respectively. The increase between baseline and 2008 and the ensuing decreases were highly significant. Other adjustment models yielded different absolute IIR, although the relative changes between the cohorts were identical for all models. Conclusions The method can be used for comparing IIR in annual cohorts of HIV notifications. The use of several different algorithms in combination, each with its own sensitivity and specificity to detect incident infection, is advisable as this reduces the impact of individual imperfections stemming primarily from relatively low sensitivities and

  2. Calibration of the charge exchange recombination spectroscopy diagnostic for core poloidal rotation velocity measurements on JET

    International Nuclear Information System (INIS)

    Crombe, K.; Andrew, Y.; Giroud, C.; Hawkes, N.C.; Murari, A.; Valisa, M.; Oost, G. van; Zastrow, K.-D.

    2004-01-01

    This article describes recent improvements in the measurement of C 6+ impurity ion poloidal rotation velocities in the core plasma of JET using charge exchange recombination spectroscopy. Two independent techniques are used to provide an accurate line calibration. The first method uses a Perkin-Elmer type 303-306 samarium hollow cathode discharge lamp, with a Sm I line at 528.291 nm close to the C VI line at 529.1 nm. The second method uses the Be II at 527.06 nm and C III at 530.47 nm in the plasma spectrum as two marker lines on either side of the C VI line. Since the viewing chords have both a toroidal and poloidal component, it is important to determine the contribution of the toroidal rotation velocity component separately. The toroidal rotation velocity in the plasma core is measured with an independent charge exchange recombination spectroscopy diagnostic, looking tangentially at the plasma core. The contribution of this velocity along the lines of sight of the poloidal rotation diagnostic has been determined experimentally in L-mode plasmas keeping the poloidal component constant (K. Crombe et al., Proc. 30th EPS Conference, St. Petersburg, Russia, 7-11 July 2003, p. 1.55). The results from these experiments are compared with calculations of the toroidal contribution that take into account the original design parameters of the diagnostic and magnetic geometry of individual shots

  3. Elaboration and installation of technology of on-line diagnostics of important equipment damage as a procedure of NPP lifetime management

    International Nuclear Information System (INIS)

    Bakirov, M.; Povarov, V.

    2012-01-01

    In contrast to conventional approaches used for diagnostics (i.e. when inspection results are used as data for numerical calculative strength analysis) the specific feature of the new proposed approach consists in a fact that the approach is based on application of the 'inverse problem' principle. As regards to implementation of the proposed new approach, first of all a detailed numerical calculative finite-element model of the monitored equipment must be developed. Results of preliminary calculations allow to make reasonable selection of definite installation places and types of control sensors intended for more effective and precise work of the calculative model. As a rule, the control sensors are high-temperature strain gauges, temperature probes, pressure, acceleration and displacement sensors, as well as acoustic-emission and ultrasonic sensors used for monitoring of actual defectiveness kinetics in the zone of potential damaging. All the sensors work in the on-line mode during several years of operation, the optimal frequency of data records is selected, all recorded data after prompt processing are transferred to the finite-element calculative modulus for strength calculations of a monitored zone. Software for strength calculation must be based on an individual calculative code, since it should also work in the on-line mode. Comprehensible strength analysis in conjunction with obtained results of defectiveness kinetics monitoring allow not only to foresee the most unfavorable scenario resulting to damaging, but also to have a possibility for prompt analysis and elaboration of compensating measures allowing to reduce operational loadings. In the report the results of development and practical application of the new approach at NPPs and corresponding technology are presented. (author)

  4. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  5. Probabilistic safety goals. Phase 3 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2009-07-01

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  6. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  7. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  8. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  9. On the logical specification of probabilistic transition models

    CSIR Research Space (South Africa)

    Rens, G

    2013-05-01

    Full Text Available We investigate the requirements for specifying the behaviors of actions in a stochastic domain. That is, we propose how to write sentences in a logical language to capture a model of probabilistic transitions due to the execution of actions of some...

  10. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  11. Coronal temperature diagnostics from high-resolution soft X-ray spectra

    Science.gov (United States)

    Strong, K. T.; Claflin, E. S.; Lemen, J. R.; Linford, G. A.

    1988-01-01

    The problem of deriving the temperature of the coronal plasma from soft X-ray spectra is discussed. Spectral atlas scans of the soft X-ray spectrum from the Flat Crystal Spectrometer on the Solar Maximum Mission are compared with theoretical predictions of the relative intensities of some of the brighter lines to determine which line intensity ratios give the most reliable temperature diagnostics. The techniques considered include line widths, He-like G ratios, intensity ratios, and ratios of lines formed by different elements. It is found that the best temperature diagnostics come from the ratios of lines formed by successive ionization stages of the same element.

  12. Application of the probabilistic method at the E.D.F

    International Nuclear Information System (INIS)

    Gachot, Bernard

    1976-01-01

    Having first evoked the problems arising from the definition of a so-called 'acceptable risk', the probabilistic study programme on safety carried out at the E.D.F. is described. The different aspects of the probabilistic estimation of a hazard are presented as well as the different steps i.e. collecting the information, carrying out a quantitative and qualitative analysis, which characterize the probabilistic study of safety problems. The problem of data determination is considered on reliability of the equipment, noting as a conclusion, that in spite of the lack of accuracy of the present data, the probabilistic methods already appear as a highly valuable tool favouring an homogenous and coherent approach of nuclear plant safety [fr

  13. Anatomicopathological basis and clinical diagnostic significance of Kerley's A line

    International Nuclear Information System (INIS)

    Wang Zhenguang; Ma Daqing; Chen Budong; He Wen; Wang Xinlian; Guan Yansheng; Zhang Yansong

    2007-01-01

    Objective: To study anatomic and pathological basis of Kerley's A line, and to evaluate the role of Kerley's A line in differential diagnosis of diffuse lung diseases (DLD). Methods: HRCT scans, gross specimen section(50-100 μm thickness) and histologic section(5-8 μm thickness) were performed and analyzed comparatively on 28 dry lung specimens from the patients with coal worker's pneumoconiosis and occupational exposure history to coal dusts. At the same time, HRCT images of 176 patients with DLD were retrospectively reviewed for the detection of Kerley's A line. Results: Kerley's A lines were seen in 17 of 28 lung specimens on coronal HRCT images. The anatomic basis of Kerley's A line represented the continuity of two or more thickened interlobular septa (14 eases) and incomplete fibrotic septa between segments or subsegments (3 cases). Histologically, the linear opacities represented the deposits of coal dust, fibrosis, edema, inflammation, thickened vessel wall within interlobular septa. Kerley's A lines were present in 11 of 176 patients (6.3%) including interstitial pulmonary edema (5 cases), viral pneumonia (2 cases), lymphangitic carcinomatosis (2 cases), sarcoidosis (1 cases) and pulmonary alveolar proteinosis (1 cases). Conclusion: Kerley's A line has a limited usefulness in the differential diagnosis of DLD because it is seen infrequently and not discermable. (authors)

  14. Probabilistic somatotopy of the spinothalamic pathway at the ventroposterolateral nucleus of the thalamus in the human brain.

    Science.gov (United States)

    Hong, J H; Kwon, H G; Jang, S H

    2011-08-01

    The STP has been regarded as the most plausible neural tract responsible for pathogenesis of central poststroke pain. The VPL nucleus has been a target for neurosurgical procedures for control of central poststroke pain. However, to our knowledge, no DTI studies have been conducted to investigate the somatotopic location of the STP at the VPL nucleus of the thalamus. In the current study, we attempted to investigate this location in the human brain by using a probabilistic tractography technique of DTI. DTI was performed at 1.5T by using a Synergy-L SENSE head coil. STPs for both the hand and leg were obtained by selection of fibers passing through 2 regions of interest (the area of the spinothalamic tract in the posterolateral medulla and the postcentral gyrus) for 41 healthy volunteers. Somatotopic mapping was obtained from the highest probabilistic location at the ACPC level. The highest probabilistic locations for the hand and leg were an average of 16.86 and 16.37 mm lateral to the ACPC line and 7.53 and 8.71 mm posterior to the midpoint of the ACPC line, respectively. Somatotopic locations for the hand and leg were different in the anteroposterior direction (P .05). We found the somatotopic locations for hand and leg of the STP at the VPL nucleus; these somatotopies were arranged in the anteroposterior direction.

  15. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  16. Staged decision making based on probabilistic forecasting

    Science.gov (United States)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  17. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  18. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  19. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  20. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  1. Analysis of truncation limit in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, Marko

    2005-01-01

    A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit

  2. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  3. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  4. Integrated on-line accelerator modeling at CEBAF

    International Nuclear Information System (INIS)

    Bowling, B.A.; Shoaee, H.; Van Zeijts, J.; Witherspoon, S.; Watson, W.

    1995-01-01

    An on-line accelerator modeling facility is currently under development at CEBAF. The model server, which is integrated with the EPICS control system, provides coupled and 2nd-order matrices for the entire accelerator, and forms the foundation for automated model- based control and diagnostic applications. Four types of machine models are provided, including design, golden or certified, live, and scratch or simulated model. Provisions are also made for the use of multiple lattice modeling programs such as DIMAD, PARMELA, and TLIE. Design and implementation details are discussed. 2 refs., 4 figs

  5. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  6. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  7. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  8. Probabilistic safety assessment model in consideration of human factors based on object-oriented bayesian networks

    International Nuclear Information System (INIS)

    Zhou Zhongbao; Zhou Jinglun; Sun Quan

    2007-01-01

    Effect of Human factors on system safety is increasingly serious, which is often ignored in traditional probabilistic safety assessment methods however. A new probabilistic safety assessment model based on object-oriented Bayesian networks is proposed in this paper. Human factors are integrated into the existed event sequence diagrams. Then the classes of the object-oriented Bayesian networks are constructed which are converted to latent Bayesian networks for inference. Finally, the inference results are integrated into event sequence diagrams for probabilistic safety assessment. The new method is applied to the accident of loss of coolant in a nuclear power plant. the results show that the model is not only applicable to real-time situation assessment, but also applicable to situation assessment based certain amount of information. The modeling complexity is kept down and the new method is appropriate to large complex systems due to the thoughts of object-oriented. (authors)

  9. Real time PV manufacturing diagnostic system

    Energy Technology Data Exchange (ETDEWEB)

    Kochergin, Vladimir [MicroXact Inc., Blacksburg, VA (United States); Crawford, Michael A. [MicroXact Inc., Blacksburg, VA (United States)

    2015-09-01

    The main obstacle Photovoltaic (PV) industry is facing at present is the higher cost of PV energy compared to that of fossil energy. While solar cell efficiencies continue to make incremental gains these improvements are so far insufficient to drive PV costs down to match that of fossil energy. Improved in-line diagnostics however, has the potential to significantly increase the productivity and reduce cost by improving the yield of the process. On this Phase I/Phase II SBIR project MicroXact developed and demonstrated at CIGS pilot manufacturing line a high-throughput in-line PV manufacturing diagnostic system, which was verified to provide fast and accurate data on the spatial uniformity of thickness, an composition of the thin films comprising the solar cell as the solar cell is processed reel-to-reel. In Phase II project MicroXact developed a stand-alone system prototype and demonstrated the following technical characteristics: 1) ability of real time defect/composition inconsistency detection over 60cm wide web at web speeds up to 3m/minute; 2) Better than 1mm spatial resolution on 60cm wide web; 3) an average better than 20nm spectral resolution resulting in more than sufficient sensitivity to composition imperfections (copper-rich and copper-poor regions were detected). The system was verified to be high vacuum compatible. Phase II results completely validated both technical and economic feasibility of the proposed concept. MicroXact’s solution is an enabling technique for in-line PV manufacturing diagnostics to increase the productivity of PV manufacturing lines and reduce the cost of solar energy, thus reducing the US dependency on foreign oil while simultaneously reducing emission of greenhouse gasses.

  10. The Impact of a Line Probe Assay Based Diagnostic Algorithm on Time to Treatment Initiation and Treatment Outcomes for Multidrug Resistant TB Patients in Arkhangelsk Region, Russia.

    Science.gov (United States)

    Eliseev, Platon; Balantcev, Grigory; Nikishova, Elena; Gaida, Anastasia; Bogdanova, Elena; Enarson, Donald; Ornstein, Tara; Detjen, Anne; Dacombe, Russell; Gospodarevskaya, Elena; Phillips, Patrick P J; Mann, Gillian; Squire, Stephen Bertel; Mariandyshev, Andrei

    2016-01-01

    In the Arkhangelsk region of Northern Russia, multidrug-resistant (MDR) tuberculosis (TB) rates in new cases are amongst the highest in the world. In 2014, MDR-TB rates reached 31.7% among new cases and 56.9% among retreatment cases. The development of new diagnostic tools allows for faster detection of both TB and MDR-TB and should lead to reduced transmission by earlier initiation of anti-TB therapy. The PROVE-IT (Policy Relevant Outcomes from Validating Evidence on Impact) Russia study aimed to assess the impact of the implementation of line probe assay (LPA) as part of an LPA-based diagnostic algorithm for patients with presumptive MDR-TB focusing on time to treatment initiation with time from first-care seeking visit to the initiation of MDR-TB treatment rather than diagnostic accuracy as the primary outcome, and to assess treatment outcomes. We hypothesized that the implementation of LPA would result in faster time to treatment initiation and better treatment outcomes. A culture-based diagnostic algorithm used prior to LPA implementation was compared to an LPA-based algorithm that replaced BacTAlert and Löwenstein Jensen (LJ) for drug sensitivity testing. A total of 295 MDR-TB patients were included in the study, 163 diagnosed with the culture-based algorithm, 132 with the LPA-based algorithm. Among smear positive patients, the implementation of the LPA-based algorithm was associated with a median decrease in time to MDR-TB treatment initiation of 50 and 66 days compared to the culture-based algorithm (BacTAlert and LJ respectively, ptime to MDR-TB treatment initiation of 78 days when compared to the culture-based algorithm (LJ, ptime to MDR diagnosis and earlier treatment initiation as well as better treatment outcomes for patients with MDR-TB. These findings also highlight the need for further improvements within the health system to reduce both patient and diagnostic delays to truly optimize the impact of new, rapid diagnostics.

  11. Probabilistic safety evaluation: Development of procedures with applications on components used in nuclear power plants

    International Nuclear Information System (INIS)

    Dillstroem, P.

    2000-12-01

    A probabilistic procedure has been developed by SAQ Kontroll AB to calculate two different failure probabilities, P F : Probability of failure, defect size given by NDT/NDE. Probability of failure, defect not detected by NDT/NDE. Based on the procedure, SAQ Kontroll AB has developed a computer program PROPSE (PRObabilistic Program for Safety Evaluation). Within PROPSE, the following features are implemented: Two different algorithms to calculate the probability of failure are included: Simple Monte Carlo Simulation (MCS), with an error estimate on P F . First-Order Reliability Method (FORM), with sensitivity factors using the most probable point of failure in a standard normal space. Using these factors, it is possible to rank the parameters within an analysis. Estimation of partial safety factors, given an input target failure probability and characteristic values for fracture toughness, yield strength, tensile strength and defect depth. Extensive validation has been carried out, using the probabilistic computer program STAR6 from Nuclear Electric and the deterministic program SACC from SAQ Kontroll AB. The validation showed that the results from PROPSE were correct, and that the algorithms used in STAR6 were not intended to work for a general problem, when the standard deviation is either 'small' or 'large'. Distributions, to be used in a probabilistic analysis, are discussed. Examples on data to be used are also given

  12. Probabilistic safety evaluation: Development of procedures with applications on components used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, P. [Det Norske Veritas AB, Stockholm (Sweden)

    2000-12-01

    A probabilistic procedure has been developed by SAQ Kontroll AB to calculate two different failure probabilities, P{sub F}: Probability of failure, defect size given by NDT/NDE. Probability of failure, defect not detected by NDT/NDE. Based on the procedure, SAQ Kontroll AB has developed a computer program PROPSE (PRObabilistic Program for Safety Evaluation). Within PROPSE, the following features are implemented: Two different algorithms to calculate the probability of failure are included: Simple Monte Carlo Simulation (MCS), with an error estimate on P{sub F}. First-Order Reliability Method (FORM), with sensitivity factors using the most probable point of failure in a standard normal space. Using these factors, it is possible to rank the parameters within an analysis. Estimation of partial safety factors, given an input target failure probability and characteristic values for fracture toughness, yield strength, tensile strength and defect depth. Extensive validation has been carried out, using the probabilistic computer program STAR6 from Nuclear Electric and the deterministic program SACC from SAQ Kontroll AB. The validation showed that the results from PROPSE were correct, and that the algorithms used in STAR6 were not intended to work for a general problem, when the standard deviation is either 'small' or 'large'. Distributions, to be used in a probabilistic analysis, are discussed. Examples on data to be used are also given.

  13. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  14. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  15. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  16. Probabilistic evaluation of fatigue crack growth in SA 508 and SA 533 B steel

    International Nuclear Information System (INIS)

    Dufresne, J.; Rieunier, J.B.

    1982-07-01

    This paper describes the method used to select the best representative law of fatigue crack growth in view of its introduction in a probabilist computer code). A modelling of the selected law (Paris law) and the statistical distribution of the corresponding numerical coefficients are presented. Results of computation are given in the case of a PWR pressure vessel with defects in belt line weld

  17. Automatic segmentation of coronary angiograms based on fuzzy inferring and probabilistic tracking

    Directory of Open Access Journals (Sweden)

    Shoujun Zhou

    2010-08-01

    Full Text Available Abstract Background Segmentation of the coronary angiogram is important in computer-assisted artery motion analysis or reconstruction of 3D vascular structures from a single-plan or biplane angiographic system. Developing fully automated and accurate vessel segmentation algorithms is highly challenging, especially when extracting vascular structures with large variations in image intensities and noise, as well as with variable cross-sections or vascular lesions. Methods This paper presents a novel tracking method for automatic segmentation of the coronary artery tree in X-ray angiographic images, based on probabilistic vessel tracking and fuzzy structure pattern inferring. The method is composed of two main steps: preprocessing and tracking. In preprocessing, multiscale Gabor filtering and Hessian matrix analysis were used to enhance and extract vessel features from the original angiographic image, leading to a vessel feature map as well as a vessel direction map. In tracking, a seed point was first automatically detected by analyzing the vessel feature map. Subsequently, two operators [e.g., a probabilistic tracking operator (PTO and a vessel structure pattern detector (SPD] worked together based on the detected seed point to extract vessel segments or branches one at a time. The local structure pattern was inferred by a multi-feature based fuzzy inferring function employed in the SPD. The identified structure pattern, such as crossing or bifurcation, was used to control the tracking process, for example, to keep tracking the current segment or start tracking a new one, depending on the detected pattern. Results By appropriate integration of these advanced preprocessing and tracking steps, our tracking algorithm is able to extract both vessel axis lines and edge points, as well as measure the arterial diameters in various complicated cases. For example, it can walk across gaps along the longitudinal vessel direction, manage varying vessel

  18. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  19. Metastatic non-small-cell lung cancer: consensus on pathology and molecular tests, first-line, second-line, and third-line therapy: 1st ESMO Consensus Conference in Lung Cancer; Lugano 2010

    DEFF Research Database (Denmark)

    Felip, E; Gridelli, C; Baas, P

    2011-01-01

    the conference, the expert panel prepared clinically relevant questions concerning five areas: early and locally advanced non-small-cell lung cancer (NSCLC), first-line metastatic NSCLC, second-/third-line NSCLC, NSCLC pathology and molecular testing, and small-cell lung cancer to be addressed through discussion......The 1st ESMO Consensus Conference on lung cancer was held in Lugano, Switzerland on 21 and 22 May 2010 with the participation of a multidisciplinary panel of leading professionals in pathology and molecular diagnostics, medical oncology, surgical oncology and radiation oncology. Before...... at the Consensus Conference. All relevant scientific literature for each question was reviewed in advance. During the Consensus Conference, the panel developed recommendations for each specific question. The consensus agreement on three of these areas: NSCLC pathology and molecular testing, the treatment of first-line...

  20. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  1. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  2. Japanese round robin analysis for probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Yagawa, G.; Yoshimura, S.; Handa, N.

    1991-01-01

    Recently attention is focused on the probabilistic fracture mechanics, a branch of fracture mechanics with probability theory for a rational mean to assess the strength of components and structures. In particular, the probabilistic fracture mechanics is recognized as the powerful means for quantitative investigation of significance of factors and rational evaluation of life on problems involving a number of uncertainties, such as degradation of material strength, accuracy and frequency of inspection. Comparison with reference experiments are generally employed to assure the analytical accuracy. However, accuracy and reliability of analytical methods in the probabilistic fracture mechanics are hardly verified by experiments. Therefore, it is strongly needed to verify the probabilistic fracture mechanics through the round robin analysis. This paper describes results from the round robin analysis of flat plate with semi-elliptic cracks on the surface, conducted by the PFM Working Group of LE Subcommittee of the Japan Welding Society under the contract of the Japan Atomic Energy Research Institute and participated by Tokyo University, Yokohama National University, the Power Reactor and Nuclear Fuel Corporation, Tokyo Electric Power Co. Central Research Institute of Electric Power Industry, Toshiba Corporation, Kawasaki Heavy Industry Co. and Mitsubishi Heavy Industry Co. (author)

  3. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  4. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  5. Quantum logic networks for probabilistic teleportation

    Institute of Scientific and Technical Information of China (English)

    刘金明; 张永生; 等

    2003-01-01

    By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.

  6. Nuclear power plant diagnostic system

    International Nuclear Information System (INIS)

    Prokop, K.; Volavy, J.

    1982-01-01

    Basic information is presented on diagnostic systems used at nuclear power plants with PWR reactors. They include systems used at the Novovoronezh nuclear power plant in the USSR, at the Nord power plant in the GDR, the system developed at the Hungarian VEIKI institute, the system used at the V-1 nuclear power plant at Jaslovske Bohunice in Czechoslovakia and systems of the Rockwell International company used in US nuclear power plants. These diagnostic systems are basically founded on monitoring vibrations and noise, loose parts, pressure pulsations, neutron noise, coolant leaks and acoustic emissions. The Rockwell International system represents a complex unit whose advantage is the on-line evaluation of signals which gives certain instructions for the given situation directly to the operator. The other described systems process signals using similar methods. Digitized signals only serve off-line computer analyses. (Z.M.)

  7. MAGNETIC DIAGNOSTICS OF THE SOLAR CHROMOSPHERE WITH THE Mg II h–k LINES

    Energy Technology Data Exchange (ETDEWEB)

    Del Pino Alemán, T.; Casini, R. [High Altitude Observatory, National Center for Atmospheric Research, P.O. Box 3000, Boulder, CO 80307-3000 (United States); Manso Sainz, R. [Max-Planck-Institut für Sonnensystemforschung, Justus-von-Liebig-Weg 3, D-37077 Göttingen (Germany)

    2016-10-20

    We investigated the formation of the Mg ii h–k doublet in a weakly magnetized atmosphere (20–100 G) using a newly developed numerical code for polarized radiative transfer in a plane-parallel geometry, which implements a recent formulation of partially coherent scattering by polarized multi-term atoms in arbitrary magnetic-field regimes. Our results confirm the importance of partial redistribution effects in the formation of the Mg ii h and k lines, as pointed out by previous work in the non-magnetic case. We show that the presence of a magnetic field can produce measurable modifications of the broadband linear polarization even for relatively small field strengths (∼10 G), while the circular polarization remains well represented by the classical magnetograph formula. Both these results open an important new window for the weak-field diagnostics of the upper solar atmosphere.

  8. ELT-MELAS analyzer and its on-line programs

    International Nuclear Information System (INIS)

    Anikeev, V.B.; Berezhnoj, V.A.; Glupova

    1976-01-01

    ELT-MELAS device constructed for an automatic analysis of pictures from big bubble chambers is described. It is controlled by a medium-size ICL-1903A computer and has two measuring modes: analysis of the ''agreement'' signal and digitation of slice-scans. Main features of the hardware and of on-line controlling and diagnostic software are presented. The test results of the MELAS complex as well as preliminary results of the scan-slice measurements of pictures from 15sup(') chamber are given

  9. Invariant and semi-invariant probabilistic normed spaces

    Energy Technology Data Exchange (ETDEWEB)

    Ghaemi, M.B. [School of Mathematics Iran, University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: mghaemi@iust.ac.ir; Lafuerza-Guillen, B. [Departamento de Estadistica y Matematica Aplicada, Universidad de Almeria, Almeria E-04120 (Spain)], E-mail: blafuerz@ual.es; Saiedinezhad, S. [School of Mathematics Iran, University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: ssaiedinezhad@yahoo.com

    2009-10-15

    Probabilistic metric spaces were introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger . We introduce the concept of semi-invariance among the PN spaces. In this paper we will find a sufficient condition for some PN spaces to be semi-invariant. We will show that PN spaces are normal spaces. Urysohn's lemma, and Tietze extension theorem for them are proved.

  10. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Directory of Open Access Journals (Sweden)

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  11. A Probabilistic Analysis of the Sacco and Vanzetti Evidence

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    A Probabilistic Analysis of the Sacco and Vanzetti Evidence is a Bayesian analysis of the trial and post-trial evidence in the Sacco and Vanzetti case, based on subjectively determined probabilities and assumed relationships among evidential events. It applies the ideas of charting evidence and probabilistic assessment to this case, which is perhaps the ranking cause celebre in all of American legal history. Modern computation methods applied to inference networks are used to show how the inferential force of evidence in a complicated case can be graded. The authors employ probabilistic assess

  12. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  13. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  14. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  15. A new interferometry-based electron density fluctuation diagnostic on Alcator C-Moda)

    Science.gov (United States)

    Kasten, C. P.; Irby, J. H.; Murray, R.; White, A. E.; Pace, D. C.

    2012-10-01

    The two-color interferometry diagnostic on the Alcator C-Mod tokamak has been upgraded to measure fluctuations in the electron density and density gradient for turbulence and transport studies. Diagnostic features and capabilities are described. In differential mode, fast phase demodulation electronics detect the relative phase change between ten adjacent, radially-separated (ΔR = 1.2 cm, adjustable), vertical-viewing chords, which allows for measurement of the line-integrated electron density gradient. The system can be configured to detect the absolute phase shift of each chord by comparison to a local oscillator, measuring the line-integrated density. Each chord is sensitive to density fluctuations with kR < 20.3 cm-1 and is digitized at up to 10 MS/s, resolving aspects of ion temperature gradient-driven modes and other long-wavelength turbulence. Data from C-Mod discharges is presented, including observations of the quasi-coherent mode in enhanced D-alpha H-mode plasmas and the weakly coherent mode in I-mode.

  16. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  17. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  18. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  19. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  20. Games people play: How video games improve probabilistic learning.

    Science.gov (United States)

    Schenk, Sabrina; Lech, Robert K; Suchan, Boris

    2017-09-29

    Recent research suggests that video game playing is associated with many cognitive benefits. However, little is known about the neural mechanisms mediating such effects, especially with regard to probabilistic categorization learning, which is a widely unexplored area in gaming research. Therefore, the present study aimed to investigate the neural correlates of probabilistic classification learning in video gamers in comparison to non-gamers. Subjects were scanned in a 3T magnetic resonance imaging (MRI) scanner while performing a modified version of the weather prediction task. Behavioral data yielded evidence for better categorization performance of video gamers, particularly under conditions characterized by stronger uncertainty. Furthermore, a post-experimental questionnaire showed that video gamers had acquired higher declarative knowledge about the card combinations and the related weather outcomes. Functional imaging data revealed for video gamers stronger activation clusters in the hippocampus, the precuneus, the cingulate gyrus and the middle temporal gyrus as well as in occipital visual areas and in areas related to attentional processes. All these areas are connected with each other and represent critical nodes for semantic memory, visual imagery and cognitive control. Apart from this, and in line with previous studies, both groups showed activation in brain areas that are related to attention and executive functions as well as in the basal ganglia and in memory-associated regions of the medial temporal lobe. These results suggest that playing video games might enhance the usage of declarative knowledge as well as hippocampal involvement and enhances overall learning performance during probabilistic learning. In contrast to non-gamers, video gamers showed better categorization performance, independently of the uncertainty of the condition. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. The probabilistic approach in the licensing process and the development of probabilistic risk assessment methodology in Japan

    International Nuclear Information System (INIS)

    Togo, Y.; Sato, K.

    1981-01-01

    The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)

  2. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  3. Relative gains, losses, and reference points in probabilistic choice in rats.

    Directory of Open Access Journals (Sweden)

    Andrew T Marshall

    Full Text Available Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice and one that probabilistically delivered reward (high-uncertainty. The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S and high-uncertainty-larger (H-L outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior.

  4. Update on the status of the ITER ECE diagnostic design

    Directory of Open Access Journals (Sweden)

    Taylor G.

    2017-01-01

    Full Text Available Considerable progress has been made on the design of the ITER electron cyclotron emission (ECE diagnostic over the past two years. Radial and oblique views are still included in the design in order to measure distortions in the electron momentum distribution, but the oblique view has been redirected to reduce stray millimeter radiation from the electron cyclotron heating system. A major challenge has been designing the 1000 K calibration sources and remotely activated mirrors located in the ECE diagnostic shield module (DSM in the equatorial port plug #09. These critical systems are being modeled and prototypes are being developed. Providing adequate neutron shielding in the DSM while allowing sufficient space for optical components is also a significant challenge. Four 45-meter long low-loss transmission lines transport the 70–1000 GHz ECE from the DSM to the ECE instrumentation room. Prototype transmission lines are being tested, as are the polarization splitter modules that separate O-mode and X-mode polarized ECE. A highly integrated prototype 200–300 GHz radiometer is being tested on the DIII-D tokamak in the USA. Design activities also include integration of ECE signals into the ITER plasma control system and determining the hardware and software architecture needed to control and calibrate the ECE instruments.

  5. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  6. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  7. Comprehensive diagnostic set for intense lithium ion hohlraum experiments on PBFA II

    International Nuclear Information System (INIS)

    Leeper, R.J.; Bailey, J.E.; Carlson, A.L.

    1994-01-01

    A review of the comprehensive diagnostic package developed at Sandia National Laboratories for intense lithium ion hohlraum target experiments on PBFA II will be presented. This package contains an extensive suite of x-ray spectral and imaging diagnostics that enable measurements of target radiation smoothing, hydro-motion, and temperature. The x-ray diagnostics include time-integrated and time-resolved pinhole cameras, energy-resolved 1-D streaked imaging diagnostics that enable measurements of target radiation smoothing, hydro-motion, and temperature. The x-ray diagnostics include time-integrated and time-resolved pinhole cameras, energy-resolved 1-D streaked imaging diagnostics, time-integrated and time-resolved grazing incidence spectrographs, a transmission grating spectrography, an elliptical crystal spectrograph, a bolometer array, an eleven element x-ray diode (XRD) array, and an eleven element PIN diode detector array. A hohlraum temperature measurement technique under development is a shock breakout diagnostic that measures the radiation pressure at the hohlraum wall. The incident Li beam symmetry and an estimate of incident Li beam power density are measured from ion beam-induced characteristic x-ray line and neutron emissions. An attempt to measure the Li beam intensity directly on target used Rutherford scattered ions into an ion movie camera and a magnetic spectrograph. The philosophy used in designing all the diagnostics in the set has emphasized redundant and independent measurements of fundamental physical quantities relevant to the performance of the target. Details of each diagnostic, its integration, data reduction procedures, and recent PBFA-II data will be discussed

  8. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    Science.gov (United States)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  9. Probabilistic considerations on the effects of random soil properties on the stability of ground structures of nuclear power plants

    International Nuclear Information System (INIS)

    Ootori, Yasuki; Ishikawa, Hiroyuki; Takeda, Tomoyoshi

    2004-01-01

    In the JEAG4601-1987 (Japan Electric Association Guide for earthquake resistance design), either the conventional deterministic method or probabilistic method is used for evaluating the stability of ground foundations and surrounding slopes in nuclear power plants. The deterministic method, in which the soil properties of 'mean ± coefficient x standard deviation' is adopted for the calculations, is generally used in the design stage to data. On the other hand, the probabilistic method, in which the soil properties assume to have probabilistic distributions, is stated as a future method. The deterministic method facilitates the evaluation, however, it is necessary to clarify the relationship between the deterministic and probabilistic methods. In order to investigate the relationship, a simple model that can take into account the dynamic effect of structures, and a simplified method for taking the spatial randomness into account are proposed in this study. As a result, it is found that the shear strength of soil is the most important factor for the stability of grounds and slopes, and the probability below the safety factor evaluated with the soil properties of mean - 1.0 x standard deviation' by the deterministic methods of much lower. (author)

  10. State of technology, system and solution supporting on-line maintenance - company's activities and products

    International Nuclear Information System (INIS)

    Nishitani, Junichi; Shimizu, Shunichi; Higasa, Hisakazu

    2010-01-01

    The new inspection system based on operator's maintenance and monitoring program of nuclear power plants was introduced in Japan more than one year ago and recommended on-line maintenance (maintenance during operation) will be carried out to increase capacity factor with safe and reliable operation of the plant. In this feature article, nine experts described the state of technology, system and solution supporting on-line maintenance - company's activities and products. These were titled as 'MHI's technology supporting on-line maintenance'. 'Technology supporting on-line maintenance - Toshiba's activities to upgrade monitoring and diagnostic service and maintenance management', 'AsahiKASEI's activities of on-line maintenance', 'Importance of information sharing of on-line maintenance and its ideal method-function of impact plan of IBM Maximo Asset Management for Nuclear', 'US's on-line maintenance and information systems', 'SmartProcedures realizing safe operation of nuclear power plant - proposal of computerized procedures', 'Ultrasonic leak detection system SDT170', 'Application of infrared thermography for equipment maintenance in nuclear power plant' and 'On-line condition monitoring system - condition eye'. (T. Tanaka)

  11. Preliminary probabilistic prediction of ice/snow accretion on stay cables based on meteorological variables

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee; Kiremidjian, A.; Georgakis, Christos T.

    The scope of the present paper is to present a framework for assessment of the probability of occurrence of ice/snow accretion on bridge cables. The framework utilizes Bayesian Probabilistic Networks and the methodology is illustrated with an example of the cable-stayed Øresund Bridge. The case...

  12. Effects of structural nonlinearity and foundation sliding on probabilistic response of a nuclear structure

    International Nuclear Information System (INIS)

    Hashemi, Alidad; Elkhoraibi, Tarek; Ostadan, Farhang

    2015-01-01

    Highlights: • Probabilistic SSI analysis including structural nonlinearity and sliding are shown. • Analysis is done for a soil and a rock site and probabilistic demands are obtained. • Structural drift ratios and In-structure response spectra are evaluated. • Structural nonlinearity significantly impacts local demands in the structure. • Sliding generally reduces seismic demands and can be accommodated in design. - Abstract: This paper examines the effects of structural nonlinearity and foundation sliding on the results of probabilistic structural analysis of a typical nuclear structure where structural nonlinearity, foundation sliding and soil-structure interaction (SSI) are explicitly included. The evaluation is carried out for a soil and a rock site at 10"4, 10"5, and 10"6 year return periods (1E − 4, 1E − 5, and 1E − 6 hazard levels, respectively). The input motions at each considered hazard level are deaggregated into low frequency (LF) and high frequency (HF) motions and a sample size of 30 is used for uncertainty propagation. The statistical distribution of structural responses including story drifts, and in-structure response spectra (ISRS) as well as foundation sliding displacements are examined. The probabilistic implementation of explicit structural nonlinearity and foundation sliding in combination with the SSI effects are demonstrated using nonlinear response history analysis (RHA) of the structure with the foundation motions obtained from elastic SSI analyses, which are applied as input to fixed-base inelastic analyses. This approach quantifies the expected structural nonlinearity and sliding for the particular structural configuration and provides a robust analytical basis for the estimation of the probabilistic distribution of selected demands parameters both at the design level and beyond design level seismic input. For the subject structure, the inclusion of foundation sliding in the analysis is found to have reduced both

  13. Valid Probabilistic Predictions for Ginseng with Venn Machines Using Electronic Nose

    Directory of Open Access Journals (Sweden)

    You Wang

    2016-07-01

    Full Text Available In the application of electronic noses (E-noses, probabilistic prediction is a good way to estimate how confident we are about our prediction. In this work, a homemade E-nose system embedded with 16 metal-oxide semi-conductive gas sensors was used to discriminate nine kinds of ginsengs of different species or production places. A flexible machine learning framework, Venn machine (VM was introduced to make probabilistic predictions for each prediction. Three Venn predictors were developed based on three classical probabilistic prediction methods (Platt’s method, Softmax regression and Naive Bayes. Three Venn predictors and three classical probabilistic prediction methods were compared in aspect of classification rate and especially the validity of estimated probability. A best classification rate of 88.57% was achieved with Platt’s method in offline mode, and the classification rate of VM-SVM (Venn machine based on Support Vector Machine was 86.35%, just 2.22% lower. The validity of Venn predictors performed better than that of corresponding classical probabilistic prediction methods. The validity of VM-SVM was superior to the other methods. The results demonstrated that Venn machine is a flexible tool to make precise and valid probabilistic prediction in the application of E-nose, and VM-SVM achieved the best performance for the probabilistic prediction of ginseng samples.

  14. Probabilistic exposure assessment to face and oral care cosmetic products by the French population.

    Science.gov (United States)

    Bernard, A; Dornic, N; Roudot, Ac; Ficheux, As

    2018-01-01

    Cosmetic exposure data for face and mouth are limited in Europe. The aim of the study was to assess the exposure to face cosmetics using recent French consumption data (Ficheux et al., 2016b, 2015). Exposure was assessed using a probabilistic method for thirty one face products from four lines of products: cleanser, care, make-up and make-up remover products and two oral care products. Probabilistic exposure was assessed for different subpopulation according to sex and age in adults and children. Pregnant women were also studied. The levels of exposure to moisturizing cream, lip balm, mascara, eyeliner, cream foundation, toothpaste and mouthwash were higher than the values currently used by the Scientific Committee on Consumer Safety (SCCS). Exposure values found for eye shadow, lipstick, lotion and milk (make-up remover) were lower than SCCS values. These new French exposure values will be useful for safety assessors and for safety agencies in order to protect the general population and the at risk populations. Copyright © 2017. Published by Elsevier Ltd.

  15. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  16. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  17. Probabilistic seismic hazards: Guidelines and constraints in evaluating results

    International Nuclear Information System (INIS)

    Sadigh, R.K.; Power, M.S.

    1989-01-01

    In conducting probabilistic seismic hazard analyses, consideration of the dispersion as well as the upper bounds on ground motion is of great significance. In particular, the truncation of ground motion levels at some upper limit would have a major influence on the computed hazard at the low-to-very-low probability levels. Additionally, other deterministic guidelines and constraints should be considered in evaluating the probabilistic seismic hazard results. In contrast to probabilistic seismic hazard evaluations, mean plus one standard deviation ground motions are typically used for deterministic estimates of ground motions from maximum events that may affect a structure. To be consistent with standard deterministic maximum estimates of ground motions values should be the highest level considered for the site. These maximum values should be associated with the largest possible event occurring at the site. Furthermore, the relationships between the ground motion level and probability of exceedance should reflect a transition from purely probabilistic assessments of ground motion at high probability levels where there are multiple chances for events to a deterministic upper bound ground motion at very low probability levels where there is very limited opportunity for maximum events to occur. In Interplate Regions, where the seismic sources may be characterized by a high-to-very-high rate of activity, the deterministic bounds will be approached or exceeded by the computer probabilistic hazard values at annual probability of exceedance levels typically as high as 10 -2 to 10 -3 . Thus, at these or lower values probability levels, probabilistically computed hazard values could be readily interpreted in the light of the deterministic constraints

  18. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  19. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  20. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  1. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  2. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1988-01-01

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  3. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  4. A ligand predication tool based on modeling and reasoning with imprecise probabilistic knowledge.

    Science.gov (United States)

    Liu, Weiru; Yue, Anbu; Timson, David J

    2010-04-01

    Ligand prediction has been driven by a fundamental desire to understand more about how biomolecules recognize their ligands and by the commercial imperative to develop new drugs. Most of the current available software systems are very complex and time-consuming to use. Therefore, developing simple and efficient tools to perform initial screening of interesting compounds is an appealing idea. In this paper, we introduce our tool for very rapid screening for likely ligands (either substrates or inhibitors) based on reasoning with imprecise probabilistic knowledge elicited from past experiments. Probabilistic knowledge is input to the system via a user-friendly interface showing a base compound structure. A prediction of whether a particular compound is a substrate is queried against the acquired probabilistic knowledge base and a probability is returned as an indication of the prediction. This tool will be particularly useful in situations where a number of similar compounds have been screened experimentally, but information is not available for all possible members of that group of compounds. We use two case studies to demonstrate how to use the tool. 2009 Elsevier Ireland Ltd. All rights reserved.

  5. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  6. Risk management on nuclear power plant. Application of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Kojima, Shigeo

    2003-01-01

    In U.S.A., nuclear safety regulation is moving to risk-informed regulation (RIR), so necessity of a standard to provide contents of probabilistic risk assessment (PRA) constructing its roots has been discussed for a long time. In 1998, the Committee on Nuclear Risk Management (CNRM) of the American Society of Mechanical Engineers (ASME) began to investigate the standard, of which last edition was published as the Standard for Probabilistic Risk Management for Nuclear Power Plant Applications: RA-S-2002 (PRMA) on April, 2002. As in the Committee, the Nuclear Regulatory Commission (NRC), electric power companies, national institutes, PRA specialists, and so on took parts to carry out many discussions with full energies of participants on risk management in U.S.A., the standard was finished after about four years' efforts. In U.S.A., risk management having already used PRA is successfully practiced, U.S.A. is at a stage with more advancing steps of the risk management than Japan is. Here was described on the standard of PRA and a concrete method of the risk management carried out at nuclear power stations. (G.K.)

  7. Report on probabilistic safety assessment (PSA) quality assurance in utilization of risk information

    International Nuclear Information System (INIS)

    2006-12-01

    Recently in Japan, introduction of nuclear safety regulations using risk information such as probabilistic safety assessment (PSA) has been considered and utilization of risk information in the rational and practical measures on safety assurance has made a progress to start with the operation or inspection area. The report compiled results of investigation and studies of PSA quality assurance in risk-informed activities in the USA. Relevant regulatory guide and standard review plan as well as issues and recommendations were reviewed for technical adequacy and advancement of probabilistic risk assessment technology in risk-informed decision making. Useful and important information to be referred as issues in PSA quality assurance was identified. (T. Tanaka)

  8. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  9. Development of a Probabilistic Technique for On-line Parameter and State Estimation in Non-linear Dynamic Systems

    International Nuclear Information System (INIS)

    Tunc Aldemir; Miller, Don W.; Hajek, Brian K.; Peng Wang

    2002-01-01

    The DSD (Dynamic System Doctor) is a system-independent, interactive software under development for on-line state/parameter estimation in dynamic systems (1), partially supported through a Nuclear Engineering Education (NEER) grant during 1998-2001. This paper summarizes the recent accomplishments in improving the user-friendliness and computational capability of DSD

  10. Probabilistic Seismic Hazard Assessment Method for Nonlinear Soil Sites based on the Hazard Spectrum of Bedrock Sites

    International Nuclear Information System (INIS)

    Hahm, Dae Gi; Seo, Jeong Moon; Choi, In Kil

    2011-01-01

    For the probabilistic safety assessment of the nuclear power plants (NPP) under seismic events, the rational probabilistic seismic hazard estimation should be performed. Generally, the probabilistic seismic hazard of NPP site is represented by the uniform hazard spectrum (UHS) for the specific annual frequency. In most case, since that the attenuation equations were defined for the bedrock sites, the standard attenuation laws cannot be applied to the general soft soil sites. Hence, for the probabilistic estimation of the seismic hazard of soft soil sites, a methodology of probabilistic seismic hazard analysis (PSHA) coupled with nonlinear dynamic analyses of the soil column are required. Two methods are commonly used for the site response analysis considering the nonlinearity of sites. The one is the deterministic method and another is the probabilistic method. In the analysis of site response, there exist many uncertainty factors such as the variation of the magnitude and frequency contents of input ground motion, and material properties of soil deposits. Hence, nowadays, it is recommended that the adoption of the probabilistic method for the PSHA of soft soil deposits considering such uncertainty factors. In this study, we estimated the amplification factor of the surface of the soft soil deposits with considering the uncertainties of the input ground motions and the soil material properties. Then, we proposed the probabilistic methodology to evaluate the UHS of the soft soil site by multiplying the amplification factor to that of the bedrock site. The proposed method was applied to four typical target sites of KNGR and APR1400 NPP site categories

  11. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke

    2016-04-11

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  12. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke; Hu, Kai-Mo; Yin, Li-Cheng; Yan, Dongming; Wang, Bin

    2016-01-01

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  13. Systematic evaluations of probabilistic floor response spectrum generation

    International Nuclear Information System (INIS)

    Lilhanand, K.; Wing, D.W.; Tseng, W.S.

    1985-01-01

    The relative merits of the current methods for direct generation of probabilistic floor response spectra (FRS) from the prescribed design response spectra (DRS) are evaluated. The explicit probabilistic methods, which explicitly use the relationship between the power spectral density function (PSDF) and response spectra (RS), i.e., the PSDF-RS relationship, are found to have advantages for practical applications over the implicit methods. To evaluate the accuracy of the explicit methods, the root-mean-square (rms) response and the peak factor contained in the PSDF-RS relationship are systematically evaluated, especially for the narrow-band floor spectral response, by comparing the analytical results with simulation results. Based on the evaluation results, a method is recommended for practical use for the direct generation of probabilistic FRS. (orig.)

  14. Probabilistic Criterion for the Economical Assessment of Nuclear Reactors

    International Nuclear Information System (INIS)

    Juanico, L; Florido, Pablo; Bergallo, Juan

    2000-01-01

    In this paper a MonteCarlo probabilistic model for the economic evaluation of nuclear power plants is presented.The probabilistic results have shown a wide spread on the economic performance due to the schedule complexity and coupling if tasks.This spread increasing to the discount rate, end hence, it becomes more important for developing countries

  15. A probabilistic coverage for on-the-fly test generation algorithms

    OpenAIRE

    Goga, N.

    2003-01-01

    This paper describes a way to compute the coverage for an on-the-fly test generation algorithm based on a probabilistic approach. The on-the-fly test generation and execution process and the development process of an implementation from a specification are viewed as a stochastic process. The probabilities of the stochastic processes are integrated in a generalized definition of coverage. The generalized formulas are instantiated for the ioco theory and for the specification of the TorX test g...

  16. Design of x-ray diagnostic beam line for a synchrotron radiation source and measurement results

    Energy Technology Data Exchange (ETDEWEB)

    Garg, Akash Deep, E-mail: akash-deep@rrcat.gov.in; Karnewar, A.K.; Ojha, A.; Shrivastava, B.B.; Holikatti, A.C.; Puntambekar, T.A.; Navathe, C.P.

    2014-08-01

    Indus-2 is a 2.5 GeV synchrotron radiation source (SRS) operational at the Raja Ramanna Centre for Advanced Technology (RRCAT) in India. We have designed, developed and commissioned x-ray diagnostic beam line (X-DBL) at the Indus-2. It is based on pinhole array imaging (8–18 keV). We have derived new equations for online measurements of source position and emission angle with pinhole array optics. Measured values are compared with the measurements at an independent x-ray beam position monitor (staggered pair blade monitor) installed in the X-DBL. The measured values are close to the theoretical expected values within ±12 µm (or ±1.5 μrad) for sufficiently wide range of the beam movements. So, beside the beam size and the beam emittance, online information for the vertical position and angle is also used in the orbit steering. In this paper, the various design considerations of the X-DBL and online measurement results are presented.

  17. Real-time control of tearing modes using a line-of-sight electron cyclotron emission diagnostic

    International Nuclear Information System (INIS)

    Hennen, B A; Westerhof, E; De Baar, M R; Bongers, W A; Thoen, D J; Nuij, P W J M; Steinbuch, M; Oosterbeek, J W; Buerger, A

    2010-01-01

    The stability and performance of tokamak plasmas are limited by instabilities such as neoclassical tearing modes. This paper reports on an experimental proof of principle of a feedback control approach for real-time, autonomous suppression and stabilization of tearing modes in a tokamak. The system combines an electron cyclotron emission diagnostic for sensing of the tearing modes in the same sight line with a steerable electron cyclotron resonance heating and current drive (ECRH/ECCD) antenna. A methodology for fast detection of q = m/n = 2/1 tearing modes and retrieval of their location, rotation frequency and phase is presented. Set-points to establish alignment of the ECRH/ECCD deposition location with the centre of the tearing mode are generated in real time and forwarded in closed loop to the steerable launcher and as a modulation pulse train to the gyrotron. Experimental results demonstrate the capability of the control system to track externally perturbed tearing modes in real time.

  18. Guidance for the definition and application of probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2011-05-01

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  19. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  20. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    Science.gov (United States)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  1. Reliability analysis of minimum energy on target for laser facilities with more beam lines

    International Nuclear Information System (INIS)

    Chen Guangyu

    2008-01-01

    Shot reliability performance measures of laser facilities with more beam lines pertain to three categories: minimum-energy-on-target, power balance, and shot diagnostics. Accounting for symmetry of NIF beam line design and similarity of subset reliability in a same partition, a fault tree of meeting minimum-energy-on-target for the large laser facility shot of type K and a simplified method are presented, which are used to analyze hypothetic reliability of partition subsets in order to get trends of influences increasing number of beam lines and diverse shot types of large laser facilities on their shot reliability. Finally, it finds that improving component reliability is more crucial for laser facilities with more beam lines in comparison with those with beam lines and functional diversity from design flexibility is greatly helpful for improving shot reliability. (authors)

  2. Diagnostics of vector magnetic fields

    Science.gov (United States)

    Stenflo, J. O.

    1985-01-01

    It is shown that the vector magnetic fields derived from observations with a filter magnetograph will be severely distorted if the spatially unresolved magnetic structure is not properly accounted for. Thus the apparent vector field will appear much more horizontal than it really is, but this distortion is strongly dependent on the area factor and the temperature line weakenings. As the available fluxtube models are not sufficiently well determined, it is not possible to correct the filter magnetograph observations for these effects in a reliable way, although a crude correction is of course much better than no correction at all. The solution to this diagnostic problem is to observe simultaneously in suitable combinations of spectral lines, and/or use Stokes line profiles recorded with very high spectral resolution. The diagnostic power of using a Fourier transform spectrometer for polarimetry is shown and some results from I and V spectra are illustrated. The line asymmetries caused by mass motions inside the fluxtubes adds an extra complication to the diagnostic problem, in particular as there are indications that the motions are nonstationary in nature. The temperature structure appears to be a function of fluxtube diameter, as a clear difference between plage and network fluxtubes was revealed. The divergence of the magnetic field with height plays an essential role in the explanation of the Stokes V asymmetries (in combination with the mass motions). A self consistent treatment of the subarcsec field geometry may be required to allow an accurate derivation of the spatially averaged vector magnetic field from spectrally resolved data.

  3. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  4. Probabilistic Assessment of the Occurrence and Duration of Ice Accretion on Cables

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee; Georgakis, Christos Thomas; Faber, Michael Havbro

    2015-01-01

    This paper presents an operational framework for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. The framework utilizes the features of the Bayesian Probabilistic Networks. and its performance is illustrated through a case study of the cable-stayed...... Oresund Bridge. The Bayesian Probabilistic Network model used for the estimation of the occurrence and duration probabilities is studied and it is found to be robust with respect to changes in the choice of distribution types used to model the meteorological variables that influence the two icing...

  5. Probabilistic safety assessment for seismic events

    International Nuclear Information System (INIS)

    1993-10-01

    This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes

  6. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    International Nuclear Information System (INIS)

    CHU, T.L.; MARTINEZ-GURIDI, G.; LIHNER, J.; OVERLAND, D.

    2004-01-01

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I and C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment

  7. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  8. Development of a high resolution cylindrical crystal spectrometer for line shape and spectral diagnostics of x-rays emitted from - hot - plasmas. Final report, June 1, 1976-December 31, 1983

    International Nuclear Information System (INIS)

    Kaellne, E.G.

    1984-01-01

    The development, installation and evaluation of a high resolution X-ray spectroscopic diagnostics are reported. The approach has been to optimize spectrometer throughput to enable single shot plasma diagnostics with good time resolution and to ensure sufficient energy resolution to allow line profile analysis. These goals have been achieved using a new X-ray geometry combined with a new position sensitive X-ray detector. These diagnostics have been used at Alcator C to detect X-ray emission of highly ionized impurity elements as well as argon seed elements specially introduced into the plasma for this diagnostic. Temporally resolved ion temperature profiles have been obtained from the recorded X-ray spectra simultaneously with other plasma parameters such as electron temperature, ionization temperature and ionization stage distribution. Radial profiles have also been measured. The developed X-ray diagnostics thus serve as a major multiparameter probe of the central core of the plasma with complementary informtion on radial profiles

  9. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  10. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  11. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  12. A probabilistic model for the identification of confinement regimes and edge localized mode behavior, with implications to scaling laws

    International Nuclear Information System (INIS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-01-01

    Pattern recognition is becoming an important tool in fusion data analysis. However, fusion diagnostic measurements are often affected by considerable statistical uncertainties, rendering the extraction of useful patterns a significant challenge. Therefore, we assume a probabilistic model for the data and perform pattern recognition in the space of probability distributions. We show the considerable advantage of our method for identifying confinement regimes and edge localized mode behavior, and we discuss the potential for scaling laws.

  13. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  14. On the functional failures concept and probabilistic safety margins: challenges in application for evaluation of effectiveness of shutdown systems - 15318

    International Nuclear Information System (INIS)

    Serghiuta, D.; Tholammakkil, J.

    2015-01-01

    The use of level-3 reliability approach and the concept of functional failure probability could provide the basis for defining a safety margin metric which would include a limit for the probability of functional failure, in line with the definition of a reliability-based design. It can also allow a quantification of level of confidence, by explicit modeling and quantification of uncertainties, and provide a better framework for representation of actual design and optimization of design margins within an integrated probabilistic-deterministic model. This paper reviews the attributes and challenges in application of functional failure concept in evaluation of risk-informed safety margins using as illustrative example the case of CANDU reactors shutdown systems effectiveness. A risk-informed formulation is first introduced for estimation of a reasonable limit for the functional failure probability using a Swiss cheese model. It is concluded that more research is needed in this area and a deterministic - probabilistic approach may be a reasonable intermediate step for evaluation of functional failure probability at the system level. The views expressed in this paper are those of the authors and do not necessarily reflect those of CNSC, or any part thereof. (authors)

  15. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  16. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  17. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  18. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  19. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  20. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  1. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  2. McMurray's Test and Joint Line Tenderness for Medial Meniscus ...

    African Journals Online (AJOL)

    The wide variations reported have an impact on clinical decision concerning whether to go for other diagnostic tests before going for diagnostic arthroscopy, which is considered as the gold standard.The purpose of this study was to determine the diagnostic value of Joint line tenderness and McMurray's test, as clinical signs ...

  3. Probabilistic safety analysis and interpretation thereof

    International Nuclear Information System (INIS)

    Steininger, U.; Sacher, H.

    1999-01-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [de

  4. Probabilistic induction of delayed health hazards in occupational radiation workers

    International Nuclear Information System (INIS)

    Mohamad, M.H.M.; Abdel-Ghani, A.H.

    2003-01-01

    Occupational radiation workers are periodically monitored for their personal occupational dose. Various types of radiation measurement devices are used, mostly film badges and thermoluminescent dosimeters. Several thousand occupational radiation workers were monitored over a period of seven years (jan. 1995- Dec. 2001). These included atomic energy personnel, nuclear materials personnel, staff of mediology departments (diagnostic, therapeutic and nuclear medicine) and industrial occupational workers handling industrial radiography equipment besides other applications of radiation sources in industry. The probably of induction of health hazards in these radiation workers was assessed using the nominal probability coefficient adopted by the ICRP (1991) for both hereditary effects and cancer induction. In this treatise, data procured are presented and discussed inthe light of basic postulations of probabilistic occurrence of radiation induced delayed health effects

  5. Development of probabilistic evaluation methodology for structural integrity of nuclear components

    International Nuclear Information System (INIS)

    Lee, Gang Yong; Yang, Jee Hyeok; Shin, Jeong Woo; Hong, Soon Won; Lee, Won Gyu; Kim, Goo Yeong

    1999-03-01

    Since integrity is very important in Nuclear Power Plants, there have been a lot of researches and several rules are provided. But these are mostly based on the concept of the deterministic fracture mechanics and in many cases, those rules are unrealistic or conservative. Therefore, the concept of the probabilistic fracture mechanics considering the realistic failure of the structure and the quantitative failure probability is introduced in many fields. There have been many researches on the probabilistic fracture mechanics in world, but a few in Korea. The final object of our research os to develop the code years. In the first year study, we obtained the concept of the probabilistic fracture mechanics by reviewing the papers about the integrity evaluation of the nuclear pressure vessel on the base of the probabilistic fracture mechanics and selected the important random variables by comparing the effects of random variables on the failure probability using the existing code

  6. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  7. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  8. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  9. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  10. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  11. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    Science.gov (United States)

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  12. BWR recirculation pump diagnostic expert system

    International Nuclear Information System (INIS)

    Chiang, S.C.; Morimoto, C.N.; Torres, M.R.

    2004-01-01

    At General Electric (GE), an on-line expert system to support maintenance decisions for BWR recirculation pumps for nuclear power plants has been developed. This diagnostic expert system is an interactive on-line system that furnishes diagnostic information concerning BWR recirculation pump operational problems. It effectively provides the recirculation pump diagnostic expertise in the plant control room continuously 24 hours a day. The expert system is interfaced to an on-line monitoring system, which uses existing plant sensors to acquire non-safety related data in real time. The expert system correlates and evaluates process data and vibration data by applying expert rules to determine the condition of a BWR recirculation pump system by applying knowledge based rules. Any diagnosis will be automatically displayed, indicating which pump may have a problem, the category of the problem, and the degree of concern expressed by the validity index and color hierarchy. The rules incorporate the expert knowledge from various technical sources such as plant experience, engineering principles, and published reports. These rules are installed in IF-THEN formats and the resulting truth values are also expressed in fuzzy terms and a certainty factor called a validity index. This GE Recirculation Pump Expert System uses industry-standard software, hardware, and network access to provide flexible interfaces with other possible data acquisition systems. Gensym G2 Real-Time Expert System is used for the expert shell and provides the graphical user interface, knowledge base, and inference engine capabilities. (author)

  13. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  14. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  15. Plasma diagnostics on large tokamaks

    International Nuclear Information System (INIS)

    Orlinskij, D.V.; Magyar, G.

    1988-01-01

    The main tasks of the large tokamaks which are under construction (T-15 and Tore Supra) and of those which have already been built (TFTR, JET, JT-60 and DIII-D) together with their design features which are relevant to plasma diagnostics are briefly discussed. The structural features and principal characteristics of the diagnostic systems being developed or already being used on these devices are also examined. The different diagnostic methods are described according to the physical quantities to be measured: electric and magnetic diagnostics, measurements of electron density, electron temperature, the ion components of the plasma, radiation loss measurements, spectroscopy of impurities, edge diagnostics and study of plasma stability. The main parameters of the various diagnostic systems used on the six large tokamaks are summarized in tables. (author). 351 refs, 44 figs, 22 tabs

  16. Laser-induced Breakdown spectroscopy quantitative analysis method via adaptive analytical line selection and relevance vector machine regression model

    International Nuclear Information System (INIS)

    Yang, Jianhong; Yi, Cancan; Xu, Jinwu; Ma, Xianghong

    2015-01-01

    A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine. - Highlights: • Both training and testing samples are considered for analytical lines selection. • The analytical lines are auto-selected based on the built-in characteristics of spectral lines. • The new method can achieve better prediction accuracy and modeling robustness. • Model predictions are given with confidence interval of probabilistic distribution

  17. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  18. On-line evaluation of position-sensitive detector (PSD) diffraction data

    International Nuclear Information System (INIS)

    Stansfield, R.F.D.; McIntyre, G.J.

    1985-01-01

    The amount of raw data accumulated in a single-crystal diffraction experiment using a two-dimensional Position Sensitive Detector is usually so large that it is impracticable to store it. It is therefore necessary to reduce each local three-dimensional array of counts to a Bragg intensity, in a time not longer than the average time that one reflection is active. The statistically optimum procedure comprises an estimation of the background from a large number of counts, and an integration of peak intensity within a suitable three-dimensional envelope. A typical on-line method is described, using as an example the D19 diffractometer at the Institut Max von Laue - Paul Langevin (ILL) high-flux reactor. Current methods of PSD data reduction are reviewed. These fall into three groups according to the basis of the method used to find the integration envelope: (a) statistical criteria, (b) three-dimensional sigma(I)/I analysis, and (c) pre-calculation of the resolution function. On-line data reduction imposes special requirements on diagnostics to check the precision of the reduced data, especially at the start of an experiment, when any peculiarities must be identified and allowed for in the data-reduction procedure. The diagnostic possibilities resulting from the comparison of local with global characteristics of the background and the integration envelope are discussed. (author)

  19. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  20. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry

    2013-01-01

    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistribution...... of stresses in the structure. The fatigue stress ranges are estimated from traffic measurements and a generic bridge model. Based on the probabilistic models for the resistance and load the reliability is estimated for a typical welded steel detail. The results show that large misalignments in the joints can...

  1. A linear process-algebraic format for probabilistic systems with data

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    This paper presents a novel linear process algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  2. Spectroscopic Diagnostics of Solar Magnetic Flux Ropes Using Iron Forbidden Line

    Science.gov (United States)

    Cheng, X.; Ding, M. D.

    2016-05-01

    In this Letter, we present Interface Region Imaging Spectrograph Fe xxi 1354.08 Å forbidden line emission of two magnetic flux ropes (MFRs) that caused two fast coronal mass ejections with velocities of ≥1000 km s-1 and strong flares (X1.6 and M6.5) on 2014 September 10 and 2015 June 22, respectively. The extreme-ultraviolet images at the 131 and 94 Å passbands provided by the Atmospheric Imaging Assembly on board Solar Dynamics Observatory reveal that both MFRs initially appear as suspended hot channel-like structures. Interestingly, part of the MFRs is also visible in the Fe xxi 1354.08 forbidden line, even prior to the eruption, e.g., for the SOL2014-09-10 event. However, the line emission is very weak and that only appears at a few locations but not the whole structure of the MFRs. This implies that the MFRs could be comprised of different threads with different temperatures and densities, based on the fact that the formation of the Fe xxi forbidden line requires a critical temperature (˜11.5 MK) and density. Moreover, the line shows a non-thermal broadening and a blueshift in the early phase. It suggests that magnetic reconnection at that time has initiated; it not only heats the MFR and, at the same time, produces a non-thermal broadening of the Fe xxi line but also produces the poloidal flux, leading to the ascension of the MFRs.

  3. SPECTROSCOPIC DIAGNOSTICS OF SOLAR MAGNETIC FLUX ROPES USING IRON FORBIDDEN LINE

    International Nuclear Information System (INIS)

    Cheng, X.; Ding, M. D.

    2016-01-01

    In this Letter, we present Interface Region Imaging Spectrograph Fe xxi 1354.08 Å forbidden line emission of two magnetic flux ropes (MFRs) that caused two fast coronal mass ejections with velocities of ≥1000 km s"−"1 and strong flares (X1.6 and M6.5) on 2014 September 10 and 2015 June 22, respectively. The extreme-ultraviolet images at the 131 and 94 Å passbands provided by the Atmospheric Imaging Assembly on board Solar Dynamics Observatory reveal that both MFRs initially appear as suspended hot channel-like structures. Interestingly, part of the MFRs is also visible in the Fe xxi 1354.08 forbidden line, even prior to the eruption, e.g., for the SOL2014-09-10 event. However, the line emission is very weak and that only appears at a few locations but not the whole structure of the MFRs. This implies that the MFRs could be comprised of different threads with different temperatures and densities, based on the fact that the formation of the Fe xxi forbidden line requires a critical temperature (∼11.5 MK) and density. Moreover, the line shows a non-thermal broadening and a blueshift in the early phase. It suggests that magnetic reconnection at that time has initiated; it not only heats the MFR and, at the same time, produces a non-thermal broadening of the Fe xxi line but also produces the poloidal flux, leading to the ascension of the MFRs.

  4. SPECTROSCOPIC DIAGNOSTICS OF SOLAR MAGNETIC FLUX ROPES USING IRON FORBIDDEN LINE

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, X.; Ding, M. D., E-mail: xincheng@nju.edu.cn [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China)

    2016-05-20

    In this Letter, we present Interface Region Imaging Spectrograph Fe xxi 1354.08 Å forbidden line emission of two magnetic flux ropes (MFRs) that caused two fast coronal mass ejections with velocities of ≥1000 km s{sup −1} and strong flares (X1.6 and M6.5) on 2014 September 10 and 2015 June 22, respectively. The extreme-ultraviolet images at the 131 and 94 Å passbands provided by the Atmospheric Imaging Assembly on board Solar Dynamics Observatory reveal that both MFRs initially appear as suspended hot channel-like structures. Interestingly, part of the MFRs is also visible in the Fe xxi 1354.08 forbidden line, even prior to the eruption, e.g., for the SOL2014-09-10 event. However, the line emission is very weak and that only appears at a few locations but not the whole structure of the MFRs. This implies that the MFRs could be comprised of different threads with different temperatures and densities, based on the fact that the formation of the Fe xxi forbidden line requires a critical temperature (∼11.5 MK) and density. Moreover, the line shows a non-thermal broadening and a blueshift in the early phase. It suggests that magnetic reconnection at that time has initiated; it not only heats the MFR and, at the same time, produces a non-thermal broadening of the Fe xxi line but also produces the poloidal flux, leading to the ascension of the MFRs.

  5. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  6. A study on the weather sampling method for probabilistic consequence analysis

    International Nuclear Information System (INIS)

    Oh, Hae Cheol

    1996-02-01

    The main task of probabilistic accident consequence analysis model is to predict the radiological situation and to provide a reliable quantitative data base for making decisions on countermeasures. The magnitude of accident consequence is depended on the characteristic of the accident and the weather coincident. In probabilistic accident consequence analysis, it is necessary to repeat the atmospheric dispersion calculation with several hundreds of weather sequences to predict the full distribution of consequences which may occur following a postulated accident release. It is desirable to select a representative sample of weather sequences from a meteorological record which is typical of the area over which the released radionuclides will disperse and which spans a sufficiently long period. The selection process is done by means of sampling techniques from a full year of hourly weather data characteristic of the plant site. In this study, the proposed Weighted importance sampling method selects proportional to the each bin size to closely approximate the true frequency distribution of weather condition at the site. The Weighted importance sampling method results in substantially less sampling uncertainty than the previous technique. The proposed technique can result in improve confidence in risk estimates

  7. OCA-P, PWR Vessel Probabilistic Fracture Mechanics

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    2001-01-01

    1 - Description of program or function: OCA-P is a probabilistic fracture-mechanics code prepared specifically for evaluating the integrity of pressurized-water reactor vessels subjected to overcooling-accident loading conditions. Based on linear-elastic fracture mechanics, it has two- and limited three-dimensional flaw capability, and can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For deterministic analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and a variety of histograms (probabilistic analysis). 2 - Method of solution: OAC-P accepts as input the reactor primary- system pressure and the reactor pressure-vessel downcomer coolant temperature, as functions of time in the specified transient. Then, the wall temperatures and stresses are calculated as a function of time and radial position in the wall, and the fracture-mechanics analysis is performed to obtain the stress intensity factors as a function of crack depth and time in the transient. In a deterministic analysis, values of the static crack initiation toughness and the crack arrest toughness are also calculated for all crack depths and times in the transient. A comparison of these values permits an evaluation of flaw behavior. For a probabilistic analysis, OCA-P generates a large number of reactor pressure vessels, each with a different combination of the various values of the parameters involved in the analysis of flaw behavior. For each of these vessels, a deterministic fracture

  8. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method

    DEFF Research Database (Denmark)

    Valentin, Jan B.; Andreetta, Christian; Boomsma, Wouter

    2014-01-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length s....... The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. © 2013 Wiley Periodicals, Inc....

  9. Probabilistic assessment of dry transport with burnup credit

    International Nuclear Information System (INIS)

    Lake, W.H.

    2003-01-01

    The general concept of probabilistic analysis and its application to the use of burnup credit in spent fuel transport is explored. Discussion of the probabilistic analysis method is presented. The concepts of risk and its perception are introduced, and models are suggested for performing probability and risk estimates. The general probabilistic models are used for evaluating the application of burnup credit for dry spent nuclear fuel transport. Two basic cases are considered. The first addresses the question of the relative likelihood of exceeding an established criticality safety limit with and without burnup credit. The second examines the effect of using burnup credit on the overall risk for dry spent fuel transport. Using reasoned arguments and related failure probability and consequence data analysis is performed to estimate the risks of using burnup credit for dry transport of spent nuclear fuel. (author)

  10. Biological sequence analysis: probabilistic models of proteins and nucleic acids

    National Research Council Canada - National Science Library

    Durbin, Richard

    1998-01-01

    ... analysis methods are now based on principles of probabilistic modelling. Examples of such methods include the use of probabilistically derived score matrices to determine the significance of sequence alignments, the use of hidden Markov models as the basis for profile searches to identify distant members of sequence families, and the inference...

  11. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  12. Growing hierarchical probabilistic self-organizing graphs.

    Science.gov (United States)

    López-Rubio, Ezequiel; Palomo, Esteban José

    2011-07-01

    Since the introduction of the growing hierarchical self-organizing map, much work has been done on self-organizing neural models with a dynamic structure. These models allow adjusting the layers of the model to the features of the input dataset. Here we propose a new self-organizing model which is based on a probabilistic mixture of multivariate Gaussian components. The learning rule is derived from the stochastic approximation framework, and a probabilistic criterion is used to control the growth of the model. Moreover, the model is able to adapt to the topology of each layer, so that a hierarchy of dynamic graphs is built. This overcomes the limitations of the self-organizing maps with a fixed topology, and gives rise to a faithful visualization method for high-dimensional data.

  13. Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.

    1989-11-01

    A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs

  14. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  15. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  16. Making Probabilistic Relational Categories Learnable

    Science.gov (United States)

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  17. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  18. Database of emission lines

    Science.gov (United States)

    Binette, L.; Ortiz, P.; Joguet, B.; Rola, C.

    1998-11-01

    A widely accessible data bank (available through Netscape) and consiting of all (or most) of the emission lines reported in the litterature is being built. It will comprise objects as diverse as HII regions, PN, AGN, HHO. One of its use will be to define/refine existing diagnostic emission line diagrams.

  19. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  20. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  1. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  2. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  3. Development and Implementation of a New HELIOS Diagnostic using a Fast Piezoelectric Valve on the Prototype Material Plasma Exposure eXperiment

    Science.gov (United States)

    Ray, Holly; Biewer, Theodore; Caneses, Juan; Green, Jonathan; Lindquist, Elizabeth; McQuown, Levon; Schmitz, Oliver

    2017-10-01

    A new helium line-ratio spectral monitoring (HELIOS) diagnostic, using a piezoelectric valve with high duty cycles (on/off times ms), allowing for good background correction, and measured particle flowrates on the order of 1020 particles/second is being implemented on Oak Ridge National Laboratory's (ORNL) Prototype Material Plasma Exposure eXperiment (Proto-MPEX). Built in collaboration with the University of Wisconsin - Madison, the HELIOS diagnostic communicates with a Labview program for controlled bursts of helium into the vessel. The open magnetic geometry of Proto-MPEX is ideal for testing and characterizing a HELIOS diagnostic. The circular cross-section with four ports allows for cross comparison between different diagnostics: 1) Helium injection with the piezoelectric puff valve, 2) HELIOS line-of-sight high-gain observation, 3) scan-able Double Langmuir probe, and 4) HELIOS 2D imaging observation. Electron density and temperature measurements from the various techniques will be compared. This work was supported by the US. D.O.E. contract DE-AC05-00OR22725 and DE-SC00013911.

  4. Progress on development of SPIDER diagnostics

    Science.gov (United States)

    Pasqualotto, R.; Agostini, M.; Barbisan, M.; Bernardi, M.; Brombin, M.; Cavazzana, R.; Croci, G.; Palma, M. Dalla; Delogu, R. S.; Gorini, G.; Lotto, L.; Muraro, A.; Peruzzo, S.; Pimazzoni, A.; Pomaro, N.; Rizzolo, A.; Serianni, G.; Spolaore, M.; Tardocchi, M.; Zaniol, B.; Zaupa, M.

    2017-08-01

    SPIDER experiment, the full size prototype of the beam source for the ITER heating neutral beam injector, has to demonstrate extraction and acceleration to 100 kV of a large negative ion hydrogen or deuterium beam with co-extracted electron fraction e-/D- SPIDER plant systems are being installed, the different diagnostic systems are in the procurement phase. Their final design is described here with a focus on some key solutions and most original and cost effective implementations. Thermocouples used to measure the power load distribution in the source and over the beam dump front surface will be efficiently fixed with proven technique and acquired through commercial and custom electronics. Spectroscopy needs to use well collimated lines of sight and will employ novel design spectrometers with higher efficiency and resolution and filtered detectors with custom built amplifiers. The electrostatic probes will be operated through electronics specifically developed to cope with the challenging environment of the RF source. The instrumented calorimeter STRIKE will use new CFC tiles, still under development. Two linear cameras, one built in house, have been tested as suitable for optical beam tomography. Some diagnostic components are off the shelf, others are custom developed: some of these are being prototyped or are under test before final production and installation, which will be completed before start of SPIDER operation.

  5. An assessment of the acute dietary exposure to glyphosate using deterministic and probabilistic methods.

    Science.gov (United States)

    Stephenson, C L; Harris, C A; Clarke, R

    2018-02-01

    Use of glyphosate in crop production can lead to residues of the active substance and related metabolites in food. Glyphosate has never been considered acutely toxic; however, in 2015 the European Food Safety Authority (EFSA) proposed an acute reference dose (ARfD). This differs from the Joint FAO/WHO Meeting on Pesticide Residues (JMPR) who in 2016, in line with their existing position, concluded that an ARfD was not necessary for glyphosate. This paper makes a comprehensive assessment of short-term dietary exposure to glyphosate from potentially treated crops grown in the EU and imported third-country food sources. European Union and global deterministic models were used to make estimates of short-term dietary exposure (generally defined as up to 24 h). Estimates were refined using food-processing information, residues monitoring data, national dietary exposure models, and basic probabilistic approaches to estimating dietary exposure. Calculated exposures levels were compared to the ARfD, considered to be the amount of a substance that can be consumed in a single meal, or 24-h period, without appreciable health risk. Acute dietary intakes were Probabilistic exposure estimates showed that the acute intake on no person-days exceeded 10% of the ARfD, even for the pessimistic scenario.

  6. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    Science.gov (United States)

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  7. A new diagnostic accuracy measure and cut-point selection criterion.

    Science.gov (United States)

    Dong, Tuochuan; Attwood, Kristopher; Hutson, Alan; Liu, Song; Tian, Lili

    2017-12-01

    Most diagnostic accuracy measures and criteria for selecting optimal cut-points are only applicable to diseases with binary or three stages. Currently, there exist two diagnostic measures for diseases with general k stages: the hypervolume under the manifold and the generalized Youden index. While hypervolume under the manifold cannot be used for cut-points selection, generalized Youden index is only defined upon correct classification rates. This paper proposes a new measure named maximum absolute determinant for diseases with k stages ([Formula: see text]). This comprehensive new measure utilizes all the available classification information and serves as a cut-points selection criterion as well. Both the geometric and probabilistic interpretations for the new measure are examined. Power and simulation studies are carried out to investigate its performance as a measure of diagnostic accuracy as well as cut-points selection criterion. A real data set from Alzheimer's Disease Neuroimaging Initiative is analyzed using the proposed maximum absolute determinant.

  8. On-line defected fuel monitoring using GFP data

    International Nuclear Information System (INIS)

    Livingstone, S.; Lewis, B.J.

    2008-01-01

    This paper describes the initial development of an on-line defected fuel diagnostic tool. The tool is based on coolant activity, and uses a quantitative and qualitative approach from existing mechanistic fission product release models, and also empirical rules based on commercial and experimental experience. The model departs from the usual methodology of analyzing steady-state fission product coolant activities, and instead uses steady-state fission product release rates calculated from the transient coolant activity data. An example of real-time defected fuel analysis work is presented using a prototype of this tool with station data. The model is in an early developmental stage, and this paper demonstrates the promising potential of this technique. (author)

  9. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  10. A probabilistic model-based soft sensor to monitor lactic acid bacteria fermentations

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2018-01-01

    A probabilistic soft sensor based on a mechanistic model was designed to monitor S. thermophilus fermentations, and validated with experimental lab-scale data. It considered uncertainties in the initial conditions, on-line measurements, and model parameters by performing Monte Carlo simulations...... the model parameters that were then used as input to the mechanistic model. The soft sensor predicted both the current state variables, as well as the future course of the fermentation, e.g. with a relative mean error of the biomass concentration of 8 %. This successful implementation of a process...... within the monitoring system. It predicted, therefore, the probability distributions of the unmeasured states, such as biomass, lactose, and lactic acid concentrations. To this end, a mechanistic model was developed first, and a statistical parameter estimation was performed in order to assess parameter...

  11. The role of probabilistic safety assessment and probabilistic safety criteria in nuclear power plant safety

    International Nuclear Information System (INIS)

    1992-01-01

    The purpose of this Safety Report is to provide guidelines on the role of probabilistic safety assessment (PSA) and a range of associated reference points, collectively referred to as probabilistic safety criteria (PSC), in nuclear safety. The application of this Safety Report and the supporting Safety Practice publication should help to ensure that PSA methodology is used appropriately to assess and enhance the safety of nuclear power plants. The guidelines are intended for use by nuclear power plant designers, operators and regulators. While these guidelines have been prepared with nuclear power plants in mind, the principles involved have wide application to other nuclear and non-nuclear facilities. In Section 2 of this Safety Report guidelines are established on the role PSA can play as part of an overall safety assurance programme. Section 3 summarizes guidelines for the conduct of PSAs, and in Section 4 a PSC framework is recommended and guidance is provided for the establishment of PSC values

  12. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  13. Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.

    Science.gov (United States)

    Herzallah, Randa

    2015-03-01

    Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Application of probabilistic methods to safety R and D and design choices

    International Nuclear Information System (INIS)

    Gavigan, F.X.; Griffith, J.D.

    1977-01-01

    The Liquid Metal Fast Breeder Reactor (LMFBR) safety program is committed to identifying and exploiting areas in which probabilistic methods can be developed and used in making reactor safety R and D choices and optimizing designs of safety systems. Emphasis will be placed on a positive approach of solidifying and expanding our knowledge. This will provide the groundwork for a consensus on FBR risk. The management structure which will be used is based on a mechanistic approach to an LMFBR Core Disruptive Accident (CDA) with risk partitioned into ''Lines of Assurance,'' i.e., independent, phenomenologically-based barriers which will impede or mitigate the progression and consequences of accident sequences. Quantitative determination of the probability of breach of these barriers through the completion of work identified for each Line of Assurance will allow the quantification of the contribution to risk reduction associated with the success of each barrier. This process can lead to better use of resources by channeling R and D in directions which promise the greatest potential for reducing risk and by identifying an orderly approach to the development and demonstration of design features which will keep LMFBR risks at an acceptable level

  15. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  16. Probabilistic methods in exotic option pricing

    NARCIS (Netherlands)

    Anderluh, J.H.M.

    2007-01-01

    The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.

  17. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  18. Decision making by hybrid probabilistic: Possibilistic utility theory

    Directory of Open Access Journals (Sweden)

    Pap Endre

    2009-01-01

    Full Text Available It is presented an approach to decision theory based upon nonprobabilistic uncertainty. There is an axiomatization of the hybrid probabilistic possibilistic mixtures based on a pair of triangular conorm and triangular norm satisfying restricted distributivity law, and the corresponding non-additive Smeasure. This is characterized by the families of operations involved in generalized mixtures, based upon a previous result on the characterization of the pair of continuous t-norm and t-conorm such that the former is restrictedly distributive over the latter. The obtained family of mixtures combines probabilistic and idempotent (possibilistic mixtures via a threshold.

  19. Review of the Diablo Canyon probabilistic risk assessment

    International Nuclear Information System (INIS)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.; Sabek, M.G.; Ravindra, M.K.; Johnson, J.J.

    1994-08-01

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Term Seismic Program

  20. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  1. DIM and diagnostic placement for NIF experiments

    International Nuclear Information System (INIS)

    Kalantar, D.

    1999-01-01

    The input that has been provided on the NIF experiment setup sheets has allowed us to review the diagnostic and DIM placement as well as the baseline unconverted light management plan. We have done an iteration to identify common diagnostic lines of sight, and with additional requirements defined by specific experiments, we propose (1) a baseline plan for DIM placement requiring only five DIMs that may be moved between up to seven DIM ports, and (2) a modified baseline unconverted light management plan. We request additional input to identify primary vs. secondary diagnostics for each experiment definition

  2. The Sapir-Whorf Hypothesis and Probabilistic Inference: Evidence from the Domain of Color.

    Science.gov (United States)

    Cibelli, Emily; Xu, Yang; Austerweil, Joseph L; Griffiths, Thomas L; Regier, Terry

    2016-01-01

    The Sapir-Whorf hypothesis holds that our thoughts are shaped by our native language, and that speakers of different languages therefore think differently. This hypothesis is controversial in part because it appears to deny the possibility of a universal groundwork for human cognition, and in part because some findings taken to support it have not reliably replicated. We argue that considering this hypothesis through the lens of probabilistic inference has the potential to resolve both issues, at least with respect to certain prominent findings in the domain of color cognition. We explore a probabilistic model that is grounded in a presumed universal perceptual color space and in language-specific categories over that space. The model predicts that categories will most clearly affect color memory when perceptual information is uncertain. In line with earlier studies, we show that this model accounts for language-consistent biases in color reconstruction from memory in English speakers, modulated by uncertainty. We also show, to our knowledge for the first time, that such a model accounts for influential existing data on cross-language differences in color discrimination from memory, both within and across categories. We suggest that these ideas may help to clarify the debate over the Sapir-Whorf hypothesis.

  3. Ambient Surveillance by Probabilistic-Possibilistic Perception

    NARCIS (Netherlands)

    Bittermann, M.S.; Ciftcioglu, O.

    2013-01-01

    A method for quantifying ambient surveillance is presented, which is based on probabilistic-possibilistic perception. The human surveillance of a scene through observing camera sensed images on a monitor is modeled in three steps. First immersion of the observer is simulated by modeling perception

  4. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  5. Probabilistic inversion in priority setting of emerging zoonoses.

    NARCIS (Netherlands)

    Kurowicka, D.; Bucura, C.; Cooke, R.; Havelaar, A.H.

    2010-01-01

    This article presents methodology of applying probabilistic inversion in combination with expert judgment in priority setting problem. Experts rank scenarios according to severity. A linear multi-criteria analysis model underlying the expert preferences is posited. Using probabilistic inversion, a

  6. Radioactivity, a pragmatic pillar of probabilistic conceptions

    International Nuclear Information System (INIS)

    Amaldi, E.

    1979-01-01

    The author expresses his opinion that by looking at the problem of repudiation of causality in physics from the most general and far away point of view, one can be brought to over-estimate the extrinsic influences and over-look intrinsic arguments inherent to two parallel, almost independent developments. The first one starts from the kinetic theory of gases and passes through statistical mechanics, Planck original definition of quantum, the photons conceived as particles and the relations between emission and absorption of photons by atoms. The other path, also intrinsic to physics starts with the accidental discovery of radioactive substances, passes through the experimental recognition of their decay properties and quickly finds its natural settlement in a probabilistic conception which can be accused to be a critical but has certainly a sound pragmatic ground, uncorrelated or at extremely loosely correlated to contemporary or pre-existing philosophical lines of thought. (Auth.)

  7. Probabilistic Models for Solar Particle Events

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  8. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  9. Proceedings of a NEA workshop on probabilistic structure integrity analysis and its relationship to deterministic analysis

    International Nuclear Information System (INIS)

    1996-01-01

    This workshop was hosted jointly by the Swedish Nuclear Power Inspectorate (SKi) and the Swedish Royal Institute of Technology (KTH). It was sponsored by the Principal Working Group 3 (PWG-3) of the NEA CSNI. PWG-3 deals with the integrity of structures and components, and has three sub-groups, dealing with the integrity of metal components and structures, ageing of concrete structures, and the seismic behaviour of structures. The sub-group dealing with metal components has three mains areas of activity: non-destructive examination; fracture mechanics; and material degradation. The topic of this workshop is primarily probabilistic fracture mechanics, but probabilistic integrity analysis includes NDE and materials degradation also. Session 1 (5 papers) was devoted to the development of probabilistic models; Session 2 (5 papers) to the random modelling of defects and material properties; Session 3 (8 papers) to the applications of probabilistic modelling to nuclear components; Sessions 4 is a concluding panel discussion

  10. Solving of some Problems with On-Line Mode Measurement of Partial Discharges

    Directory of Open Access Journals (Sweden)

    Karel Zalis

    2004-01-01

    Full Text Available This paper deals with the problems discussing the transition from off-line diagnostic methods to on-line ones. Based on the experience with commercial partial discharge measuring equipment a new digital system for the evaluation of partial discharge measurement including software and hardware facilities has been developed at the Czech Technical University in Prague. Two expert systems work in this complex evaluating system: a rule-based expert system performing an amplitude analysis of partial discharge impulses for determining the damage of the insulation system, and a neural network which is used for a phase analysis of partial discharge impulses to determine the kind of partial discharge activity. Problem of the elimination of disturbances is also discussed.

  11. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  12. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  13. Probabilistic deletion of copies of linearly independent quantum states

    International Nuclear Information System (INIS)

    Feng Jian; Gao Yunfeng; Wang Jisuo; Zhan Mingsheng

    2002-01-01

    We show that each of two copies of the nonorthogonal states randomly selected from a certain set S can be probabilistically deleted by a general unitary-reduction operation if and only if the states are linearly independent. We derive a tight bound on the best possible deleting efficiencies. These results for 2→1 probabilistic deleting are also generalized into the case of N→M deleting (N,M positive integers and N>M)

  14. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  15. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  16. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  17. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  18. SNS Diagnostics Tools for Data Acquisition and Display

    CERN Document Server

    Sundaram, Madhan; Long, Cary D

    2005-01-01

    The Spallation Neutron Source (SNS) accelerator systems will deliver a 1.0 GeV, 1.4 MW proton beam to a liquid mercury target for neutron scattering research. The accelerator complex consists of a 1.0 GeV linear accelerator, an accumulator ring and associated transport lines. The SNS diagnostics platform is PC-based and will run Windows for its OS and LabVIEW as its programming language. The diagnostics platform as well as other control systems and operator consoles use the Channel Access (CA) protocol of the Experimental Physics and Industrial Control System (EPICS) to communicate. This paper describes the tools created to evaluate the diagnostic instrument using our standard programming environment, LabVIEW. The tools are based on the LabVIEW Channel Access library and can run on Windows, Linux, and Mac OS X. The data-acquisition tool uses drop and drag to select process variables organized by instrument, accelerator component, or beam parameters. The data can be viewed on-line and logged to disk for later ...

  19. Application of probabilistic risk assessment to advanced liquid metal reactor designs

    International Nuclear Information System (INIS)

    Carroll, W.P.; Temme, M.I.

    1987-01-01

    The United States Department of Energy (US DOE) has been active in the development and application of probabilistic risk assessment methods within its liquid metal breeder reactor development program for the past eleven years. These methods have been applied to comparative risk evaluations, the selection of design features for reactor concepts, the selection and emphasis of research and development programs, and regulatory discussions. The application of probabilistic methods to reactors which are in the conceptual design stage presents unique data base, modeling, and timing challenges, and excellent opportunities to improve the final design. We provide here the background and insights on the experience which the US DOE liquid metal breeder reactor program has had in its application of probabilistic methods to the Clinch River Breeder Reactor Plant project, the Conceptual Design State of the Large Development Plant, and updates on this design. Plans for future applications of probabilistic risk assessment methods are also discussed. The US DOE is embarking on an innovative design program for liquid metal reactors. (author)

  20. The Importance of Conditional Probability in Diagnostic Reasoning and Clinical Decision Making: A Primer for the Eye Care Practitioner.

    Science.gov (United States)

    Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A

    2017-04-01

    To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.

  1. Probabilistic cloning of three symmetric states

    International Nuclear Information System (INIS)

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  2. A probabilistic methodology for the design of radiological confinement of tokamak reactors

    International Nuclear Information System (INIS)

    Golinescu, Ruxandra P.; Morosan, Florinel; Kazimi, Mujid S.

    1997-01-01

    A methodology using probabilistic risk assessment techniques is proposed for evaluating the design of multiple confinement barriers for a fusion plant within the context of a limited allowable risk. The methodology was applied to the reference design of the International Thermonuclear Experimental Reactor (ITER). Accident sequence models were developed to determine the probability of radioactive releases from each confinement barrier. The current ITER design requirements, that set environmental radioactive release limits for individual event sequences grouped in categories by frequency, is extended to derive a limit on the plant overall risk. This avoids detailed accounting for event uncertainties in both frequency and consequence. Thus, an analytical form for a limit line is derived as a complementary cumulative frequency of permissible radioactive releases to the environment. The line can be derived using risk aversion of the designer's own choice. By comparing the releases from each confinement barrier against this limit line, a decision can be made about the number of barriers required to comply with the design requirements. A decision model using multi-attribute utility function theory was constructed to help the designer in choosing the type of the tokamak building while considering preferences for attributes such as construction cost, project completion time, technical feasibility and public attitude. Sensitivity analysis on some of the relevant parameters in the model was performed

  3. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  4. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008

    Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...... likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...

  5. Probabilistic Mobility Models for Mobile and Wireless Networks

    DEFF Research Database (Denmark)

    Song, Lei; Godskesen, Jens Christian

    2010-01-01

    In this paper we present a probabilistic broadcast calculus for mobile and wireless networks whose connections are unreliable. In our calculus broadcasted messages can be lost with a certain probability, and due to mobility the connection probabilities may change. If a network broadcasts a message...... from a location it will evolve to a network distribution depending on whether nodes at other locations receive the message or not. Mobility of locations is not arbitrary but guarded by a probabilistic mobility function (PMF) and we also define the notion of a weak bisimulation given a PMF...

  6. Foundations of the Formal Sciences VI: Probabilistic reasoning and reasoning with probabilities

    NARCIS (Netherlands)

    Löwe, B.; Pacuit, E.; Romeijn, J.W.

    2009-01-01

    Probabilistic methods are increasingly becoming an important tool in a variety of disciplines including computer science, mathematics, artificial intelligence, epistemology, game and decision theory and linguistics. In addition to the discussion on applications of probabilistic methods there is an

  7. When to conduct probabilistic linkage vs. deterministic linkage? A simulation study.

    Science.gov (United States)

    Zhu, Ying; Matsuyama, Yutaka; Ohashi, Yasuo; Setoguchi, Soko

    2015-08-01

    When unique identifiers are unavailable, successful record linkage depends greatly on data quality and types of variables available. While probabilistic linkage theoretically captures more true matches than deterministic linkage by allowing imperfection in identifiers, studies have shown inconclusive results likely due to variations in data quality, implementation of linkage methodology and validation method. The simulation study aimed to understand data characteristics that affect the performance of probabilistic vs. deterministic linkage. We created ninety-six scenarios that represent real-life situations using non-unique identifiers. We systematically introduced a range of discriminative power, rate of missing and error, and file size to increase linkage patterns and difficulties. We assessed the performance difference of linkage methods using standard validity measures and computation time. Across scenarios, deterministic linkage showed advantage in PPV while probabilistic linkage showed advantage in sensitivity. Probabilistic linkage uniformly outperformed deterministic linkage as the former generated linkages with better trade-off between sensitivity and PPV regardless of data quality. However, with low rate of missing and error in data, deterministic linkage performed not significantly worse. The implementation of deterministic linkage in SAS took less than 1min, and probabilistic linkage took 2min to 2h depending on file size. Our simulation study demonstrated that the intrinsic rate of missing and error of linkage variables was key to choosing between linkage methods. In general, probabilistic linkage was a better choice, but for exceptionally good quality data (<5% error), deterministic linkage was a more resource efficient choice. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Probabilistic Meteorological Characterization for Turbine Loads

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov

    2014-01-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface...

  9. A Probabilistic Framework for Curve Evolution

    DEFF Research Database (Denmark)

    Dahl, Vedrana Andersen

    2017-01-01

    approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...

  10. Spectroscopic diagnostics of NIF ICF implosions using line ratios of Kr dopant in the ignition capsule

    Science.gov (United States)

    Dasgupta, Arati; Ouart, Nicholas; Giuiani, John; Clark, Robert; Schneider, Marilyn; Scott, Howard; Chen, Hui; Ma, Tammy

    2017-10-01

    X ray spectroscopy is used on the NIF to diagnose the plasma conditions in the ignition target in indirect drive ICF implosions. A platform is being developed at NIF where small traces of krypton are used as a dopant to the fuel gas for spectroscopic diagnostics using krypton line emissions. The fraction of krypton dopant was varied in the experiments and was selected so as not to perturb the implosion. Our goal is to use X-ray spectroscopy of dopant line ratios produced by the hot core that can provide a precise measurement of electron temperature. Simulations of the krypton spectra using a 1 in 104 atomic fraction of krypton in direct-drive exploding pusher with a range of electron temperatures and densities show discrepancies when different atomic models are used. We use our non-LTE atomic model with a detailed fine-structure level atomic structure and collisional-radiative rates to investigate the krypton spectra at the same conditions. Synthetic spectra are generated with a detailed multi-frequency radiation transport scheme from the emission regions of interest to analyze the experimental data with 0.02% Kr concentration and compare and contrast with the existing simulations at LLNL. Work supported by DOE/NNSA; Part of this work was also done under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344.

  11. A temporally and spatially resolved electron density diagnostic method for the edge plasma based on Stark broadening

    Energy Technology Data Exchange (ETDEWEB)

    Zafar, A., E-mail: zafara@ornl.gov [Department of Nuclear Engineering, North Carolina State University, Raleigh, North Carolina 27695 (United States); Oak Ridge National Laboratory, Oak Ridge, Tennessee 37830 (United States); Martin, E. H.; Isler, R. C.; Caughman, J. B. O. [Oak Ridge National Laboratory, Oak Ridge, Tennessee 37830 (United States); Shannon, S. C. [Department of Nuclear Engineering, North Carolina State University, Raleigh, North Carolina 27695 (United States)

    2016-11-15

    An electron density diagnostic (≥10{sup 10} cm{sup −3}) capable of high temporal (ms) and spatial (mm) resolution is currently under development at Oak Ridge National Laboratory. The diagnostic is based on measuring the Stark broadened, Doppler-free spectral line profile of the n = 6–2 hydrogen Balmer series transition. The profile is then fit to a fully quantum mechanical model including the appropriate electric and magnetic field operators. The quasi-static approach used to calculate the Doppler-free spectral line profile is outlined here and the results from the model are presented for H-δ spectra for electron densities of 10{sup 10}–10{sup 13} cm{sup −3}. The profile shows complex behavior due to the interaction between the magnetic substates of the atom.

  12. MATILDA Version-2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part II

    Science.gov (United States)

    2017-07-28

    risk assessment for “unsafe” scenarios. Recently, attention in the DoD has turned to Probabilistic Risk Assessment (PRA) models [5,6] as an...corresponding to the CRA undershoot boundary. The magenta- coloured line represents the portion of the C-RX(U) circle that would contribute to the...Tertiary Precaution Surface. Undershoot related laser firing restrictions within the green- coloured C-RX(U) can be ignored. Figure 34

  13. Probabilistic quantum cloning of a subset of linearly dependent states

    Science.gov (United States)

    Rui, Pinshu; Zhang, Wen; Liao, Yanlin; Zhang, Ziyun

    2018-02-01

    It is well known that a quantum state, secretly chosen from a certain set, can be probabilistically cloned with positive cloning efficiencies if and only if all the states in the set are linearly independent. In this paper, we focus on probabilistic quantum cloning of a subset of linearly dependent states. We show that a linearly-independent subset of linearly-dependent quantum states {| Ψ 1⟩,| Ψ 2⟩,…,| Ψ n ⟩} can be probabilistically cloned if and only if any state in the subset cannot be expressed as a linear superposition of the other states in the set {| Ψ 1⟩,| Ψ 2⟩,…,| Ψ n ⟩}. The optimal cloning efficiencies are also investigated.

  14. International Conference on Plasma Diagnostics. Slides, papers and posters of Plasma Diagnostics 2010

    International Nuclear Information System (INIS)

    Hartfuss, H.J.; Bonhomme, G.; Grisolia, C.; Hirsch, M.; Klos, Z.; Mazouffre, S.; Musielok, J.; Ratynskaya, S.; Sadowski, M.; Van de Sanden, R.; Sentis, M.; Stroth, U.; Tereshin, V.; Tichy, M.; Unterberg, B.; Weisen, H.; Zoletnik, S.

    2011-01-01

    Plasma diagnostics 2010 is an International Conference on Diagnostic Methods involved in Research and Applications of Plasmas, originating on combining the 5. German-Polish Conference on Plasma Diagnostics for Fusion and Applications and the 7. French-Polish Seminar on Thermal Plasma in Space and Laboratory. The Scientific Committee of 'Plasma 2007' decided to concentrate the attention of future conferences more on the diagnostic development and diagnostic interpretation in the fields of high and low temperature plasmas and plasma applications. It is aimed at involving all European activities in the fields. The Scientific Program will cover the fields from low temperature laboratory to fusion plasmas of various configurations as well as dusty and astrophysical plasmas and industrial plasma applications

  15. A general diagnostic model applied to language testing data.

    Science.gov (United States)

    von Davier, Matthias

    2008-11-01

    Probabilistic models with one or more latent variables are designed to report on a corresponding number of skills or cognitive attributes. Multidimensional skill profiles offer additional information beyond what a single test score can provide, if the reported skills can be identified and distinguished reliably. Many recent approaches to skill profile models are limited to dichotomous data and have made use of computationally intensive estimation methods such as Markov chain Monte Carlo, since standard maximum likelihood (ML) estimation techniques were deemed infeasible. This paper presents a general diagnostic model (GDM) that can be estimated with standard ML techniques and applies to polytomous response variables as well as to skills with two or more proficiency levels. The paper uses one member of a larger class of diagnostic models, a compensatory diagnostic model for dichotomous and partial credit data. Many well-known models, such as univariate and multivariate versions of the Rasch model and the two-parameter logistic item response theory model, the generalized partial credit model, as well as a variety of skill profile models, are special cases of this GDM. In addition to an introduction to this model, the paper presents a parameter recovery study using simulated data and an application to real data from the field test for TOEFL Internet-based testing.

  16. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  17. Development of a computerized system for performance monitoring and diagnostics in nuclear power plants

    International Nuclear Information System (INIS)

    Chou, G.H.; Chao, H.J.

    1995-01-01

    An on-line computerized system for thermal performance monitoring and diagnostics has been developed at the Institute of Nuclear Energy Research (INER). It was the product of the ChinShan plant performance Monitoring, Analysis and Diagnostics Expert System (CS-MADES) project sponsored by Taiwan Power Company (TPC). The system can carry out turbine performance monitoring and analysis during normal operation, and yield diagnostic results of component degradation after finding out the missing generation problems. Three subsystems were generated to support the whole system framework. They are Test Data Processing Subsystem (TDPS), On-line Monitoring and Analysis Subsystem (OMAS), and Thermal Performance Diagnostics Expert System (TPDES). Some visible benefits have been gained so far through the prototype system installed at the Chinshan nuclear power station

  18. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  19. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  20. Probabilistic Design of Coastal Flood Defences in Vietnam

    NARCIS (Netherlands)

    Mai Van, C.

    2010-01-01

    This study further develops the method of probabilistic design and to address a knowledge gap in its application regarding safety and reliability, risk assessment and risk evaluation to the fields of flood defences. The thesis discusses: - a generic probabilistic design framework for assessing flood