WorldWideScience

Sample records for generated probability curves

  1. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  2. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    Energy Technology Data Exchange (ETDEWEB)

    Garza, J. [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States); Millwater, H., E-mail: harry.millwater@utsa.edu [University of Texas at San Antonio, Mechanical Engineering, 1 UTSA circle, EB 3.04.50, San Antonio, TX 78249 (United States)

    2012-04-15

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: Black-Right-Pointing-Pointer Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. Black-Right-Pointing-Pointer The sensitivities are computed with negligible cost using Monte Carlo sampling. Black-Right-Pointing-Pointer The change in the POF due to a change in the POD curve parameters can be easily estimated.

  3. Sensitivity of probability-of-failure estimates with respect to probability of detection curve parameters

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2012-01-01

    A methodology has been developed and demonstrated that can be used to compute the sensitivity of the probability-of-failure (POF) with respect to the parameters of inspection processes that are simulated using probability of detection (POD) curves. The formulation is such that the probabilistic sensitivities can be obtained at negligible cost using sampling methods by reusing the samples used to compute the POF. As a result, the methodology can be implemented for negligible cost in a post-processing non-intrusive manner thereby facilitating implementation with existing or commercial codes. The formulation is generic and not limited to any specific random variables, fracture mechanics formulation, or any specific POD curve as long as the POD is modeled parametrically. Sensitivity estimates for the cases of different POD curves at multiple inspections, and the same POD curves at multiple inspections have been derived. Several numerical examples are presented and show excellent agreement with finite difference estimates with significant computational savings. - Highlights: ► Sensitivity of the probability-of-failure with respect to the probability-of-detection curve. ►The sensitivities are computed with negligible cost using Monte Carlo sampling. ► The change in the POF due to a change in the POD curve parameters can be easily estimated.

  4. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  5. Calculation of magnetization curves and probability distribution for monoclinic and uniaxial systems

    International Nuclear Information System (INIS)

    Sobh, Hala A.; Aly, Samy H.; Yehia, Sherif

    2013-01-01

    We present the application of a simple classical statistical mechanics-based model to selected monoclinic and hexagonal model systems. In this model, we treat the magnetization as a classical vector whose angular orientation is dictated by the laws of equilibrium classical statistical mechanics. We calculate for these anisotropic systems, the magnetization curves, energy landscapes and probability distribution for different sets of relevant parameters and magnetic fields of different strengths and directions. Our results demonstrate a correlation between the most probable orientation of the magnetization vector, the system's parameters, and the external magnetic field. -- Highlights: ► We calculate magnetization curves and probability angular distribution of the magnetization. ► The magnetization curves are consistent with probability results for the studied systems. ► Monoclinic and hexagonal systems behave differently due to their different anisotropies

  6. Three-generation neutrino oscillations in curved spacetime

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yu-Hao, E-mail: yhzhang1994@gmail.com; Li, Xue-Qian, E-mail: lixq@nankai.edu.cn

    2016-10-15

    Three-generation MSW effect in curved spacetime is studied and a brief discussion on the gravitational correction to the neutrino self-energy is given. The modified mixing parameters and corresponding conversion probabilities of neutrinos after traveling through celestial objects of constant densities are obtained. The method to distinguish between the normal hierarchy and inverted hierarchy is discussed in this framework. Due to the gravitational redshift of energy, in some extreme situations, the resonance energy of neutrinos might be shifted noticeably and the gravitational effect on the self-energy of neutrino becomes significant at the vicinities of spacetime singularities.

  7. On a framework for generating PoD curves assisted by numerical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Subair, S. Mohamed, E-mail: prajagopal@iitm.ac.in; Agrawal, Shweta, E-mail: prajagopal@iitm.ac.in; Balasubramaniam, Krishnan, E-mail: prajagopal@iitm.ac.in; Rajagopal, Prabhu, E-mail: prajagopal@iitm.ac.in [Indian Institute of Technology Madras, Department of Mechanical Engineering, Chennai, T.N. (India); Kumar, Anish; Rao, Purnachandra B.; Tamanna, Jayakumar [Indira Gandhi Centre for Atomic Research, Metallurgy and Materials Group, Kalpakkam, T.N. (India)

    2015-03-31

    The Probability of Detection (PoD) curve method has emerged as an important tool for the assessment of the performance of NDE techniques, a topic of particular interest to the nuclear industry where inspection qualification is very important. The conventional experimental means of generating PoD curves though, can be expensive, requiring large data sets (covering defects and test conditions), and equipment and operator time. Several methods of achieving faster estimates for PoD curves using physics-based modelling have been developed to address this problem. Numerical modelling techniques are also attractive, especially given the ever-increasing computational power available to scientists today. Here we develop procedures for obtaining PoD curves, assisted by numerical simulation and based on Bayesian statistics. Numerical simulations are performed using Finite Element analysis for factors that are assumed to be independent, random and normally distributed. PoD curves so generated are compared with experiments on austenitic stainless steel (SS) plates with artificially created notches. We examine issues affecting the PoD curve generation process including codes, standards, distribution of defect parameters and the choice of the noise threshold. We also study the assumption of normal distribution for signal response parameters and consider strategies for dealing with data that may be more complex or sparse to justify this. These topics are addressed and illustrated through the example case of generation of PoD curves for pulse-echo ultrasonic inspection of vertical surface-breaking cracks in SS plates.

  8. Image scaling curve generation

    NARCIS (Netherlands)

    2012-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  9. Image scaling curve generation.

    NARCIS (Netherlands)

    2011-01-01

    The present invention relates to a method of generating an image scaling curve, where local saliency is detected in a received image. The detected local saliency is then accumulated in the first direction. A final scaling curve is derived from the detected local saliency and the image is then

  10. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  11. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  13. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  14. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    Science.gov (United States)

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  15. Algorithm for Automatic Generation of Curved and Compound Twills

    Institute of Scientific and Technical Information of China (English)

    WANG Mei-zhen; WANG Fu-mei; WANG Shan-yuan

    2005-01-01

    A new arithmetic using matrix left-shift functions for the quicker generation of curved and compound twills is introduced in this paper. A matrix model for the generation of regular, curved and compound twill structures is established and its computing simulation realization are elaborated. Examples of the algorithm applying in the simulation and the automatic generation of curved and compound twills in fabric CAD are obtained.

  16. Considerations of "Combined Probability of Injury" in the next-generation USA frontal NCAP.

    Science.gov (United States)

    Laituri, Tony R; Henry, Scott; Sullivan, Kaye; Nutt, Marvin

    2010-08-01

    The numerical basis for assigning star ratings in the next-generation USA New Car Assessment Program (NCAP) for frontal impacts was assessed. That basis, the Combined Probability of Injury, or CPI, is the probability of an occupant sustaining an injury to any of the specified body regions. For an NCAP test, a CPI value is computed by (a) using risk curves to convert body-region responses from a test dummy into body-region risks and (b) using a theoretical, overarching CPI equation to convert those separate body-region risks into a single CPI value. Though the general concept of applying a CPI equation to assign star ratings has existed since 1994, there will be numerous changes to the 2011 frontal NCAP: there will be two additional body regions (n = 4 vs. 2), the injury probabilities will be evaluated for lower-severity (more likely) injury levels, and some of the occupant responses will change. These changes could yield more disperse CPIs that could yield more disperse ratings. However, the reasons for this increased dispersion should be consistent with real-world findings. Related assessments were the topic of this two-part study, focused on drivers. In Part 1, the CPI equation was assessed without applying risk curves. Specifically, field injury probabilities for the four body regions were used as inputs to the CPI equation, and the resulting equation-produced CPIs were compared with the field CPIs. In Part 2, subject to analyses of test dummy responses from recent NCAP tests, the effect of risk curve choice on CPIs was assessed. Specifically, dispersion statistics were compared for CPIs based on various underlying risk curves applied to data from 2001-2005 model year vehicles (n = 183). From Part 1, the theoretical CPI equation for four body regions demonstrated acceptable fidelity when provided field injury rates (R(2)= 0.92), with the equation-based CPIs being approximately 12 percent lower than those of ideal correlation. From Part 2, the 2011 NCAP protocol

  17. The S-curve for forecasting waste generation in construction projects.

    Science.gov (United States)

    Lu, Weisheng; Peng, Yi; Chen, Xi; Skitmore, Martin; Zhang, Xiaoling

    2016-10-01

    Forecasting construction waste generation is the yardstick of any effort by policy-makers, researchers, practitioners and the like to manage construction and demolition (C&D) waste. This paper develops and tests an S-curve model to indicate accumulative waste generation as a project progresses. Using 37,148 disposal records generated from 138 building projects in Hong Kong in four consecutive years from January 2011 to June 2015, a wide range of potential S-curve models are examined, and as a result, the formula that best fits the historical data set is found. The S-curve model is then further linked to project characteristics using artificial neural networks (ANNs) so that it can be used to forecast waste generation in future construction projects. It was found that, among the S-curve models, cumulative logistic distribution is the best formula to fit the historical data. Meanwhile, contract sum, location, public-private nature, and duration can be used to forecast construction waste generation. The study provides contractors with not only an S-curve model to forecast overall waste generation before a project commences, but also with a detailed baseline to benchmark and manage waste during the course of construction. The major contribution of this paper is to the body of knowledge in the field of construction waste generation forecasting. By examining it with an S-curve model, the study elevates construction waste management to a level equivalent to project cost management where the model has already been readily accepted as a standard tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. ON THE TOPOLOGY OF MECHANISMS DESIGNED FOR CURVES GENERATION

    Directory of Open Access Journals (Sweden)

    MEREUTA Elena

    2008-07-01

    Full Text Available The paper presents some mechanisms used for generating simple or complex curves. The mechanisms are shown in different positions and for some special curves the demonstrations are performed.

  19. Probability- and curve-based fractal reconstruction on 2D DEM terrain profile

    International Nuclear Information System (INIS)

    Lai, F.-J.; Huang, Y.M.

    2009-01-01

    Data compression and reconstruction has been playing important roles in information science and engineering. As part of them, image compression and reconstruction that mainly deal with image data set reduction for storage or transmission and data set restoration with least loss is still a topic deserved a great deal of works to focus on. In this paper we propose a new scheme in comparison with the well-known Improved Douglas-Peucker (IDP) method to extract characteristic or feature points of two-dimensional digital elevation model (2D DEM) terrain profile to compress data set. As for reconstruction in use of fractal interpolation, we propose a probability-based method to speed up the fractal interpolation execution to a rate as high as triple or even ninefold of the regular. In addition, a curve-based method is proposed in the study to determine the vertical scaling factor that much affects the generation of the interpolated data points to significantly improve the reconstruction performance. Finally, an evaluation is made to show the advantage of employing the proposed new method to extract characteristic points associated with our novel fractal interpolation scheme.

  20. Curvature Entropy for Curved Profile Generation

    Directory of Open Access Journals (Sweden)

    Koichiro Sato

    2012-03-01

    Full Text Available In a curved surface design, the overall shape features that emerge from combinations of shape elements are important. However, controlling the features of the overall shape in curved profiles is difficult using conventional microscopic shape information such as dimension. Herein two types of macroscopic shape information, curvature entropy and quadrature curvature entropy, quantitatively represent the features of the overall shape. The curvature entropy is calculated by the curvature distribution, and represents the complexity of a shape (one of the overall shape features. The quadrature curvature entropy is an improvement of the curvature entropy by introducing a Markov process to evaluate the continuity of a curvature and to approximate human cognition of the shape. Additionally, a shape generation method using a genetic algorithm as a calculator and the entropy as a shape generation index is presented. Finally, the applicability of the proposed method is demonstrated using the side view of an automobile as a design example.

  1. Bandwidth increasing mechanism by introducing a curve fixture to the cantilever generator

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Weiqun, E-mail: weiqunliu@home.swjtu.edu.cn; Liu, Congzhi; Ren, Bingyu; Zhu, Qiao; Hu, Guangdi [School of Mechanical Engineering, Southwest Jiaotong University, 610031 Chengdu (China); Yang, Weiqing [School of Materials Science and Engineering, Southwest Jiaotong University, 610031 Chengdu (China)

    2016-07-25

    A nonlinear wideband generator architecture by clamping the cantilever beam generator with a curve fixture is proposed. Devices with different nonlinear stiffness can be obtained by properly choosing the fixture curve according to the design requirements. Three available generator types are presented and discussed for polynomial curves. Experimental investigations show that the proposed mechanism effectively extends the operation bandwidth with good power performance. Especially, the simplicity and easy feasibility allow the mechanism to be widely applied for vibration generators in different scales and environments.

  2. Research of Cubic Bezier Curve NC Interpolation Signal Generator

    Directory of Open Access Journals (Sweden)

    Shijun Ji

    2014-08-01

    Full Text Available Interpolation technology is the core of the computer numerical control (CNC system, and the precision and stability of the interpolation algorithm directly affect the machining precision and speed of CNC system. Most of the existing numerical control interpolation technology can only achieve circular arc interpolation, linear interpolation or parabola interpolation, but for the numerical control (NC machining of parts with complicated surface, it needs to establish the mathematical model and generate the curved line and curved surface outline of parts and then discrete the generated parts outline into a large amount of straight line or arc to carry on the processing, which creates the complex program and a large amount of code, so it inevitably introduce into the approximation error. All these factors affect the machining accuracy, surface roughness and machining efficiency. The stepless interpolation of cubic Bezier curve controlled by analog signal is studied in this paper, the tool motion trajectory of Bezier curve can be directly planned out in CNC system by adjusting control points, and then these data were put into the control motor which can complete the precise feeding of Bezier curve. This method realized the improvement of CNC trajectory controlled ability from the simple linear and circular arc to the complex project curve, and it provides a new way for economy realizing the curve surface parts with high quality and high efficiency machining.

  3. Automatic generation and analysis of solar cell IV curves

    Science.gov (United States)

    Kraft, Steven M.; Jones, Jason C.

    2014-06-03

    A photovoltaic system includes multiple strings of solar panels and a device presenting a DC load to the strings of solar panels. Output currents of the strings of solar panels may be sensed and provided to a computer that generates current-voltage (IV) curves of the strings of solar panels. Output voltages of the string of solar panels may be sensed at the string or at the device presenting the DC load. The DC load may be varied. Output currents of the strings of solar panels responsive to the variation of the DC load are sensed to generate IV curves of the strings of solar panels. IV curves may be compared and analyzed to evaluate performance of and detect problems with a string of solar panels.

  4. Statistics about elliptic curves over finite prime fields

    OpenAIRE

    Gekeler, Ernst-Ulrich

    2006-01-01

    We derive formulas for the probabilities of various properties (cyclicity, squarefreeness, generation by random points) of the point groups of randomly chosen elliptic curves over random prime fields.

  5. Generating prior probabilities for classifiers of brain tumours using belief networks

    Directory of Open Access Journals (Sweden)

    Arvanitis Theodoros N

    2007-09-01

    Full Text Available Abstract Background Numerous methods for classifying brain tumours based on magnetic resonance spectra and imaging have been presented in the last 15 years. Generally, these methods use supervised machine learning to develop a classifier from a database of cases for which the diagnosis is already known. However, little has been published on developing classifiers based on mixed modalities, e.g. combining imaging information with spectroscopy. In this work a method of generating probabilities of tumour class from anatomical location is presented. Methods The method of "belief networks" is introduced as a means of generating probabilities that a tumour is any given type. The belief networks are constructed using a database of paediatric tumour cases consisting of data collected over five decades; the problems associated with using this data are discussed. To verify the usefulness of the networks, an application of the method is presented in which prior probabilities were generated and combined with a classification of tumours based solely on MRS data. Results Belief networks were constructed from a database of over 1300 cases. These can be used to generate a probability that a tumour is any given type. Networks are presented for astrocytoma grades I and II, astrocytoma grades III and IV, ependymoma, pineoblastoma, primitive neuroectodermal tumour (PNET, germinoma, medulloblastoma, craniopharyngioma and a group representing rare tumours, "other". Using the network to generate prior probabilities for classification improves the accuracy when compared with generating prior probabilities based on class prevalence. Conclusion Bayesian belief networks are a simple way of using discrete clinical information to generate probabilities usable in classification. The belief network method can be robust to incomplete datasets. Inclusion of a priori knowledge is an effective way of improving classification of brain tumours by non-invasive methods.

  6. A generative Bezier curve model for surf-zone tracking in coastal image sequences

    CSIR Research Space (South Africa)

    Burke, Michael G

    2017-09-01

    Full Text Available This work introduces a generative Bezier curve model suitable for surf-zone curve tracking in coastal image sequences. The model combines an adaptive curve parametrised by control points governed by local random walks with a global sinusoidal motion...

  7. The strategy curve. A method for representing and interpreting generator bidding strategies

    International Nuclear Information System (INIS)

    Lucas, N.; Taylor, P.

    1995-01-01

    The pool is the novel trading arrangement at the heart of the privatized electricity market in England and Wales. This central role in the new system makes it crucial that it is seen to function efficiently. Unfortunately, it is governed by a set of complex rules, which leads to a lack of transparency, and this makes monitoring of its operation difficult. This paper seeks to provide a method for illuminating one aspect of the pool, that of generator bidding behaviour. We introduce the concept of a strategy curve, which is a concise device for representing generator bidding strategies. This curve has the appealing characteristic of directly revealing any deviation in the bid price of a genset from the costs of generating electricity. After a brief discussion about what constitutes price and cost in this context we present a number of strategy curves for different days and provide some interpretation of their form, based in part on our earlier work with game theory. (author)

  8. PLANAR MECHANISMS USED FOR GENERATING CURVE LINE TRANSLATION MOTION

    Directory of Open Access Journals (Sweden)

    Ovidiu ANTONESCU

    2015-05-01

    Full Text Available The curve line translation motion can be generated in the particular form of the circular translation, through mono-mobile mechanisms with articulated links of simple parallelogram type (with a fixed side or through transmission with toothed belt with a fixed wheel. Also, the circular translation can be generated through planar mechanisms with two cylindrical gears with a fixed central wheel. It is mentioned that the two cylindrical gearings of the Fergusson mechanisms are both exterior and interior.

  9. A formalism to generate probability distributions for performance-assessment modeling

    International Nuclear Information System (INIS)

    Kaplan, P.G.

    1990-01-01

    A formalism is presented for generating probability distributions of parameters used in performance-assessment modeling. The formalism is used when data are either sparse or nonexistent. The appropriate distribution is a function of the known or estimated constraints and is chosen to maximize a quantity known as Shannon's informational entropy. The formalism is applied to a parameter used in performance-assessment modeling. The functional form of the model that defines the parameter, data from the actual field site, and natural analog data are analyzed to estimate the constraints. A beta probability distribution of the example parameter is generated after finding four constraints. As an example of how the formalism is applied to the site characterization studies of Yucca Mountain, the distribution is generated for an input parameter in a performance-assessment model currently used to estimate compliance with disposal of high-level radioactive waste in geologic repositories, 10 CFR 60.113(a)(2), commonly known as the ground water travel time criterion. 8 refs., 2 figs

  10. Steam generator tubes rupture probability estimation - study of the axially cracked tube case

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.; Roussel, G.

    1992-01-01

    The objective of the present study is to estimate the probability of a steam generator tube rupture due to the unstable propagation of axial through-wall cracks during a hypothetical accident. For this purpose the probabilistic fracture mechanics model was developed taking into account statistical distributions of influencing parameters. A numerical example considering a typical steam generator seriously affected by axial stress corrosion cracking in the roll transition area, is presented; it indicates the change of rupture probability with different assumptions focusing mostly on tubesheet reinforcing factor, crack propagation rate and crack detection probability. 8 refs., 4 figs., 4 tabs

  11. Interaction between daily load demand curve and management of hydro-thermal generation system

    International Nuclear Information System (INIS)

    Granelli, G.; Montagna, M.; Pasini, G.; Innorta, M.; Marannino, P.

    1993-01-01

    The influence that the behaviour of the daily load demand curve has on the management of a hydro-thermal generation system is considered. The aim of this paper is to show the improvements that can be achieved by suitable load management techniques capable of flattening the load demand curve. The analysis is carried out by using a hydro-thermal scheduling program and a thermal unit dynamic dispatch procedure. The possibility of properly re-committing the available thermal units is also taken into account. The economical and technical convenience of shutting down less economical thermal units operating near the lower generations limits is verified. Finally, some considerations are made about the possible use of the thermal generation incremental costs as a tool for planning the end users' kWh prices, even in the short term. The results refer to a system with characteristics similar to those of the Italian one. In determining the daily load demand curves, the characteristics of load demand in Italy as well as in other European countries are taken into account

  12. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    International Nuclear Information System (INIS)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J.

    2012-01-01

    Greenhouse gas (CO 2 , CH 4 and N 2 O, hereinafter GHG) and criteria air pollutant (CO, NO x , VOC, PM 10 , PM 2.5 and SO x , hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life-cycle modeling with GREET.

  13. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment.

    Directory of Open Access Journals (Sweden)

    Amber M Sprenger

    2011-06-01

    Full Text Available We tested the predictions of HyGene (Thomas, Dougherty, Sprenger, & Harbison, 2008 that both divided attention at encoding and judgment should affect degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention at encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments.

  14. Implications of Cognitive Load for Hypothesis Generation and Probability Judgment

    Science.gov (United States)

    Sprenger, Amber M.; Dougherty, Michael R.; Atkins, Sharona M.; Franco-Watkins, Ana M.; Thomas, Rick P.; Lange, Nicholas; Abbs, Brandon

    2011-01-01

    We tested the predictions of HyGene (Thomas et al., 2008) that both divided attention at encoding and judgment should affect the degree to which participants’ probability judgments violate the principle of additivity. In two experiments, we showed that divided attention during judgment leads to an increase in subadditivity, suggesting that the comparison process for probability judgments is capacity limited. Contrary to the predictions of HyGene, a third experiment revealed that divided attention during encoding leads to an increase in later probability judgment made under full attention. The effect of divided attention during encoding on judgment was completely mediated by the number of hypotheses participants generated, indicating that limitations in both encoding and recall can cascade into biases in judgments. PMID:21734897

  15. Impact of distributed generation in the probability density of voltage sags; Impacto da geracao distribuida na densidade de probabilidade de afundamentos de tensao

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Alessandro Candido Lopes [CELG - Companhia Energetica de Goias, Goiania, GO (Brazil). Generation and Transmission. System' s Operation Center], E-mail: alessandro.clr@celg.com.br; Batista, Adalberto Jose [Universidade Federal de Goias (UFG), Goiania, GO (Brazil)], E-mail: batista@eee.ufg.br; Leborgne, Roberto Chouhy [Universidade Federal do Rio Grande do Sul (UFRS), Porto Alegre, RS (Brazil)], E-mail: rcl@ece.ufrgs.br; Emiliano, Pedro Henrique Mota, E-mail: ph@phph.com.br

    2009-07-01

    This article presents the impact of distributed generation in studies of voltage sags caused by faults in the electrical system. We simulated short-circuit-to-ground in 62 lines of 230, 138, 69 and 13.8 kV that are part of the electrical system of the city of Goiania, of Goias state . For each fault position was monitored the bus voltage of 380 V in an industrial consumer sensitive to such sags. Were inserted different levels of GD near the consumer. The simulations of a short circuit, with the monitoring bar 380 V, were performed again. A study using stochastic simulation Monte Carlo (SMC) was performed to obtain, at each level of GD, the probability curves and sags of the probability density and its voltage class. With these curves were obtained the average number of sags according to each class, that the consumer bar may be submitted annually. The simulations were performed using the Program Analysis of Simultaneous Faults - ANAFAS. In order to overcome the intrinsic limitations of the methods of simulation of this program and allow data entry via windows, a computational tool was developed in Java language. Data processing was done using the MATLAB software.

  16. Generation of large-scale PV scenarios using aggregated power curves

    DEFF Research Database (Denmark)

    Nuño Martinez, Edgar; Cutululis, Nicolaos Antonio

    2017-01-01

    The contribution of solar photovoltaic (PV) power to the generation is becoming more relevant in modern power system. Therefore, there is a need to model the variability large-scale PV generation accurately. This paper presents a novel methodology to generate regional PV scenarios based...... on aggregated power curves rather than traditional physical PV conversion models. Our approach is based on hourly mesoscale reanalysis irradiation data and power measurements and do not require additional variables such as ambient temperature or wind speed. It was used to simulate the PV generation...... on the German system between 2012 and 2015 showing high levels of correlation with actual measurements (93.02–97.60%) and small deviations from the expected capacity factors (0.02–1.80%). Therefore, we are confident about the ability of the proposed model to accurately generate realistic large-scale PV...

  17. Computing exact bundle compliance control charts via probability generating functions.

    Science.gov (United States)

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  18. Numerical generation of boundary-fitted curvilinear coordinate systems for arbitrarily curved surfaces

    International Nuclear Information System (INIS)

    Takagi, T.; Miki, K.; Chen, B.C.J.; Sha, W.T.

    1985-01-01

    A new method is presented for numerically generating boundary-fitted coordinate systems for arbitrarily curved surfaces. The three-dimensional surface has been expressed by functions of two parameters using the geometrical modeling techniques in computer graphics. This leads to new quasi-one- and two-dimensional elliptic partial differential equations for coordinate transformation. Since the equations involve the derivatives of the surface expressions, the grids geneated by the equations distribute on the surface depending on its slope and curvature. A computer program GRID-CS based on the method was developed and applied to a surface of the second order, a torus and a surface of a primary containment vessel for a nuclear reactor. These applications confirm that GRID-CS is a convenient and efficient tool for grid generation on arbitrarily curved surfaces

  19. Estimating probable flaw distributions in PWR steam generator tubes

    International Nuclear Information System (INIS)

    Gorman, J.A.; Turner, A.P.L.

    1997-01-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regarding uncertainties and assumptions in the data and analyses

  20. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  1. A neural network driving curve generation method for the heavy-haul train

    Directory of Open Access Journals (Sweden)

    Youneng Huang

    2016-05-01

    Full Text Available The heavy-haul train has a series of characteristics, such as the locomotive traction properties, the longer length of train, and the nonlinear train pipe pressure during train braking. When the train is running on a continuous long and steep downgrade railway line, the safety of the train is ensured by cycle braking, which puts high demands on the driving skills of the driver. In this article, a driving curve generation method for the heavy-haul train based on a neural network is proposed. First, in order to describe the nonlinear characteristics of train braking, the neural network model is constructed and trained by practical driving data. In the neural network model, various nonlinear neurons are interconnected to work for information processing and transmission. The target value of train braking pressure reduction and release time is achieved by modeling the braking process. The equation of train motion is computed to obtain the driving curve. Finally, in four typical operation scenarios, comparing the curve data generated by the method with corresponding practical data of the Shuohuang heavy-haul railway line, the results show that the method is effective.

  2. The relative impact of sizing errors on steam generator tube failure probability

    International Nuclear Information System (INIS)

    Cizelj, L.; Dvorsek, T.

    1998-01-01

    The Outside Diameter Stress Corrosion Cracking (ODSCC) at tube support plates is currently the major degradation mechanism affecting the steam generator tubes made of Inconel 600. This caused development and licensing of degradation specific maintenance approaches, which addressed two main failure modes of the degraded piping: tube rupture; and excessive leakage through degraded tubes. A methodology aiming at assessing the efficiency of a given set of possible maintenance approaches has already been proposed by the authors. It pointed out better performance of the degradation specific over generic approaches in (1) lower probability of single and multiple steam generator tube rupture (SGTR), (2) lower estimated accidental leak rates and (3) less tubes plugged. A sensitivity analysis was also performed pointing out the relative contributions of uncertain input parameters to the tube rupture probabilities. The dominant contribution was assigned to the uncertainties inherent to the regression models used to correlate the defect size and tube burst pressure. The uncertainties, which can be estimated from the in-service inspections, are further analysed in this paper. The defect growth was found to have significant and to some extent unrealistic impact on the probability of single tube rupture. Since the defect growth estimates were based on the past inspection records they strongly depend on the sizing errors. Therefore, an attempt was made to filter out the sizing errors and to arrive at more realistic estimates of the defect growth. The impact of different assumptions regarding sizing errors on the tube rupture probability was studied using a realistic numerical example. The data used is obtained from a series of inspection results from Krsko NPP with 2 Westinghouse D-4 steam generators. The results obtained are considered useful in safety assessment and maintenance of affected steam generators. (author)

  3. Updated greenhouse gas and criteria air pollutant emission factors and their probability distribution functions for electricity generating units

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Wang, M.; Elgowainy, A.; Han, J. (Energy Systems)

    2012-07-06

    Greenhouse gas (CO{sub 2}, CH{sub 4} and N{sub 2}O, hereinafter GHG) and criteria air pollutant (CO, NO{sub x}, VOC, PM{sub 10}, PM{sub 2.5} and SO{sub x}, hereinafter CAP) emission factors for various types of power plants burning various fuels with different technologies are important upstream parameters for estimating life-cycle emissions associated with alternative vehicle/fuel systems in the transportation sector, especially electric vehicles. The emission factors are typically expressed in grams of GHG or CAP per kWh of electricity generated by a specific power generation technology. This document describes our approach for updating and expanding GHG and CAP emission factors in the GREET (Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation) model developed at Argonne National Laboratory (see Wang 1999 and the GREET website at http://greet.es.anl.gov/main) for various power generation technologies. These GHG and CAP emissions are used to estimate the impact of electricity use by stationary and transportation applications on their fuel-cycle emissions. The electricity generation mixes and the fuel shares attributable to various combustion technologies at the national, regional and state levels are also updated in this document. The energy conversion efficiencies of electric generating units (EGUs) by fuel type and combustion technology are calculated on the basis of the lower heating values of each fuel, to be consistent with the basis used in GREET for transportation fuels. On the basis of the updated GHG and CAP emission factors and energy efficiencies of EGUs, the probability distribution functions (PDFs), which are functions that describe the relative likelihood for the emission factors and energy efficiencies as random variables to take on a given value by the integral of their own probability distributions, are updated using best-fit statistical curves to characterize the uncertainties associated with GHG and CAP emissions in life

  4. Spline Trajectory Algorithm Development: Bezier Curve Control Point Generation for UAVs

    Science.gov (United States)

    Howell, Lauren R.; Allen, B. Danette

    2016-01-01

    A greater need for sophisticated autonomous piloting systems has risen in direct correlation with the ubiquity of Unmanned Aerial Vehicle (UAV) technology. Whether surveying unknown or unexplored areas of the world, collecting scientific data from regions in which humans are typically incapable of entering, locating lost or wanted persons, or delivering emergency supplies, an unmanned vehicle moving in close proximity to people and other vehicles, should fly smoothly and predictably. The mathematical application of spline interpolation can play an important role in autopilots' on-board trajectory planning. Spline interpolation allows for the connection of Three-Dimensional Euclidean Space coordinates through a continuous set of smooth curves. This paper explores the motivation, application, and methodology used to compute the spline control points, which shape the curves in such a way that the autopilot trajectory is able to meet vehicle-dynamics limitations. The spline algorithms developed used to generate these curves supply autopilots with the information necessary to compute vehicle paths through a set of coordinate waypoints.

  5. Practical Constraint K-Segment Principal Curve Algorithms for Generating Railway GPS Digital Map

    Directory of Open Access Journals (Sweden)

    Dewang Chen

    2013-01-01

    Full Text Available In order to obtain a decent trade-off between the low-cost, low-accuracy Global Positioning System (GPS receivers and the requirements of high-precision digital maps for modern railways, using the concept of constraint K-segment principal curves (CKPCS and the expert knowledge on railways, we propose three practical CKPCS generation algorithms with reduced computational complexity, and thereafter more suitable for engineering applications. The three algorithms are named ALLopt, MPMopt, and DCopt, in which ALLopt exploits global optimization and MPMopt and DCopt apply local optimization with different initial solutions. We compare the three practical algorithms according to their performance on average projection error, stability, and the fitness for simple and complex simulated trajectories with noise data. It is found that ALLopt only works well for simple curves and small data sets. The other two algorithms can work better for complex curves and large data sets. Moreover, MPMopt runs faster than DCopt, but DCopt can work better for some curves with cross points. The three algorithms are also applied in generating GPS digital maps for two railway GPS data sets measured in Qinghai-Tibet Railway (QTR. Similar results like the ones in synthetic data are obtained. Because the trajectory of a railway is relatively simple and straight, we conclude that MPMopt works best according to the comprehensive considerations on the speed of computation and the quality of generated CKPCS. MPMopt can be used to obtain some key points to represent a large amount of GPS data. Hence, it can greatly reduce the data storage requirements and increase the positioning speed for real-time digital map applications.

  6. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    OpenAIRE

    Mohsen Gerami; Saeed Ghaffari; Amir Mahdi Heidari Tafreshi

    2017-01-01

    Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. T...

  7. CURVES AND AESTHETIC SURFACES GENERATED BY THE R-R-RTR MECHANISM

    Directory of Open Access Journals (Sweden)

    Liliana LUCA

    2013-05-01

    Full Text Available Let’s consider a mechanism having two driving elements with revolving movements and a RTR dyad, with elements of null length and aesthetic tracks of a point are determined on a rod, for various linear movement laws of driving elements. The generated curves revolve around x and y axes and aesthetic surfaces result.

  8. Some possible causes and probability of leakages in LMFBR steam generators

    International Nuclear Information System (INIS)

    Bolt, P.R.

    1984-01-01

    Relevant operational experience with steam generators for process and conventional plant and thermal and fast reactors is reviewed. Possible causes of water/steam leakages into sodium/gas are identified and data is given on the conditions necessary for failure, leakage probability and type of leakage path. (author)

  9. Usefulness of antigen-specific IgE probability curves derived from the 3gAllergy assay in diagnosing egg, cow's milk, and wheat allergies

    Directory of Open Access Journals (Sweden)

    Sakura Sato

    2017-04-01

    Conclusions: Measurements of sIgE against egg, milk, and wheat as determined by 3gAllergy may be used as a tool to facilitate the diagnosis of food allergy in subjects with suspected food allergies. However, these probability curves should not be applied interchangeably between different assays.

  10. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Science.gov (United States)

    Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  11. Statistically generated weighted curve fit of residual functions for modal analysis of structures

    Science.gov (United States)

    Bookout, P. S.

    1995-01-01

    A statistically generated weighting function for a second-order polynomial curve fit of residual functions has been developed. The residual flexibility test method, from which a residual function is generated, is a procedure for modal testing large structures in an external constraint-free environment to measure the effects of higher order modes and interface stiffness. This test method is applicable to structures with distinct degree-of-freedom interfaces to other system components. A theoretical residual function in the displacement/force domain has the characteristics of a relatively flat line in the lower frequencies and a slight upward curvature in the higher frequency range. In the test residual function, the above-mentioned characteristics can be seen in the data, but due to the present limitations in the modal parameter evaluation (natural frequencies and mode shapes) of test data, the residual function has regions of ragged data. A second order polynomial curve fit is required to obtain the residual flexibility term. A weighting function of the data is generated by examining the variances between neighboring data points. From a weighted second-order polynomial curve fit, an accurate residual flexibility value can be obtained. The residual flexibility value and free-free modes from testing are used to improve a mathematical model of the structure. The residual flexibility modal test method is applied to a straight beam with a trunnion appendage and a space shuttle payload pallet simulator.

  12. Curvature Entropy for Curved Profile Generation

    OpenAIRE

    Ujiie, Yoshiki; Kato, Takeo; Sato, Koichiro; Matsuoka, Yoshiyuki

    2012-01-01

    In a curved surface design, the overall shape features that emerge from combinations of shape elements are important. However, controlling the features of the overall shape in curved profiles is difficult using conventional microscopic shape information such as dimension. Herein two types of macroscopic shape information, curvature entropy and quadrature curvature entropy, quantitatively represent the features of the overall shape. The curvature entropy is calculated by the curvature distribu...

  13. A note on families of fragility curves

    International Nuclear Information System (INIS)

    Kaplan, S.; Bier, V.M.; Bley, D.C.

    1989-01-01

    In the quantitative assessment of seismic risk, uncertainty in the fragility of a structural component is usually expressed by putting forth a family of fragility curves, with probability serving as the parameter of the family. Commonly, a lognormal shape is used both for the individual curves and for the expression of uncertainty over the family. A so-called composite single curve can also be drawn and used for purposes of approximation. This composite curve is often regarded as equivalent to the mean curve of the family. The equality seems intuitively reasonable, but according to the authors has never been proven. The paper presented proves this equivalence hypothesis mathematically. Moreover, the authors show that this equivalence hypothesis between fragility curves is itself equivalent to an identity property of the standard normal probability curve. Thus, in the course of proving the fragility curve hypothesis, the authors have also proved a rather obscure, but interesting and perhaps previously unrecognized, property of the standard normal curve

  14. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Hongxiang [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Sun, Ning [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Wigmosta, Mark [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Distinguished Faculty Fellow, Department of Civil and Environmental Engineering, University of Washington, Seattle Washington United States; Skaggs, Richard [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Hou, Zhangshuan [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland Washington United States; Leung, Ruby [Earth and Biological Sciences Directorate, Pacific Northwest National Laboratory, Richland Washington United States

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial over-/under-estimation of design basis events and subsequent over-/under-design of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to under-design, many with significant under-estimation of 100-year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for under-design were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for over-design at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  15. A method for the rapid generation of nonsequential light-response curves of chlorophyll fluorescence.

    Science.gov (United States)

    Serôdio, João; Ezequiel, João; Frommlet, Jörg; Laviale, Martin; Lavaud, Johann

    2013-11-01

    Light-response curves (LCs) of chlorophyll fluorescence are widely used in plant physiology. Most commonly, LCs are generated sequentially, exposing the same sample to a sequence of distinct actinic light intensities. These measurements are not independent, as the response to each new light level is affected by the light exposure history experienced during previous steps of the LC, an issue particularly relevant in the case of the popular rapid light curves. In this work, we demonstrate the proof of concept of a new method for the rapid generation of LCs from nonsequential, temporally independent fluorescence measurements. The method is based on the combined use of sample illumination with digitally controlled, spatially separated beams of actinic light and a fluorescence imaging system. It allows the generation of a whole LC, including a large number of actinic light steps and adequate replication, within the time required for a single measurement (and therefore named "single-pulse light curve"). This method is illustrated for the generation of LCs of photosystem II quantum yield, relative electron transport rate, and nonphotochemical quenching on intact plant leaves exhibiting distinct light responses. This approach makes it also possible to easily characterize the integrated dynamic light response of a sample by combining the measurement of LCs (actinic light intensity is varied while measuring time is fixed) with induction/relaxation kinetics (actinic light intensity is fixed and the response is followed over time), describing both how the response to light varies with time and how the response kinetics varies with light intensity.

  16. Solar updraft power generator with radial and curved vanes

    Science.gov (United States)

    Hafizh, Hadyan; Hamsan, Raziff; Zamri, Aidil Azlan Ahmad; Keprawi, Mohamad Fairuz Mohamad; Shirato, Hiromichi

    2018-02-01

    Solar radiation is the largest source of energy available on earth and the solar updraft power generator (SUPG) is a renewable energy facility capable of harnessing its abundant power. Unlike the conventional wind turbines that harness natural wind in the atmosphere and often encounter with the intermittent issue or even complete cut-off from airflow, the SUPG creates artificial wind as a result of solar-induced convective flows. However, the SUPG has an inherent low total efficiency due to the conversion of thermal energy into pressure energy. Acknowledging the low efficiency and considering its potential as a renewable energy facility, the current work aims to increase the total efficiency by installing a series of guide walls inside the collector. Two types of guide walls were used i.e. radial and curved vanes. The result with curved vanes showed that the updraft velocity is higher compare to those without vanes. About 18% and 64% improvement of updraft velocity and mechanical power were attained respectively. Furthermore, it was observed that the role of radial vanes configuration was more to produce a smooth updraft velocity profile rather than increasing the total efficiency.

  17. Canonical generators of the cohomology of moduli of parabolic bundles on curves

    International Nuclear Information System (INIS)

    Biswas, I.; Raghavendra, N.

    1994-11-01

    We determine generators of the rational cohomology algebras of moduli spaces of parabolic vector bundles on a curve, under some 'primality' conditions on the parabolic datum. These generators are canonical in a precise sense. Our results are new even for usual vector bundles (i.e., vector bundles without parabolic structure) whose rank is greater than 2 and is coprime to the degree; in this case, they are generalizations of a theorem of Newstead on the moduli of vector bundles of rank 2 and odd degree. (author). 11 refs

  18. Fortran code for generating random probability vectors, unitaries, and quantum states

    Directory of Open Access Journals (Sweden)

    Jonas eMaziero

    2016-03-01

    Full Text Available The usefulness of generating random configurations is recognized in many areas of knowledge. Fortran was born for scientific computing and has been one of the main programming languages in this area since then. And several ongoing projects targeting towards its betterment indicate that it will keep this status in the decades to come. In this article, we describe Fortran codes produced, or organized, for the generation of the following random objects: numbers, probability vectors, unitary matrices, and quantum state vectors and density matrices. Some matrix functions are also included and may be of independent interest.

  19. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    Science.gov (United States)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based

  20. Sharp Bounds by Probability-Generating Functions and Variable Drift

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fouz, Mahmoud; Witt, Carsten

    2011-01-01

    We introduce to the runtime analysis of evolutionary algorithms two powerful techniques: probability-generating functions and variable drift analysis. They are shown to provide a clean framework for proving sharp upper and lower bounds. As an application, we improve the results by Doerr et al....... (GECCO 2010) in several respects. First, the upper bound on the expected running time of the most successful quasirandom evolutionary algorithm for the OneMax function is improved from 1.28nln n to 0.982nlnn, which breaks the barrier of nln n posed by coupon-collector processes. Compared to the classical...

  1. Application of remote sensing and geographical information system for generation of runoff curve number

    Science.gov (United States)

    Meshram, S. Gajbhiye; Sharma, S. K.; Tignath, S.

    2017-07-01

    Watershed is an ideal unit for planning and management of land and water resources (Gajbhiye et al., IEEE international conference on advances in technology and engineering (ICATE), Bombay, vol 1, issue 9, pp 23-25, 2013a; Gajbhiye et al., Appl Water Sci 4(1):51-61, 2014a; Gajbhiye et al., J Geol Soc India (SCI-IF 0.596) 84(2):192-196, 2014b). This study aims to generate the curve number, using remote sensing and geographical information system (GIS) and the effect of slope on curve number values. The study was carried out in Kanhaiya Nala watershed located in Satna district of Madhya Pradesh. Soil map, Land Use/Land cover and slope map were generated in GIS Environment. The CN parameter values corresponding to various soil, land cover, and land management conditions were selected from Natural Resource Conservation Service (NRCS) standard table. Curve number (CN) is an index developed by the NRCS, to represent the potential for storm water runoff within a drainage area. The CN for a drainage basin is estimated using a combination of land use, soil, and antecedent soil moisture condition (AMC). In present study effect of slope on CN values were determined. The result showed that the CN unadjusted value are higher in comparison to CN adjusted with slope. Remote sensing and GIS is very reliable technique for the preparation of most of the input data required by the SCS curve number model.

  2. Lagrangian Curves on Spectral Curves of Monopoles

    International Nuclear Information System (INIS)

    Guilfoyle, Brendan; Khalid, Madeeha; Ramon Mari, Jose J.

    2010-01-01

    We study Lagrangian points on smooth holomorphic curves in TP 1 equipped with a natural neutral Kaehler structure, and prove that they must form real curves. By virtue of the identification of TP 1 with the space LE 3 of oriented affine lines in Euclidean 3-space, these Lagrangian curves give rise to ruled surfaces in E 3 , which we prove have zero Gauss curvature. Each ruled surface is shown to be the tangent lines to a curve in E 3 , called the edge of regression of the ruled surface. We give an alternative characterization of these curves as the points in E 3 where the number of oriented lines in the complex curve Σ that pass through the point is less than the degree of Σ. We then apply these results to the spectral curves of certain monopoles and construct the ruled surfaces and edges of regression generated by the Lagrangian curves.

  3. Accurate potential energy curves, spectroscopic parameters, transition dipole moments, and transition probabilities of 21 low-lying states of the CO+ cation

    Science.gov (United States)

    Xing, Wei; Shi, Deheng; Zhang, Jicai; Sun, Jinfeng; Zhu, Zunlue

    2018-05-01

    This paper calculates the potential energy curves of 21 Λ-S and 42 Ω states, which arise from the first two dissociation asymptotes of the CO+ cation. The calculations are conducted using the complete active space self-consistent field method, which is followed by the valence internally contracted multireference configuration interaction approach with the Davidson correction. To improve the reliability and accuracy of the potential energy curves, core-valence correlation and scalar relativistic corrections, as well as the extrapolation of potential energies to the complete basis set limit are taken into account. The spectroscopic parameters and vibrational levels are determined. The spin-orbit coupling effect on the spectroscopic parameters and vibrational levels is evaluated. To better study the transition probabilities, the transition dipole moments are computed. The Franck-Condon factors and Einstein coefficients of some emissions are calculated. The radiative lifetimes are determined for a number of vibrational levels of several states. The transitions between different Λ-S states are evaluated. Spectroscopic routines for observing these states are proposed. The spectroscopic parameters, vibrational levels, transition dipole moments, and transition probabilities reported in this paper can be considered to be very reliable and can be used as guidelines for detecting these states in an appropriate spectroscopy experiment, especially for the states that were very difficult to observe or were not detected in previous experiments.

  4. Critical Factors for Inducing Curved Somatosensory Saccades

    Directory of Open Access Journals (Sweden)

    Tamami Nakano

    2011-10-01

    Full Text Available We are able to make a saccade toward a tactile stimuli to one hand, but trajectories of many saccades curved markedly when the arms were crossed (Groh & Sparks, 2006. However, it remains unknown why some curved and others did not. We therefore examined critical factors for inducing the curved somatosensory saccades. Participants made a saccade as soon as possible from a central fixation point toward a tactile stimulus delivered to one of the two hands, and switched between arms-crossed and arms-uncrossed postures every 6 trials. Trajectories were generally straight when the arms were uncrossed, but all participants made curved saccades when the arms were crossed (12–64%. We found that the probability of curved saccades depended critically on the onset latency: the probability was less than 5% when the latency was larger than 250 ms, but the probability increased up to 70–80% when the onset latency was 160 ms. This relationship was shared across participants. The results suggest that a touch in the arms-crossed posture was always mapped to the wrong hand in the initial phase up to 160 ms, and then remapped to the correct hand during the next 100 ms by some fundamental neural mechanisms shared across participants.

  5. An investigation on vulnerability assessment of steel structures with thin steel shear wall through development of fragility curves

    Directory of Open Access Journals (Sweden)

    Mohsen Gerami

    2017-02-01

    Full Text Available Fragility curves play an important role in damage assessment of buildings. Probability of damage induction to the structure against seismic events can be investigated upon generation of afore mentioned curves. In current research 360 time history analyses have been carried out on structures of 3, 10 and 20 story height and subsequently fragility curves have been adopted. The curves are developed based on two indices of inter story drifts and equivalent strip axial strains of the shear wall. Time history analysis is carried out in Perform 3d considering 10 far field seismograms and 10 near fields. Analysis of low height structures revealed that they are more vulnerable in accelerations lower than 0.8 g in near field earthquakes because of higher mode effects. Upon the generated fragility curves it was observed that middle and high structures have more acceptable performance and lower damage levels compared to low height structures in both near and far field seismic hazards.

  6. A Probability Analysis of the Generating Cost for APR1000+

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Gag-Hyeon; Kim, Dae-Hun [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, cutting CO{sub 2} emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the APR1000+ development is to ensure export competitiveness in the developing countries in the Middle East and Southeast Asia. To that end, APR1000+(1,000MWe, 3.5 generation) will be developed based on APR+ (1,500MWe, 3.5 generation). And comparing to OPR1000(Korean Standard Nuclear power Plant, 2.5 generation), APR1000+ have many design features such as the 60 year design life time, comprehensive site requirement of 0.3g seismic design, stability improvement, operability improvement and provisions for severe accidents. In this simulation, the results of generating cost for APR1000+ preliminary conceptual design using a probability method was shown to be 48.37 ~ 74.22 won/kWh(median value 56.51 won/kWh). Those of OPR1000 was 42.08 ~ 61.77 won/kWh(median value 48.63 won/kWh). APR1000+ has -16.2% cost advantage over OPR1000 nuclear power plant. The main reason of this results is due to adding several safety designs.

  7. Surfactant Effect on the Average Flow Generation Near Curved Interface

    Science.gov (United States)

    Klimenko, Lyudmila; Lyubimov, Dmitry

    2018-02-01

    The present work is devoted to the average flow generation near curved interface with a surfactant adsorbed on the surface layer. The investigation was carried out for a liquid drop embedded in a viscous liquid with a different density. The liquid flows inside and outside the drop are generated by small amplitude and high frequency vibrations. Surfactant exchange between the drop surface and the surrounding liquid is limited by the process of adsorption-desorption. It was assumed that the surfactant is soluble in the surrounding liquid, but not soluble in the liquid drop. Surrounding liquid and the liquid in the drop are considered incompressible. Normal and shear viscous stresses balance at the interface is performed under the condition that the film thickness of the adsorbed surfactant is negligible. The problem is solved under assumption that the shape of the drop in the presence of adsorbed surfactant remains spherical symmetry. The effective boundary conditions for the tangential velocity jump and shear stress jump, describing the above generation have been obtained by matched asymptotic expansions method. The conditions under which the drop surface can be considered as a quasi-solid are determined. It is shown that in the case of the significant effect of surfactant on the surface tension, the dominant mechanism for the generation is the Schlichting mechanisms under vibrations.

  8. Left passage probability of Schramm-Loewner Evolution

    Science.gov (United States)

    Najafi, M. N.

    2013-06-01

    SLE(κ,ρ⃗) is a variant of Schramm-Loewner Evolution (SLE) which describes the curves which are not conformal invariant, but are self-similar due to the presence of some other preferred points on the boundary. In this paper we study the left passage probability (LPP) of SLE(κ,ρ⃗) through field theoretical framework and find the differential equation governing this probability. This equation is numerically solved for the special case κ=2 and hρ=0 in which hρ is the conformal weight of the boundary changing (bcc) operator. It may be referred to loop erased random walk (LERW) and Abelian sandpile model (ASM) with a sink on its boundary. For the curve which starts from ξ0 and conditioned by a change of boundary conditions at x0, we find that this probability depends significantly on the factor x0-ξ0. We also present the perturbative general solution for large x0. As a prototype, we apply this formalism to SLE(κ,κ-6) which governs the curves that start from and end on the real axis.

  9. Transient stability probability evaluation of power system incorporating with wind farm and SMES

    DEFF Research Database (Denmark)

    Fang, Jiakun; Miao, Lu; Wen, Jinyu

    2013-01-01

    Large scale renewable power generation brings great challenges to the power system operation and stabilization. Energy storage is one of the most important technologies to face the challenges. This paper proposes a method for transient stability probability evaluation of power system with wind farm...... and SMES. Firstly, a modified 11-bus test system with both wind farm and SMES has been implemented. The wind farm is represented as a doubly fed induction generator (DFIG). Then a stochastic-based approach to evaluate the probabilistic transient stability index of the power system is presented. Uncertain...... the probability indices. With the proposed method based on Monte-Carlo simulation and bisection method, system stability is "measured". Quantitative relationship of penetration level, SMES coil size and system stability is established. Considering the stability versus coil size to be the production curve...

  10. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  11. Simulating Supernova Light Curves

    Energy Technology Data Exchange (ETDEWEB)

    Even, Wesley Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dolence, Joshua C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-05

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth’s atmosphere.

  12. Simulating Supernova Light Curves

    International Nuclear Information System (INIS)

    Even, Wesley Paul; Dolence, Joshua C.

    2016-01-01

    This report discusses supernova light simulations. A brief review of supernovae, basics of supernova light curves, simulation tools used at LANL, and supernova results are included. Further, it happens that many of the same methods used to generate simulated supernova light curves can also be used to model the emission from fireballs generated by explosions in the earth's atmosphere.

  13. Pressure drop-flow rate curves for single-phase steam in Combustion Engineering type steam generator U-tubes during severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Fynan, Douglas A.; Ahn, Kwang-Il, E-mail: kiahn@kaeri.re.kr

    2016-12-15

    Highlights: • Pressure drop-flow rate curves for superheated steam in U-tubes were generated. • Forward flow of hot steam is favored in the longer and taller U-tubes. • Reverse flow of cold steam is favored in short U-tubes. • Steam generator U-tube bundle geometry and tube diameter are important. • Need for correlation development for natural convention heat transfer coefficient. - Abstract: Characteristic pressure drop-flow rate curves are generated for all row numbers of the OPR1000 steam generators (SGs), representative of Combustion Engineering (CE) type SGs featuring square bend U-tubes. The pressure drop-flow rate curves are applicable to severe accident natural circulations of single-phase superheated steam during high pressure station blackout sequences with failed auxiliary feedwater and dry secondary side which are closely related to the thermally induced steam generator tube rupture event. The pressure drop-flow rate curves which determine the recirculation rate through the SG tubes are dependent on the tube bundle geometry and hydraulic diameter of the tubes. The larger CE type SGs have greater variation of tube length and height as a function of row number with forward flow of steam favored in the longer and taller high row number tubes and reverse flow favored in the short low row number tubes. Friction loss, natural convection heat transfer coefficients, and temperature differentials from the primary to secondary side are dominant parameters affecting the recirculation rate. The need for correlation development for natural convection heat transfer coefficients for external flow over tube bundles currently not modeled in system codes is discussed.

  14. Consistency Results for the ROC Curves of Fused Classifiers

    National Research Council Canada - National Science Library

    Bjerkaas, Kristopher

    2004-01-01

    .... An established performance quantifier is the Receiver Operating Characteristic (ROC) curve, which allows one to view the probability of detection versus the probability of false alarm in one graph...

  15. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    Science.gov (United States)

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  16. Heat transfer and pressure drop characteristics of the tube bank fin heat exchanger with fin punched with flow redistributors and curved triangular vortex generators

    Science.gov (United States)

    Liu, Song; Jin, Hua; Song, KeWei; Wang, LiangChen; Wu, Xiang; Wang, LiangBi

    2017-10-01

    The heat transfer performance of the tube bank fin heat exchanger is limited by the air-side thermal resistance. Thus, enhancing the air-side heat transfer is an effective method to improve the performance of the heat exchanger. A new fin pattern with flow redistributors and curved triangular vortex generators is experimentally studied in this paper. The effects of the flow redistributors located in front of the tube stagnation point and the curved vortex generators located around the tube on the characteristics of heat transfer and pressure drop are discussed in detail. A performance comparison is also carried out between the fins with and without flow redistributors. The experimental results show that the flow redistributors stamped out from the fin in front of the tube stagnation points can decrease the friction factor at the cost of decreasing the heat transfer performance. Whether the combination of the flow redistributors and the curved vortex generators will present a better heat transfer performance depends on the size of the curved vortex generators. As for the studied two sizes of vortex generators, the heat transfer performance is promoted by the flow redistributors for the fin with larger size of vortex generators and the performance is suppressed by the flow redistributors for the fin with smaller vortex generators.

  17. Usefulness of antigen-specific IgE probability curves derived from the 3gAllergy assay in diagnosing egg, cow's milk, and wheat allergies.

    Science.gov (United States)

    Sato, Sakura; Ogura, Kiyotake; Takahashi, Kyohei; Sato, Yasunori; Yanagida, Noriyuki; Ebisawa, Motohiro

    2017-04-01

    Specific IgE (sIgE) antibody detection using the Siemens IMMULITE ® 3gAllergy™ (3gAllergy) assay have not been sufficiently examined for the diagnosis of food allergy. The aim of this study was to evaluate the utility of measuring sIgE levels using the 3gAllergy assay to diagnose allergic reactions to egg, milk, and wheat. This retrospective study was conducted on patients with diagnosed or suspected allergies to egg, milk and wheat. Patients were divided into two groups according to their clinical reactivity to these allergens based on oral food challenge outcomes and/or convincing histories of immediate reaction to causative food(s). The sIgE levels were measured using 3gAllergy and ImmunoCAP. Predicted probability curves were estimated using logistic regression analysis. We analyzed 1561 patients, ages 0-19 y (egg = 436, milk = 499, wheat = 626). The sIgE levels determined using 3gAllergy correlated with those of ImmunoCAP, classifying 355 patients as symptomatic: egg = 149, milk = 123, wheat = 83. 3gAllergy sIgE levels were significantly higher in symptomatic than in asymptomatic patients (P allergies. However, these probability curves should not be applied interchangeably between different assays. Copyright © 2016 Japanese Society of Allergology. Production and hosting by Elsevier B.V. All rights reserved.

  18. Scenario analysis for estimating the learning rate of photovoltaic power generation based on learning curve theory in South Korea

    International Nuclear Information System (INIS)

    Hong, Sungjun; Chung, Yanghon; Woo, Chungwon

    2015-01-01

    South Korea, as the 9th largest energy consuming in 2013 and the 7th largest greenhouse gas emitting country in 2011, established ‘Low Carbon Green Growth’ as the national vision in 2008, and is announcing various active energy policies that are set to gain the attention of the world. In this paper, we estimated the decrease of photovoltaic power generation cost in Korea based on the learning curve theory. Photovoltaic energy is one of the leading renewable energy sources, and countries all over the world are currently expanding R and D, demonstration and deployment of photovoltaic technology. In order to estimate the learning rate of photovoltaic energy in Korea, both conventional 1FLC (one-factor learning curve), which considers only the cumulative power generation, and 2FLC, which also considers R and D investment were applied. The 1FLC analysis showed that the cost of power generation decreased by 3.1% as the cumulative power generation doubled. The 2FCL analysis presented that the cost decreases by 2.33% every time the cumulative photovoltaic power generation is doubled and by 5.13% every time R and D investment is doubled. Moreover, the effect of R and D investment on photovoltaic technology took after around 3 years, and the depreciation rate of R and D investment was around 20%. - Highlights: • We analyze the learning effects of photovoltaic energy technology in Korea. • In order to calculate the learning rate, we use 1FLC (one-factor learning curve) and 2FLC methods, respectively. • 1FLC method considers only the cumulative power generation. • 2FLC method considers both cumulative power generation and knowledge stock. • We analyze a variety of scenarios by time lag and depreciation rate of R and D investment

  19. On the Generation of Random Ensembles of Qubits and Qutrits Computing Separability Probabilities for Fixed Rank States

    Directory of Open Access Journals (Sweden)

    Khvedelidze Arsen

    2018-01-01

    Full Text Available The generation of random mixed states is discussed, aiming for the computation of probabilistic characteristics of composite finite dimensional quantum systems. In particular, we consider the generation of random Hilbert-Schmidt and Bures ensembles of qubit and qutrit pairs and compute the corresponding probabilities to find a separable state among the states of a fixed rank.

  20. Construction of estimated flow- and load-duration curves for Kentucky using the Water Availability Tool for Environmental Resources (WATER)

    Science.gov (United States)

    Unthank, Michael D.; Newson, Jeremy K.; Williamson, Tanja N.; Nelson, Hugh L.

    2012-01-01

    Flow- and load-duration curves were constructed from the model outputs of the U.S. Geological Survey's Water Availability Tool for Environmental Resources (WATER) application for streams in Kentucky. The WATER application was designed to access multiple geospatial datasets to generate more than 60 years of statistically based streamflow data for Kentucky. The WATER application enables a user to graphically select a site on a stream and generate an estimated hydrograph and flow-duration curve for the watershed upstream of that point. The flow-duration curves are constructed by calculating the exceedance probability of the modeled daily streamflows. User-defined water-quality criteria and (or) sampling results can be loaded into the WATER application to construct load-duration curves that are based on the modeled streamflow results. Estimates of flow and streamflow statistics were derived from TOPographically Based Hydrological MODEL (TOPMODEL) simulations in the WATER application. A modified TOPMODEL code, SDP-TOPMODEL (Sinkhole Drainage Process-TOPMODEL) was used to simulate daily mean discharges over the period of record for 5 karst and 5 non-karst watersheds in Kentucky in order to verify the calibrated model. A statistical evaluation of the model's verification simulations show that calibration criteria, established by previous WATER application reports, were met thus insuring the model's ability to provide acceptably accurate estimates of discharge at gaged and ungaged sites throughout Kentucky. Flow-duration curves are constructed in the WATER application by calculating the exceedence probability of the modeled daily flow values. The flow-duration intervals are expressed as a percentage, with zero corresponding to the highest stream discharge in the streamflow record. Load-duration curves are constructed by applying the loading equation (Load = Flow*Water-quality criterion) at each flow interval.

  1. Prospects for PV: a learning curve analysis

    International Nuclear Information System (INIS)

    Zwaan, Bob van der; Rabi, A.

    2003-01-01

    This article gives an overview of the current state-of-the-art of photovoltaic electricity technology, and addresses its potential for cost reductions over the first few decades of the 21st century. Current PV production cost ranges are presented, both in terms of capacity installation and electricity generation, of single crystalline silicon, multi-crystalline silicon, amorphous silicon and other thin film technologies. Possible decreases of these costs are assessed, as expected according to the learning-curve methodology. We also estimate how much PV could gain if external costs (due to environmental and health damage) of energy were internalised, for example by an energy tax. Our conclusions are that, (1) mainly due its high costs, PV electricity is unlikely to play a major role in global energy supply and carbon emissions abatement before 2020, (2) extrapolating learning curves observed in the past, one can expect its costs to decrease significantly over the coming years, so that a considerable PV electricity share world-wide could materialise after 2020, (3) niche-market applications, e.g. using stand-alone systems in remote areas, are crucial for continuing 'the ride along the learning curve', (4) damage costs of conventional (fossil) power sources are considerable, and their internalisation would improve the competitiveness of PV, although probably not enough to close the current cost gap. (author)

  2. A Probability Analysis of the Generating Cost for EU-APR1400 Single Unit

    International Nuclear Information System (INIS)

    Ha, Gak Hyeon; Kim, Sung Hwan

    2014-01-01

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, reducing CO 2 emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the EU-APR100 development is to ensure export competitiveness in the European countries. To this end, EU-APR1400 have been developed based one te APR1400 (Advanced Power Reactor, GEN Type) The EU-APR1400 adds many advanced design features to its predecessor, as outlined below in Table 1. In this simulation, the results of the generating cost of the EU-APR1400 single unit were determined using the probability cost analysis technique, the generating cost range was shown to be 56.16 ∼ 70.92 won/kWh.

  3. A Probability Analysis of the Generating Cost for EU-APR1400 Single Unit

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Gak Hyeon; Kim, Sung Hwan [KHNP CRI, Seoul (Korea, Republic of)

    2014-10-15

    The nuclear power plant market is expected to grow rapidly in order to address issues of global warming, reducing CO{sub 2} emissions and securing stable electricity supplies. Under these circumstances, the main primary goal of the EU-APR100 development is to ensure export competitiveness in the European countries. To this end, EU-APR1400 have been developed based one te APR1400 (Advanced Power Reactor, GEN Type) The EU-APR1400 adds many advanced design features to its predecessor, as outlined below in Table 1. In this simulation, the results of the generating cost of the EU-APR1400 single unit were determined using the probability cost analysis technique, the generating cost range was shown to be 56.16 ∼ 70.92 won/kWh.

  4. Design Curve Generation for 3D SiC Fiber Architecture

    Science.gov (United States)

    Lang, Jerry; Dicarlo, James A.

    2014-01-01

    The design tool provides design curves that allow a simple and quick way to examine multiple factors that can influence the processing and key properties of the preforms and their final SiC-reinforced ceramic composites without over obligating financial capital for the fabricating of materials. Tool predictions for process and fiber fraction properties have been validated for a HNS 3D preform.The virtualization aspect of the tool will be used to provide a quick generation of solid models with actual fiber paths for finite element evaluation to predict mechanical and thermal properties of proposed composites as well as mechanical displacement behavior due to creep and stress relaxation to study load sharing characteristic between constitutes for better performance.Tool predictions for the fiber controlled properties of the SiCSiC CMC fabricated from the HNS preforms will be valuated and up-graded from the measurements on these CMC

  5. Survival curves for irradiated cells

    International Nuclear Information System (INIS)

    Gibson, D.K.

    1975-01-01

    The subject of the lecture is the probability of survival of biological cells which have been subjected to ionising radiation. The basic mathematical theories of cell survival as a function of radiation dose are developed. A brief comparison with observed survival curves is made. (author)

  6. Titration Curves: Fact and Fiction.

    Science.gov (United States)

    Chamberlain, John

    1997-01-01

    Discusses ways in which datalogging equipment can enable titration curves to be measured accurately and how computing power can be used to predict the shape of curves. Highlights include sources of error, use of spreadsheets to generate titration curves, titration of a weak acid with a strong alkali, dibasic acids, weak acid and weak base, and…

  7. Signature Curves Statistics of DNA Supercoils

    OpenAIRE

    Shakiban, Cheri; Lloyd, Peter

    2004-01-01

    In this paper we describe the Euclidean signature curves for two dimensional closed curves in the plane and their generalization to closed space curves. The focus will be on discrete numerical methods for approximating such curves. Further we will apply these numerical methods to plot the signature curves related to three-dimensional simulated DNA supercoils. Our primary focus will be on statistical analysis of the data generated for the signature curves of the supercoils. We will try to esta...

  8. Determination of PV Generator I-V/P-V Characteristic Curves Using a DC-DC Converter Controlled by a Virtual Instrument

    Directory of Open Access Journals (Sweden)

    E. Durán

    2012-01-01

    Full Text Available A versatile measurement system for systematic testing and measurement of the evolution of the I-V characteristic curves of photovoltaic panels or arrays (PV generators is proposed in this paper. The measurement system uses a circuit solution based on DC-DC converters that involves several advantages relative to traditional methods: simple structure, scalability, fast response, and low cost. The measurement of the desired characteristics of PV generators includes high speed of response and high fidelity. The prototype system built is governed by a microcontroller, and experimental results prove the proposed measurement system useful. A virtual instrument (VI was developed for full system control from a computer. The developed system enables monitoring the suitable operation of a PV generator in real time, since it allows comparing its actual curves with those provided by the manufacturer.

  9. Probabilities of Natural Events Occurring at Savannah River Plant

    Energy Technology Data Exchange (ETDEWEB)

    Huang, J.C.

    2001-07-17

    This report documents the comprehensive evaluation of probability models of natural events which are applicable to Savannah River Plant. The probability curves selected for these natural events are recommended to be used by all SRP/SRL safety analysts. This will ensure a consistency in analysis methodology for postulated SAR incidents involving natural phenomena.

  10. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves.

    Science.gov (United States)

    Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J

    2012-02-01

    The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  11. Remote sensing used for power curves

    International Nuclear Information System (INIS)

    Wagner, R; Joergensen, H E; Paulsen, U S; Larsen, T J; Antoniou, I; Thesbjerg, L

    2008-01-01

    Power curve measurement for large wind turbines requires taking into account more parameters than only the wind speed at hub height. Based on results from aerodynamic simulations, an equivalent wind speed taking the wind shear into account was defined and found to reduce the power standard deviation in the power curve significantly. Two LiDARs and a SoDAR are used to measure the wind profile in front of a wind turbine. These profiles are used to calculate the equivalent wind speed. The comparison of the power curves obtained with the three instruments to the traditional power curve, obtained using a cup anemometer measurement, confirms the results obtained from the simulations. Using LiDAR profiles reduces the error in power curve measurement, when these are used as relative instrument together with a cup anemometer. Results from the SoDAR do not show such promising results, probably because of noisy measurements resulting in distorted profiles

  12. Linking probabilities of off-lattice self-avoiding polygons and the effects of excluded volume

    International Nuclear Information System (INIS)

    Hirayama, Naomi; Deguchi, Tetsuo; Tsurusaki, Kyoichi

    2009-01-01

    We evaluate numerically the probability of linking, i.e. the probability of a given pair of self-avoiding polygons (SAPs) being entangled and forming a nontrivial link type L. In the simulation we generate pairs of SAPs of N spherical segments of radius r d such that they have no overlaps among the segments and each of the SAPs has the trivial knot type. We evaluate the probability of a self-avoiding pair of SAPs forming a given link type L for various link types with fixed distance R between the centers of mass of the two SAPs. We define normalized distance r by r=R/R g,0 1 where R g,0 1 denotes the square root of the mean square radius of gyration of SAP of the trivial knot 0 1 . We introduce formulae expressing the linking probability as a function of normalized distance r, which gives good fitting curves with respect to χ 2 values. We also investigate the dependence of linking probabilities on the excluded-volume parameter r d and the number of segments, N. Quite interestingly, the graph of linking probability versus normalized distance r shows no N-dependence at a particular value of the excluded volume parameter, r d = 0.2

  13. Investigation of learning and experience curves

    Energy Technology Data Exchange (ETDEWEB)

    Krawiec, F.; Thornton, J.; Edesess, M.

    1980-04-01

    The applicability of learning and experience curves for predicting future costs of solar technologies is assessed, and the major test case is the production economics of heliostats. Alternative methods for estimating cost reductions in systems manufacture are discussed, and procedures for using learning and experience curves to predict costs are outlined. Because adequate production data often do not exist, production histories of analogous products/processes are analyzed and learning and aggregated cost curves for these surrogates estimated. If the surrogate learning curves apply, they can be used to estimate solar technology costs. The steps involved in generating these cost estimates are given. Second-generation glass-steel and inflated-bubble heliostat design concepts, developed by MDAC and GE, respectively, are described; a costing scenario for 25,000 units/yr is detailed; surrogates for cost analysis are chosen; learning and aggregate cost curves are estimated; and aggregate cost curves for the GE and MDAC designs are estimated. However, an approach that combines a neoclassical production function with a learning-by-doing hypothesis is needed to yield a cost relation compatible with the historical learning curve and the traditional cost function of economic theory.

  14. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.

    1999-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original database was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound master curve corresponds to the K IR -reference curve. (orig.)

  15. Statistical re-evaluation of the ASME KIC and KIR fracture toughness reference curves

    International Nuclear Information System (INIS)

    Wallin, K.; Rintamaa, R.

    1998-01-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the 'Master curve', has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K IC -reference curve. Similarly, the 1% lower bound Master curve corresponds to the K IR -reference curve. (orig.)

  16. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  17. Cost development of future technologies for power generation-A study based on experience curves and complementary bottom-up assessments

    International Nuclear Information System (INIS)

    Neij, Lena

    2008-01-01

    Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development

  18. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  19. Automated Blazar Light Curves Using Machine Learning

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Spencer James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-27

    This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.

  20. Tumor control probability after a radiation of animal tumors

    International Nuclear Information System (INIS)

    Urano, Muneyasu; Ando, Koichi; Koike, Sachiko; Nesumi, Naofumi

    1975-01-01

    Tumor control and regrowth probability of animal tumors irradiated with a single x-ray dose were determined, using a spontaneous C3H mouse mammary carcinoma. Cellular radiation sensitivity of tumor cells and tumor control probability of the tumor were examined by the TD 50 and TCD 50 assays respectively. Tumor growth kinetics were measured by counting the percentage of labelled mitosis and by measuring the growth curve. A mathematical analysis of tumor control probability was made from these results. A formula proposed, accounted for cell population kinetics or division probability model, cell sensitivity to radiation and number of tumor cells. (auth.)

  1. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  2. Feasible Path Generation Using Bezier Curves for Car-Like Vehicle

    Science.gov (United States)

    Latip, Nor Badariyah Abdul; Omar, Rosli

    2017-08-01

    When planning a collision-free path for an autonomous vehicle, the main criteria that have to be considered are the shortest distance, lower computation time and completeness, i.e. a path can be found if one exists. Besides that, a feasible path for the autonomous vehicle is also crucial to guarantee that the vehicle can reach the target destination considering its kinematic constraints such as non-holonomic and minimum turning radius. In order to address these constraints, Bezier curves is applied. In this paper, Bezier curves are modeled and simulated using Matlab software and the feasibility of the resulting path is analyzed. Bezier curve is derived from a piece-wise linear pre-planned path. It is found that the Bezier curves has the capability of making the planned path feasible and could be embedded in a path planning algorithm for an autonomous vehicle with kinematic constraints. It is concluded that the length of segments of the pre-planned path have to be greater than a nominal value, derived from the vehicle wheelbase, maximum steering angle and maximum speed to ensure the path for the autonomous car is feasible.

  3. Low-lying electronic states of the OH radical: potential energy curves, dipole moment functions, and transition probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Qin, X.; Zhang, S. D. [Qufu Normal University, Qufu (China)

    2014-12-15

    The six doublet and the two quartet electronic states ({sup 2}Σ{sup +}(2), {sup 2}Σ{sup -}, {sup 2}Π(2), {sup 2}Δ, {sup 4}Σ{sup -}, and {sup 4}Π) of the OH radical have been studied using the multi-reference configuration interaction (MRCI) method where the Davidson correction, core-valence interaction and relativistic effect are considered with large basis sets of aug-cc-pv5z, aug-cc-pcv5z, and cc-pv5z-DK, respectively. Potential energy curves (PECs) and dipole moment functions are also calculated for these states for internuclear distances ranging from 0.05 nm to 0.80 nm. All possible vibrational levels and rotational constants for the bound state X{sup 2}Π and A{sup 2}Σ{sup +} of OH are predicted by numerical solving the radial Schroedinger equation through the Level program, and spectroscopic parameters, which are in good agreements with experimental results, are obtained. Transition dipole moments between the ground state X{sup 2}Π and other excited states are also computed using MRCI, and the transition probability, lifetime, and Franck-Condon factors for the A{sup 2}Σ{sup +} - X{sup 2}Π transition are discussed and compared with existing experimental values.

  4. Tourism and solid waste generation in Europe: A panel data assessment of the Environmental Kuznets Curve.

    Science.gov (United States)

    Arbulú, Italo; Lozano, Javier; Rey-Maquieira, Javier

    2015-12-01

    The relationship between tourism growth and municipal solid waste (MSW) generation has been, until now, the subject of little research. This is puzzling since the tourism sector is an important MSW generator and, at the same time, is willing to avoid negative impacts from MSW mismanagement. This paper aims to provide tools for tourism and MSW management by assessing the effects of tourism volume, tourism quality and tourism specialization on MSW generation in the UE. This is done using the Environmental Kuznets Curve (EKC) framework. The study considers a panel data for 32 European economies in the 1997-2010 periods. Empirical results support the EKC hypothesis for MSW and shows that northern countries tend to have lower income elasticity than less developed countries; furthermore, results confirm a non-linear and significant effect of tourism arrivals, expenditure per tourist and tourism specialization on MSW generation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Enhanced secondary analysis of survival data: reconstructing the data from published Kaplan-Meier survival curves

    Directory of Open Access Journals (Sweden)

    Guyot Patricia

    2012-02-01

    Full Text Available Abstract Background The results of Randomized Controlled Trials (RCTs on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. Methods We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios with statistics based on repeated reconstructions by multiple observers. Results The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. Conclusion The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.

  6. FN-curves: preliminary estimation of severe accident risks after Fukushima

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Costa, Antonio Carlos Lopes da

    2015-01-01

    Doubts of whether the risks related to severe accidents in nuclear reactors are indeed very low were raised after the nuclear accident at Fukushima Daiichi in 2011. Risk estimations of severe accidents in nuclear power plants involve both probability and consequence assessment of such events. Among the ways to display risks, risk curves are tools that express the frequency of exceeding a certain magnitude of consequence. Societal risk is often represented graphically in a FN-curve, a type of risk curve, which displays the probability of having N or more fatalities per year, as a function of N, on a double logarithmic scale. The FN-curve, originally introduced for the assessment of the risks in the nuclear industry through the U.S.NRC Reactor Safety Study WASH-1400 (1975), is used in various countries to express and limit risks of hazardous activities. This first study estimated an expected rate of core damage equal to 5x10 -5 by reactor-year and suggested an upper bound of 3x10 -4 by reactor-year. A more recent report issued by Electric Power Research Institute - EPRI (2008) estimates a figure of the order of 2x10 -5 by reactor-year. The Fukushima nuclear accident apparently implies that the observed core damage frequency is higher than that predicted by these probabilistic safety assessments. Therefore, this paper presents a preliminary analyses of the FN-curves related to severe nuclear reactor accidents, taking into account a combination of available data of past accidents, probability modelling to estimate frequencies, and expert judgments. (author)

  7. FN-curves: preliminary estimation of severe accident risks after Fukushima

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Soares, Wellington Antonio; Costa, Antonio Carlos Lopes da, E-mail: vasconv@cdtn.br, E-mail: soaresw@cdtn.br, E-mail: aclc@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    Doubts of whether the risks related to severe accidents in nuclear reactors are indeed very low were raised after the nuclear accident at Fukushima Daiichi in 2011. Risk estimations of severe accidents in nuclear power plants involve both probability and consequence assessment of such events. Among the ways to display risks, risk curves are tools that express the frequency of exceeding a certain magnitude of consequence. Societal risk is often represented graphically in a FN-curve, a type of risk curve, which displays the probability of having N or more fatalities per year, as a function of N, on a double logarithmic scale. The FN-curve, originally introduced for the assessment of the risks in the nuclear industry through the U.S.NRC Reactor Safety Study WASH-1400 (1975), is used in various countries to express and limit risks of hazardous activities. This first study estimated an expected rate of core damage equal to 5x10{sup -5} by reactor-year and suggested an upper bound of 3x10{sup -4} by reactor-year. A more recent report issued by Electric Power Research Institute - EPRI (2008) estimates a figure of the order of 2x10{sup -5} by reactor-year. The Fukushima nuclear accident apparently implies that the observed core damage frequency is higher than that predicted by these probabilistic safety assessments. Therefore, this paper presents a preliminary analyses of the FN-curves related to severe nuclear reactor accidents, taking into account a combination of available data of past accidents, probability modelling to estimate frequencies, and expert judgments. (author)

  8. Wind Turbine Power Curve Design for Optimal Power Generation in Wind Farms Considering Wake Effect

    Directory of Open Access Journals (Sweden)

    Jie Tian

    2017-03-01

    Full Text Available In modern wind farms, maximum power point tracking (MPPT is widely implemented. Using the MPPT method, each individual wind turbine is controlled by its pitch angle and tip speed ratio to generate the maximum active power. In a wind farm, the upstream wind turbine may cause power loss to its downstream wind turbines due to the wake effect. According to the wake model, downstream power loss is also determined by the pitch angle and tip speed ratio of the upstream wind turbine. By optimizing the pitch angle and tip speed ratio of each wind turbine, the total active power of the wind farm can be increased. In this paper, the optimal pitch angle and tip speed ratio are selected for each wind turbine by the exhausted search. Considering the estimation error of the wake model, a solution to implement the optimized pitch angle and tip speed ratio is proposed, which is to generate the optimal control curves for each individual wind turbine off-line. In typical wind farms with regular layout, based on the detailed analysis of the influence of pitch angle and tip speed ratio on the total active power of the wind farm by the exhausted search, the optimization is simplified with the reduced computation complexity. By using the optimized control curves, the annual energy production (AEP is increased by 1.03% compared to using the MPPT method in a case-study of a typical eighty-turbine wind farm.

  9. Concise method for evaluating the probability distribution of the marginal cost of power generation

    International Nuclear Information System (INIS)

    Zhang, S.H.; Li, Y.Z.

    2000-01-01

    In the developing electricity market, many questions on electricity pricing and the risk modelling of forward contracts require the evaluation of the expected value and probability distribution of the short-run marginal cost of power generation at any given time. A concise forecasting method is provided, which is consistent with the definitions of marginal costs and the techniques of probabilistic production costing. The method embodies clear physical concepts, so that it can be easily understood theoretically and computationally realised. A numerical example has been used to test the proposed method. (author)

  10. Soil-Structure Interaction Effect on Fragility Curve of 3D Models of Concrete Moment-Resisting Buildings

    Directory of Open Access Journals (Sweden)

    Ali Anvarsamarin

    2018-01-01

    Full Text Available This paper presents the probabilistic generation of collapse fragility curves for evaluating the performance of 3D, reinforced concrete (RC moment-resisting building models, considering soil-structure interaction (SSI by concentration on seismic uncertainties. It considers collapse as the loss of lateral load-resisting capacity of the building structures due to severe ground shaking and consequent large interstory drifts intensified by P-Δ effects as well as the strength and stiffness deterioration of their lateral load carrying systems. The estimation of the collapse performance of structures requires the relation between the intensity measure (IM and the probability of collapse that is determined using the generated collapse fragility curves. Considering a number of 6-, 12-, and 18-story, 3D, RC moment-resisting buildings, two scalar IMs are employed to estimate their collapse fragility curve. On the other hand, the effect of the site soil type on the collapse fragility curves was taken into account by considering the soil-structure interaction. According to the obtained results, adopting the average of spectral acceleration (Saavg intensity measure is more efficient in capturing the effect of the inherent uncertainties of the strong ground motions on the structural response parameters. In addition, considering the SSI for soil type D with shear-wave velocity of 180 m/s to 360 m/s reduces the median of intensity measure (IM = Sa(T1 of fragility curve in 6-, 12-, and 18-story buildings by 4.92%, 22.26%, and 23.03%, respectively.

  11. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    Science.gov (United States)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  12. Utilization of curve offsets in additive manufacturing

    Science.gov (United States)

    Haseltalab, Vahid; Yaman, Ulas; Dolen, Melik

    2018-05-01

    Curve offsets are utilized in different fields of engineering and science. Additive manufacturing, which lately becomes an explicit requirement in manufacturing industry, utilizes curve offsets widely. One of the necessities of offsetting is for scaling which is required if there is shrinkage after the fabrication or if the surface quality of the resulting part is unacceptable. Therefore, some post-processing is indispensable. But the major application of curve offsets in additive manufacturing processes is for generating head trajectories. In a point-wise AM process, a correct tool-path in each layer can reduce lots of costs and increase the surface quality of the fabricated parts. In this study, different curve offset generation algorithms are analyzed to show their capabilities and disadvantages through some test cases and improvements on their drawbacks are suggested.

  13. Power forward curves: a managerial perspective

    International Nuclear Information System (INIS)

    Nagarajan, Shankar

    1999-01-01

    This chapter concentrates on managerial application of power forward curves, and examines the determinants of electricity prices such as transmission constraints, its inability to be stored in a conventional way, its seasonality and weather dependence, the generation stack, and the swing risk. The electricity forward curve, classical arbitrage, constructing a forward curve, volatilities, and electricity forward curve models such as the jump-diffusion model, the mean-reverting heteroscedastic volatility model, and an econometric model of forward prices are examined. A managerial perspective of the applications of the forward curve is presented covering plant valuation, capital budgeting, performance measurement, product pricing and structuring, asset optimisation, valuation of transmission options, and risk management

  14. On the magnetization process and the associated probability in anisotropic cubic crystals

    Energy Technology Data Exchange (ETDEWEB)

    Khedr, D.M., E-mail: doaamohammed88@gmail.com [Department of Basic Science, Modern Academy of Engineering and Technology at Maadi, Cairo (Egypt); Aly, Samy H.; Shabara, Reham M. [Department of Physics, Faculty of Science at Damietta, University of Damietta, Damietta (Egypt); Yehia, Sherif [Department of Physics, Faculty of Science at Helwan, University of Helwan, Helwan (Egypt)

    2017-05-15

    We present a theoretical method to calculate specific magnetic properties, e.g. magnetization curves, magnetic susceptibility and probability landscapes along the [100], [110] and [111] crystallographic directions of a crystal of cubic symmetry. The probability landscape displays the evolution of the most probable angular orientation of the magnetization vector, for selected temperatures and magnetic fields. Our method is based on the premises of classical statistical mechanics. The energy density, used in the partition function, is the sum of magnetic anisotropy and Zeeman energies, however no other energies e.g. elastic or magnetoelastic terms are considered in the present work. Model cubic systems of diverse anisotropies are analyzed first, and subsequently material magnetic systems of cubic symmetry; namely iron, nickel and Co{sub x} Fe{sub 100−x} compounds, are discussed. We highlight a correlation between magnetization curves and the associated probability landscapes. In addition, determination of easiest axes of magnetization, using energy consideration, is done and compared with the results of the present method.

  15. J-resistance curves for Inconel 690 and Incoloy 800 nuclear steam generators tubes at room temperature and at 300 °C

    Energy Technology Data Exchange (ETDEWEB)

    Bergant, Marcos A., E-mail: marcos.bergant@cab.cnea.gov.ar [Gerencia CAREM, Centro Atómico Bariloche (CNEA), Av. Bustillo 9500, San Carlos de Bariloche 8400 (Argentina); Yawny, Alejandro A., E-mail: yawny@cab.cnea.gov.ar [División Física de Metales, Centro Atómico Bariloche (CNEA) / CONICET, Av. Bustillo 9500, San Carlos de Bariloche 8400 (Argentina); Perez Ipiña, Juan E., E-mail: juan.perezipina@fain.uncoma.edu.ar [Grupo Mecánica de Fractura, Universidad Nacional del Comahue / CONICET, Buenos Aires 1400, Neuquén 8300 (Argentina)

    2017-04-01

    The structural integrity of steam generator tubes is a relevant issue concerning nuclear plant safety. In the present work, J-resistance curves of Inconel 690 and Incoloy 800 nuclear steam generator tubes with circumferential and longitudinal through wall cracks were obtained at room temperature and 300 °C using recently developed non-standard specimens' geometries. It was found that Incoloy 800 tubes exhibited higher J-resistance curves than Inconel 690 for both crack orientations. For both materials, circumferential cracks resulted into higher fracture resistance than longitudinal cracks, indicating a certain degree of texture anisotropy introduced by the tube fabrication process. From a practical point of view, temperature effects have found to be negligible in all cases. The results obtained in the present work provide a general framework for further application to structural integrity assessments of cracked tubes in a variety of nuclear steam generator designs. - Highlights: •Non-standard fracture specimens were obtained from nuclear steam generator tubes. •Specimens with circumferential and longitudinal through-wall cracks were used. •Inconel 690 and Incoloy 800 steam generator tubes were tested at 24 and 300 °C. •Fracture toughness for circumferential cracks was higher than for longitudinal cracks. •Incoloy 800 showed higher fracture toughness than Inconel 690 steam generator tubes.

  16. A probability evaluation method of early deterioration condition for the critical components of wind turbine generator systems

    DEFF Research Database (Denmark)

    Hu, Y.; Li, H.; Liao, X

    2016-01-01

    method of early deterioration condition for critical components based only on temperature characteristic parameters. First, the dynamic threshold of deterioration degree function was proposed by analyzing the operational data between temperature and rotor speed. Second, a probability evaluation method...... of early deterioration condition was presented. Finally, two cases showed the validity of the proposed probability evaluation method in detecting early deterioration condition and in tracking their further deterioration for the critical components.......This study determines the early deterioration condition of critical components for a wind turbine generator system (WTGS). Due to the uncertainty nature of the fluctuation and intermittence of wind, early deterioration condition evaluation poses a challenge to the traditional vibration...

  17. Radioactivity release vs probability for a steam generator tube rupture accident

    International Nuclear Information System (INIS)

    Buslik, A.J.; Hall, R.E.

    1978-01-01

    A calculation of the probability of obtaining various radioactivity releases from a steam generator tube rupture (SGTR) is presented. The only radioactive isotopes considered are Iodine-131 and Xe-133. The particular accident path considered consists of a double-ended guillotine SGTR followed by loss of offsite power (LOSP). If there is no loss of offsite power, and no system fault other than the SGTR, it is judged that the consequences will be minimal, since the amount of iodine released through the condenser air ejector is expected to be quite small; this is a consequence of the fact that the concentration of iodine in the vapor released from the condenser air ejector is very small compared to that dissolved in the condensate water. In addition, in some plants the condenser air ejector flow is automatically diverted to containment or a high-activity alarm. The analysis presented here is for a typical Westinghouse PWR such as described in RESAR-3S

  18. Determination of the 121Te gamma emission probabilities associated with the production process of radiopharmaceutical NaI[123I

    International Nuclear Information System (INIS)

    Araujo, M.T.F.; Lopes, R.T.; Poledna, R.; Delgado, J.U.; Almeida, M.C.M. de; Silva, R.L.

    2015-01-01

    The 123 I is widely used in radiodiagnostic procedures in nuclear medicine. According to Pharmacopoeia care should be taken during its production process, since radionuclidic impurities may be generated. The 121 Te is an impurity that arises during the 123 I production and determining their gamma emission probabilities (Pγ) is important in order to obtain more information about its decay. Activities were also obtained by absolute standardization using the sum-peak method and these values were compared to the efficiency curve method. (author)

  19. Intersection numbers of spectral curves

    CERN Document Server

    Eynard, B.

    2011-01-01

    We compute the symplectic invariants of an arbitrary spectral curve with only 1 branchpoint in terms of integrals of characteristic classes in the moduli space of curves. Our formula associates to any spectral curve, a characteristic class, which is determined by the laplace transform of the spectral curve. This is a hint to the key role of Laplace transform in mirror symmetry. When the spectral curve is y=\\sqrt{x}, the formula gives Kontsevich--Witten intersection numbers, when the spectral curve is chosen to be the Lambert function \\exp{x}=y\\exp{-y}, the formula gives the ELSV formula for Hurwitz numbers, and when one chooses the mirror of C^3 with framing f, i.e. \\exp{-x}=\\exp{-yf}(1-\\exp{-y}), the formula gives the Marino-Vafa formula, i.e. the generating function of Gromov-Witten invariants of C^3. In some sense this formula generalizes ELSV, Marino-Vafa formula, and Mumford formula.

  20. Probability of a steam generator tube rupture due to the presence of axial through wall cracks

    International Nuclear Information System (INIS)

    Mavko, B.; Cizelj, L.

    1991-01-01

    Using the Leak-Before-Break (LBB) approach to define tube plugging criteria a possibility to operate with through wall crack(s) in steam generator tubes may be considered. This fact may imply an increase in tube rupture probability. Improved examination techniques (in addition to the 100% tube examination) have been developed and introduced to counterbalance the associated risk. However no estimates of the amount of total increase or decrease of risk due to the introduction of LBB have been made. A scheme to predict this change of risk is proposed in the paper, based on probabilistic fracture mechanics analysis of axial cracks combined with available data of steam generator tube nondestructive examination reliability. (author)

  1. Surface growth kinematics via local curve evolution

    KAUST Repository

    Moulton, Derek E.

    2012-11-18

    A mathematical framework is developed to model the kinematics of surface growth for objects that can be generated by evolving a curve in space, such as seashells and horns. Growth is dictated by a growth velocity vector field defined at every point on a generating curve. A local orthonormal basis is attached to each point of the generating curve and the velocity field is given in terms of the local coordinate directions, leading to a fully local and elegant mathematical structure. Several examples of increasing complexity are provided, and we demonstrate how biologically relevant structures such as logarithmic shells and horns emerge as analytical solutions of the kinematics equations with a small number of parameters that can be linked to the underlying growth process. Direct access to cell tracks and local orientation enables for connections to be made to the underlying growth process. © 2012 Springer-Verlag Berlin Heidelberg.

  2. Fitting fatigue test data with a novel S-N curve using frequentist and Bayesian inference

    NARCIS (Netherlands)

    Leonetti, D.; Maljaars, J.; Snijder, H.H.B.

    2017-01-01

    In design against fatigue, a lower bound stress range vs. endurance curve (S-N curve) is employed to characterize fatigue resistance of plain material and structural details. With respect to the inherent variability of the fatigue life, the S-N curve is related to a certain probability of

  3. Application of dissociation curve analysis to radiation hybrid panel marker scoring: generation of a map of river buffalo (B. bubalis chromosome 20

    Directory of Open Access Journals (Sweden)

    Schäffer Alejandro A

    2008-11-01

    Full Text Available Abstract Background Fluorescence of dyes bound to double-stranded PCR products has been utilized extensively in various real-time quantitative PCR applications, including post-amplification dissociation curve analysis, or differentiation of amplicon length or sequence composition. Despite the current era of whole-genome sequencing, mapping tools such as radiation hybrid DNA panels remain useful aids for sequence assembly, focused resequencing efforts, and for building physical maps of species that have not yet been sequenced. For placement of specific, individual genes or markers on a map, low-throughput methods remain commonplace. Typically, PCR amplification of DNA from each panel cell line is followed by gel electrophoresis and scoring of each clone for the presence or absence of PCR product. To improve sensitivity and efficiency of radiation hybrid panel analysis in comparison to gel-based methods, we adapted fluorescence-based real-time PCR and dissociation curve analysis for use as a novel scoring method. Results As proof of principle for this dissociation curve method, we generated new maps of river buffalo (Bubalus bubalis chromosome 20 by both dissociation curve analysis and conventional marker scoring. We also obtained sequence data to augment dissociation curve results. Few genes have been previously mapped to buffalo chromosome 20, and sequence detail is limited, so 65 markers were screened from the orthologous chromosome of domestic cattle. Thirty bovine markers (46% were suitable as cross-species markers for dissociation curve analysis in the buffalo radiation hybrid panel under a standard protocol, compared to 25 markers suitable for conventional typing. Computational analysis placed 27 markers on a chromosome map generated by the new method, while the gel-based approach produced only 20 mapped markers. Among 19 markers common to both maps, the marker order on the map was maintained perfectly. Conclusion Dissociation curve

  4. Strange Curves, Counting Rabbits, & Other Mathematical Explorations

    CERN Document Server

    Ball, Keith

    2011-01-01

    How does mathematics enable us to send pictures from space back to Earth? Where does the bell-shaped curve come from? Why do you need only 23 people in a room for a 50/50 chance of two of them sharing the same birthday? In Strange Curves, Counting Rabbits, and Other Mathematical Explorations, Keith Ball highlights how ideas, mostly from pure math, can answer these questions and many more. Drawing on areas of mathematics from probability theory, number theory, and geometry, he explores a wide range of concepts, some more light-hearted, others central to the development of the field and used dai

  5. MICA: Multiple interval-based curve alignment

    Science.gov (United States)

    Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf

    2018-01-01

    MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA

  6. Expected utility versus expected regret theory versions of decision curve analysis do generate different results when treatment effects are taken into account.

    Science.gov (United States)

    Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin

    2018-02-01

    Decision curve analysis (DCA) is a widely used method for evaluating diagnostic tests and predictive models. It was developed based on expected utility theory (EUT) and has been reformulated using expected regret theory (ERG). Under certain circumstances, these 2 formulations yield different results. Here we describe these situations and explain the variation. We compare the derivations of the EUT- and ERG-based formulations of DCA for a typical medical decision problem: "treat none," "treat all," or "use model" to guide treatment. We illustrate the differences between the 2 formulations when applied to the following clinical question: at which probability of death we should refer a terminally ill patient to hospice? Both DCA formulations yielded identical but mirrored results when treatment effects are ignored; they generated significantly different results otherwise. Treatment effect has a significant effect on the results derived by EUT DCA and less so on ERG DCA. The elicitation of specific values for disutilities affected the results even more significantly in the context of EUT DCA, whereas no such elicitation was required within the ERG framework. EUT and ERG DCA generate different results when treatment effects are taken into account. The magnitude of the difference depends on the effect of treatment and the disutilities associated with disease and treatment effects. This is important to realize as the current practice guidelines are uniformly based on EUT; the same recommendations can significantly differ if they are derived based on ERG framework. © 2016 The Authors. Journal of Evaluation in Clinical Practice Published by John Wiley & Sons Ltd.

  7. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  8. Estimation of Typhoon Wind Hazard Curves for Nuclear Sites

    Energy Technology Data Exchange (ETDEWEB)

    Choun, Young-Sun; Kim, Min-Kyu [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The intensity of such typhoons, which can influence the Korean Peninsula, is on an increasing trend owing to a rapid change of climate of the Northwest Pacific Ocean. Therefore, nuclear facilities should be prepared against future super-typhoons. Currently, the U.S. Nuclear Regulatory Commission requires that a new NPP should be designed to endure the design-basis hurricane wind speeds corresponding to an annual exceedance frequency of 10{sup -7} (return period of 10 million years). A typical technique used to estimate typhoon wind speeds is based on a sampling of the key parameters of typhoon wind models from the distribution functions fitting statistical distributions to the observation data. Thus, the estimated wind speeds for long return periods include an unavoidable uncertainty owing to a limited observation. This study estimates the typhoon wind speeds for nuclear sites using a Monte Carlo simulation, and derives wind hazard curves using a logic-tree framework to reduce the epistemic uncertainty. Typhoon wind speeds were estimated for different return periods through a Monte-Carlo simulation using the typhoon observation data, and the wind hazard curves were derived using a logic-tree framework for three nuclear sites. The hazard curves for the simulated and probable maximum winds were obtained. The mean hazard curves for the simulated and probable maximum winds can be used for the design and risk assessment of an NPP.

  9. Evolvement simulation of the probability of neutron-initiating persistent fission chain

    International Nuclear Information System (INIS)

    Wang Zhe; Hong Zhenying

    2014-01-01

    Background: Probability of neutron-initiating persistent fission chain, which has to be calculated in analysis of critical safety, start-up of reactor, burst waiting time on pulse reactor, bursting time on pulse reactor, etc., is an inherent parameter in a multiplying assembly. Purpose: We aim to derive time-dependent integro-differential equation for such probability in relative velocity space according to the probability conservation, and develop the deterministic code Dynamic Segment Number Probability (DSNP) based on the multi-group S N method. Methods: The reliable convergence of dynamic calculation was analyzed and numerical simulation of the evolvement process of dynamic probability for varying concentration was performed under different initial conditions. Results: On Highly Enriched Uranium (HEU) Bare Spheres, when the time is long enough, the results of dynamic calculation approach to those of static calculation. The most difference of such results between DSNP and Partisn code is less than 2%. On Baker model, over the range of about 1 μs after the first criticality, the most difference between the dynamic and static calculation is about 300%. As for a super critical system, the finite fission chains decrease and the persistent fission chains increase as the reactivity aggrandizes, the dynamic evolvement curve of initiation probability is close to the static curve within the difference of 5% when the K eff is more than 1.2. The cumulative probability curve also indicates that the difference of integral results between the dynamic calculation and the static calculation decreases from 35% to 5% as the K eff increases. This demonstrated that the ability of initiating a self-sustaining fission chain reaction approaches stabilization, while the former difference (35%) showed the important difference of the dynamic results near the first criticality with the static ones. The DSNP code agrees well with Partisn code. Conclusions: There are large numbers of

  10. CSI 2264: CHARACTERIZING YOUNG STARS IN NGC 2264 WITH STOCHASTICALLY VARYING LIGHT CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Stauffer, John; Rebull, Luisa; Carey, Sean [Spitzer Science Center, California Institute of Technology, Pasadena, CA 91125 (United States); Cody, Ann Marie [NASA Ames Research Center, Kepler Science Office, Mountain View, CA 94035 (United States); Hillenbrand, Lynne A.; Carpenter, John [Astronomy Department, California Institute of Technology, Pasadena, CA 91125 (United States); Turner, Neal J. [Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA 91109 (United States); Terebey, Susan [Department of Physics and Astronomy, 5151 State University Drive, California State University at Los Angeles, Los Angeles, CA 90032 (United States); Morales-Calderón, Maria [Centro de Astrobiología, Dpto. de Astrofísica, INTA-CSIC, P.O. BOX 78, E-28691, ESAC Campus, Villanueva de la Cañada, Madrid (Spain); Alencar, Silvia H. P.; McGinnis, Pauline; Sousa, Alana [Departamento de Física—ICEx—UFMG, Av. Antônio Carlos, 6627, 30270-901, Belo Horizonte, MG (Brazil); Bouvier, Jerome; Venuti, Laura [Université de Grenoble, Institut de Planétologie et d’Astrophysique de Grenoble (IPAG), F-38000 Grenoble (France); Hartmann, Lee; Calvet, Nuria [Department of Astronomy, University of Michigan, 500 Church Street, Ann Arbor, MI:48105 (United States); Micela, Giusi; Flaccomio, Ettore [INAF—Osservatorio Astronomico di Palermo, Piazza del Parlamento 1, I-90134, Palermo (Italy); Song, Inseok [Department of Physics and Astronomy, The University of Georgia, Athens, GA 30602-2451 (United States); Gutermuth, Rob, E-mail: stauffer@ipac.caltech.edu [Department of Astronomy, University of Massachusetts, Amherst, MA 01003 (United States); and others

    2016-03-15

    We provide CoRoT and Spitzer light curves and other supporting data for 17 classical T Tauri stars in NGC 2264 whose CoRoT light curves exemplify the “stochastic” light curve class as defined in 2014 by Cody et al. The most probable physical mechanism to explain the optical variability within this light curve class is time-dependent mass accretion onto the stellar photosphere, producing transient hot spots. Where we have appropriate spectral data, we show that the veiling variability in these stars is consistent in both amplitude and timescale with the optical light curve morphology. The veiling variability is also well-correlated with the strength of the He i 6678 Å emission line, predicted by models to arise in accretion shocks on or near the stellar photosphere. Stars with accretion burst light curve morphology also have variable mass accretion. The stochastic and accretion burst light curves can both be explained by a simple model of randomly occurring flux bursts, with the stochastic light curve class having a higher frequency of lower amplitude events. Members of the stochastic light curve class have only moderate mass accretion rates. Their Hα profiles usually have blueshifted absorption features, probably originating in a disk wind. The lack of periodic signatures in the light curves suggests that little of the variability is due to long-lived hot spots rotating into or out of our line of sight; instead, the primary driver of the observed photometric variability is likely to be instabilities in the inner disk that lead to variable mass accretion.

  11. Probability of defect detection of Posiva's electron beam weld

    International Nuclear Information System (INIS)

    Kanzler, D.; Mueller, C.; Pitkaenen, J.

    2013-12-01

    The report 'Probability of Defect Detection of Posiva's electron beam weld' describes POD curves of four NDT methods radiographic testing, ultrasonic testing, eddy current testing and visual testing. POD-curves are based on the artificial defects in reference blocks. The results are devoted to the demonstration of suitability of the methods for EB weld testing. Report describes methodology and procedure applied by BAM. Report creates a link from the assessment of the reliability and inspection performance to the risk assessment process of the canister final disposal project. Report ensures the confirmation of the basic quality of the NDT methods and their capability to describe the quality of the EB-weld. The probability of detection curves are determined based on the MIL-1823 standard and it's reliability guidelines. The MIL-1823 standard was developed for the determination of integrity of gas turbine engines for the US military. In the POD-process there are determined as a key parameter for the defect detectability the a90/95 magnitudes, i.e. the size measure a of the defect, for which the lower 95 % confidence band crosses the 90 % POD level. By this way can be confirmed that defects with a size of a90/95 will be detected with 90 % probability. In case the experiment will be repeated 5 % might fall outside this confidence limit. (orig.)

  12. Statistical re-evaluation of the ASME K{sub IC} and K{sub IR} fracture toughness reference curves

    Energy Technology Data Exchange (ETDEWEB)

    Wallin, K.; Rintamaa, R. [Valtion Teknillinen Tutkimuskeskus, Espoo (Finland)

    1998-11-01

    Historically the ASME reference curves have been treated as representing absolute deterministic lower bound curves of fracture toughness. In reality, this is not the case. They represent only deterministic lower bound curves to a specific set of data, which represent a certain probability range. A recently developed statistical lower bound estimation method called the `Master curve`, has been proposed as a candidate for a new lower bound reference curve concept. From a regulatory point of view, the Master curve is somewhat problematic in that it does not claim to be an absolute deterministic lower bound, but corresponds to a specific theoretical failure probability that can be chosen freely based on application. In order to be able to substitute the old ASME reference curves with lower bound curves based on the master curve concept, the inherent statistical nature (and confidence level) of the ASME reference curves must be revealed. In order to estimate the true inherent level of safety, represented by the reference curves, the original data base was re-evaluated with statistical methods and compared to an analysis based on the master curve concept. The analysis reveals that the 5% lower bound Master curve has the same inherent degree of safety as originally intended for the K{sub IC}-reference curve. Similarly, the 1% lower bound Master curve corresponds to the K{sub IR}-reference curve. (orig.)

  13. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  14. Influence of pavement condition on horizontal curve safety.

    Science.gov (United States)

    Buddhavarapu, Prasad; Banerjee, Ambarish; Prozzi, Jorge A

    2013-03-01

    Crash statistics suggest that horizontal curves are the most vulnerable sites for crash occurrence. These crashes are often severe and many involve at least some level of injury due to the nature of the collisions. Ensuring the desired pavement surface condition is one potentially effective strategy to reduce the occurrence of severe accidents on horizontal curves. This study sought to develop crash injury severity models by integrating crash and pavement surface condition databases. It focuses on developing a causal relationship between pavement condition indices and severity level of crashes occurring on two-lane horizontal curves in Texas. In addition, it examines the suitability of the existing Skid Index for safety maintenance of two-lane curves. Significant correlation is evident between pavement condition and crash injury severity on two-lane undivided horizontal curves in Texas. Probability of a crash becoming fatal is appreciably sensitive to certain pavement indices. Data suggested that road facilities providing a smoother and more comfortable ride are vulnerable to severe crashes on horizontal curves. In addition, the study found that longitudinal skid measurement barely correlates with injury severity of crashes occurring on curved portions. The study recommends exploring the option of incorporating lateral friction measurement into Pavement Management System (PMS) databases specifically at curved road segments. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Synthesization of the Ar VIII 3s-3p beam-foil decay curve

    International Nuclear Information System (INIS)

    Lindgaard, A.; Veje, E.

    1981-01-01

    The beam-foil decay curve for the 3s-3p transition in Ar VIII has been simulated from experimentally determined relative initial level populations and transition probabilities calculated in the numerical Coulomb approximation. Good agreement is observed between simulated and measured decay curves. A discussion of the simulation is given. (Auth.)

  16. Bayesian Inference of Nonstationary Precipitation Intensity-Duration-Frequency Curves for Infrastructure Design

    Science.gov (United States)

    2016-03-01

    each IDF curve and subsequently used to force a calibrated and validated precipitation - runoff model. Probability-based, risk-informed hydrologic...ERDC/CHL CHETN-X-2 March 2016 Approved for public release; distribution is unlimited. Bayesian Inference of Nonstationary Precipitation Intensity...based means by which to develop local precipitation Intensity-Duration-Frequency (IDF) curves using historical rainfall time series data collected for

  17. Probability function of breaking-limited surface elevation. [wind generated waves of ocean

    Science.gov (United States)

    Tung, C. C.; Huang, N. E.; Yuan, Y.; Long, S. R.

    1989-01-01

    The effect of wave breaking on the probability function of surface elevation is examined. The surface elevation limited by wave breaking zeta sub b(t) is first related to the original wave elevation zeta(t) and its second derivative. An approximate, second-order, nonlinear, non-Gaussian model for zeta(t) of arbitrary but moderate bandwidth is presented, and an expression for the probability density function zeta sub b(t) is derived. The results show clearly that the effect of wave breaking on the probability density function of surface elevation is to introduce a secondary hump on the positive side of the probability density function, a phenomenon also observed in wind wave tank experiments.

  18. Determination of the {sup 121}Te gamma emission probabilities associated with the production process of radiopharmaceutical NaI[{sup 123}I

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, M.T.F.; Lopes, R.T., E-mail: maraujo@con.ufrj.br, E-mail: miriamtaina@hotmail.com [Coordenacao dos Cursos de Pos-Graduacao em Engenharia (LIN/PEN/COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear. Lab. de Instrumentacao Nuclear; Poledna, R.; Delgado, J.U.; Almeida, M.C.M. de; Silva, R.L. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ/LNMRI), Rio de Janeiro, RJ (Brazil). Lab. Nacional de Metrologia das Radiacoes Ionizantes

    2015-07-01

    The {sup 123}I is widely used in radiodiagnostic procedures in nuclear medicine. According to Pharmacopoeia care should be taken during its production process, since radionuclidic impurities may be generated. The {sup 121}Te is an impurity that arises during the {sup 123}I production and determining their gamma emission probabilities (Pγ) is important in order to obtain more information about its decay. Activities were also obtained by absolute standardization using the sum-peak method and these values were compared to the efficiency curve method. (author)

  19. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  20. 134Cs emission probabilities determination by gamma spectrometry

    Science.gov (United States)

    de Almeida, M. C. M.; Poledna, R.; Delgado, J. U.; Silva, R. L.; Araujo, M. T. F.; da Silva, C. J.

    2018-03-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of 134Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. 134Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration. The gamma emission probabilities (Pγ) were determined mainly for some energies of the 134Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1).

  1. The shape of the melting curve and phase transitions in the liquid state

    International Nuclear Information System (INIS)

    Yahel, Eyal

    2014-01-01

    The phase diagram of elemental liquids has been found to be surprisingly rich, including variations in the melting curve and transitions in the liquid phase. The effect of these transitions on the shape of the melting curve is reviewed and analyzed. First-order phase transitions intersecting the melting curve imply piecewise continuous melting curves, with solid-solid transitions generating upward kinks or minima and liquid-liquid transitions generating downward kinks or maxima

  2. A proposal of the diagnosis-dynamic characteristic (DDC) model describing the relation between search time and confidence levels for a dichotomous judgment, and its application to ROC curve generation

    Science.gov (United States)

    Matsumoto, Toru; Fukuda, Nobuo; Furukawa, Akira; Suwa, Koji; Wada, Shinichi; Matsumoto, Mitsuomi; Sone, Shusuke

    2006-03-01

    When physicians inspect an image, they make up a certain degree of confidence that the image are abnormal; p(t), or normal; n(t)[n(t)=1-p(t)]. After infinite time of the inspection, they reach the equilibrium levels of the confidence of p*=p(∞) and n*=n(∞). There are psychological conflicts between the decisions of normal and abnormal. We assume that the decision of "normal" is distracted by the decision of "abnormal" by a factor of k(1 + ap), and in an inverse direction by a factor of k(1 + bn), where k ( > 0) is a parameter that relates with image quality and skill of the physicians, and a and b are unknown constants. After the infinite time of inspection, the conflict reaches the equilibrium, which satisfies the equation, k(1 + ap*)n* = k(1 + bn*)p*. Here we define a parameter C, which is 2p*/[p*(1 - p*)]. After the infinite time of inspection, the conflict reaches the equilibrium, which satisfies t that changes in the confidence level with the time (dp/dt) is proportional to [k(1+ap)n - k(1+bn)p], i.e. k[-cp2 + (c - 2)p + 1]. Solving the differential equation, we derived the equation; t(p) and p(t) depending with the parameters; k, c, S. S (0-1) is the value arbitrary selected and related with probability of "abnormal" before the image inspection (S = p(0)). Image reading studies were executed for CT images. ROC curves were generated both by the traditional 4-step score-based method and by the confidence level; p estimated from the equation t(p) of the DDC model using observed judgment time. It was concluded that ROC curves could be generated by measuring time for dichotomous judgment without the subjective scores of diagnostic confidence and applying the DDC model.

  3. Spherical images and inextensible curved folding

    Science.gov (United States)

    Seffen, Keith A.

    2018-02-01

    In their study, Duncan and Duncan [Proc. R. Soc. London A 383, 191 (1982), 10.1098/rspa.1982.0126] calculate the shape of an inextensible surface folded in two about a general curve. They find the analytical relationships between pairs of generators linked across the fold curve, the shape of the original path, and the fold angle variation along it. They present two special cases of generator layouts for which the fold angle is uniform or the folded curve remains planar, for simplifying practical folding in sheet-metal processes. We verify their special cases by a graphical treatment according to a method of Gauss. We replace the fold curve by a piecewise linear path, which connects vertices of intersecting pairs of hinge lines. Inspired by the d-cone analysis by Farmer and Calladine [Int. J. Mech. Sci. 47, 509 (2005), 10.1016/j.ijmecsci.2005.02.013], we construct the spherical images for developable folding of successive vertices: the operating conditions of the special cases in Duncan and Duncan are then revealed straightforwardly by the geometric relationships between the images. Our approach may be used to synthesize folding patterns for novel deployable and shape-changing surfaces without need of complex calculation.

  4. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  5. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  6. Failure probability of PWR reactor coolant loop piping

    International Nuclear Information System (INIS)

    Lo, T.; Woo, H.H.; Holman, G.S.; Chou, C.K.

    1984-02-01

    This paper describes the results of assessments performed on the PWR coolant loop piping of Westinghouse and Combustion Engineering plants. For direct double-ended guillotine break (DEGB), consideration was given to crack existence probability, initial crack size distribution, hydrostatic proof test, preservice inspection, leak detection probability, crack growth characteristics, and failure criteria based on the net section stress failure and tearing modulus stability concept. For indirect DEGB, fragilities of major component supports were estimated. The system level fragility was then calculated based on the Boolean expression involving these fragilities. Indirect DEGB due to seismic effects was calculated by convolving the system level fragility and the seismic hazard curve. The results indicate that the probability of occurrence of both direct and indirect DEGB is extremely small, thus, postulation of DEGB in design should be eliminated and replaced by more realistic criteria

  7. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  8. Constructing forward price curves in electricity markets

    DEFF Research Database (Denmark)

    Fleten, S.-E.; Lemming, Jørgen Kjærgaard

    2003-01-01

    We present and analyze a method for constructing approximated high-resolution forward price curves in electricity markets. Because a limited number of forward or futures contracts are traded in the market, only a limited picture of the theoretical continuous forward price curve is available...... to the analyst. Our method combines the information contained in observed bid and ask prices with information from the forecasts generated by bottom-up models. As an example, we use information concerning the shape of the seasonal variation from a bottom-up model to improve the forward price curve quoted...

  9. Transition Dipole Moments and Transition Probabilities of the CN Radical

    Science.gov (United States)

    Yin, Yuan; Shi, Deheng; Sun, Jinfeng; Zhu, Zunlue

    2018-04-01

    This paper studies the transition probabilities of electric dipole transitions between 10 low-lying states of the CN radical. These states are X2Σ+, A2Π, B2Σ+, a4Σ+, b4Π, 14Σ‑, 24Π, 14Δ, 16Σ+, and 16Π. The potential energy curves are calculated using the CASSCF method, which is followed by the icMRCI approach with the Davidson correction. The transition dipole moments between different states are calculated. To improve the accuracy of potential energy curves, core–valence correlation and scalar relativistic corrections, as well as the extrapolation of potential energies to the complete basis set limit are included. The Franck–Condon factors and Einstein coefficients of emissions are calculated. The radiative lifetimes are determined for the vibrational levels of the A2Π, B2Σ+, b4Π, 14Σ‑, 24Π, 14Δ, and 16Π states. According to the transition probabilities and radiative lifetimes, some guidelines for detecting these states spectroscopically are proposed. The spin–orbit coupling effect on the spectroscopic and vibrational properties is evaluated. The splitting energy in the A2Π state is determined to be 50.99 cm‑1, which compares well with the experimental ones. The potential energy curves, transition dipole moments, spectroscopic parameters, and transition probabilities reported in this paper can be considered to be very reliable. The results obtained here can be used as guidelines for detecting these transitions, in particular those that have not been measured in previous experiments or have not been observed in the Sun, comets, stellar atmospheres, dark interstellar clouds, and diffuse interstellar clouds.

  10. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David, E-mail: david.portnoy@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Feuerbach, Robert; Heimberg, Jennifer [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of

  11. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    International Nuclear Information System (INIS)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-01-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of spectra

  12. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Science.gov (United States)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the "threat" set of spectra

  13. Experiments with conjugate gradient algorithms for homotopy curve tracking

    Science.gov (United States)

    Irani, Kashmira M.; Ribbens, Calvin J.; Watson, Layne T.; Kamat, Manohar P.; Walker, Homer F.

    1991-01-01

    There are algorithms for finding zeros or fixed points of nonlinear systems of equations that are globally convergent for almost all starting points, i.e., with probability one. The essence of all such algorithms is the construction of an appropriate homotopy map and then tracking some smooth curve in the zero set of this homotopy map. HOMPACK is a mathematical software package implementing globally convergent homotopy algorithms with three different techniques for tracking a homotopy zero curve, and has separate routines for dense and sparse Jacobian matrices. The HOMPACK algorithms for sparse Jacobian matrices use a preconditioned conjugate gradient algorithm for the computation of the kernel of the homotopy Jacobian matrix, a required linear algebra step for homotopy curve tracking. Here, variants of the conjugate gradient algorithm are implemented in the context of homotopy curve tracking and compared with Craig's preconditioned conjugate gradient method used in HOMPACK. The test problems used include actual large scale, sparse structural mechanics problems.

  14. GHG emissions, GDP growth and the Kyoto Protocol: A revisit of Environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Wei Ming; Lee, Grace W.M. [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106 (China); Wu, Chih Cheng [Energy and Air Pollution Control Section, New Materials R and D Department, China Steel Corporation, 1, Chung-Kang Road, Siaogang District, Kaohsiung 81233 (China)

    2008-01-15

    The Kyoto Protocol attempts through political negotiations to guide participating industrialized countries' greenhouse gas (GHG) emissions from a positive growing trend, to reach a peak point (or turning point), and then be reduced to a negative growth. That means the relationship between decreasing GHG emissions and economic growth may be described by an inverted-U curve (or called a bell-shaped curve), which is consistent with the concept of the Environmental Kuznets Curve (EKC) hypothesis. This research observed that the economic development and GHG emissions in Economies in Transition (EITs) exhibit a hockey-stick curve trend (or called quasi-L-shape curve), that also generates a lot of 'hot air' which is significant to the implementation of the Kyoto Protocol. In addition, through the analysis of single-country time series data and GDP data, this research demonstrated that statistical data for most of the Annex II countries do not possess evidence that supports the EKC hypothesis for GHG emissions. The results from this study also indicated that the 38 industrialized countries are unable to meet their targets under the Kyoto Protocol within the specified time period, which are probably caused by the econometric method's inability to predict accurately the extents and development of innovative technologies and Clean Development Mechanism (CDM) projects. If the international community truly wants to reduce the GHG emissions, the effectiveness of the existing international framework for emissions reduction needs to be reconsidered seriously, and the global cooperation mechanism also needs to be greatly enhanced. (author)

  15. GHG emissions, GDP growth and the Kyoto Protocol: A revisit of Environmental Kuznets Curve hypothesis

    Energy Technology Data Exchange (ETDEWEB)

    Huang Weiming [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China); Lee, Grace W.M. [Graduate Institute of Environmental Engineering, National Taiwan University, 71, Chou-Shan Road, Taipei 106, Taiwan (China)], E-mail: gracelee@ntu.edu.tw; Wu Chihcheng [Energy and Air Pollution Control Section, New Materials R and D Department, China Steel Corporation, 1, Chung-Kang Road, Siaogang District, Kaohsiung 81233, Taiwan (China)

    2008-01-15

    The Kyoto Protocol attempts through political negotiations to guide participating industrialized countries' greenhouse gas (GHG) emissions from a positive growing trend, to reach a peak point (or turning point), and then be reduced to a negative growth. That means the relationship between decreasing GHG emissions and economic growth may be described by an inverted-U curve (or called a bell-shaped curve), which is consistent with the concept of the Environmental Kuznets Curve (EKC) hypothesis. This research observed that the economic development and GHG emissions in Economies in Transition (EITs) exhibit a hockey-stick curve trend (or called quasi-L-shape curve), that also generates a lot of 'hot air' which is significant to the implementation of the Kyoto Protocol. In addition, through the analysis of single-country time series data and GDP data, this research demonstrated that statistical data for most of the Annex II countries do not possess evidence that supports the EKC hypothesis for GHG emissions. The results from this study also indicated that the 38 industrialized countries are unable to meet their targets under the Kyoto Protocol within the specified time period, which are probably caused by the econometric method's inability to predict accurately the extents and development of innovative technologies and Clean Development Mechanism (CDM) projects. If the international community truly wants to reduce the GHG emissions, the effectiveness of the existing international framework for emissions reduction needs to be reconsidered seriously, and the global cooperation mechanism also needs to be greatly enhanced.

  16. GHG emissions, GDP growth and the Kyoto Protocol: A revisit of Environmental Kuznets Curve hypothesis

    International Nuclear Information System (INIS)

    Huang Weiming; Lee, Grace W.M.; Wu Chihcheng

    2008-01-01

    The Kyoto Protocol attempts through political negotiations to guide participating industrialized countries' greenhouse gas (GHG) emissions from a positive growing trend, to reach a peak point (or turning point), and then be reduced to a negative growth. That means the relationship between decreasing GHG emissions and economic growth may be described by an inverted-U curve (or called a bell-shaped curve), which is consistent with the concept of the Environmental Kuznets Curve (EKC) hypothesis. This research observed that the economic development and GHG emissions in Economies in Transition (EITs) exhibit a hockey-stick curve trend (or called quasi-L-shape curve), that also generates a lot of 'hot air' which is significant to the implementation of the Kyoto Protocol. In addition, through the analysis of single-country time series data and GDP data, this research demonstrated that statistical data for most of the Annex II countries do not possess evidence that supports the EKC hypothesis for GHG emissions. The results from this study also indicated that the 38 industrialized countries are unable to meet their targets under the Kyoto Protocol within the specified time period, which are probably caused by the econometric method's inability to predict accurately the extents and development of innovative technologies and Clean Development Mechanism (CDM) projects. If the international community truly wants to reduce the GHG emissions, the effectiveness of the existing international framework for emissions reduction needs to be reconsidered seriously, and the global cooperation mechanism also needs to be greatly enhanced

  17. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun

    2010-09-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying completely on the surfaces by using iso-parameter curves of the reparameterized surfaces. The Hausdorff distance between the projected curve and the original curve is controlled under the user-specified distance tolerance. The projected curve is T-G 1 continuous, where T is the user-specified angle tolerance. Examples are given to show the performance of our algorithm. © 2010 Elsevier Inc. All rights reserved.

  18. Constructing forward price curves in electricity markets

    International Nuclear Information System (INIS)

    Fleten, Stein-Erik; Lemming, Jacob

    2003-01-01

    We present and analyze a method for constructing approximated high-resolution forward price curves in electricity markets. Because a limited number of forward or futures contracts are traded in the market, only a limited picture of the theoretical continuous forward price curve is available to the analyst. Our method combines the information contained in observed bid and ask prices with information from the forecasts generated by bottom-up models. As an example, we use information concerning the shape of the seasonal variation from a bottom-up model to improve the forward price curve quoted on the Nordic power exchange

  19. Electro-Mechanical Resonance Curves

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2018-01-01

    Recently I have been investigating the frequency response of galvanometers. These are direct-current devices used to measure small currents. By using a low-frequency function generator to supply the alternating-current signal and a stopwatch smartphone app to measure the period, I was able to take data to allow a resonance curve to be drawn. This…

  20. Projection of curves on B-spline surfaces using quadratic reparameterization

    KAUST Repository

    Yang, Yijun; Zeng, Wei; Zhang, Hui; Yong, Junhai; Paul, Jean Claude

    2010-01-01

    Curves on surfaces play an important role in computer aided geometric design. In this paper, we present a hyperbola approximation method based on the quadratic reparameterization of Bézier surfaces, which generates reasonable low degree curves lying

  1. Limitations of acceptability curves for presenting uncertainty in cost-effectiveness analysis

    NARCIS (Netherlands)

    Groot Koerkamp, Bas; Hunink, M. G. Myriam; Stijnen, Theo; Hammitt, James K.; Kuntz, Karen M.; Weinstein, Milton C.

    2007-01-01

    Clinical journals increasingly illustrate uncertainty about the cost and effect of health care interventions using cost-effectiveness acceptability curves (CEACs). CEACs present the probability that each competing alternative is optimal for a range of values of the cost-effectiveness threshold. The

  2. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  3. Energy dependence of contrast-detail-dose and object-detectability-dose curves for CT scanners

    International Nuclear Information System (INIS)

    Wagner, L.K.; Cohen, G.

    1982-01-01

    The energy dependence of contrast-detail-dose (CdD) and object-detectability-dose (OdD) curves for computed tomographic scanners is investigated. The effects of changes in beam energy on perceptibility are shown to be due to changes in signal-to-noise ratio resulting from changes in contrast and photon statistics. Energy-dependence analysis of OdD curves is shown to depend on the atomic composition of the phantom used to generate the curves, while such an analysis of CdD curves is independent of the atomic composition of the phantom. It is also shown that any OdD curve can be generated from CdD curves and that use of this fact rectifies any potential energy-dependent interpretation of CdD curves

  4. Probabilistic evaluation of design S-N curve and reliability assessment of ASME code-based evaluation

    International Nuclear Information System (INIS)

    Zhao Yongxiang

    1999-01-01

    A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve

  5. Section curve reconstruction and mean-camber curve extraction of a point-sampled blade surface.

    Directory of Open Access Journals (Sweden)

    Wen-long Li

    Full Text Available The blade is one of the most critical parts of an aviation engine, and a small change in the blade geometry may significantly affect the dynamics performance of the aviation engine. Rapid advancements in 3D scanning techniques have enabled the inspection of the blade shape using a dense and accurate point cloud. This paper proposes a new method to achieving two common tasks in blade inspection: section curve reconstruction and mean-camber curve extraction with the representation of a point cloud. The mathematical morphology is expanded and applied to restrain the effect of the measuring defects and generate an ordered sequence of 2D measured points in the section plane. Then, the energy and distance are minimized to iteratively smoothen the measured points, approximate the section curve and extract the mean-camber curve. In addition, a turbine blade is machined and scanned to observe the curvature variation, energy variation and approximation error, which demonstrates the availability of the proposed method. The proposed method is simple to implement and can be applied in aviation casting-blade finish inspection, large forging-blade allowance inspection and visual-guided robot grinding localization.

  6. Modular forms and special cycles on Shimura curves (AM-161)

    CERN Document Server

    Kudla, Stephen S; Yang, Tonghai

    2006-01-01

    Modular Forms and Special Cycles on Shimura Curves is a thorough study of the generating functions constructed from special cycles, both divisors and zero-cycles, on the arithmetic surface ""M"" attached to a Shimura curve ""M"" over the field of rational numbers. These generating functions are shown to be the q-expansions of modular forms and Siegel modular forms of genus two respectively, valued in the Gillet-Soulé arithmetic Chow groups of ""M"". The two types of generating functions are related via an arithmetic inner product formula. In addition, an analogue of the classical Siegel-Weil

  7. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  8. ArcCN-Runoff: An ArcGIS tool for generating curve number and runoff maps

    Science.gov (United States)

    Zhan, X.; Huang, M.-L.

    2004-01-01

    The development and the application of ArcCN-Runoff tool, an extension of ESRI@ ArcGIS software, are reported. This tool can be applied to determine curve numbers and to calculate runoff or infiltration for a rainfall event in a watershed. Implementation of GIS techniques such as dissolving, intersecting, and a curve-number reference table improve efficiency. Technical processing time may be reduced from days, if not weeks, to hours for producing spatially varied curve number and runoff maps. An application example for a watershed in Lyon County and Osage County, Kansas, USA, is presented. ?? 2004 Elsevier Ltd. All rights reserved.

  9. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  10. Calibration curves for biological dosimetry

    International Nuclear Information System (INIS)

    Guerrero C, C.; Brena V, M. . E-mail cgc@nuclear.inin.mx

    2004-01-01

    The generated information by the investigations in different laboratories of the world, included the ININ, in which settles down that certain class of chromosomal leisure it increases in function of the dose and radiation type, has given by result the obtaining of calibrated curves that are applied in the well-known technique as biological dosimetry. In this work is presented a summary of the work made in the laboratory that includes the calibrated curves for gamma radiation of 60 Cobalt and X rays of 250 k Vp, examples of presumed exposure to ionizing radiation, resolved by means of aberration analysis and the corresponding dose estimate through the equations of the respective curves and finally a comparison among the dose calculations in those people affected by the accident of Ciudad Juarez, carried out by the group of Oak Ridge, USA and those obtained in this laboratory. (Author)

  11. ROC-ing along: Evaluation and interpretation of receiver operating characteristic curves.

    Science.gov (United States)

    Carter, Jane V; Pan, Jianmin; Rai, Shesh N; Galandiuk, Susan

    2016-06-01

    It is vital for clinicians to understand and interpret correctly medical statistics as used in clinical studies. In this review, we address current issues and focus on delivering a simple, yet comprehensive, explanation of common research methodology involving receiver operating characteristic (ROC) curves. ROC curves are used most commonly in medicine as a means of evaluating diagnostic tests. Sample data from a plasma test for the diagnosis of colorectal cancer were used to generate a prediction model. These are actual, unpublished data that have been used to describe the calculation of sensitivity, specificity, positive predictive and negative predictive values, and accuracy. The ROC curves were generated to determine the accuracy of this plasma test. These curves are generated by plotting the sensitivity (true-positive rate) on the y axis and 1 - specificity (false-positive rate) on the x axis. Curves that approach closest to the coordinate (x = 0, y = 1) are more highly predictive, whereas ROC curves that lie close to the line of equality indicate that the result is no better than that obtained by chance. The optimum sensitivity and specificity can be determined from the graph as the point where the minimum distance line crosses the ROC curve. This point corresponds to the Youden index (J), a function of sensitivity and specificity used commonly to rate diagnostic tests. The area under the curve is used to quantify the overall ability of a test to discriminate between 2 outcomes. By following these simple guidelines, interpretation of ROC curves will be less difficult and they can then be interpreted more reliably when writing, reviewing, or analyzing scientific papers. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Comparative analysis of costs for the generation of electrical energy in Brazil using the screening curve

    International Nuclear Information System (INIS)

    Barros, Thaisa C. de; Castrillo, Lazara S.; Xavier, Aline G.; Goncalves, Gabriela de L.; Melo, Julio Angelo

    2017-01-01

    To meet the demand for electric consumption, it is fundamental to prepare an efficient energy planning that guarantees the safe supply of energy, so that the price of kilowatt-hours for the consumer respects the tariff modality and the socio-environmental impact is the smallest possible. In recent years, alternative sources of energy have been gaining considerable space in the Brazilian generating park. Among the many options for energy supply, choosing the most feasible involves the use of techniques that compare all the costs involved in generating electricity from the sources available in the Brazilian energy matrix. The objective of the work is to show a quick, practical and objective tool that performs this comparison and assists in decision making. The method based on the comparison is the Cost of Energy and the tool for the application of the same are the Screening curves, widely used by the International Atomic Energy Agency (IAEA). In this analysis, the following parameters are considered: operating capacity, toxic gas emission rate, fuel consumption and values, fixed and variable costs of operation and maintenance, investment values for each source, construction time and useful life. It is worth noting that the method does not consider in calculations aspects such as inflation, forced interruptions of the plant and other more specific factors. With the work, it was possible to perform an examination of the costs of the generation technologies available in Brazil and, through the obtained data, the economic viability of the generating parks was discussed through simulations in different scenarios, comparing the sources among themselves. (author)

  13. Comparative analysis of costs for the generation of electrical energy in Brazil using the screening curve

    Energy Technology Data Exchange (ETDEWEB)

    Barros, Thaisa C. de; Castrillo, Lazara S.; Xavier, Aline G.; Goncalves, Gabriela de L.; Melo, Julio Angelo, E-mail: barros.camara@gmail.com, E-mail: lazaracastrillo@hotmail.com, E-mail: alinegxavier@gmail.com, E-mail: gabilimag_@hotmail.com, E-mail: angelo_mecanic@hotmail.com [Escola Politecnica de Pernambuco (UPE), Recife, PE (Brazil). Departamento de Engenharia Mecanica Industrial

    2017-11-01

    To meet the demand for electric consumption, it is fundamental to prepare an efficient energy planning that guarantees the safe supply of energy, so that the price of kilowatt-hours for the consumer respects the tariff modality and the socio-environmental impact is the smallest possible. In recent years, alternative sources of energy have been gaining considerable space in the Brazilian generating park. Among the many options for energy supply, choosing the most feasible involves the use of techniques that compare all the costs involved in generating electricity from the sources available in the Brazilian energy matrix. The objective of the work is to show a quick, practical and objective tool that performs this comparison and assists in decision making. The method based on the comparison is the Cost of Energy and the tool for the application of the same are the Screening curves, widely used by the International Atomic Energy Agency (IAEA). In this analysis, the following parameters are considered: operating capacity, toxic gas emission rate, fuel consumption and values, fixed and variable costs of operation and maintenance, investment values for each source, construction time and useful life. It is worth noting that the method does not consider in calculations aspects such as inflation, forced interruptions of the plant and other more specific factors. With the work, it was possible to perform an examination of the costs of the generation technologies available in Brazil and, through the obtained data, the economic viability of the generating parks was discussed through simulations in different scenarios, comparing the sources among themselves. (author)

  14. Antisideslip and Antirollover Safety Speed Controller Design for Vehicle on Curved Road

    Directory of Open Access Journals (Sweden)

    Guo Lie

    2014-01-01

    Full Text Available When the drivers cannot be aware of the existing of forthcoming curved roads and fail to regulate their safety speeds accordingly, sideslip or rollover may occur with high probability. The antisideslip and antirollover control of vehicle on curved road in automatic highway systems is studied. The safety speed warning system is set before entering the curved road firstly. The speed adhesion control is adopted to shorten the braking distance while decelerating and to guarantee the safety speed. The velocity controller when decelerating on the straight path and the posture controller when driving on curved road are designed, respectively, utilizing integral backstepping technology. Simulation results demonstrate that this control system is characterized by quick and precise tracking and global stability. Consequently, it is able to avoid the dangerous operating conditions, such as sideslip and rollover, and guarantee the safety and directional stability when driving on curved road.

  15. {sup 134}Cs emission probabilities determination by gamma spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, M.C.M. de, E-mail: candida@cnen.gov.br [Comissão Nacional de Energia Nuclear (DINOR/CNEN), Riode Janeiro, RJ (Brazil); Poledna, R.; Delgado, J.U.; Silva, R.L.; Araujo, M.T.; Silva, C.J. da [Instituto de Radioproteção e Dosimetria (LNMRI/IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of {sup 134}Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. {sup 134}Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration.The gamma emission probabilities (Pγ) were determined mainly for some energies of the {sup 134}Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1). (author)

  16. Principal Curves on Riemannian Manifolds.

    Science.gov (United States)

    Hauberg, Soren

    2016-09-01

    Euclidean statistics are often generalized to Riemannian manifolds by replacing straight-line interpolations with geodesic ones. While these Riemannian models are familiar-looking, they are restricted by the inflexibility of geodesics, and they rely on constructions which are optimal only in Euclidean domains. We consider extensions of Principal Component Analysis (PCA) to Riemannian manifolds. Classic Riemannian approaches seek a geodesic curve passing through the mean that optimizes a criteria of interest. The requirements that the solution both is geodesic and must pass through the mean tend to imply that the methods only work well when the manifold is mostly flat within the support of the generating distribution. We argue that instead of generalizing linear Euclidean models, it is more fruitful to generalize non-linear Euclidean models. Specifically, we extend the classic Principal Curves from Hastie & Stuetzle to data residing on a complete Riemannian manifold. We show that for elliptical distributions in the tangent of spaces of constant curvature, the standard principal geodesic is a principal curve. The proposed model is simple to compute and avoids many of the pitfalls of traditional geodesic approaches. We empirically demonstrate the effectiveness of the Riemannian principal curves on several manifolds and datasets.

  17. Pricing Energy and Ancillary Services in a Day-Ahead Market for a Price-Taker Hydro Generating Company Using a Risk-Constrained Approach

    Directory of Open Access Journals (Sweden)

    Perica Ilak

    2014-04-01

    Full Text Available This paper analyzes a price-taker hydro generating company which participates simultaneously in day-ahead energy and ancillary services markets. An approach for deriving marginal cost curves for energy and ancillary services is proposed, taking into consideration price uncertainty and opportunity cost of water, which can later be used to determine hourly bid curves. The proposed approach combines an hourly conditional value-at-risk, probability of occurrence of automatic generation control states and an opportunity cost of water to determine energy and ancillary services marginal cost curves. The proposed approach is in a linear constraint form and is easy to implement in optimization problems. A stochastic model of the hydro-economic river basin is presented, based on the actual Vinodol hydropower system in Croatia, with a complex three-dimensional relationship between the power produced, the discharged water, and the head of associated reservoir.

  18. Learning curves in health professions education.

    Science.gov (United States)

    Pusic, Martin V; Boutis, Kathy; Hatala, Rose; Cook, David A

    2015-08-01

    Learning curves, which graphically show the relationship between learning effort and achievement, are common in published education research but are not often used in day-to-day educational activities. The purpose of this article is to describe the generation and analysis of learning curves and their applicability to health professions education. The authors argue that the time is right for a closer look at using learning curves-given their desirable properties-to inform both self-directed instruction by individuals and education management by instructors.A typical learning curve is made up of a measure of learning (y-axis), a measure of effort (x-axis), and a mathematical linking function. At the individual level, learning curves make manifest a single person's progress towards competence including his/her rate of learning, the inflection point where learning becomes more effortful, and the remaining distance to mastery attainment. At the group level, overlaid learning curves show the full variation of a group of learners' paths through a given learning domain. Specifically, they make overt the difference between time-based and competency-based approaches to instruction. Additionally, instructors can use learning curve information to more accurately target educational resources to those who most require them.The learning curve approach requires a fine-grained collection of data that will not be possible in all educational settings; however, the increased use of an assessment paradigm that explicitly includes effort and its link to individual achievement could result in increased learner engagement and more effective instructional design.

  19. Power Curve Estimation With Multivariate Environmental Factors for Inland and Offshore Wind Farms

    KAUST Repository

    Lee, Giwhyun; Ding, Yu; Genton, Marc G.; Xie, Le

    2015-01-01

    In the wind industry, a power curve refers to the functional relationship between the power output generated by a wind turbine and the wind speed at the time of power generation. Power curves are used in practice for a number of important tasks

  20. Analysis of power curves of Danish and foreign wind turbines

    International Nuclear Information System (INIS)

    Petersen, H.

    1995-12-01

    This report describes an analysis of power curves for a number of wind turbines, 30 Danish and 17 foreign - European - wind turbines. The investigation is limited to wind turbines of 150 kW capacity and greater, and to wind turbines for which a power curve is available. The power curves are transformed into a common, uniform presentation in order to facilitate the succeeding treatment, which primarily is the calculation of the production of electrical energy yielded per year. From the known data of the wind turbine, equipped generator power and rotor area and the area swept by the blades, the specific electrical production is calculated in three terms: yield per square meter of rotor area, yield per kW generator power and yield per square meter and per kilowatt generator power. Based on these findings a number of comparisons are established, such as comparisons of conceptual designs and technical- economical evaluations. (au)

  1. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  2. Comparison of two methods to determine fan performance curves using computational fluid dynamics

    Science.gov (United States)

    Onma, Patinya; Chantrasmi, Tonkid

    2018-01-01

    This work investigates a systematic numerical approach that employs Computational Fluid Dynamics (CFD) to obtain performance curves of a backward-curved centrifugal fan. Generating the performance curves requires a number of three-dimensional simulations with varying system loads at a fixed rotational speed. Two methods were used and their results compared to experimental data. The first method incrementally changes the mass flow late through the inlet boundary condition while the second method utilizes a series of meshes representing the physical damper blade at various angles. The generated performance curves from both methods are compared with an experiment setup in accordance with the AMCA fan performance testing standard.

  3. The micro-optic photovoltaic behavior of solar cell along with microlens curved glass substrate

    International Nuclear Information System (INIS)

    Xie, Jin; Wu, Keke; Cheng, Jian; Li, Ping; Zheng, Jiahua

    2015-01-01

    Highlights: • A microlens array may be micro-ground on curved photovoltaic glass substrate. • Its micro-optical structure absorbs and scatters the inclined light to solar cell. • It increases conversion efficiency and fill factor in weak and inclined lights. • It improves electricity generation by about 4 times in scattered cloudy daylight. • It produces stronger electricity generation in cloudy day than in sunny day. - Abstract: A hybrid of microlens structure and curved surface may produce high value-added micro-optic performance. Hence, the microlens array is proposed on macro curved glass substrate of thin film solar cell. The objective is to understand how the micro-optic behavior of microlens curved array influences indoor power conversion efficiency and outdoor electricity generation. First, the absorptivities of visible light and infrared light were analyzed in connection with the curved microlens sizes; then the microlens curved glass substrate was fabricated by a Computer Numerical Control (CNC) micro-grinding with micro diamond wheel V-tip; finally, its photovoltaic properties and electricity generation were measured, respectively. It is shown that the microlens curved surface may strongly absorb and scatter light to solar cell. It increases the absorptivity of visible light against plane surface, but it decreases the one of infrared light against microlens surface. When it is applied to solar cell, it enhances the power conversion efficiency by 3.4–10.6% under oblique illumination. When it is applied to solar device, it increases the electricity generation of daylight by 119–106% against microlens surface and by 260–419% against traditional plane surface, respectively. The surprising finding is that it produces much larger electricity generation during cloudy day than during sunny day, but traditional plane surface does not so

  4. Seismic Fragility Curves of Industrial Buildings by Using Nonlinear Analysis

    Directory of Open Access Journals (Sweden)

    Mohamed Nazri Fadzli

    2017-01-01

    Full Text Available This study presents the steel fragility curves and performance curves of industrial buildings of different geometries. The fragility curves were obtained for different building geometries, and the performance curves were developed based on lateral load, which is affected by the geometry of the building. Three records of far-field ground motion were used for incremental dynamic analysis (IDA, and the design lateral loads for pushover analysis (POA. All designs were based on British Standard (BS 5950; however, Eurocode 8 was preferred for seismic consideration in the analysis because BS 5950 does not specify any seismic provision. The five levels of performance stated by FEMA-273, namely, operational phase, immediate occupancy, damage control, life safety, and collapse prevention (CP were used as main guidelines for evaluating structural performance. For POA, Model 2 had highest base shear, followed by Model 1 and Model 3, even though Model 2 has a smaller structure compared with Model 3. Meanwhile, the fragility curves showed that the probability of reaching or exceeding the CP level of Model 2 is the highest, followed by that of Models 1 and 3.

  5. Exact results for survival probability in the multistate Landau-Zener model

    International Nuclear Information System (INIS)

    Volkov, M V; Ostrovsky, V N

    2004-01-01

    An exact formula is derived for survival probability in the multistate Landau-Zener model in the special case where the initially populated state corresponds to the extremal (maximum or minimum) slope of a linear diabatic potential curve. The formula was originally guessed by S Brundobler and V Elzer (1993 J. Phys. A: Math. Gen. 26 1211) based on numerical calculations. It is a simple generalization of the expression for the probability of diabatic passage in the famous two-state Landau-Zener model. Our result is obtained via analysis and summation of the entire perturbation theory series

  6. Multiwavelength light curve parameters of Cepheid variables

    Directory of Open Access Journals (Sweden)

    Bhardwaj Anupam

    2017-01-01

    Full Text Available We present a comparative analysis of theoretical and observed light curves of Cepheid variables using Fourier decomposition. The theoretical light curves at multiple wavelengths are generated using stellar pulsation models for chemical compositions representative of Cepheids in the Galaxy and Magellanic Clouds. The observed light curves at optical (VI, near-infrared (JHKs and mid-infrared (3.6 & 4.5-μm bands are compiled from the literature. We discuss the variation of light curve parameters as a function of period, wavelength and metallicity. Theoretical and observed Fourier amplitude parameters decrease with increase in wavelength while the phase parameters increase with wavelength. We find that theoretical amplitude parameters obtained using canonical mass-luminosity levels exhibit a greater offset with respect to observations when compared to non-canonical relations. We also discuss the impact of variation in convective efficiency on the light curve structure of Cepheid variables. The increase in mixing length parameter results in a zero-point offset in bolometric mean magnitudes and reduces the systematic large difference in theoretical amplitudes with respect to observations.

  7. Non-stationary random vibration analysis of a 3D train-bridge system using the probability density evolution method

    Science.gov (United States)

    Yu, Zhi-wu; Mao, Jian-feng; Guo, Feng-qi; Guo, Wei

    2016-03-01

    Rail irregularity is one of the main sources causing train-bridge random vibration. A new random vibration theory for the coupled train-bridge systems is proposed in this paper. First, number theory method (NTM) with 2N-dimensional vectors for the stochastic harmonic function (SHF) of rail irregularity power spectrum density was adopted to determine the representative points of spatial frequencies and phases to generate the random rail irregularity samples, and the non-stationary rail irregularity samples were modulated with the slowly varying function. Second, the probability density evolution method (PDEM) was employed to calculate the random dynamic vibration of the three-dimensional (3D) train-bridge system by a program compiled on the MATLAB® software platform. Eventually, the Newmark-β integration method and double edge difference method of total variation diminishing (TVD) format were adopted to obtain the mean value curve, the standard deviation curve and the time-history probability density information of responses. A case study was presented in which the ICE-3 train travels on a three-span simply-supported high-speed railway bridge with excitation of random rail irregularity. The results showed that compared to the Monte Carlo simulation, the PDEM has higher computational efficiency for the same accuracy, i.e., an improvement by 1-2 orders of magnitude. Additionally, the influences of rail irregularity and train speed on the random vibration of the coupled train-bridge system were discussed.

  8. Considerations for reference pump curves

    International Nuclear Information System (INIS)

    Stockton, N.B.

    1992-01-01

    This paper examines problems associated with inservice testing (IST) of pumps to assess their hydraulic performance using reference pump curves to establish acceptance criteria. Safety-related pumps at nuclear power plants are tested under the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code (the Code), Section 11. The Code requires testing pumps at specific reference points of differential pressure or flow rate that can be readily duplicated during subsequent tests. There are many cases where test conditions cannot be duplicated. For some pumps, such as service water or component cooling pumps, the flow rate at any time depends on plant conditions and the arrangement of multiple independent and constantly changing loads. System conditions cannot be controlled to duplicate a specific reference value. In these cases, utilities frequently request to use pump curves for comparison of test data for acceptance. There is no prescribed method for developing a pump reference curve. The methods vary and may yield substantially different results. Some results are conservative when compared to the Code requirements; some are not. The errors associated with different curve testing techniques should be understood and controlled within reasonable bounds. Manufacturer's pump curves, in general, are not sufficiently accurate to use as reference pump curves for IST. Testing using reference curves generated with polynomial least squares fits over limited ranges of pump operation, cubic spline interpolation, or cubic spline least squares fits can provide a measure of pump hydraulic performance that is at least as accurate as the Code required method. Regardless of the test method, error can be reduced by using more accurate instruments, by correcting for systematic errors, by increasing the number of data points, and by taking repetitive measurements at each data point

  9. Improving runoff risk estimates: Formulating runoff as a bivariate process using the SCS curve number method

    Science.gov (United States)

    Shaw, Stephen B.; Walter, M. Todd

    2009-03-01

    The Soil Conservation Service curve number (SCS-CN) method is widely used to predict storm runoff for hydraulic design purposes, such as sizing culverts and detention basins. As traditionally used, the probability of calculated runoff is equated to the probability of the causative rainfall event, an assumption that fails to account for the influence of variations in soil moisture on runoff generation. We propose a modification to the SCS-CN method that explicitly incorporates rainfall return periods and the frequency of different soil moisture states to quantify storm runoff risks. Soil moisture status is assumed to be correlated to stream base flow. Fundamentally, this approach treats runoff as the outcome of a bivariate process instead of dictating a 1:1 relationship between causative rainfall and resulting runoff volumes. Using data from the Fall Creek watershed in western New York and the headwaters of the French Broad River in the mountains of North Carolina, we show that our modified SCS-CN method improves frequency discharge predictions in medium-sized watersheds in the eastern United States in comparison to the traditional application of the method.

  10. Constraint-based Student Modelling in Probability Story Problems with Scaffolding Techniques

    Directory of Open Access Journals (Sweden)

    Nabila Khodeir

    2018-01-01

    Full Text Available Constraint-based student modelling (CBM is an important technique employed in intelligent tutoring systems to model student knowledge to provide relevant assistance. This paper introduces the Math Story Problem Tutor (MAST, a Web-based intelligent tutoring system for probability story problems, which is able to generate problems of different contexts, types and difficulty levels for self-paced learning. Constraints in MAST are specified at a low-level of granularity to allow fine-grained diagnosis of the student error. Furthermore, MAST extends CBM to address errors due to misunderstanding of the narrative story. It can locate and highlight keywords that may have been overlooked or misunderstood leading to an error. This is achieved by utilizing the role of sentences and keywords that are defined through the Natural Language Generation (NLG methods deployed in the story problem generation. MAST also integrates CBM with scaffolding questions and feedback to provide various forms of help and guidance to the student. This allows the student to discover and correct any errors in his/her solution. MAST has been preliminary evaluated empirically and the results show the potential effectiveness in tutoring students with a decrease in the percentage of violated constraints along the learning curve. Additionally, there is a significant improvement in the results of the post–test exam in comparison to the pre-test exam of the students using MAST in comparison to those relying on the textbook

  11. IDF-curves for precipitation In Belgium

    International Nuclear Information System (INIS)

    Mohymont, Bernard; Demarde, Gaston R.

    2004-01-01

    The Intensity-Duration-Frequency (IDF) curves for precipitation constitute a relationship between the intensity, the duration and the frequency of rainfall amounts. The intensity of precipitation is expressed in mm/h, the duration or aggregation time is the length of the interval considered while the frequency stands for the probability of occurrence of the event. IDF-curves constitute a classical and useful tool that is primarily used to dimension hydraulic structures in general, as e.g., sewer systems and which are consequently used to assess the risk of inundation. In this presentation, the IDF relation for precipitation is studied for different locations in Belgium. These locations correspond to two long-term, high-quality precipitation networks of the RMIB: (a) the daily precipitation depths of the climatological network (more than 200 stations, 1951-2001 baseline period); (b) the high-frequency 10-minutes precipitation depths of the hydro meteorological network (more than 30 stations, 15 to 33 years baseline period). For the station of Uccle, an uninterrupted time-series of more than one hundred years of 10-minutes rainfall data is available. The proposed technique for assessing the curves is based on maximum annual values of precipitation. A new analytical formula for the IDF-curves was developed such that these curves stay valid for aggregation times ranging from 10 minutes to 30 days (when fitted with appropriate data). Moreover, all parameters of this formula have physical dimensions. Finally, adequate spatial interpolation techniques are used to provide nationwide extreme values precipitation depths for short- to long-term durations With a given return period. These values are estimated on the grid points of the Belgian ALADIN-domain used in the operational weather forecasts at the RMIB.(Author)

  12. Estimation of Curve Tracing Time in Supercapacitor based PV Characterization

    Science.gov (United States)

    Basu Pal, Sudipta; Das Bhattacharya, Konika; Mukherjee, Dipankar; Paul, Debkalyan

    2017-08-01

    Smooth and noise-free characterisation of photovoltaic (PV) generators have been revisited with renewed interest in view of large size PV arrays making inroads into the urban sector of major developing countries. Such practice has recently been observed to be confronted by the use of a suitable data acquisition system and also the lack of a supporting theoretical analysis to justify the accuracy of curve tracing. However, the use of a selected bank of supercapacitors can mitigate the said problems to a large extent. Assuming a piecewise linear analysis of the V-I characteristics of a PV generator, an accurate analysis of curve plotting time has been possible. The analysis has been extended to consider the effect of equivalent series resistance of the supercapacitor leading to increased accuracy (90-95%) of curve plotting times.

  13. Matching sampler penetration curves to definitions of respirable fraction

    International Nuclear Information System (INIS)

    Mercer, T.T.

    1977-01-01

    A formal definition of 'respirable fraction' (the probability that a particle of a given size will deposit in the alveolar regions of the lung if inhaled) is useful only if there is a method of sorting out airborne contamination approximately in accordance with the definition. The matching of the definitions adopted by different organizations to the penetration curves of various types of sample is discussed. (author)

  14. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  15. Time dependent and asymptotic neutron number probability distribution calculation using discrete Fourier transform

    International Nuclear Information System (INIS)

    Humbert, Ph.

    2005-01-01

    In this paper we consider the probability distribution of neutrons in a multiplying assembly. The problem is studied using a space independent one group neutron point reactor model without delayed neutrons. We recall the generating function methodology and analytical results obtained by G.I. Bell when the c 2 approximation is used and we present numerical solutions in the general case, without this approximation. The neutron source induced distribution is calculated using the single initial neutron distribution which satisfies a master (Kolmogorov backward) equation. This equation is solved using the generating function method. The generating function satisfies a differential equation and the probability distribution is derived by inversion of the generating function. Numerical results are obtained using the same methodology where the generating function is the Fourier transform of the probability distribution. Discrete Fourier transforms are used to calculate the discrete time dependent distributions and continuous Fourier transforms are used to calculate the asymptotic continuous probability distributions. Numerical applications are presented to illustrate the method. (author)

  16. Theoretical study of melting curves on Ta, Mo, and W at high pressures

    Energy Technology Data Exchange (ETDEWEB)

    Xi Feng [Laboratory for Shock Wave and Detonation Physics Research, Institute of Fluid Physics, P.O. Box 919-102, 621900 Mianyang (China)], E-mail: hawk_0816@yahoo.com.cn; Cai Lingcang [Laboratory for Shock Wave and Detonation Physics Research, Institute of Fluid Physics, P.O. Box 919-102, 621900 Mianyang (China)

    2008-06-01

    The melting curves of tantalum (Ta), molybdenum (Mo), and tungsten (W) are calculated using a dislocation-mediated melting model. The calculated melting curves are in good agreement with shock-wave data, and partially in agreement with wire explosion and piston-cylinder data, but show large discrepancies with diamond-anvil cell (DAC) data. We propose that the melting mechanism caused by shock-wave and laser-heated DAC techniques are probably different, and that a systematic difference exists in the two melting processes.

  17. The Probable Ages of Asteroid Families

    Science.gov (United States)

    Harris, A. W.

    1993-01-01

    There has been considerable debate recently over the ages of the Hirayama families, and in particular if some of the families are very oung(u) It is a straightforward task to estimate the characteristic time of a collision between a body of a given diameter, d_o, by another body of diameter greater of equal to d_1. What is less straightforward is to estimate the critical diameter ratio, d_1/d_o, above which catastrophic disruption occurs, from which one could infer probable ages of the Hirayama families, by knowing the diameter of the parent body, d_o. One can gain some insight into the probable value of d_1/d_o, and of the likely ages of existing families, from the plot below. I have computed the characteristic time between collisions in the asteroid belt of a size ratio greater of equal to d_1/d_o, for 4 sizes of target asteroids, d_o. The solid curves to the lower right are the characteristic times for a single object...

  18. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  19. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  20. Injury risk curves for the skeletal knee-thigh-hip complex for knee-impact loading.

    Science.gov (United States)

    Rupp, Jonathan D; Flannagan, Carol A C; Kuppa, Shashi M

    2010-01-01

    Injury risk curves for the skeletal knee-thigh-hip (KTH) relate peak force applied to the anterior aspect of the flexed knee, the primary source of KTH injury in frontal motor-vehicle crashes, to the probability of skeletal KTH injury. Previous KTH injury risk curves have been developed from analyses of peak knee-impact force data from studies where knees of whole cadavers were impacted. However, these risk curves either neglect the effects of occupant gender, stature, and mass on KTH fracture force, or account for them using scaling factors derived from dimensional analysis without empirical support. A large amount of experimental data on the knee-impact forces associated with KTH fracture are now available, making it possible to estimate the effects of subject characteristics on skeletal KTH injury risk by statistically analyzing empirical data. Eleven studies were identified in the biomechanical literature in which the flexed knees of whole cadavers were impacted. From these, peak knee-impact force data and the associated subject characteristics were reanalyzed using survival analysis with a lognormal distribution. Results of this analysis indicate that the relationship between peak knee-impact force and the probability of KTH fracture is a function of age, total body mass, and whether the surface that loads the knee is rigid. Comparisons between injury risk curves for the midsize adult male and small adult female crash test dummies defined in previous studies and new risk curves for these sizes of occupants developed in this study suggest that previous injury risk curves generally overestimate the likelihood of KTH fracture at a given peak knee-impact force. Future work should focus on defining the relationships between impact force at the human knee and peak axial compressive forces measured by load cells in the crash test dummy KTH complex so that these new risk curves can be used with ATDs.

  1. Application of probability generating function to the essentials of nondestructive nuclear materials assay system using neutron correlation

    International Nuclear Information System (INIS)

    Hosoma, Takashi

    2017-01-01

    In the previous research (JAEA-Research 2015-009), essentials of neutron multiplicity counting mathematics were reconsidered where experiences obtained at the Plutonium Conversion Development Facility were taken into, and formulae of multiplicity distribution were algebraically derived up to septuplet using a probability generating function to make a strategic move in the future. Its principle was reported by K. Böhnel in 1985, but such a high-order expansion was the first case due to its increasing complexity. In this research, characteristics of the high-order correlation were investigated. It was found that higher-order correlation increases rapidly in response to the increase of leakage multiplication, crosses and leaves lower-order correlations behind, when leakage multiplication is > 1.3 that depends on detector efficiency and counter setting. In addition, fission rates and doubles count rates by fast neutron and by thermal neutron in their coexisting system were algebraically derived using a probability generating function again. Its principle was reported by I. Pázsit and L. Pál in 2012, but such a physical interpretation, i.e. associating their stochastic variables with fission rate, doubles count rate and leakage multiplication, is the first case. From Rossi-alpha combined distribution and measured ratio of each area obtained by Differential Die-Away Self-Interrogation (DDSI) and conventional assay data, it is possible to estimate: the number of induced fissions per unit time by fast neutron and by thermal neutron; the number of induced fissions (< 1) by one source neutron; and individual doubles count rates. During the research, a hypothesis introduced in their report was proved to be true. Provisional calculations were done for UO_2 of 1∼10 kgU containing ∼ 0.009 wt% "2"4"4Cm. (author)

  2. Light Scattering of Rough Orthogonal Anisotropic Surfaces with Secondary Most Probable Slope Distributions

    International Nuclear Information System (INIS)

    Li Hai-Xia; Cheng Chuan-Fu

    2011-01-01

    We study the light scattering of an orthogonal anisotropic rough surface with secondary most probable slope distribution. It is found that the scattered intensity profiles have obvious secondary maxima, and in the direction perpendicular to the plane of incidence, the secondary maxima are oriented in a curve on the observation plane, which is called the orientation curve. By numerical calculation of the scattering wave fields with the height data of the sample, it is validated that the secondary maxima are induced by the side face element, which constitutes the prismoid structure of the anisotropic surface. We derive the equation of the quadratic orientation curve. Experimentally, we construct the system for light scattering measurement using a CCD. The scattered intensity profiles are extracted from the images at different angles of incidence along the orientation curves. The experimental results conform to the theory. (fundamental areas of phenomenology(including applications))

  3. Receiver Operating Characteristic Analysis for Classification Based on Various Prior Probabilities of Groups with an Application to Breath Analysis

    Science.gov (United States)

    Cimermanová, K.

    2009-01-01

    In this paper we illustrate the influence of prior probabilities of diseases on diagnostic reasoning. For various prior probabilities of classified groups characterized by volatile organic compounds of breath profile, smokers and non-smokers, we constructed the ROC curve and the Youden index with related asymptotic pointwise confidence intervals.

  4. Incorporating Nonstationarity into IDF Curves across CONUS from Station Records and Implications

    Science.gov (United States)

    Wang, K.; Lettenmaier, D. P.

    2017-12-01

    Intensity-duration-frequency (IDF) curves are widely used for engineering design of storm-affected structures. Current practice is that IDF-curves are based on observed precipitation extremes fit to a stationary probability distribution (e.g., the extreme value family). However, there is increasing evidence of nonstationarity in station records. We apply the Mann-Kendall trend test to over 1000 stations across the CONUS at a 0.05 significance level, and find that about 30% of stations test have significant nonstationarity for at least one duration (1-, 2-, 3-, 6-, 12-, 24-, and 48-hours). We fit the stations to a GEV distribution with time-varying location and scale parameters using a Bayesian- methodology and compare the fit of stationary versus nonstationary GEV distributions to observed precipitation extremes. Within our fitted nonstationary GEV distributions, we compare distributions with a time-varying location parameter versus distributions with both time-varying location and scale parameters. For distributions with two time-varying parameters, we pay particular attention to instances where location and scale trends have opposing directions. Finally, we use the mathematical framework based on work of Koutsoyiannis to generate IDF curves based on the fitted GEV distributions and discuss the implications that using time-varying parameters may have on simple scaling relationships. We apply the above methods to evaluate how frequency statistics based on a stationary assumption compare to those that incorporate nonstationarity for both short and long term projects. Overall, we find that neglecting nonstationarity can lead to under- or over-estimates (depending on the trend for the given duration and region) of important statistics such as the design storm.

  5. Variability of the Wind Turbine Power Curve

    Directory of Open Access Journals (Sweden)

    Mahesh M. Bandi

    2016-09-01

    Full Text Available Wind turbine power curves are calibrated by turbine manufacturers under requirements stipulated by the International Electrotechnical Commission to provide a functional mapping between the mean wind speed v ¯ and the mean turbine power output P ¯ . Wind plant operators employ these power curves to estimate or forecast wind power generation under given wind conditions. However, it is general knowledge that wide variability exists in these mean calibration values. We first analyse how the standard deviation in wind speed σ v affects the mean P ¯ and the standard deviation σ P of wind power. We find that the magnitude of wind power fluctuations scales as the square of the mean wind speed. Using data from three planetary locations, we find that the wind speed standard deviation σ v systematically varies with mean wind speed v ¯ , and in some instances, follows a scaling of the form σ v = C × v ¯ α ; C being a constant and α a fractional power. We show that, when applicable, this scaling form provides a minimal parameter description of the power curve in terms of v ¯ alone. Wind data from different locations establishes that (in instances when this scaling exists the exponent α varies with location, owing to the influence of local environmental conditions on wind speed variability. Since manufacturer-calibrated power curves cannot account for variability influenced by local conditions, this variability translates to forecast uncertainty in power generation. We close with a proposal for operators to perform post-installation recalibration of their turbine power curves to account for the influence of local environmental factors on wind speed variability in order to reduce the uncertainty of wind power forecasts. Understanding the relationship between wind’s speed and its variability is likely to lead to lower costs for the integration of wind power into the electric grid.

  6. Finite-size scaling of survival probability in branching processes

    OpenAIRE

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Alvaro

    2014-01-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We reveal the finite-size scaling law of the survival probability for a given branching process ruled by a probability distribution of the number of offspring per element whose standard deviation is finite, obtaining the exact scaling function as well as the critical exponents. Our findings prove the universal behavi...

  7. 3D flyable curves for an autonomous aircraft

    Science.gov (United States)

    Bestaoui, Yasmina

    2012-11-01

    The process of conducting a mission for an autonomous aircraft includes determining the set of waypoints (flight planning) and the path for the aircraft to fly (path planning). The autonomous aircraft is an under-actuated system, having less control inputs than degrees of freedom and has two nonholonomic (non integrable) kinematic constraints. Consequently, the set of feasible trajectories will be restricted and the problem of trajectory generation becomes more complicated than a simple interpolation. Care must be taken in the selection of the basic primitives to respect the kinematic and dynamic limitations. The topic of this paper is trajectory generation using parametric curves. The problem can be formulated as follows: to lead the autonomous aircraft from an initial configuration qi to a final configuration qf in the absence of obstacles, find a trajectory q(t) for 0 ≤t ≤ T. The trajectory can be broken down into a geometric path q(s), s being the curvilinear abscissa and s=s(t) a temporal function. In 2D the curves fall into two categories: • Curves whose coordinates have a closed form expressions, for example B-splines, quintic polynomials or polar splines. • Curves whose curvature is a function of their arc length for example clothoids, cubic spirals, quintic or intrinsic splines. Some 3D solutions will be presented in this paper and their effectiveness discussed towards the problem in hand.

  8. Irregular conformal block, spectral curve and flow equations

    International Nuclear Information System (INIS)

    Choi, Sang Kwan; Rim, Chaiho; Zhang, Hong

    2016-01-01

    Irregular conformal block is motivated by the Argyres-Douglas type of N=2 super conformal gauge theory. We investigate the classical/NS limit of irregular conformal block using the spectral curve on a Riemann surface with irregular punctures, which is equivalent to the loop equation of irregular matrix model. The spectral curve is reduced to the second order (Virasoro symmetry, SU(2) for the gauge theory) and third order (W_3 symmetry, SU(3)) differential equations of a polynomial with finite degree. The conformal and W symmetry generate the flow equations in the spectral curve and determine the irregular conformal block, hence the partition function of the Argyres-Douglas theory ala AGT conjecture.

  9. Probability of background to produce a signal-like excess, for all Higgs masses tested.

    CERN Document Server

    ATLAS, collaboration

    2012-01-01

    The probability of background to produce a signal-like excess, for all the Higgs boson masses tested. At almost all masses, the probability (solid curve) is at least a few percent; however, at 126.5 GeV it dips to 3x10-7, or one chance in three million, the '5-sigma' gold-standard normally used for the discovery of a new particle. A Standard Model Higgs boson with that mass would produce a dip to 4.6 sigma.

  10. A powerful test for weak periodic signals with unknown light curve shape in sparse data

    International Nuclear Information System (INIS)

    Jager De, O.C.; Raubenheimer, B.C.; Swanepoel, J.W.H.

    1989-01-01

    A problem with most tests for periodicity is that they are powerful enough to detect only certain kinds of periodic shapes in the case of weak signals. This causes a selection effect with the identification of weak periodic signals. A new test for uniformity called the H-test is derived for which the probability distribution is an exponential function. This test is shown to have a very good power against most light curve shapes encountered in X- and γ-ray Astronomy and therefore makes the detection of sources with a larger variety of shapes possible. The use of the H-test is suggested if no a priori information about the light curve shape is available. It is also shown how the probability distribution of the test statistics changes when a periodicity search is conducted using very small steps in the period or frequency range. The flux sensitivity for various light curve shapes is also derived for a few tests and this flux is on average a minimum for the H-test

  11. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  12. Bragg Curve, Biological Bragg Curve and Biological Issues in Space Radiation Protection with Shielding

    Science.gov (United States)

    Honglu, Wu; Cucinotta, F.A.; Durante, M.; Lin, Z.; Rusek, A.

    2006-01-01

    The space environment consists of a varying field of radiation particles including high-energy ions, with spacecraft shielding material providing the major protection to astronauts from harmful exposure. Unlike low-LET gamma or X-rays, the presence of shielding does not always reduce the radiation risks for energetic charged particle exposure. Since the dose delivered by the charged particle increases sharply as the particle approaches the end of its range, a position known as the Bragg peak, the Bragg curve does not necessarily represent the biological damage along the particle traversal since biological effects are influenced by the track structure of both primary and secondary particles. Therefore, the biological Bragg curve is dependent on the energy and the type of the primary particle, and may vary for different biological endpoints. To achieve a Bragg curve distribution, we exposed cells to energetic heavy ions with the beam geometry parallel to a monolayer of fibroblasts. Qualitative analyses of gamma-H2AX fluorescence, a known marker of DSBs, indicated increased clustering of DNA damage before the Bragg peak, enhanced homogenous distribution at the peak, and provided visual evidence of high linear energy transfer (LET) particle traversal of cells beyond the Bragg peak. A quantitative biological response curve generated for micronuclei (MN) induction across the Bragg curve did not reveal an increased yield of MN at the location of the Bragg peak. However, the ratio of mono-to bi-nucleated cells, which indicates inhibition in cell progression, increased at the Bragg peak location. These results, along with other biological concerns, show that space radiation protection with shielding can be a complicated issue.

  13. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  14. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  15. Computer-aided design of curved surfaces with automatic model generation

    Science.gov (United States)

    Staley, S. M.; Jerard, R. B.; White, P. R.

    1980-01-01

    The design and visualization of three-dimensional objects with curved surfaces have always been difficult. The paper given below describes a computer system which facilitates both the design and visualization of such surfaces. The system enhances the design of these surfaces by virtue of various interactive techniques coupled with the application of B-Spline theory. Visualization is facilitated by including a specially built model-making machine which produces three-dimensional foam models. Thus, the system permits the designer to produce an inexpensive model of the object which is suitable for evaluation and presentation.

  16. [Application of decision curve on evaluation of MRI predictive model for early assessing pathological complete response to neoadjuvant therapy in breast cancer].

    Science.gov (United States)

    He, Y J; Li, X T; Fan, Z Q; Li, Y L; Cao, K; Sun, Y S; Ouyang, T

    2018-01-23

    Objective: To construct a dynamic enhanced MR based predictive model for early assessing pathological complete response (pCR) to neoadjuvant therapy in breast cancer, and to evaluate the clinical benefit of the model by using decision curve. Methods: From December 2005 to December 2007, 170 patients with breast cancer treated with neoadjuvant therapy were identified and their MR images before neoadjuvant therapy and at the end of the first cycle of neoadjuvant therapy were collected. Logistic regression model was used to detect independent factors for predicting pCR and construct the predictive model accordingly, then receiver operating characteristic (ROC) curve and decision curve were used to evaluate the predictive model. Results: ΔArea(max) and Δslope(max) were independent predictive factors for pCR, OR =0.942 (95% CI : 0.918-0.967) and 0.961 (95% CI : 0.940-0.987), respectively. The area under ROC curve (AUC) for the constructed model was 0.886 (95% CI : 0.820-0.951). Decision curve showed that in the range of the threshold probability above 0.4, the predictive model presented increased net benefit as the threshold probability increased. Conclusions: The constructed predictive model for pCR is of potential clinical value, with an AUC>0.85. Meanwhile, decision curve analysis indicates the constructed predictive model has net benefit from 3 to 8 percent in the likely range of probability threshold from 80% to 90%.

  17. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    Science.gov (United States)

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  18. Individual survival curves comparing subjective and observed mortality risks.

    Science.gov (United States)

    Bissonnette, Luc; Hurd, Michael D; Michaud, Pierre-Carl

    2017-12-01

    We compare individual survival curves constructed from objective (actual mortality) and elicited subjective information (probability of survival to a given target age). We develop a methodology to estimate jointly subjective and objective individual survival curves accounting for rounding on subjective reports of perceived survival. We make use of the long follow-up period in the Health and Retirement Study and the high quality of mortality data to estimate individual survival curves that feature both observed and unobserved heterogeneity. This allows us to compare objective and subjective estimates of remaining life expectancy for various groups and compare welfare effects of objective and subjective mortality risk using the life cycle model of consumption. We find that subjective and objective hazards are not the same. The median welfare loss from misperceptions of mortality risk when annuities are not available is 7% of current wealth at age 65 whereas more than 25% of respondents have losses larger than 60% of wealth. When annuities are available and exogenously given, the welfare loss is substantially lower. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Analytical flow duration curves for summer streamflow in Switzerland

    Science.gov (United States)

    Santos, Ana Clara; Portela, Maria Manuela; Rinaldo, Andrea; Schaefli, Bettina

    2018-04-01

    This paper proposes a systematic assessment of the performance of an analytical modeling framework for streamflow probability distributions for a set of 25 Swiss catchments. These catchments show a wide range of hydroclimatic regimes, including namely snow-influenced streamflows. The model parameters are calculated from a spatially averaged gridded daily precipitation data set and from observed daily discharge time series, both in a forward estimation mode (direct parameter calculation from observed data) and in an inverse estimation mode (maximum likelihood estimation). The performance of the linear and the nonlinear model versions is assessed in terms of reproducing observed flow duration curves and their natural variability. Overall, the nonlinear model version outperforms the linear model for all regimes, but the linear model shows a notable performance increase with catchment elevation. More importantly, the obtained results demonstrate that the analytical model performs well for summer discharge for all analyzed streamflow regimes, ranging from rainfall-driven regimes with summer low flow to snow and glacier regimes with summer high flow. These results suggest that the model's encoding of discharge-generating events based on stochastic soil moisture dynamics is more flexible than previously thought. As shown in this paper, the presence of snowmelt or ice melt is accommodated by a relative increase in the discharge-generating frequency, a key parameter of the model. Explicit quantification of this frequency increase as a function of mean catchment meteorological conditions is left for future research.

  20. Automatically-generated rectal dose constraints in intensity-modulated radiation therapy for prostate cancer

    Science.gov (United States)

    Hwang, Taejin; Kim, Yong Nam; Kim, Soo Kon; Kang, Sei-Kwon; Cheong, Kwang-Ho; Park, Soah; Yoon, Jai-Woong; Han, Taejin; Kim, Haeyoung; Lee, Meyeon; Kim, Kyoung-Joo; Bae, Hoonsik; Suh, Tae-Suk

    2015-06-01

    The dose constraint during prostate intensity-modulated radiation therapy (IMRT) optimization should be patient-specific for better rectum sparing. The aims of this study are to suggest a novel method for automatically generating a patient-specific dose constraint by using an experience-based dose volume histogram (DVH) of the rectum and to evaluate the potential of such a dose constraint qualitatively. The normal tissue complication probabilities (NTCPs) of the rectum with respect to V %ratio in our study were divided into three groups, where V %ratio was defined as the percent ratio of the rectal volume overlapping the planning target volume (PTV) to the rectal volume: (1) the rectal NTCPs in the previous study (clinical data), (2) those statistically generated by using the standard normal distribution (calculated data), and (3) those generated by combining the calculated data and the clinical data (mixed data). In the calculated data, a random number whose mean value was on the fitted curve described in the clinical data and whose standard deviation was 1% was generated by using the `randn' function in the MATLAB program and was used. For each group, we validated whether the probability density function (PDF) of the rectal NTCP could be automatically generated with the density estimation method by using a Gaussian kernel. The results revealed that the rectal NTCP probability increased in proportion to V %ratio , that the predictive rectal NTCP was patient-specific, and that the starting point of IMRT optimization for the given patient might be different. The PDF of the rectal NTCP was obtained automatically for each group except that the smoothness of the probability distribution increased with increasing number of data and with increasing window width. We showed that during the prostate IMRT optimization, the patient-specific dose constraints could be automatically generated and that our method could reduce the IMRT optimization time as well as maintain the

  1. DETERMINISTIC COMPONENTS IN THE LIGHT CURVE AMPLITUDE OF Y OPH

    International Nuclear Information System (INIS)

    Pop, Alexandru; Turcu, Vlad; Vamos, Calin

    2010-01-01

    About two decades after the discovery of the amplitude decline of the light curve of the classical Cepheid Y Oph, its study is resumed using an increased amount of homogenized data and an extended time base. In our approach, the investigation of different time series concerning the light curve amplitude of Y Oph is not only the reason for the present study, but also a stimulus for developing a coherent methodology for studying long- and short-term variability phenomena in variable stars, taking into account the details of concrete observing conditions: amount of data, data sampling, time base, and individual errors of observational data. The statistical significance of this decreasing trend was estimated by assuming its linearity. We approached the decision-making process by formulating adequate null and alternative hypotheses, and testing the value of the regression line slope for different data sets via Monte Carlo simulations. A variability analysis, through various methods, of the original data and of the residuals obtained after removing the linear trend was performed. We also proposed a new statistical test, based on amplitude spectrum analysis and Monte Carlo simulations, intended to evaluate how detectible is a given (linear) trend in well-defined observing conditions: the trend detection probability. The main conclusion of our study on Y Oph is that, even if the false alarm probability is low enough to consider the decreasing trend to be statistically significant, the available data do not allow us to obtain a reasonably powerful test. We are able to confirm the light curve amplitude decline, and the order of magnitude of its slope with a better statistical substantiation. According to the obtained values of the trend detection probability, it seems that the trend we are dealing with is marked by a low detectibility. Our attempt to find signs of possible variability phenomena at shorter timescales ended by emphasizing the relative constancy of our data

  2. Classification of resistance to passive motion using minimum probability of error criterion.

    Science.gov (United States)

    Chan, H C; Manry, M T; Kondraske, G V

    1987-01-01

    Neurologists diagnose many muscular and nerve disorders by classifying the resistance to passive motion of patients' limbs. Over the past several years, a computer-based instrument has been developed for automated measurement and parameterization of this resistance. In the device, a voluntarily relaxed lower extremity is moved at constant velocity by a motorized driver. The torque exerted on the extremity by the machine is sampled, along with the angle of the extremity. In this paper a computerized technique is described for classifying a patient's condition as 'Normal' or 'Parkinson disease' (rigidity), from the torque versus angle curve for the knee joint. A Legendre polynomial, fit to the curve, is used to calculate a set of eight normally distributed features of the curve. The minimum probability of error approach is used to classify the curve as being from a normal or Parkinson disease patient. Data collected from 44 different subjects was processes and the results were compared with an independent physician's subjective assessment of rigidity. There is agreement in better than 95% of the cases, when all of the features are used.

  3. Marginal abatement cost curves for Heavy Duty Vehicles. Background report

    Energy Technology Data Exchange (ETDEWEB)

    Schroten, A.; Warringa, G.; Bles, M.

    2012-09-15

    Cost curves were calculated for CO2 abatement technologies for Heavy Duty Vehicles. These curves were elaborated for eight different vehicle categories (six categories of truck and two subcategories), as well as for an 'average' truck and bus. Given that cost curves depend very much on underlying assumptions, the MACH model (Marginal Abatement Costs of Heavy duty vehicles) was developed. This model allows users to enter their own assumptions with respect to parameters like fuel prices and cost and lifetime of individual technologies, with the model then generating new cost curves for the various vehicle categories. This background report contains a description of the model and a summary of the results of several model runs.

  4. A simple method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation

    International Nuclear Information System (INIS)

    Begnozzi, L.; Gentile, F.P.; Di Nallo, A.M.; Chiatti, L.; Zicari, C.; Consorti, R.; Benassi, M.

    1994-01-01

    Since volumetric dose distributions are available with 3-dimensional radiotherapy treatment planning they can be used in statistical evaluation of response to radiation. This report presents a method to calculate the influence of dose inhomogeneity and fractionation in normal tissue complication probability evaluation. The mathematical expression for the calculation of normal tissue complication probability has been derived combining the Lyman model with the histogram reduction method of Kutcher et al. and using the normalized total dose (NTD) instead of the total dose. The fitting of published tolerance data, in case of homogeneous or partial brain irradiation, has been considered. For the same total or partial volume homogeneous irradiation of the brain, curves of normal tissue complication probability have been calculated with fraction size of 1.5 Gy and of 3 Gy instead of 2 Gy, to show the influence of fraction size. The influence of dose distribution inhomogeneity and α/β value has also been simulated: Considering α/β=1.6 Gy or α/β=4.1 Gy for kidney clinical nephritis, the calculated curves of normal tissue complication probability are shown. Combining NTD calculations and histogram reduction techniques, normal tissue complication probability can be estimated taking into account the most relevant contributing factors, including the volume effect. (orig.) [de

  5. Experimental and simulated beam-foil decay curves for some transitions in Zn II

    International Nuclear Information System (INIS)

    Hultberg, S.; Liljeby, L.; Mannervik, S.; Veje, E.; Lindgaard, A.

    1980-01-01

    Experimental beam-foil decay curves for the 4s-4p, 4p-4d, 4d-4f, and the 4p-5s transitions in Zn II are compared to decay curves synthesized from transition probabilities calculated in the numerical Coulomb approximation and either measured initial level populations or population models. Good agreement exists between experimental curves and those based on the measured initial level populations for the 5s, 4d, and 4f levels while certain deviations are noted for the 4p term. None of the applied population models reproduce all experimental curves satisfyingly well. In addition, lifetimes are determined experimentally for 7 terms in Zn II, and good agreement with the numerical Coulomb approximation lifetimes is generally found except for some p terms. Beam-foil excitation-mechanism results for zinc are presented and compared to previous results from light projectiles. (Auth.)

  6. Reflection curves—new computation and rendering techniques

    Directory of Open Access Journals (Sweden)

    Dan-Eugen Ulmet

    2004-05-01

    Full Text Available Reflection curves on surfaces are important tools for free-form surface interrogation. They are essential for industrial 3D CAD/CAM systems and for rendering purposes. In this note, new approaches regarding the computation and rendering of reflection curves on surfaces are introduced. These approaches are designed to take the advantage of the graphics libraries of recent releases of commercial systems such as the OpenInventor toolkit (developed by Silicon Graphics or Matlab (developed by The Math Works. A new relation between reflection curves and contour curves is derived; this theoretical result is used for a straightforward Matlab implementation of reflection curves. A new type of reflection curves is also generated using the OpenInventor texture and environment mapping implementations. This allows the computation, rendering, and animation of reflection curves at interactive rates, which makes it particularly useful for industrial applications.

  7. Two general models that generate long range correlation

    Science.gov (United States)

    Gan, Xiaocong; Han, Zhangang

    2012-06-01

    In this paper we study two models that generate sequences with LRC (long range correlation). For the IFT (inverse Fourier transform) model, our conclusion is the low frequency part leads to LRC, while the high frequency part tends to eliminate it. Therefore, a typical method to generate a sequence with LRC is multiplying the spectrum of a white noise sequence by a decaying function. A special case is analyzed: the linear combination of a smooth curve and a white noise sequence, in which the DFA plot consists of two line segments. For the patch model, our conclusion is long subsequences leads to LRC, while short subsequences tend to eliminate it. Therefore, we can generate a sequence with LRC by using a fat-tailed PDF (probability distribution function) of the length of the subsequences. A special case is also analyzed: if a patch model with long subsequences is mixed with a white noise sequence, the DFA plot will consist of two line segments. We have checked known models and actual data, and found they are all consistent with this study.

  8. A tool for the calculation of rockfall fragility curves for masonry buildings

    Science.gov (United States)

    Mavrouli, Olga

    2017-04-01

    Masonries are common structures in mountainous and coastal areas and they exhibit substantial vulnerability to rockfalls. For big rockfall events or precarious structures the damage is very high and the repair is not cost-effective. Nonetheless, for small or moderate rockfalls, the damage may vary in function of the characteristics of the impacting rock blocks and of the buildings. The evaluation of the expected damage for masonry buildings, and for different small and moderate rockfall scenarios, is useful for assessing the expected direct loss at constructed areas, and its implications for life safety. A tool for the calculation of fragility curves for masonry buildings which are impacted by rock blocks is presented. The fragility curves provide the probability of exceeding a given damage state (low, moderate and high) for increasing impact energies of the rock blocks on the walls. The damage states are defined according to a damage index equal to the percentage of the damaged area of a wall, as being proportional to the repair cost. Aleatoric and epistemic uncertainties are incorporated with respect to the (i) rock block velocity, (ii) rock block size, (iii) masonry width, and (iv) masonry resistance. The calculation of the fragility curves is applied using a Monte Carlo simulation. Given user-defined data for the average value of these four parameters and their variability, random scenarios are developed, the respective damage index is assessed for each scenario, and the probability of exceedance of each damage state is calculated. For the assessment of the damage index, a database developed by the results of 576 analytical simulations is used. The variables range is: wall width 0.4 - 1.0 m, wall tensile strength 0.1 - 0.6 MPa, rock velocity 1-20 m/s, rock size 1-20 m3. Nonetheless this tool permits the use of alternative databases, on the condition that they contain data that correlate the damage with the four aforementioned variables. The fragility curves can

  9. Task 4.1: Development of a framework for creating a databank to generate probability density functions for process parameters

    International Nuclear Information System (INIS)

    Burgazzi, Luciano

    2011-01-01

    PSA analysis should be based on the best available data for the types of equipment and systems in the plant. In some cases very limited data may be available for evolutionary designs or new equipments, especially in the case of passive systems. It has been recognized that difficulties arise in addressing the uncertainties related to the physical phenomena and characterizing the parameters relevant to the passive system performance evaluation, since the unavailability of a consistent operational and experimental data base. This lack of experimental evidence and validated data forces the analyst to resort to expert/engineering judgment to a large extent, thus making the results strongly dependent upon the expert elicitation process. This prompts the need for the development of a framework for constructing a database to generate probability distributions for the parameters influencing the system behaviour. The objective of the task is to develop a consistent framework aimed at creating probability distributions for the parameters relevant to the passive system performance evaluation. In order to achieve this goal considerable experience and engineering judgement are also required to determine which existing data are most applicable to the new systems or which generic data bases or models provide the best information for the system design. Eventually in case of absence of documented specific reliability data, documented expert judgement coming out from a well structured procedure could be used to envisage sound probability distributions for the parameters under interest

  10. Making Heads or Tails of Probability: An Experiment with Random Generators

    Science.gov (United States)

    Morsanyi, Kinga; Handley, Simon J.; Serpell, Sylvie

    2013-01-01

    Background: The equiprobability bias is a tendency for individuals to think of probabilistic events as "equiprobable" by nature, and to judge outcomes that occur with different probabilities as equally likely. The equiprobability bias has been repeatedly found to be related to formal education in statistics, and it is claimed to be based…

  11. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  12. Efficient pseudorandom generators based on the DDH assumption

    NARCIS (Netherlands)

    Rezaeian Farashahi, R.; Schoenmakers, B.; Sidorenko, A.; Okamoto, T.; Wang, X.

    2007-01-01

    A family of pseudorandom generators based on the decisional Diffie-Hellman assumption is proposed. The new construction is a modified and generalized version of the Dual Elliptic Curve generator proposed by Barker and Kelsey. Although the original Dual Elliptic Curve generator is shown to be

  13. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  14. Supervised detection of anomalous light curves in massive astronomical catalogs

    International Nuclear Information System (INIS)

    Nun, Isadora; Pichara, Karim; Protopapas, Pavlos; Kim, Dae-Won

    2014-01-01

    The development of synoptic sky surveys has led to a massive amount of data for which resources needed for analysis are beyond human capabilities. In order to process this information and to extract all possible knowledge, machine learning techniques become necessary. Here we present a new methodology to automatically discover unknown variable objects in large astronomical catalogs. With the aim of taking full advantage of all information we have about known objects, our method is based on a supervised algorithm. In particular, we train a random forest classifier using known variability classes of objects and obtain votes for each of the objects in the training set. We then model this voting distribution with a Bayesian network and obtain the joint voting distribution among the training objects. Consequently, an unknown object is considered as an outlier insofar it has a low joint probability. By leaving out one of the classes on the training set, we perform a validity test and show that when the random forest classifier attempts to classify unknown light curves (the class left out), it votes with an unusual distribution among the classes. This rare voting is detected by the Bayesian network and expressed as a low joint probability. Our method is suitable for exploring massive data sets given that the training process is performed offline. We tested our algorithm on 20 million light curves from the MACHO catalog and generated a list of anomalous candidates. After analysis, we divided the candidates into two main classes of outliers: artifacts and intrinsic outliers. Artifacts were principally due to air mass variation, seasonal variation, bad calibration, or instrumental errors and were consequently removed from our outlier list and added to the training set. After retraining, we selected about 4000 objects, which we passed to a post-analysis stage by performing a cross-match with all publicly available catalogs. Within these candidates we identified certain known

  15. Supervised Detection of Anomalous Light Curves in Massive Astronomical Catalogs

    Science.gov (United States)

    Nun, Isadora; Pichara, Karim; Protopapas, Pavlos; Kim, Dae-Won

    2014-09-01

    The development of synoptic sky surveys has led to a massive amount of data for which resources needed for analysis are beyond human capabilities. In order to process this information and to extract all possible knowledge, machine learning techniques become necessary. Here we present a new methodology to automatically discover unknown variable objects in large astronomical catalogs. With the aim of taking full advantage of all information we have about known objects, our method is based on a supervised algorithm. In particular, we train a random forest classifier using known variability classes of objects and obtain votes for each of the objects in the training set. We then model this voting distribution with a Bayesian network and obtain the joint voting distribution among the training objects. Consequently, an unknown object is considered as an outlier insofar it has a low joint probability. By leaving out one of the classes on the training set, we perform a validity test and show that when the random forest classifier attempts to classify unknown light curves (the class left out), it votes with an unusual distribution among the classes. This rare voting is detected by the Bayesian network and expressed as a low joint probability. Our method is suitable for exploring massive data sets given that the training process is performed offline. We tested our algorithm on 20 million light curves from the MACHO catalog and generated a list of anomalous candidates. After analysis, we divided the candidates into two main classes of outliers: artifacts and intrinsic outliers. Artifacts were principally due to air mass variation, seasonal variation, bad calibration, or instrumental errors and were consequently removed from our outlier list and added to the training set. After retraining, we selected about 4000 objects, which we passed to a post-analysis stage by performing a cross-match with all publicly available catalogs. Within these candidates we identified certain known

  16. Numerical modeling of the effect of surface topology on the saturated pool nucleate boiling curve

    International Nuclear Information System (INIS)

    Unal, C.; Pasamehmetoglu, K.O.

    1993-01-01

    A numerical study of saturated pool nucleate boiling with an emphasis on the effect of surface topography is presented. The numerical model consisted of solving the three-dimensional transient heat conduction equation within the heater subjected to nucleate boiling over its upper surface. The surface topography model considered the distribution of the cavity and cavity angles based on exponential and normal probability functions. Parametric results showed that the saturated nucleate boiling curve shifted left and became steeper with an increase in the mean cavity radius. The boiling curve was found to be sensitive to the selection of how many cavities were selected for each octagonal cell. A small variation in the statistical parameters, especially cavity radii for smooth surfaces, resulted in noticeable differences in wall superheat for a given heat flux. This result indicated that while the heat transfer coefficient increased with cavity radii, the cavity radii or height alone was not sufficient to characterize the boiling curve. It also suggested that statistical experimental data should consider large samples to characterize the surface topology. The boiling curve shifted to the right when the cavity angle was obtained using a normal distribution. This effect became less important when the number of cavities for each cell was increasing because the probability of the potential cavity with a larger radius in each cell was increased. When the contact angle of the fluid decreased for a given mean cavity radii, the boiling curve shifted to the right. This shift was more pronounced at smaller mean cavity radii and decreased with increasing mean cavity radii

  17. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  18. Rational Multi-curve Models with Counterparty-risk Valuation Adjustments

    DEFF Research Database (Denmark)

    Crépey, Stéphane; Macrina, Andrea; Nguyen, Tuyet Mai

    2016-01-01

    We develop a multi-curve term structure set-up in which the modelling ingredients are expressed by rational functionals of Markov processes. We calibrate to London Interbank Offer Rate swaptions data and show that a rational two-factor log-normal multi-curve model is sufficient to match market da...... with regulatory obligations. In order to compute counterparty-risk valuation adjustments, such as credit valuation adjustment, we show how default intensity processes with rational form can be derived. We flesh out our study by applying the results to a basis swap contract....... with accuracy. We elucidate the relationship between the models developed and calibrated under a risk-neutral measure Q and their consistent equivalence class under the real-world probability measure P. The consistent P-pricing models are applied to compute the risk exposures which may be required to comply...

  19. Optimal fractionation for the radiotherapy of tumour cells possessing wide-shouldered survival curves

    International Nuclear Information System (INIS)

    Wheldon, T.E.

    1979-01-01

    A recent publication (Zeitz, L., and McDonald, J.M., 1978, Br. J. Radiol., vol. 51, 637) has considered the use of in vitro survival curves in the evaluation of different treatment schedules. Several studies of oxygenated melanoma cell have demonstrated a wider than average shoulder width for the survival curves. It is possible that hypoxia reduces the width of this shoulder. Theoretical cell survival probabilities were calculated for each of the four treatment schedules considered by Zeitz and McDonald. The calculations were based on hypothetical survival curves for anoxic melanoma cells with the shoulder either fully retained or completely abolished. No allowance was made for either re-population or re-oxygenation. The advantage of small doses per fraction was demonstrated for both types of survival curve. Re-oxygenation during therapy could therefore mean that a non-uniform treatment schedule is the appropriate choice for this type of tumour. (U.K.)

  20. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  1. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    International Nuclear Information System (INIS)

    Sanders, N. E.; Soderberg, A. M.; Betancourt, M.

    2015-01-01

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST

  2. UNSUPERVISED TRANSIENT LIGHT CURVE ANALYSIS VIA HIERARCHICAL BAYESIAN INFERENCE

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, N. E.; Soderberg, A. M. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Betancourt, M., E-mail: nsanders@cfa.harvard.edu [Department of Statistics, University of Warwick, Coventry CV4 7AL (United Kingdom)

    2015-02-10

    Historically, light curve studies of supernovae (SNe) and other transient classes have focused on individual objects with copious and high signal-to-noise observations. In the nascent era of wide field transient searches, objects with detailed observations are decreasing as a fraction of the overall known SN population, and this strategy sacrifices the majority of the information contained in the data about the underlying population of transients. A population level modeling approach, simultaneously fitting all available observations of objects in a transient sub-class of interest, fully mines the data to infer the properties of the population and avoids certain systematic biases. We present a novel hierarchical Bayesian statistical model for population level modeling of transient light curves, and discuss its implementation using an efficient Hamiltonian Monte Carlo technique. As a test case, we apply this model to the Type IIP SN sample from the Pan-STARRS1 Medium Deep Survey, consisting of 18,837 photometric observations of 76 SNe, corresponding to a joint posterior distribution with 9176 parameters under our model. Our hierarchical model fits provide improved constraints on light curve parameters relevant to the physical properties of their progenitor stars relative to modeling individual light curves alone. Moreover, we directly evaluate the probability for occurrence rates of unseen light curve characteristics from the model hyperparameters, addressing observational biases in survey methodology. We view this modeling framework as an unsupervised machine learning technique with the ability to maximize scientific returns from data to be collected by future wide field transient searches like LSST.

  3. Curved Folded Plate Timber Structures

    OpenAIRE

    Buri, Hans Ulrich; Stotz, Ivo; Weinand, Yves

    2011-01-01

    This work investigates the development of a Curved Origami Prototype made with timber panels. In the last fifteen years the timber industry has developed new, large size, timber panels. Composition and dimensions of these panels and the possibility of milling them with Computer Numerical Controlled machines shows great potential for folded plate structures. To generate the form of these structures we were inspired by Origami, the Japanese art of paper folding. Common paper tessellations are c...

  4. Discontinuity of the annuity curves. III. Two types of vital variability in Drosophila melanogaster.

    Science.gov (United States)

    Bychkovskaia, I B; Mylnikov, S V; Mozhaev, G A

    2016-01-01

    We confirm five-phased construction of Drosophila annuity curves established earlier. Annuity curves were composed of stable five-phase component and variable one. Variable component was due to differences in phase durations. As stable, so variable components were apparent for 60 generations. Stochastic component was described as well. Viability variance which characterize «reaction norm» was apparent for all generation as well. Thus, both types of variability seem to be inherited.

  5. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  6. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  7. Spectroscopic parameters, vibrational levels, transition dipole moments and transition probabilities of the 9 low-lying states of the NCl+ cation

    Science.gov (United States)

    Yin, Yuan; Shi, Deheng; Sun, Jinfeng; Zhu, Zunlue

    2018-03-01

    This work calculates the potential energy curves of 9 Λ-S and 28 Ω states of the NCl+ cation. The technique employed is the complete active space self-consistent field method, which is followed by the internally contracted multireference configuration interaction approach with the Davidson correction. The Λ-S states are X2Π, 12Σ+, 14Π, 14Σ+, 14Σ-, 24Π, 14Δ, 16Σ+, and 16Π, which are yielded from the first two dissociation channels of NCl+ cation. The Ω states are generated from these Λ-S states. The 14Π, 14Δ, 16Σ+, and 16Π states are inverted with the spin-orbit coupling effect included. The 14Σ+, 16Σ+, and 16Π states are very weakly bound, whose well depths are only several-hundred cm- 1. One avoided crossing of PECs occurs between the 12Σ+ and 22Σ+ states. To improve the quality of potential energy curves, core-valence correlation and scalar relativistic corrections are included. The potential energies are extrapolated to the complete basis set limit. The spectroscopic parameters and vibrational levels are calculated. The transition dipole moments are computed. The Franck-Condon factors, Einstein coefficients, and radiative lifetimes of many transitions are determined. The spectroscopic approaches are proposed for observing these states according to the transition probabilities. The spin-orbit coupling effect on the spectroscopic and vibrational properties is evaluated. The spectroscopic parameters, vibrational levels, transition dipole moments, as well as transition probabilities reported in this paper could be considered to be very reliable.

  8. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  9. Analysis of Morphological Features of Benign and Malignant Breast Cell Extracted From FNAC Microscopic Image Using the Pearsonian System of Curves.

    Science.gov (United States)

    Rajbongshi, Nijara; Bora, Kangkana; Nath, Dilip C; Das, Anup K; Mahanta, Lipi B

    2018-01-01

    Cytological changes in terms of shape and size of nuclei are some of the common morphometric features to study breast cancer, which can be observed by careful screening of fine needle aspiration cytology (FNAC) images. This study attempts to categorize a collection of FNAC microscopic images into benign and malignant classes based on family of probability distribution using some morphometric features of cell nuclei. For this study, features namely area, perimeter, eccentricity, compactness, and circularity of cell nuclei were extracted from FNAC images of both benign and malignant samples using an image processing technique. All experiments were performed on a generated FNAC image database containing 564 malignant (cancerous) and 693 benign (noncancerous) cell level images. The five-set extracted features were reduced to three-set (area, perimeter, and circularity) based on the mean statistic. Finally, the data were fitted to the generalized Pearsonian system of frequency curve, so that the resulting distribution can be used as a statistical model. Pearsonian system is a family of distributions where kappa (κ) is the selection criteria computed as functions of the first four central moments. For the benign group, kappa (κ) corresponding to area, perimeter, and circularity was -0.00004, 0.0000, and 0.04155 and for malignant group it was 1016942, 0.01464, and -0.3213, respectively. Thus, the family of distribution related to these features for the benign and malignant group were different, and therefore, characterization of their probability curve will also be different.

  10. Linear dose response curves in fungi and tradescantia

    International Nuclear Information System (INIS)

    Unrau, P.

    1999-07-01

    heterozygosity (LOH) events occur because Clone 02 repairs both DSB and LCD by recombination. Clone 02 has a linear dose response for high LET radiation. Starting from the same initial yieId frequency, wild-types have a sublinear response. The sublinear response reflects a smoothly decreasing probability that 'pinks' are generated as a function of increasing high LET dose for wild-type but not Clone 02. This smoothly decreasing response would be expected for LOH in 'wild-type' humans. It reflects an increasing proportion of DNA damage being repaired by non-recombinational pathways and/or an increasing probability of cell death with increasing dose. Clone 02 at low doses and low dose rates of low LET radiation has a linear dose response, reflecting a 1/16 probability of a lesion leading to LOH, relative to high LET lesions. This differential is held to reflect: microdosimetric differences in energy deposition and, therefore, DNA damage by low and high LET radiations; the effects of lesion clustering after high LET on the probability of generating the end wild-types. While no observations have been made at very low doses and dose rates in wild-types, there is no reason to suppose that the low LET linear non-threshold dose response of Clone 02 is abnormal. The importance of the LOH somatic genetic end-point is that it reflects cancer risk in humans. The linear non-threshold low dose low LET response curves reflects either the probability that recombinational Holliday junctions are occasionally cleaved in a rare orientation to generate LOH, or the probability that low LET lesions include a small proportion of clustered events similar to high LET ionization or both. Calculations of the Poisson probability that two or more low LET lesions will be induced in the same target suggest that dose rate effects depend upon the coincidence of DNA lesions in the same target, and that the probability of LOH depends upon lesion and repair factors. But the slope of LOH in Clone 02 and all other

  11. Linear dose response curves in fungi and tradescantia

    Energy Technology Data Exchange (ETDEWEB)

    Unrau, P. [Atomic Energy of Canada Ltd., Chalk River, Ontario (Canada)

    1999-07-15

    ;pink' loss of heterozygosity (LOH) events occur because Clone 02 repairs both DSB and LCD by recombination. Clone 02 has a linear dose response for high LET radiation. Starting from the same initial yieId frequency, wild-types have a sublinear response. The sublinear response reflects a smoothly decreasing probability that 'pinks' are generated as a function of increasing high LET dose for wild-type but not Clone 02. This smoothly decreasing response would be expected for LOH in 'wild-type' humans. It reflects an increasing proportion of DNA damage being repaired by non-recombinational pathways and/or an increasing probability of cell death with increasing dose. Clone 02 at low doses and low dose rates of low LET radiation has a linear dose response, reflecting a 1/16 probability of a lesion leading to LOH, relative to high LET lesions. This differential is held to reflect: microdosimetric differences in energy deposition and, therefore, DNA damage by low and high LET radiations; the effects of lesion clustering after high LET on the probability of generating the end wild-types. While no observations have been made at very low doses and dose rates in wild-types, there is no reason to suppose that the low LET linear non-threshold dose response of Clone 02 is abnormal. The importance of the LOH somatic genetic end-point is that it reflects cancer risk in humans. The linear non-threshold low dose low LET response curves reflects either the probability that recombinational Holliday junctions are occasionally cleaved in a rare orientation to generate LOH, or the probability that low LET lesions include a small proportion of clustered events similar to high LET ionization or both. Calculations of the Poisson probability that two or more low LET lesions will be induced in the same target suggest that dose rate effects depend upon the coincidence of DNA lesions in the same target, and that the probability of LOH depends upon lesion and repair factors. But the

  12. Validating eddy current array probes for inspecting steam generator tubes

    International Nuclear Information System (INIS)

    Sullivan, S.P.; Cecco, V.S.; Obrutsky, L.S.

    1997-01-01

    A CANDU nuclear reactor was shut down for over one year because steam generator (SG) tubes had failed with outer diameter stress-corrosion cracking (ODSCC) in the U-bend section. Novel, single-pass eddy current transmit-receive probes, denoted as C3, were successful in detecting all significant cracks so that the cracked tubes could be plugged and the unit restarted. Significant numbers of tubes with SCC were removed from a SG in order to validate the results of the new probe. Results from metallurgical examinations were used to obtain probability-of-detection (POD) and sizing accuracy plots to quantify the performance of this new inspection technique. Though effective, the above approach of relying on tubes removed from a reactor is expensive, in terms of both economic and radiation-exposure costs. This led to a search for more affordable methods to validate inspection techniques and procedures. Methods are presented for calculating POD curves based on signal-to-noise studies using field data. Results of eddy current scans of tubes with laboratory-induced ODSCC are presented with associated POD curves. These studies appear promising in predicting realistic POD curves for new inspection technologies. They are being used to qualify an improved eddy current array probe in preparation for field use. (author)

  13. Learning Binomial Probability Concepts with Simulation, Random Numbers and a Spreadsheet

    Science.gov (United States)

    Rochowicz, John A., Jr.

    2005-01-01

    This paper introduces the reader to the concepts of binomial probability and simulation. A spreadsheet is used to illustrate these concepts. Random number generators are great technological tools for demonstrating the concepts of probability. Ideas of approximation, estimation, and mathematical usefulness provide numerous ways of learning…

  14. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K.; Cronvall, O.; Maennistoe, I. (VTT Technical Research Centre of Finland (Finland)); Gunnars, J.; Alverlind, L.; Dillstroem, P. (Inspecta Technology, Stockholm (Sweden)); Gandossi, L. (European Commission Joint Research Centre, Brussels (Belgium))

    2009-12-15

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  15. Studies on the effect of flaw detection probability assumptions on risk reduction at inspection

    International Nuclear Information System (INIS)

    Simola, K.; Cronvall, O.; Maennistoe, I.; Gunnars, J.; Alverlind, L.; Dillstroem, P.; Gandossi, L.

    2009-12-01

    The aim of the project was to study the effect of POD assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve e.g. in risk-informed in-service inspection (RI-ISI) studies. The results of the study indicate that the use of a simplified POD curve could be justifiable in RI-ISI applications. Another aim was to compare various structural reliability calculation approaches for a set of cases. Through benchmarking one can identify differences and similarities between modelling approaches, and provide added confidence on models and identify development needs. Comparing the leakage probabilities calculated by different approaches at the end of plant lifetime (60 years) shows that the results are very similar when inspections are not accounted for. However, when inspections are taken into account the predicted order of magnitude differs. Further studies would be needed to investigate the reasons for the differences. Development needs and plans for the benchmarked structural reliability models are discussed. (author)

  16. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  17. Calculation of cranial nerve complication probability for acoustic neuroma radiosurgery

    International Nuclear Information System (INIS)

    Meeks, Sanford L.; Buatti, John M.; Foote, Kelly D.; Friedman, William A.; Bova, Francis J.

    2000-01-01

    Purpose: Estimations of complications from stereotactic radiosurgery usually rely simply on dose-volume or dose-diameter isoeffect curves. Due to the sparse clinical data available, these curves have typically not considered the target location in the brain, target histology, or treatment plan conformality as parameters in the calculation. In this study, a predictive model was generated to estimate the probability of cranial neuropathies as a result of acoustic schwannoma radiosurgery. Methods and Materials: The dose-volume histogram reduction scheme was used to calculate the normal tissue complication probability (NTCP) from brainstem dose-volume histograms. The model's fitting parameters were optimized to provide the best fit to the observed complication data for acoustic neuroma patients treated with stereotactic radiosurgery at the University of Florida. The calculation was then applied to the remainder of the patients in the database. Results: The best fit to our clinical data was obtained using n = 0.04, m = 0.15, and no. alphano. /no. betano. = 2.1 Gy -1 . Although the fitting parameter m is relatively consistent with ranges found in the literature, both the volume parameter, n, and no. alphano. /no. betano. are much smaller than the values quoted in the literature. The fit to our clinical data indicates that brainstem, or possibly a specific portion of the brainstem, is more radiosensitive than the parameters in the literature indicate, and that there is very little volume effect; in other words, irradiation of a small fraction of the brainstem yields NTCPs that are nearly as high as those calculated for entire volume irradiation. These new fitting parameters are specific to acoustic neuroma radiosurgery, and the small volume effect that we observe may be an artifact of the fixed relationship of acoustic tumors to specific regions of the brainstem. Applying the model to our patient database, we calculate an average NTCP of 7.2% for patients who had no

  18. Semiclassical scalar propagators in curved backgrounds: Formalism and ambiguities

    International Nuclear Information System (INIS)

    Grain, J.; Barrau, A.

    2007-01-01

    The phenomenology of quantum systems in curved space-times is among the most fascinating fields of physics, allowing--often at the gedankenexperiment level--constraints on tentative theories of quantum gravity. Determining the dynamics of fields in curved backgrounds remains, however, a complicated task because of the highly intricate partial differential equations involved, especially when the space metric exhibits no symmetry. In this article, we provide--in a pedagogical way--a general formalism to determine this dynamics at the semiclassical order. To this purpose, a generic expression for the semiclassical propagator is computed and the equation of motion for the probability four-current is derived. Those results underline a direct analogy between the computation of the propagator in general relativistic quantum mechanics and the computation of the propagator for stationary systems in nonrelativistic quantum mechanics. A possible application of this formalism to curvature-induced quantum interferences is also discussed

  19. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  20. Better P-curves: Making P-curve analysis more robust to errors, fraud, and ambitious P-hacking, a Reply to Ulrich and Miller (2015).

    Science.gov (United States)

    Simonsohn, Uri; Simmons, Joseph P; Nelson, Leif D

    2015-12-01

    When studies examine true effects, they generate right-skewed p-curves, distributions of statistically significant results with more low (.01 s) than high (.04 s) p values. What else can cause a right-skewed p-curve? First, we consider the possibility that researchers report only the smallest significant p value (as conjectured by Ulrich & Miller, 2015), concluding that it is a very uncommon problem. We then consider more common problems, including (a) p-curvers selecting the wrong p values, (b) fake data, (c) honest errors, and (d) ambitiously p-hacked (beyond p < .05) results. We evaluate the impact of these common problems on the validity of p-curve analysis, and provide practical solutions that substantially increase its robustness. (c) 2015 APA, all rights reserved).

  1. The influence of initial beliefs on judgments of probability.

    Science.gov (United States)

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment.

  2. Impaired mismatch negativity (MMN) generation in schizophrenia as a function of stimulus deviance, probability, and interstimulus/interdeviant interval.

    Science.gov (United States)

    Javitt, D C; Grochowski, S; Shelley, A M; Ritter, W

    1998-03-01

    Schizophrenia is a severe mental disorder associated with disturbances in perception and cognition. Event-related potentials (ERP) provide a mechanism for evaluating potential mechanisms underlying neurophysiological dysfunction in schizophrenia. Mismatch negativity (MMN) is a short-duration auditory cognitive ERP component that indexes operation of the auditory sensory ('echoic') memory system. Prior studies have demonstrated impaired MMN generation in schizophrenia along with deficits in auditory sensory memory performance. MMN is elicited in an auditory oddball paradigm in which a sequence of repetitive standard tones is interrupted infrequently by a physically deviant ('oddball') stimulus. The present study evaluates MMN generation as a function of deviant stimulus probability, interstimulus interval, interdeviant interval and the degree of pitch separation between the standard and deviant stimuli. The major findings of the present study are first, that MMN amplitude is decreased in schizophrenia across a broad range of stimulus conditions, and second, that the degree of deficit in schizophrenia is largest under conditions when MMN is normally largest. The pattern of deficit observed in schizophrenia differs from the pattern observed in other conditions associated with MMN dysfunction, including Alzheimer's disease, stroke, and alcohol intoxication.

  3. Dealing with Non-stationarity in Intensity-Frequency-Duration Curve

    Science.gov (United States)

    Rengaraju, S.; Rajendran, V.; C T, D.

    2017-12-01

    Extremes like flood and drought are becoming frequent and more vulnerable in recent times, generally attributed to the recent revelation of climate change. One of the main concerns is that whether the present infrastructures like dams, storm water drainage networks, etc., which were designed following the so called `stationary' assumption, are capable of withstanding the expected severe extremes. Stationary assumption considers that extremes are not changing with respect to time. However, recent studies proved that climate change has altered the climate extremes both temporally and spatially. Traditionally, the observed non-stationary in the extreme precipitation is incorporated in the extreme value distributions in terms of changing parameters. Nevertheless, this raises a question which parameter needs to be changed, i.e. location or scale or shape, since either one or more of these parameters vary at a given location. Hence, this study aims to detect the changing parameters to reduce the complexity involved in the development of non-stationary IDF curve and to provide the uncertainty bound of estimated return level using Bayesian Differential Evolutionary Monte Carlo (DE-MC) algorithm. Firstly, the extreme precipitation series is extracted using Peak Over Threshold. Then, the time varying parameter(s) is(are) detected for the extracted series using Generalized Additive Models for Location Scale and Shape (GAMLSS). Then, the IDF curve is constructed using Generalized Pareto Distribution incorporating non-stationarity only if the parameter(s) is(are) changing with respect to time, otherwise IDF curve will follow stationary assumption. Finally, the posterior probability intervals of estimated return revel are computed through Bayesian DE-MC approach and the non-stationary based IDF curve is compared with the stationary based IDF curve. The results of this study emphasize that the time varying parameters also change spatially and the IDF curves should incorporate non

  4. Two-slit experiment: quantum and classical probabilities

    International Nuclear Information System (INIS)

    Khrennikov, Andrei

    2015-01-01

    Inter-relation between quantum and classical probability models is one of the most fundamental problems of quantum foundations. Nowadays this problem also plays an important role in quantum technologies, in quantum cryptography and the theory of quantum random generators. In this letter, we compare the viewpoint of Richard Feynman that the behavior of quantum particles cannot be described by classical probability theory with the viewpoint that quantum–classical inter-relation is more complicated (cf, in particular, with the tomographic model of quantum mechanics developed in detail by Vladimir Man'ko). As a basic example, we consider the two-slit experiment, which played a crucial role in quantum foundational debates at the beginning of quantum mechanics (QM). In particular, its analysis led Niels Bohr to the formulation of the principle of complementarity. First, we demonstrate that in complete accordance with Feynman's viewpoint, the probabilities for the two-slit experiment have the non-Kolmogorovian structure, since they violate one of basic laws of classical probability theory, the law of total probability (the heart of the Bayesian analysis). However, then we show that these probabilities can be embedded in a natural way into the classical (Kolmogorov, 1933) probability model. To do this, one has to take into account the randomness of selection of different experimental contexts, the joint consideration of which led Feynman to a conclusion about the non-classicality of quantum probability. We compare this embedding of non-Kolmogorovian quantum probabilities into the Kolmogorov model with well-known embeddings of non-Euclidean geometries into Euclidean space (e.g., the Poincaré disk model for the Lobachvesky plane). (paper)

  5. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  6. Effect of the interaction among traps on the shape of thermoluminescence glow curves

    International Nuclear Information System (INIS)

    Marcazzo, J.; Santiago, M.; Spano, F.; Lester, M.; Ortega, F.; Molina, P.; Caselli, E.

    2007-01-01

    The effect of the interaction among traps on the structure of thermoluminescence glow curves has been investigated by generating numerically simulated glow curves for a wide range of trap parameters. The results reported in this paper provide useful insights which assist in the analysis of experimental glow curves. The most important result shows that it is incorrect to assume beforehand that each peak is related to a specific trapping state. The validity of the quasiequilibrium approximation is briefly discussed

  7. Particles and Dirac-type operators on curved spaces

    International Nuclear Information System (INIS)

    Visinescu, Mihai

    2003-01-01

    We review the geodesic motion of pseudo-classical particles in curved spaces. Investigating the generalized Killing equations for spinning spaces, we express the constants of motion in terms of Killing-Yano tensors. Passing from the spinning spaces to the Dirac equation in curved backgrounds we point out the role of the Killing-Yano tensors in the construction of the Dirac-type operators. The general results are applied to the case of the four-dimensional Euclidean Taub-Newman-Unti-Tamburino space. From the covariantly constant Killing-Yano tensors of this space we construct three new Dirac-type operators which are equivalent with the standard Dirac operator. Finally the Runge-Lenz operator for the Dirac equation in this background is expressed in terms of the fourth Killing-Yano tensor which is not covariantly constant. As a rule the covariantly constant Killing-Yano tensors realize certain square roots of the metric tensor. Such a Killing-Yano tensor produces simultaneously a Dirac-type operator and the generator of a one-parameter Lie group connecting this operator with the standard Dirac one. On the other hand, the not covariantly constant Killing-Yano tensors are important in generating hidden symmetries. The presence of not covariantly constant Killing-Yano tensors implies the existence of non-standard supersymmetries in point particle theories on curved background. (author)

  8. A study on the effect of flaw detection probability assumptions on risk reduction achieved by non-destructive inspection

    International Nuclear Information System (INIS)

    Cronvall, O.; Simola, K.; Männistö, I.; Gunnars, J.; Alverlind, L.; Dillström, P.; Gandossi, L.

    2012-01-01

    Leakages and ruptures of piping components lead to reduction or loss of the pressure retaining capability of the system, and thus contribute to the overall risk associated with nuclear power plants. In-service inspection (ISI) aims at verifying that defects are not present in components of the pressure boundary or, if defects are present, ensuring that these are detected before they affect the safe operation of the plant. Reliability estimates of piping are needed e.g., in probabilistic safety assessment (PSA) studies, risk-informed ISI (RI-ISI) applications, and other structural reliability assessments. Probabilistic fracture mechanics models can account for ISI reliability, but a quantitative estimate for the latter is needed. This is normally expressed in terms of probability of detection (POD) curves, which correlate the probability of detecting a flaw with flaw size. A detailed POD curve is often difficult (or practically impossible) to obtain. If sufficient risk reduction can be shown by using simplified (but reasonably conservative) POD estimates, more complex PODs are not needed. This paper summarises the results of a study on the effect of piping inspection reliability assumptions on failure probability using structural reliability models. The main interest was to investigate whether it is justifiable to use a simplified POD curve. Further, the study compared various structural reliability calculation approaches for a set of analysis cases. The results indicate that the use of a simplified POD could be justifiable in RI-ISI applications.

  9. Semi-classical scalar propagators in curved backgrounds: formalism and ambiguities

    Energy Technology Data Exchange (ETDEWEB)

    Grain, J. [Laboratory for Subatomic Physics and Cosmology, Grenoble Universites, CNRS, IN2P3, 53, avenue de Martyrs, 38026 Grenoble cedex (France)]|[AstroParticle and Cosmology, Universite Paris 7, CNRS, IN2P3, 10, rue Alice Domon et Leonie Duquet, 75205 Paris cedex 13 (France); Barrau, A. [Laboratory for Subatomic Physics and Cosmology, Grenoble Universites, CNRS, IN2P3, 53, avenue de Martyrs, 38026 Grenoble cedex (France)

    2007-05-15

    The phenomenology of quantum systems in curved space-times is among the most fascinating fields of physics, allowing - often at the Gedanken experiment level - constraints on tentative theories of quantum gravity. Determining the dynamics of fields in curved backgrounds remains however a complicated task because of the highly intricate partial differential equations involved, especially when the space metric exhibits no symmetry. In this article, we provide - in a pedagogical way - a general formalism to determine this dynamics at the semi-classical order. To this purpose, a generic expression for the semi-classical propagator is computed and the equation of motion for the probability four-current is derived. Those results underline a direct analogy between the computation of the propagator in general relativistic quantum mechanics and the computation of the propagator for stationary systems in non-relativistic quantum mechanics. (authors)

  10. Semi-classical scalar propagators in curved backgrounds: formalism and ambiguities

    International Nuclear Information System (INIS)

    Grain, J.; Barrau, A.

    2007-05-01

    The phenomenology of quantum systems in curved space-times is among the most fascinating fields of physics, allowing - often at the Gedanken experiment level - constraints on tentative theories of quantum gravity. Determining the dynamics of fields in curved backgrounds remains however a complicated task because of the highly intricate partial differential equations involved, especially when the space metric exhibits no symmetry. In this article, we provide - in a pedagogical way - a general formalism to determine this dynamics at the semi-classical order. To this purpose, a generic expression for the semi-classical propagator is computed and the equation of motion for the probability four-current is derived. Those results underline a direct analogy between the computation of the propagator in general relativistic quantum mechanics and the computation of the propagator for stationary systems in non-relativistic quantum mechanics. (authors)

  11. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  12. Knot probability of polygons subjected to a force: a Monte Carlo study

    International Nuclear Information System (INIS)

    Rensburg, E J Janse van; Orlandini, E; Tesi, M C; Whittington, S G

    2008-01-01

    We use Monte Carlo methods to study the knot probability of lattice polygons on the cubic lattice in the presence of an external force f. The force is coupled to the span of the polygons along a lattice direction, say the z-direction. If the force is negative polygons are squeezed (the compressive regime), while positive forces tend to stretch the polygons along the z-direction (the tensile regime). For sufficiently large positive forces we verify that the Pincus scaling law in the force-extension curve holds. At a fixed number of edges n the knot probability is a decreasing function of the force. For a fixed force the knot probability approaches unity as 1 - exp(-α 0 (f)n + o(n)), where α 0 (f) is positive and a decreasing function of f. We also examine the average of the absolute value of the writhe and we verify the square root growth law (known for f = 0) for all values of f

  13. Type Ia Supernova Light Curve Inference: Hierarchical Models for Nearby SN Ia in the Optical and Near Infrared

    Science.gov (United States)

    Mandel, Kaisey; Kirshner, R. P.; Narayan, G.; Wood-Vasey, W. M.; Friedman, A. S.; Hicken, M.

    2010-01-01

    I have constructed a comprehensive statistical model for Type Ia supernova light curves spanning optical through near infrared data simultaneously. The near infrared light curves are found to be excellent standard candles (sigma(MH) = 0.11 +/- 0.03 mag) that are less vulnerable to systematic error from dust extinction, a major confounding factor for cosmological studies. A hierarchical statistical framework incorporates coherently multiple sources of randomness and uncertainty, including photometric error, intrinsic supernova light curve variations and correlations, dust extinction and reddening, peculiar velocity dispersion and distances, for probabilistic inference with Type Ia SN light curves. Inferences are drawn from the full probability density over individual supernovae and the SN Ia and dust populations, conditioned on a dataset of SN Ia light curves and redshifts. To compute probabilistic inferences with hierarchical models, I have developed BayeSN, a Markov Chain Monte Carlo algorithm based on Gibbs sampling. This code explores and samples the global probability density of parameters describing individual supernovae and the population. I have applied this hierarchical model to optical and near infrared data of over 100 nearby Type Ia SN from PAIRITEL, the CfA3 sample, and the literature. Using this statistical model, I find that SN with optical and NIR data have a smaller residual scatter in the Hubble diagram than SN with only optical data. The continued study of Type Ia SN in the near infrared will be important for improving their utility as precise and accurate cosmological distance indicators.

  14. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  15. Stochastic geometry of critical curves, Schramm-Loewner evolutions and conformal field theory

    International Nuclear Information System (INIS)

    Gruzberg, Ilya A

    2006-01-01

    Conformally invariant curves that appear at critical points in two-dimensional statistical mechanics systems and their fractal geometry have received a lot of attention in recent years. On the one hand, Schramm (2000 Israel J. Math. 118 221 (Preprint math.PR/9904022)) has invented a new rigorous as well as practical calculational approach to critical curves, based on a beautiful unification of conformal maps and stochastic processes, and by now known as Schramm-Loewner evolution (SLE). On the other hand, Duplantier (2000 Phys. Rev. Lett. 84 1363; Fractal Geometry and Applications: A Jubilee of Benot Mandelbrot: Part 2 (Proc. Symp. Pure Math. vol 72) (Providence, RI: American Mathematical Society) p 365 (Preprint math-ph/0303034)) has applied boundary quantum gravity methods to calculate exact multifractal exponents associated with critical curves. In the first part of this paper, I provide a pedagogical introduction to SLE. I present mathematical facts from the theory of conformal maps and stochastic processes related to SLE. Then I review basic properties of SLE and provide practical derivation of various interesting quantities related to critical curves, including fractal dimensions and crossing probabilities. The second part of the paper is devoted to a way of describing critical curves using boundary conformal field theory (CFT) in the so-called Coulomb gas formalism. This description provides an alternative (to quantum gravity) way of obtaining the multifractal spectrum of critical curves using only traditional methods of CFT based on free bosonic fields

  16. The Use of Statistically Based Rolling Supply Curves for Electricity Market Analysis: A Preliminary Look

    Energy Technology Data Exchange (ETDEWEB)

    Jenkin, Thomas J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Larson, Andrew [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ruth, Mark F [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ben [U.S. Department of Energy; Spitsen, Paul [U.S. Department of Energy

    2018-03-27

    In light of the changing electricity resource mixes across the United States, an important question in electricity modeling is how additions and retirements of generation, including additions in variable renewable energy (VRE) generation could impact markets by changing hourly wholesale energy prices. Instead of using resource-intensive production cost models (PCMs) or building and using simple generator supply curves, this analysis uses a 'top-down' approach based on regression analysis of hourly historical energy and load data to estimate the impact of supply changes on wholesale electricity prices, provided the changes are not so substantial that they fundamentally alter the market and dispatch-order driven behavior of non-retiring units. The rolling supply curve (RSC) method used in this report estimates the shape of the supply curve that fits historical hourly price and load data for given time intervals, such as two-weeks, and then repeats this on a rolling basis through the year. These supply curves can then be modified on an hourly basis to reflect the impact of generation retirements or additions, including VRE and then reapplied to the same load data to estimate the change in hourly electricity price. The choice of duration over which these RSCs are estimated has a significant impact on goodness of fit. For example, in PJM in 2015, moving from fitting one curve per year to 26 rolling two-week supply curves improves the standard error of the regression from 16 dollars/MWh to 6 dollars/MWh and the R-squared of the estimate from 0.48 to 0.76. We illustrate the potential use and value of the RSC method by estimating wholesale price effects under various generator retirement and addition scenarios, and we discuss potential limits of the technique, some of which are inherent. The ability to do this type of analysis is important to a wide range of market participants and other stakeholders, and it may have a role in complementing use of or providing

  17. Comparison of ASME pressure–temperature limits on the fracture probability for a pressurized water reactor pressure vessel

    International Nuclear Information System (INIS)

    Chou, Hsoung-Wei; Huang, Chin-Cheng

    2017-01-01

    Highlights: • P-T limits based on ASME K_I_a curve, K_I_C curve and RI method are presented. • Probabilistic and deterministic methods are used to evaluate P-T limits on RPV. • The feasibility of substituting P-T curves with more operational is demonstrated. • Warm-prestressing effect is critical in determining the fracture probability. - Abstract: The ASME Code Section XI-Appendix G defines the normal reactor startup (heat-up) and shut-down (cool-down) operation limits according to the fracture toughness requirement of reactor pressure vessel (RPV) materials. This paper investigates the effects of different pressure-temperature limit operations on structural integrity of a Taiwan domestic pressurized water reactor (PWR) pressure vessel. Three kinds of pressure-temperature limits based on different fracture toughness requirements – the K_I_a fracture toughness curve of ASME Section XI-Appendix G before 1998 editions, the K_I_C fracture toughness curve of ASME Section XI-Appendix G after 2001 editions, and the risk-informed revision method supplemented in ASME Section XI-Appendix G after 2013 editions, respectively, are established as the loading conditions. A series of probabilistic fracture mechanics analyses for the RPV are conducted employing ORNL’s FAVOR code considering various radiation embrittlement levels under these pressure-temperature limit conditions. It is found that the pressure-temperature operation limits which provide more operational flexibility may lead to higher fracture risks to the RPV. The cladding-induced shallow surface breaking flaws are the most critical and dominate the fracture probability of the RPV under pressure-temperature limit transients. Present study provides a risk-informed reference for the operation safety and regulation viewpoint of PWRs in Taiwan.

  18. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  19. Construction of long-term isochronous stress-strain curves by a modeling of short-term creep curves for a Grade 9Cr-1Mo steel

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Yin, Song-Nan; Koo, Gyeong-Hoi

    2009-01-01

    This study dealt with the construction of long-term isochronous stress-strain curves (ISSC) by a modeling of short-term creep curves for a Grade 9Cr-1Mo steel (G91) which is a candidate material for structural applications in the next generation nuclear reactors as well as in fusion reactors. To do this, tensile material data used in the inelastic constitutive equations was obtained by tensile tests at 550degC. Creep curves were obtained by a series of creep tests with different stress levels of 300MPa to 220MPa at an identical controlled temperature of 550degC. On the basis of these experimental data, the creep curves were characterized by Garofalo's creep model. Three parameters of P 1 , P 2 and P 3 in Garofalo's model were properly optimized by a nonlinear least square fitting (NLSF) analysis. The stress dependency of the three parameters was found to be a linear relationship. But, the P 3 parameter representing the steady state creep rate exhibited a two slope behavior with different stress exponents at a transient stress of about 250 MPa. The long-term creep curves of the G91 steel was modeled by Garofalo's model with only a few short-term creep data. Using the modeled creep curves, the long-term isochronous curves up to 10 5 hours were successfully constructed. (author)

  20. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  1. The probability of traffic accidents associated with the transport of radioactive wastes

    International Nuclear Information System (INIS)

    James, I.A.

    1986-01-01

    This report evaluates the probability of a container impact during transit between generating and disposal sites. Probabilities per route mile are combined with the characteristics of the transport systems described in previous reports, to allow a comparison of different disposal options to be made. (author)

  2. Proposed Spontaneous Generation of Magnetic Fields by Curved Layers of a Chiral Superconductor

    Science.gov (United States)

    Kvorning, T.; Hansson, T. H.; Quelle, A.; Smith, C. Morais

    2018-05-01

    We demonstrate that two-dimensional chiral superconductors on curved surfaces spontaneously develop magnetic flux. This geometric Meissner effect provides an unequivocal signature of chiral superconductivity, which could be observed in layered materials under stress. We also employ the effect to explain some puzzling questions related to the location of zero-energy Majorana modes.

  3. Model-assisted probability of detection of flaws in aluminum blocks using polynomial chaos expansions

    Science.gov (United States)

    Du, Xiaosong; Leifsson, Leifur; Grandin, Robert; Meeker, William; Roberts, Ronald; Song, Jiming

    2018-04-01

    Probability of detection (POD) is widely used for measuring reliability of nondestructive testing (NDT) systems. Typically, POD is determined experimentally, while it can be enhanced by utilizing physics-based computational models in combination with model-assisted POD (MAPOD) methods. With the development of advanced physics-based methods, such as ultrasonic NDT testing, the empirical information, needed for POD methods, can be reduced. However, performing accurate numerical simulations can be prohibitively time-consuming, especially as part of stochastic analysis. In this work, stochastic surrogate models for computational physics-based measurement simulations are developed for cost savings of MAPOD methods while simultaneously ensuring sufficient accuracy. The stochastic surrogate is used to propagate the random input variables through the physics-based simulation model to obtain the joint probability distribution of the output. The POD curves are then generated based on those results. Here, the stochastic surrogates are constructed using non-intrusive polynomial chaos (NIPC) expansions. In particular, the NIPC methods used are the quadrature, ordinary least-squares (OLS), and least-angle regression sparse (LARS) techniques. The proposed approach is demonstrated on the ultrasonic testing simulation of a flat bottom hole flaw in an aluminum block. The results show that the stochastic surrogates have at least two orders of magnitude faster convergence on the statistics than direct Monte Carlo sampling (MCS). Moreover, the evaluation of the stochastic surrogate models is over three orders of magnitude faster than the underlying simulation model for this case, which is the UTSim2 model.

  4. EVEREST: Pixel Level Decorrelation of K2 Light Curves

    Science.gov (United States)

    Luger, Rodrigo; Agol, Eric; Kruse, Ethan; Barnes, Rory; Becker, Andrew; Foreman-Mackey, Daniel; Deming, Drake

    2016-10-01

    We present EPIC Variability Extraction and Removal for Exoplanet Science Targets (EVEREST), an open-source pipeline for removing instrumental noise from K2 light curves. EVEREST employs a variant of pixel level decorrelation to remove systematics introduced by the spacecraft’s pointing error and a Gaussian process to capture astrophysical variability. We apply EVEREST to all K2 targets in campaigns 0-7, yielding light curves with precision comparable to that of the original Kepler mission for stars brighter than {K}p≈ 13, and within a factor of two of the Kepler precision for fainter targets. We perform cross-validation and transit injection and recovery tests to validate the pipeline, and compare our light curves to the other de-trended light curves available for download at the MAST High Level Science Products archive. We find that EVEREST achieves the highest average precision of any of these pipelines for unsaturated K2 stars. The improved precision of these light curves will aid in exoplanet detection and characterization, investigations of stellar variability, asteroseismology, and other photometric studies. The EVEREST pipeline can also easily be applied to future surveys, such as the TESS mission, to correct for instrumental systematics and enable the detection of low signal-to-noise transiting exoplanets. The EVEREST light curves and the source code used to generate them are freely available online.

  5. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex.

    Science.gov (United States)

    Chan, Stephanie C Y; Niv, Yael; Norman, Kenneth A

    2016-07-27

    The orbitofrontal cortex (OFC) has been implicated in both the representation of "state," in studies of reinforcement learning and decision making, and also in the representation of "schemas," in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or "latent cause" that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or "belief distribution") over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true "state" of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or "schema"). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. Copyright © 2016 the authors 0270-6474/16/367817-12$15.00/0.

  6. Consistent cost curves for identification of optimal energy savings across industry and residential sectors

    DEFF Research Database (Denmark)

    Klinge Jacobsen, Henrik; Baldini, Mattia

    the costs are incurred and savings (difference in discount rates both private and social) • The issue of marginal investment in a case of replacement anyway or a full investment in the energy saving technology • Implementation costs (and probability of investment) differs across sectors • Cost saving...... with constructing and applying the cost curves in modelling: • Cost curves do not have the same cost interpretation across economic subsectors and end-use technologies (investment cost for equipment varies – including/excluding installation – adaptation costs – indirect production costs) • The time issue of when...... options are not additive - meaning that marginal energy savings from one option depends on what other options implemented We address the importance of these issues and illustrate with Danish cases how large the difference in savings cost curves can be if different methodologies are used. For example...

  7. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  8. Comparison Algorithm Kernels on Support Vector Machine (SVM To Compare The Trend Curves with Curves Online Forex Trading

    Directory of Open Access Journals (Sweden)

    irfan abbas

    2017-01-01

    Full Text Available At this time, the players Forex Trading generally still use the data exchange in the form of a Forex Trading figures from different sources. Thus they only receive or know the data rate of a Forex Trading prevailing at the time just so difficult to analyze or predict exchange rate movements future. Forex players usually use the indicators to enable them to analyze and memperdiksi future value. Indicator is a decision making tool. Trading forex is trading currency of a country, the other country's currency. Trading took place globally between the financial centers of the world with the involvement of the world's major banks as the major transaction. Trading Forex offers profitable investment type with a small capital and high profit, with relatively small capital can earn profits doubled. This is due to the forex trading systems exist leverage which the invested capital will be doubled if the predicted results of buy / sell is accurate, but Trading Forex having high risk level, but by knowing the right time to trade (buy or sell, the losses can be avoided. Traders who invest in the foreign exchange market is expected to have the ability to analyze the circumstances and situations in predicting the difference in currency exchange rates. Forex price movements that form the pattern (curve up and down greatly assist traders in making decisions. The movement of the curve used as an indicator in the decision to purchase (buy or sell (sell. This study compares (Comparation type algorithm kernel on Support Vector Machine (SVM to predict the movement of the curve in live time trading forex using the data GBPUSD, 1H. Results of research on the study of the results and discussion can be concluded that the Kernel Dot, Kernel Multiquaric, Kernel Neural inappropriately used for data is non-linear in the case of data forex to follow the pattern of trend curves, because curves generated curved linear (straight and then to type of kernel is the closest curve

  9. INVESTIGATION OF CURVES SET BY CUBIC DISTRIBUTION OF CURVATURE

    Directory of Open Access Journals (Sweden)

    S. A. Ustenko

    2014-03-01

    Full Text Available Purpose. Further development of the geometric modeling of curvelinear contours of different objects based on the specified cubic curvature distribution and setpoints of curvature in the boundary points. Methodology. We investigate the flat section of the curvilinear contour generating under condition that cubic curvature distribution is set. Curve begins and ends at the given points, where angles of tangent slope and curvature are also determined. It was obtained the curvature equation of this curve, depending on the section length and coefficient c of cubic curvature distribution. The analysis of obtained equation was carried out. As well as, it was investigated the conditions, in which the inflection points of the curve are appearing. One should find such an interval of parameter change (depending on the input data and the section length, in order to place the inflection point of the curvature graph outside the curve section borders. It was determined the dependence of tangent slope of angle to the curve at its arbitrary point, as well as it was given the recommendations to solve a system of integral equations that allow finding the length of the curve section and the coefficient c of curvature cubic distribution. Findings. As the result of curves research, it is found that the criterion for their selection one can consider the absence of inflection points of the curvature on the observed section. Influence analysis of the parameter c on the graph of tangent slope angle to the curve showed that regardless of its value, it is provided the same rate of angle increase of tangent slope to the curve. Originality. It is improved the approach to geometric modeling of curves based on cubic curvature distribution with its given values at the boundary points by eliminating the inflection points from the observed section of curvilinear contours. Practical value. Curves obtained using the proposed method can be used for geometric modeling of curvilinear

  10. Nonlinear Dynamic of Curved Railway Tracks in Three-Dimensional Space

    Science.gov (United States)

    Liu, X.; Ngamkhanong, C.; Kaewunruen, S.

    2017-12-01

    On curved tracks, high-pitch noise pollution can often be a considerable concern of rail asset owners, commuters, and people living or working along the rail corridor. Inevitably, wheel/rail interface can cause a traveling source of sound and vibration, which spread over a long distance of rail network. The sound and vibration can be in various forms and spectra. The undesirable sound and vibration on curves is often called ‘noise,’ includes flanging and squealing noises. This paper focuses on the squeal noise phenomena on curved tracks located in urban environments. It highlights the effect of curve radii on lateral track dynamics. It is important to note that rail freight curve noises, especially for curve squeals, can be observed almost everywhere and every type of track structures. The most pressing noise appears at sharper curved tracks where excessive lateral wheel/rail dynamics resonate with falling friction states, generating a tonal noise problem, so-call ‘squeal’. Many researchers have carried out measurements and simulations to understand the actual root causes of the squeal noise. Most researchers believe that wheel resonance over falling friction is the main cause, whilst a few others think that dynamic mode coupling of wheel and rail may also cause the squeal. Therefore, this paper is devoted to systems thinking the approach and dynamic assessment in resolving railway curve noise problems. The simulations of railway tracks with different curve radii will be carried out to develop state-of-the-art understanding into lateral track dynamics, including rail dynamics, cant dynamics, gauge dynamics and overall track responses.

  11. Measurement of D-T neutron penetration probability spectra for iron ball shell systems

    International Nuclear Information System (INIS)

    Duan Shaojie

    1998-06-01

    The D-T neutron penetration probability spectra are measured for iron ball shell systems of the series of samples used in the experiments, and the penetration curves are presented. As the detector is near to samples, the measured results being approximately corrected are compared with those in the literature, and it is shown that the former is compatible with the latter in the range of the experimental error

  12. Probability of failure prediction for step-stress fatigue under sine or random stress

    Science.gov (United States)

    Lambert, R. G.

    1979-01-01

    A previously proposed cumulative fatigue damage law is extended to predict the probability of failure or fatigue life for structural materials with S-N fatigue curves represented as a scatterband of failure points. The proposed law applies to structures subjected to sinusoidal or random stresses and includes the effect of initial crack (i.e., flaw) sizes. The corrected cycle ratio damage function is shown to have physical significance.

  13. An empirical probability model of detecting species at low densities.

    Science.gov (United States)

    Delaney, David G; Leung, Brian

    2010-06-01

    False negatives, not detecting things that are actually present, are an important but understudied problem. False negatives are the result of our inability to perfectly detect species, especially those at low density such as endangered species or newly arriving introduced species. They reduce our ability to interpret presence-absence survey data and make sound management decisions (e.g., rapid response). To reduce the probability of false negatives, we need to compare the efficacy and sensitivity of different sampling approaches and quantify an unbiased estimate of the probability of detection. We conducted field experiments in the intertidal zone of New England and New York to test the sensitivity of two sampling approaches (quadrat vs. total area search, TAS), given different target characteristics (mobile vs. sessile). Using logistic regression we built detection curves for each sampling approach that related the sampling intensity and the density of targets to the probability of detection. The TAS approach reduced the probability of false negatives and detected targets faster than the quadrat approach. Mobility of targets increased the time to detection but did not affect detection success. Finally, we interpreted two years of presence-absence data on the distribution of the Asian shore crab (Hemigrapsus sanguineus) in New England and New York, using our probability model for false negatives. The type of experimental approach in this paper can help to reduce false negatives and increase our ability to detect species at low densities by refining sampling approaches, which can guide conservation strategies and management decisions in various areas of ecology such as conservation biology and invasion ecology.

  14. Rejecting probability summation for radial frequency patterns, not so Quick!

    Science.gov (United States)

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Reliability of the emergency diesel generator

    Energy Technology Data Exchange (ETDEWEB)

    Verstegen, C.; Kotthoff, K. [Gesellschaft fuer Reaktorsicherheit - GRS mbH, Schwertnergasse 1, D-5000 Koeln 1, Cologne (Germany)

    1986-02-15

    The paper deals with a statistical investigation on the availability of diesel generators, which has been performed recently The investigation is based on the operating experiences of a total of 40-diesel generators in 10 German NPP's. Both unavailability of the diesel generators due to failures and due to maintenance and repair have been considered.The probability of diesel failure during start and short-time operation amounts?o about 8 x 10{sup -3}/demand. The probability of common mode failures is approximately one order of magnitude smaller. The influence of various parameters on the failure probability has been discussed. A statistically significant dependence could not be identified In addition the investigation shows that the unavailability of the diesel generators due to maintenance and repair is about of the same order magnitude as the probability of diesel failures. (authors)

  16. Contributions to the R-curve behaviour of ceramic materials

    International Nuclear Information System (INIS)

    Fett, T.

    1994-12-01

    Several ceramic materials show an increase in crack growth resistance with increasing crack extension. Especially, in case of coarse-grained alumina this ''R-curve effect'' is caused by crack-face interactions in the wake of the advancing crack. Similar effects occur for whisker reinforced ceramics. Due to the crack-face interactions so-called ''bridging stresses'' are generated which transfer forces between the two crack surfaces. A second reason for an increase of crack-growth resistance are stress-induced phase transformations in zirconia ceramics with the tetragonal phase changing to the monoclinic phase. These transformations will affect the stress field in the surroundings of crack tips. The transformation generates a crack-tip transformation zone and, due to the stress balance, also residual stresses in the whole crack region which result in a residual stress intensity factor. This additional stress intensity factor is also a reason for the R-curve behaviour. In this report both effects are outlined in detail. (orig.) [de

  17. Refinement of a Method for Identifying Probable Archaeological Sites from Remotely Sensed Data

    Science.gov (United States)

    Tilton, James C.; Comer, Douglas C.; Priebe, Carey E.; Sussman, Daniel; Chen, Li

    2012-01-01

    To facilitate locating archaeological sites before they are compromised or destroyed, we are developing approaches for generating maps of probable archaeological sites, through detecting subtle anomalies in vegetative cover, soil chemistry, and soil moisture by analyzing remotely sensed data from multiple sources. We previously reported some success in this effort with a statistical analysis of slope, radar, and Ikonos data (including tasseled cap and NDVI transforms) with Student's t-test. We report here on new developments in our work, performing an analysis of 8-band multispectral Worldview-2 data. The Worldview-2 analysis begins by computing medians and median absolute deviations for the pixels in various annuli around each site of interest on the 28 band difference ratios. We then use principle components analysis followed by linear discriminant analysis to train a classifier which assigns a posterior probability that a location is an archaeological site. We tested the procedure using leave-one-out cross validation with a second leave-one-out step to choose parameters on a 9,859x23,000 subset of the WorldView-2 data over the western portion of Ft. Irwin, CA, USA. We used 100 known non-sites and trained one classifier for lithic sites (n=33) and one classifier for habitation sites (n=16). We then analyzed convex combinations of scores from the Archaeological Predictive Model (APM) and our scores. We found that that the combined scores had a higher area under the ROC curve than either individual method, indicating that including WorldView-2 data in analysis improved the predictive power of the provided APM.

  18. Logging and Agricultural Residue Supply Curves for the Pacific Northwest

    Energy Technology Data Exchange (ETDEWEB)

    Kerstetter, James D.; Lyons, John Kim

    2001-01-01

    This report quantified the volume of logging residues at the county level for current timber harvests. The cost of recovering logging residues was determined for skidding, yearding, loading, chipping and transporting the residues. Supply curves were developed for ten candidate conversion sites in the Pacific Northwest Region. Agricultural field residues were also quantified at the county level using five-year average crop yields. Agronomic constraints were applied to arrive at the volumes available for energy use. Collection costs and transportation costs were determined and supply curves generated for thirteen candidate conversion sites.

  19. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  20. Flow characteristics of curved ducts

    Directory of Open Access Journals (Sweden)

    Rudolf P.

    2007-10-01

    Full Text Available Curved channels are very often present in real hydraulic systems, e.g. curved diffusers of hydraulic turbines, S-shaped bulb turbines, fittings, etc. Curvature brings change of velocity profile, generation of vortices and production of hydraulic losses. Flow simulation using CFD techniques were performed to understand these phenomena. Cases ranging from single elbow to coupled elbows in shapes of U, S and spatial right angle position with circular cross-section were modeled for Re = 60000. Spatial development of the flow was studied and consequently it was deduced that minor losses are connected with the transformation of pressure energy into kinetic energy and vice versa. This transformation is a dissipative process and is reflected in the amount of the energy irreversibly lost. Least loss coefficient is connected with flow in U-shape elbows, biggest one with flow in Sshape elbows. Finally, the extent of the flow domain influenced by presence of curvature was examined. This isimportant for proper placement of mano- and flowmeters during experimental tests. Simulations were verified with experimental results presented in literature.

  1. TYPE Ia SUPERNOVA LIGHT-CURVE INFERENCE: HIERARCHICAL BAYESIAN ANALYSIS IN THE NEAR-INFRARED

    International Nuclear Information System (INIS)

    Mandel, Kaisey S.; Friedman, Andrew S.; Kirshner, Robert P.; Wood-Vasey, W. Michael

    2009-01-01

    We present a comprehensive statistical analysis of the properties of Type Ia supernova (SN Ia) light curves in the near-infrared using recent data from Peters Automated InfraRed Imaging TELescope and the literature. We construct a hierarchical Bayesian framework, incorporating several uncertainties including photometric error, peculiar velocities, dust extinction, and intrinsic variations, for principled and coherent statistical inference. SN Ia light-curve inferences are drawn from the global posterior probability of parameters describing both individual supernovae and the population conditioned on the entire SN Ia NIR data set. The logical structure of the hierarchical model is represented by a directed acyclic graph. Fully Bayesian analysis of the model and data is enabled by an efficient Markov Chain Monte Carlo algorithm exploiting the conditional probabilistic structure using Gibbs sampling. We apply this framework to the JHK s SN Ia light-curve data. A new light-curve model captures the observed J-band light-curve shape variations. The marginal intrinsic variances in peak absolute magnitudes are σ(M J ) = 0.17 ± 0.03, σ(M H ) = 0.11 ± 0.03, and σ(M Ks ) = 0.19 ± 0.04. We describe the first quantitative evidence for correlations between the NIR absolute magnitudes and J-band light-curve shapes, and demonstrate their utility for distance estimation. The average residual in the Hubble diagram for the training set SNe at cz > 2000kms -1 is 0.10 mag. The new application of bootstrap cross-validation to SN Ia light-curve inference tests the sensitivity of the statistical model fit to the finite sample and estimates the prediction error at 0.15 mag. These results demonstrate that SN Ia NIR light curves are as effective as corrected optical light curves, and, because they are less vulnerable to dust absorption, they have great potential as precise and accurate cosmological distance indicators.

  2. Complex curve of the two-matrix model and its tau-function

    International Nuclear Information System (INIS)

    Kazakov, Vladimir A; Marshakov, Andrei

    2003-01-01

    We study the Hermitian and normal two-matrix models in planar approximation for an arbitrary number of eigenvalue supports. Its planar graph interpretation is given. The study reveals a general structure of the underlying analytic complex curve, different from the hyperelliptic curve of the one-matrix model. The matrix model quantities are expressed through the periods of meromorphic generating differential on this curve and the partition function of the multiple support solution, as a function of filling numbers and coefficients of the matrix potential, is shown to be a quasiclassical tau-function. The relation to N = 1 supersymmetric Yang-Mills theories is discussed. A general class of solvable multi-matrix models with tree-like interactions is considered

  3. Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections

    Science.gov (United States)

    Tseng, Hsin-yi; Tung, Ching-pin

    2015-04-01

    Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in

  4. An algorithm for unified analysis on the thermoluminescence glow curve

    International Nuclear Information System (INIS)

    Chung, K.S.; Park, C.Y.; Lee, J.I.; Kim, J.L.

    2014-01-01

    An algorithm was developed to integrally handle excitation by radiation, relaxation and luminescence by thermal or optical stimulation in thermoluminescence (TL) and optically stimulated luminescence (OSL) processes. This algorithm reflects the mutual interaction between traps through a conduction band. Electrons and holes are created by radiation in the beginning, and these electrons move to the trap through the conduction band. These holes move to the recombination center through a valence band. The ratio of the electrons allocated to each trap differs with the recombination probability and these values also relevant to the process of luminescence. Accordingly, the glow curve can be interpreted by taking the rate of electron–hole pairs created by ionizing radiation as a unique initial condition. This method differs from the conventional method of interpreting the measured glow curve with the initial electron concentration allocated to each trap at the end of irradiation. A program using the Visual Studio's C# subsystem was made to realize such a developed algorithm. To verify this algorithm it was applied to LiF:Mg,Cu,Si. The TL glow curve was deconvoluted with a model of five traps, one deep trap and one recombination center (RC). - Highlights: • TL glow curve deconvolution employing interacting model. • Simulation both irradiation and TL readout stages for various dose level. • Application in the identification TL kinetics of LiF:Mg,Cu,Si TLD

  5. Clinical diagnostic accuracy of acute colonic diverticulitis in patients admitted with acute abdominal pain, a receiver operating characteristic curve analysis.

    Science.gov (United States)

    Jamal Talabani, A; Endreseth, B H; Lydersen, S; Edna, T-H

    2017-01-01

    The study investigated the capability of clinical findings, temperature, C-reactive protein (CRP), and white blood cell (WBC) count to discern patients with acute colonic diverticulitis from all other patients admitted with acute abdominal pain. The probability of acute diverticulitis was assessed by the examining doctor, using a scale from 0 (zero probability) to 10 (100 % probability). Receiver operating characteristic (ROC) curves were used to assess the clinical diagnostic accuracy of acute colonic diverticulitis in patients admitted with acute abdominal pain. Of 833 patients admitted with acute abdominal pain, 95 had acute colonic diverticulitis. ROC curve analysis gave an area under the ROC curve (AUC) of 0.95 (CI 0.92 to 0.97) for ages patients. Separate analysis showed an AUC = 0.83 (CI 0.80 to 0.86) of CRP alone. White blood cell count and temperature were almost useless to discriminate acute colonic diverticulitis from other types of acute abdominal pain, AUC = 0.59 (CI 0.53 to 0.65) for white blood cell count and AUC = 0.57 (0.50 to 0.63) for temperature, respectively. This prospective study demonstrates that standard clinical evaluation by non-specialist doctors based on history, physical examination, and initial blood tests on admission provides a high degree of diagnostic precision in patients with acute colonic diverticulitis.

  6. Probabilistic Rainfall Intensity-Duration-Frequency Curves for the October 2015 Flooding in South Carolina

    Science.gov (United States)

    Phillips, R.; Samadi, S. Z.; Meadows, M.

    2017-12-01

    The potential for the intensity of extreme rainfall to increase with climate change nonstationarity has emerged as a prevailing issue for the design of engineering infrastructure, underscoring the need to better characterize the statistical assumptions underlying hydrological frequency analysis. The focus of this study is on developing probabilistic rainfall intensity-duration-frequency (IDF) curves for the major catchments in South Carolina (SC) where the October 02-05, 2015 floods caused infrastructure damages and several lives to be lost. Continuous to discrete probability distributions including Weibull, the generalized extreme value (GEV), the Generalized Pareto (GP), the Gumbel, the Fréchet, the normal, and the log-normal functions were fitted to the short duration (i.e., 24-hr) intense rainfall. Analysis suggests that the GEV probability distribution provided the most adequate fit to rainfall records. Rainfall frequency analysis indicated return periods above 500 years for urban drainage systems with a maximum return level of approximately 2,744 years, whereas rainfall magnitude was much lower in rural catchments. Further, the return levels (i.e., 2, 20, 50,100, 500, and 1000 years) computed by Monte Carlo method were consistently higher than the NOAA design IDF curves. Given the potential increase in the magnitude of intense rainfall, current IDF curves can substantially underestimate the frequency of extremes, indicating the susceptibility of the storm drainage and flood control structures in SC that were designed under assumptions of a stationary climate.

  7. Statistics and probability with applications for engineers and scientists

    CERN Document Server

    Gupta, Bhisham C

    2013-01-01

    Introducing the tools of statistics and probability from the ground up An understanding of statistical tools is essential for engineers and scientists who often need to deal with data analysis over the course of their work. Statistics and Probability with Applications for Engineers and Scientists walks readers through a wide range of popular statistical techniques, explaining step-by-step how to generate, analyze, and interpret data for diverse applications in engineering and the natural sciences. Unique among books of this kind, Statistics and Prob

  8. Fatal defect in computerized glow curve deconvolution of thermoluminescence

    International Nuclear Information System (INIS)

    Sakurai, T.

    2001-01-01

    The method of computerized glow curve deconvolution (CGCD) is a powerful tool in the study of thermoluminescence (TL). In a system where the plural trapping levels have the probability of retrapping, the electrons trapped at one level can transfer from this level to another through retrapping via the conduction band during reading TL. However, at present, the method of CGCD has no affect on the electron transition between the trapping levels; this is a fatal defect. It is shown by computer simulation that CGCD using general-order kinetics thus cannot yield the correct trap parameters. (author)

  9. Finite-size scaling of survival probability in branching processes.

    Science.gov (United States)

    Garcia-Millan, Rosalba; Font-Clos, Francesc; Corral, Álvaro

    2015-04-01

    Branching processes pervade many models in statistical physics. We investigate the survival probability of a Galton-Watson branching process after a finite number of generations. We derive analytically the existence of finite-size scaling for the survival probability as a function of the control parameter and the maximum number of generations, obtaining the critical exponents as well as the exact scaling function, which is G(y)=2ye(y)/(e(y)-1), with y the rescaled distance to the critical point. Our findings are valid for any branching process of the Galton-Watson type, independently of the distribution of the number of offspring, provided its variance is finite. This proves the universal behavior of the finite-size effects in branching processes, including the universality of the metric factors. The direct relation to mean-field percolation is also discussed.

  10. Stenting for curved lesions using a novel curved balloon: Preliminary experimental study.

    Science.gov (United States)

    Tomita, Hideshi; Higaki, Takashi; Kobayashi, Toshiki; Fujii, Takanari; Fujimoto, Kazuto

    2015-08-01

    Stenting may be a compelling approach to dilating curved lesions in congenital heart diseases. However, balloon-expandable stents, which are commonly used for congenital heart diseases, are usually deployed in a straight orientation. In this study, we evaluated the effect of stenting with a novel curved balloon considered to provide better conformability to the curved-angled lesion. In vitro experiments: A Palmaz Genesis(®) stent (Johnson & Johnson, Cordis Co, Bridgewater, NJ, USA) mounted on the Goku(®) curve (Tokai Medical Co. Nagoya, Japan) was dilated in vitro to observe directly the behavior of the stent and balloon assembly during expansion. Animal experiment: A short Express(®) Vascular SD (Boston Scientific Co, Marlborough, MA, USA) stent and a long Express(®) Vascular LD stent (Boston Scientific) mounted on the curved balloon were deployed in the curved vessel of a pig to observe the effect of stenting in vivo. In vitro experiments: Although the stent was dilated in a curved fashion, stent and balloon assembly also rotated conjointly during expansion of its curved portion. In the primary stenting of the short stent, the stent was dilated with rotation of the curved portion. The excised stent conformed to the curved vessel. As the long stent could not be negotiated across the mid-portion with the balloon in expansion when it started curving, the mid-portion of the stent failed to expand fully. Furthermore, the balloon, which became entangled with the stent strut, could not be retrieved even after complete deflation. This novel curved balloon catheter might be used for implantation of the short stent in a curved lesion; however, it should not be used for primary stenting of the long stent. Post-dilation to conform the stent to the angled vessel would be safer than primary stenting irrespective of stent length. Copyright © 2014 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.

  11. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  12. Curve Boxplot: Generalization of Boxplot for Ensembles of Curves.

    Science.gov (United States)

    Mirzargar, Mahsa; Whitaker, Ross T; Kirby, Robert M

    2014-12-01

    In simulation science, computational scientists often study the behavior of their simulations by repeated solutions with variations in parameters and/or boundary values or initial conditions. Through such simulation ensembles, one can try to understand or quantify the variability or uncertainty in a solution as a function of the various inputs or model assumptions. In response to a growing interest in simulation ensembles, the visualization community has developed a suite of methods for allowing users to observe and understand the properties of these ensembles in an efficient and effective manner. An important aspect of visualizing simulations is the analysis of derived features, often represented as points, surfaces, or curves. In this paper, we present a novel, nonparametric method for summarizing ensembles of 2D and 3D curves. We propose an extension of a method from descriptive statistics, data depth, to curves. We also demonstrate a set of rendering and visualization strategies for showing rank statistics of an ensemble of curves, which is a generalization of traditional whisker plots or boxplots to multidimensional curves. Results are presented for applications in neuroimaging, hurricane forecasting and fluid dynamics.

  13. Classical black holes: the nonlinear dynamics of curved spacetime.

    Science.gov (United States)

    Thorne, Kip S

    2012-08-03

    Numerical simulations have revealed two types of physical structures, made from curved spacetime, that are attached to black holes: tendexes, which stretch or squeeze anything they encounter, and vortexes, which twist adjacent inertial frames relative to each other. When black holes collide, their tendexes and vortexes interact and oscillate (a form of nonlinear dynamics of curved spacetime). These oscillations generate gravitational waves, which can give kicks up to 4000 kilometers per second to the merged black hole. The gravitational waves encode details of the spacetime dynamics and will soon be observed and studied by the Laser Interferometer Gravitational Wave Observatory and its international partners.

  14. JUMPING THE CURVE

    Directory of Open Access Journals (Sweden)

    René Pellissier

    2012-01-01

    Full Text Available This paper explores the notion ofjump ing the curve,following from Handy 's S-curve onto a new curve with new rules policies and procedures. . It claims that the curve does not generally lie in wait but has to be invented by leadership. The focus of this paper is the identification (mathematically and inferentially ofthat point in time, known as the cusp in catastrophe theory, when it is time to change - pro-actively, pre-actively or reactively. These three scenarios are addressed separately and discussed in terms ofthe relevance ofeach.

  15. Thermoeconomic diagnosis and entropy generation paradox

    DEFF Research Database (Denmark)

    Sigthorsson, Oskar; Ommen, Torben Schmidt; Elmegaard, Brian

    2017-01-01

    In the entropy generation paradox, the entropy generation number, as a function of heat exchanger effectiveness, counter-intuitively approaches zero in two limits symmetrically from a single maximum. In thermoeconomic diagnosis, namely in the characteristic curve method, the exergy destruction...... to the entropy generation paradox, as a decreased heat exchanger effectiveness (as in the case of an operation anomaly in the component) can counter-intuitively result in decreased exergy destruction rate of the component. Therefore, along with an improper selection of independent variables, the heat exchanger...... increases in case of an operation anomaly in a component. The normalised exergy destruction rate as the dependent variable therefore resolves the relation of the characteristic curve method with the entropy generation paradox....

  16. Going Beyond, Going Further: The Preparation of Acid-Base Titration Curves.

    Science.gov (United States)

    McClendon, Michael

    1984-01-01

    Background information, list of materials needed, and procedures used are provided for a simple technique for generating mechanically plotted acid-base titration curves. The method is suitable for second-year high school chemistry students. (JN)

  17. Effects of LWR coolant environments on fatigue design curves of carbon and low-alloy steels

    International Nuclear Information System (INIS)

    Chopra, O.K.; Shack, W.J.

    1998-03-01

    The ASME Boiler and Pressure Vessel Code provides rules for the construction of nuclear power plant components. Figures I-9.1 through I-9.6 of Appendix I to Section III of the code specify fatigue design curves for structural materials. While effects of reactor coolant environments are not explicitly addressed by the design curves, test data indicate that the Code fatigue curves may not always be adequate in coolant environments. This report summarizes work performed by Argonne National Laboratory on fatigue of carbon and low-alloy steels in light water reactor (LWR) environments. The existing fatigue S-N data have been evaluated to establish the effects of various material and loading variables such as steel type, dissolved oxygen level, strain range, strain rate, temperature, orientation, and sulfur content on the fatigue life of these steels. Statistical models have been developed for estimating the fatigue S-N curves as a function of material, loading, and environmental variables. The results have been used to estimate the probability of fatigue cracking of reactor components. The different methods for incorporating the effects of LWR coolant environments on the ASME Code fatigue design curves are presented

  18. Assessing neural activity related to decision-making through flexible odds ratio curves and their derivatives.

    Science.gov (United States)

    Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Pardo-Vazquez, Jose L; Leboran, Victor; Molenberghs, Geert; Faes, Christel; Acuña, Carlos

    2011-06-30

    It is well established that neural activity is stochastically modulated over time. Therefore, direct comparisons across experimental conditions and determination of change points or maximum firing rates are not straightforward. This study sought to compare temporal firing probability curves that may vary across groups defined by different experimental conditions. Odds-ratio (OR) curves were used as a measure of comparison, and the main goal was to provide a global test to detect significant differences of such curves through the study of their derivatives. An algorithm is proposed that enables ORs based on generalized additive models, including factor-by-curve-type interactions to be flexibly estimated. Bootstrap methods were used to draw inferences from the derivatives curves, and binning techniques were applied to speed up computation in the estimation and testing processes. A simulation study was conducted to assess the validity of these bootstrap-based tests. This methodology was applied to study premotor ventral cortex neural activity associated with decision-making. The proposed statistical procedures proved very useful in revealing the neural activity correlates of decision-making in a visual discrimination task. Copyright © 2011 John Wiley & Sons, Ltd.

  19. The Effect of Velocity Correlation on the Spatial Evolution of Breakthrough Curves in Heterogeneous Media

    Science.gov (United States)

    Massoudieh, A.; Dentz, M.; Le Borgne, T.

    2017-12-01

    In heterogeneous media, the velocity distribution and the spatial correlation structure of velocity for solute particles determine the breakthrough curves and how they evolve as one moves away from the solute source. The ability to predict such evolution can help relating the spatio-statistical hydraulic properties of the media to the transport behavior and travel time distributions. While commonly used non-local transport models such as anomalous dispersion and classical continuous time random walk (CTRW) can reproduce breakthrough curve successfully by adjusting the model parameter values, they lack the ability to relate model parameters to the spatio-statistical properties of the media. This in turns limits the transferability of these models. In the research to be presented, we express concentration or flux of solutes as a distribution over their velocity. We then derive an integrodifferential equation that governs the evolution of the particle distribution over velocity at given times and locations for a particle ensemble, based on a presumed velocity correlation structure and an ergodic cross-sectional velocity distribution. This way, the spatial evolution of breakthrough curves away from the source is predicted based on cross-sectional velocity distribution and the connectivity, which is expressed by the velocity transition probability density. The transition probability is specified via a copula function that can help construct a joint distribution with a given correlation and given marginal velocities. Using this approach, we analyze the breakthrough curves depending on the velocity distribution and correlation properties. The model shows how the solute transport behavior evolves from ballistic transport at small spatial scales to Fickian dispersion at large length scales relative to the velocity correlation length.

  20. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  1. Dual Smarandache Curves of a Timelike Curve lying on Unit dual Lorentzian Sphere

    OpenAIRE

    Kahraman, Tanju; Hüseyin Ugurlu, Hasan

    2016-01-01

    In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere. Firstly, we define the four types of dual Smarandache curves of a timelike curve lying on dual Lorentzian sphere.

  2. Can Low-Resolution Airborne Laser Scanning Data Be Used to Model Stream Rating Curves?

    Directory of Open Access Journals (Sweden)

    Steve W. Lyon

    2015-03-01

    Full Text Available This pilot study explores the potential of using low-resolution (0.2 points/m2 airborne laser scanning (ALS-derived elevation data to model stream rating curves. Rating curves, which allow the functional translation of stream water depth into discharge, making them integral to water resource monitoring efforts, were modeled using a physics-based approach that captures basic geometric measurements to establish flow resistance due to implicit channel roughness. We tested synthetically thinned high-resolution (more than 2 points/m2 ALS data as a proxy for low-resolution data at a point density equivalent to that obtained within most national-scale ALS strategies. Our results show that the errors incurred due to the effect of low-resolution versus high-resolution ALS data were less than those due to flow measurement and empirical rating curve fitting uncertainties. As such, although there likely are scale and technical limitations to consider, it is theoretically possible to generate rating curves in a river network from ALS data of the resolution anticipated within national-scale ALS schemes (at least for rivers with relatively simple geometries. This is promising, since generating rating curves from ALS scans would greatly enhance our ability to monitor streamflow by simplifying the overall effort required.

  3. Can low-resolution airborne laser scanning data be used to model stream rating curves?

    Science.gov (United States)

    Lyon, Steve; Nathanson, Marcus; Lam, Norris; Dahlke, Helen; Rutzinger, Martin; Kean, Jason W.; Laudon, Hjalmar

    2015-01-01

    This pilot study explores the potential of using low-resolution (0.2 points/m2) airborne laser scanning (ALS)-derived elevation data to model stream rating curves. Rating curves, which allow the functional translation of stream water depth into discharge, making them integral to water resource monitoring efforts, were modeled using a physics-based approach that captures basic geometric measurements to establish flow resistance due to implicit channel roughness. We tested synthetically thinned high-resolution (more than 2 points/m2) ALS data as a proxy for low-resolution data at a point density equivalent to that obtained within most national-scale ALS strategies. Our results show that the errors incurred due to the effect of low-resolution versus high-resolution ALS data were less than those due to flow measurement and empirical rating curve fitting uncertainties. As such, although there likely are scale and technical limitations to consider, it is theoretically possible to generate rating curves in a river network from ALS data of the resolution anticipated within national-scale ALS schemes (at least for rivers with relatively simple geometries). This is promising, since generating rating curves from ALS scans would greatly enhance our ability to monitor streamflow by simplifying the overall effort required.

  4. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  5. A dose-response curve for biodosimetry from a 6 MV electron linear accelerator

    Energy Technology Data Exchange (ETDEWEB)

    Lemos-Pinto, M.M.P.; Cadena, M.; Santos, N.; Fernandes, T.S.; Borges, E.; Amaral, A., E-mail: marcelazoo@yahoo.com.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear

    2015-10-15

    Biological dosimetry (biodosimetry) is based on the investigation of radiation-induced biological effects (biomarkers), mainly dicentric chromosomes, in order to correlate them with radiation dose. To interpret the dicentric score in terms of absorbed dose, a calibration curve is needed. Each curve should be constructed with respect to basic physical parameters, such as the type of ionizing radiation characterized by low or high linear energy transfer (LET) and dose rate. This study was designed to obtain dose calibration curves by scoring of dicentric chromosomes in peripheral blood lymphocytes irradiated in vitro with a 6 MV electron linear accelerator (Mevatron M, Siemens, USA). Two software programs, CABAS (Chromosomal Aberration Calculation Software) and Dose Estimate, were used to generate the curve. The two software programs are discussed; the results obtained were compared with each other and with other published low LET radiation curves. Both software programs resulted in identical linear and quadratic terms for the curve presented here, which was in good agreement with published curves for similar radiation quality and dose rates. (author)

  6. A dose-response curve for biodosimetry from a 6 MV electron linear accelerator.

    Science.gov (United States)

    Lemos-Pinto, M M P; Cadena, M; Santos, N; Fernandes, T S; Borges, E; Amaral, A

    2015-10-01

    Biological dosimetry (biodosimetry) is based on the investigation of radiation-induced biological effects (biomarkers), mainly dicentric chromosomes, in order to correlate them with radiation dose. To interpret the dicentric score in terms of absorbed dose, a calibration curve is needed. Each curve should be constructed with respect to basic physical parameters, such as the type of ionizing radiation characterized by low or high linear energy transfer (LET) and dose rate. This study was designed to obtain dose calibration curves by scoring of dicentric chromosomes in peripheral blood lymphocytes irradiated in vitro with a 6 MV electron linear accelerator (Mevatron M, Siemens, USA). Two software programs, CABAS (Chromosomal Aberration Calculation Software) and Dose Estimate, were used to generate the curve. The two software programs are discussed; the results obtained were compared with each other and with other published low LET radiation curves. Both software programs resulted in identical linear and quadratic terms for the curve presented here, which was in good agreement with published curves for similar radiation quality and dose rates.

  7. Cholinergic neuromodulation changes phase response curve shape and type in cortical pyramidal neurons.

    Directory of Open Access Journals (Sweden)

    Klaus M Stiefel

    Full Text Available Spike generation in cortical neurons depends on the interplay between diverse intrinsic conductances. The phase response curve (PRC is a measure of the spike time shift caused by perturbations of the membrane potential as a function of the phase of the spike cycle of a neuron. Near the rheobase, purely positive (type I phase-response curves are associated with an onset of repetitive firing through a saddle-node bifurcation, whereas biphasic (type II phase-response curves point towards a transition based on a Hopf-Andronov bifurcation. In recordings from layer 2/3 pyramidal neurons in cortical slices, cholinergic action, consistent with down-regulation of slow voltage-dependent potassium currents such as the M-current, switched the PRC from type II to type I. This is the first report showing that cholinergic neuromodulation may cause a qualitative switch in the PRCs type implying a change in the fundamental dynamical mechanism of spike generation.

  8. ECM using Edwards curves

    DEFF Research Database (Denmark)

    Bernstein, Daniel J.; Birkner, Peter; Lange, Tanja

    2013-01-01

    -arithmetic level are as follows: (1) use Edwards curves instead of Montgomery curves; (2) use extended Edwards coordinates; (3) use signed-sliding-window addition-subtraction chains; (4) batch primes to increase the window size; (5) choose curves with small parameters and base points; (6) choose curves with large...

  9. Uncertainty estimation with bias-correction for flow series based on rating curve

    Science.gov (United States)

    Shao, Quanxi; Lerat, Julien; Podger, Geoff; Dutta, Dushmanta

    2014-03-01

    Streamflow discharge constitutes one of the fundamental data required to perform water balance studies and develop hydrological models. A rating curve, designed based on a series of concurrent stage and discharge measurements at a gauging location, provides a way to generate complete discharge time series with a reasonable quality if sufficient measurement points are available. However, the associated uncertainty is frequently not available even though it has a significant impact on hydrological modelling. In this paper, we identify the discrepancy of the hydrographers' rating curves used to derive the historical discharge data series and proposed a modification by bias correction which is also in the form of power function as the traditional rating curve. In order to obtain the uncertainty estimation, we propose a further both-side Box-Cox transformation to stabilize the regression residuals as close to the normal distribution as possible, so that a proper uncertainty can be attached for the whole discharge series in the ensemble generation. We demonstrate the proposed method by applying it to the gauging stations in the Flinders and Gilbert rivers in north-west Queensland, Australia.

  10. Theory of overdispersion in counting statistics caused by fluctuating probabilities

    International Nuclear Information System (INIS)

    Semkow, Thomas M.

    1999-01-01

    It is shown that the random Lexis fluctuations of probabilities such as probability of decay or detection cause the counting statistics to be overdispersed with respect to the classical binomial, Poisson, or Gaussian distributions. The generating and the distribution functions for the overdispersed counting statistics are derived. Applications to radioactive decay with detection and more complex experiments are given, as well as distinguishing between the source and background, in the presence of overdispersion. Monte-Carlo verifications are provided

  11. Statistical validation of normal tissue complication probability models.

    Science.gov (United States)

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Statistical Validation of Normal Tissue Complication Probability Models

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chengjian, E-mail: c.j.xu@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schaaf, Arjen van der; Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Schilstra, Cornelis [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Radiotherapy Institute Friesland, Leeuwarden (Netherlands)

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  13. The application of numerical debris flow modelling for the generation of physical vulnerability curves

    Czech Academy of Sciences Publication Activity Database

    Quan Luna, B.; Blahůt, Jan; van Westen, C.J.; Sterlacchini, S.; van Asch, T.W.J.; Akbas, S.O.

    2011-01-01

    Roč. 11, č. 7 (2011), s. 2047-2060 ISSN 1561-8633 Institutional research plan: CEZ:AV0Z30460519 Keywords : debris flow modelling * hazard * vulnerability curves Subject RIV: DB - Geology ; Mineralogy Impact factor: 1.983, year: 2011 http://www.nat-hazards-earth-syst-sci.net/11/2047/2011/

  14. Tube structural integrity evaluation of Palo Verde Unit 1 steam generators for axial upper-bundle cracking

    International Nuclear Information System (INIS)

    Woodman, B.W.; Begley, J.A.; Brown, S.D.; Sweeney, K.; Radspinner, M.; Melton, M.

    1995-01-01

    The analysis of the issue of upper bundle axial ODSCC as it apples to steam generator tube structural integrity in Unit 1 at the Palo Verde Nuclear generating Station is presented in this study. Based on past inspection results for Units 2 and 3 at Palo Verde, the detection of secondary side stress corrosion cracks in the upper bundle region of Unit 1 may occur at some future date. The following discussion provides a description and analysis of the probability of axial ODSCC in Unit 1 leading to the exceedance of Regulatory Guide 1.121 structural limits. The probabilities of structural limit exceedance are estimated as function of run time using a conservative approach. The chosen approach models the historical development of cracks, crack growth, detection of cracks and subsequent removal from service and the initiation and growth of new cracks during a given cycle of operation. Past performance of all Palo Verde Units as well as the historical performance of other steam generators was considered in the development of cracking statistics for application to Unit 1. Data in the literature and Unit 2 pulled tube examination results were used to construct probability of detection curves for the detection of axial IGSCC/IGA using an MRPC (multi-frequency rotating panake coil) eddy current probe. Crack growth rates were estimated from Unit 2 eddy current inspection data combined with pulled tube examination results and data in the literature. A Monte-Carlo probabilistic model is developed to provide an overall assessment of the risk of Regulatory Guide exceedance during plant operation

  15. DECIPHERING THERMAL PHASE CURVES OF DRY, TIDALLY LOCKED TERRESTRIAL PLANETS

    Energy Technology Data Exchange (ETDEWEB)

    Koll, Daniel D. B.; Abbot, Dorian S., E-mail: dkoll@uchicago.edu [Department of the Geophysical Sciences, University of Chicago, Chicago, IL 60637 (United States)

    2015-03-20

    Next-generation space telescopes will allow us to characterize terrestrial exoplanets. To do so effectively it will be crucial to make use of all available data. We investigate which atmospheric properties can, and cannot, be inferred from the broadband thermal phase curve of a dry and tidally locked terrestrial planet. First, we use dimensional analysis to show that phase curves are controlled by six nondimensional parameters. Second, we use an idealized general circulation model to explore the relative sensitivity of phase curves to these parameters. We find that the feature of phase curves most sensitive to atmospheric parameters is the peak-to-trough amplitude. Moreover, except for hot and rapidly rotating planets, the phase amplitude is primarily sensitive to only two nondimensional parameters: (1) the ratio of dynamical to radiative timescales and (2) the longwave optical depth at the surface. As an application of this technique, we show how phase curve measurements can be combined with transit or emission spectroscopy to yield a new constraint for the surface pressure and atmospheric mass of terrestrial planets. We estimate that a single broadband phase curve, measured over half an orbit with the James Webb Space Telescope, could meaningfully constrain the atmospheric mass of a nearby super-Earth. Such constraints will be important for studying the atmospheric evolution of terrestrial exoplanets as well as characterizing the surface conditions on potentially habitable planets.

  16. Comparing of the yield curve of the pediatric X-ray equipment using thermoluminescent dosimeters and cylindrical ionization chamber

    International Nuclear Information System (INIS)

    Filipov, Danielle; Schelin, Hugo R.; Tilly Junior, Joao G.

    2014-01-01

    The determination of the yield curve of a radiographic equipment should be realized once a year, or when the unit be serviced. Besides being a requirement of ANVISA, through this test is possible to determine the incident air kerma (at a given point in the center of the beam) - INAK. Based on these concepts, the main objective of this work is the comparison of yield curves of the pediatric X-ray apparatus using two different detectors: one cylindrical ionization chamber and thermoluminescent dosimeters type LiF: Mg, Cu, P, as per protocol RLA / 9/057 IAEA. Then the equation of the yield curve (generated by each detector) was used to determine the INAK of 10 pediatric examinations, performed on this equipment. After the process of calibration of both detectors, they were placed side by side at a focus of the tube equipment for determining the performance of the same curve. Finally, using the curves generated by two detectors, INAK values of the 10 tests were calculated (from the kVp values, and mAs focus-patient of each exams), generating difference values at most 5%. As a conclusion, it can be said that the TLD lithium fluoride doped with Mg, Cu and P and the cylindrical ionization chambers may be used satisfactorily to determine the yield curve, whether as quality control or dosimetry

  17. Parameter Deduction and Accuracy Analysis of Track Beam Curves in Straddle-type Monorail Systems

    Directory of Open Access Journals (Sweden)

    Xiaobo Zhao

    2015-12-01

    Full Text Available The accuracy of the bottom curve of a PC track beam is strongly related to the production quality of the entire beam. Many factors may affect the parameters of the bottom curve, such as the superelevation of the curve and the deformation of a PC track beam. At present, no effective method has been developed to determine the bottom curve of a PC track beam; therefore, a new technique is presented in this paper to deduce the parameters of such a curve and to control the accuracy of the computation results. First, the domain of the bottom curve of a PC track beam is assumed to be a spindle plane. Then, the corresponding supposed top curve domain is determined based on a geometrical relationship that is the opposite of that identified by the conventional method. Second, several optimal points are selected from the supposed top curve domain according to the dichotomy algorithm; the supposed top curve is thus generated by connecting these points. Finally, one rigorous criterion is established in the fractal dimension to assess the accuracy of the assumed top curve deduced in the previous step. If this supposed curve coincides completely with the known top curve, then the assumed bottom curve corresponding to the assumed top curve is considered to be the real bottom curve. This technique of determining the bottom curve of a PC track beam is thus proven to be efficient and accurate.

  18. METHOD OF FOREST FIRES PROBABILITY ASSESSMENT WITH POISSON LAW

    Directory of Open Access Journals (Sweden)

    A. S. Plotnikova

    2016-01-01

    Full Text Available The article describes the method for the forest fire burn probability estimation on a base of Poisson distribution. The λ parameter is assumed to be a mean daily number of fires detected for each Forest Fire Danger Index class within specific period of time. Thus, λ was calculated for spring, summer and autumn seasons separately. Multi-annual daily Forest Fire Danger Index values together with EO-derived hot spot map were input data for the statistical analysis. The major result of the study is generation of the database on forest fire burn probability. Results were validated against EO daily data on forest fires detected over Irkutsk oblast in 2013. Daily weighted average probability was shown to be linked with the daily number of detected forest fires. Meanwhile, there was found a number of fires which were developed when estimated probability was low. The possible explanation of this phenomenon was provided.

  19. Technological change in energy systems. Learning curves, logistic curves and input-output coefficients

    International Nuclear Information System (INIS)

    Pan, Haoran; Koehler, Jonathan

    2007-01-01

    Learning curves have recently been widely adopted in climate-economy models to incorporate endogenous change of energy technologies, replacing the conventional assumption of an autonomous energy efficiency improvement. However, there has been little consideration of the credibility of the learning curve. The current trend that many important energy and climate change policy analyses rely on the learning curve means that it is of great importance to critically examine the basis for learning curves. Here, we analyse the use of learning curves in energy technology, usually implemented as a simple power function. We find that the learning curve cannot separate the effects of price and technological change, cannot reflect continuous and qualitative change of both conventional and emerging energy technologies, cannot help to determine the time paths of technological investment, and misses the central role of R and D activity in driving technological change. We argue that a logistic curve of improving performance modified to include R and D activity as a driving variable can better describe the cost reductions in energy technologies. Furthermore, we demonstrate that the top-down Leontief technology can incorporate the bottom-up technologies that improve along either the learning curve or the logistic curve, through changing input-output coefficients. An application to UK wind power illustrates that the logistic curve fits the observed data better and implies greater potential for cost reduction than the learning curve does. (author)

  20. Shape optimization of self-avoiding curves

    Science.gov (United States)

    Walker, Shawn W.

    2016-04-01

    This paper presents a softened notion of proximity (or self-avoidance) for curves. We then derive a sensitivity result, based on shape differential calculus, for the proximity. This is combined with a gradient-based optimization approach to compute three-dimensional, parameterized curves that minimize the sum of an elastic (bending) energy and a proximity energy that maintains self-avoidance by a penalization technique. Minimizers are computed by a sequential-quadratic-programming (SQP) method where the bending energy and proximity energy are approximated by a finite element method. We then apply this method to two problems. First, we simulate adsorbed polymer strands that are constrained to be bound to a surface and be (locally) inextensible. This is a basic model of semi-flexible polymers adsorbed onto a surface (a current topic in material science). Several examples of minimizing curve shapes on a variety of surfaces are shown. An advantage of the method is that it can be much faster than using molecular dynamics for simulating polymer strands on surfaces. Second, we apply our proximity penalization to the computation of ideal knots. We present a heuristic scheme, utilizing the SQP method above, for minimizing rope-length and apply it in the case of the trefoil knot. Applications of this method could be for generating good initial guesses to a more accurate (but expensive) knot-tightening algorithm.

  1. Calibration curve to establish the exposure dose at Co60 gamma radiation

    International Nuclear Information System (INIS)

    Guerrero C, C.; Brena V, M.

    2000-01-01

    The biological dosimetry is an adequate method for the dose determination in cases of overexposure to ionizing radiation or doubt of the dose obtained by physical methods. It is based in the aberrations analysis produced in the chromosomes. The behavior of leisure in chromosomes is of dose-response type and it has been generated curves in distinct laboratories. Next is presented the curve for gamma radiation produced in the National Institute of Nuclear Research (ININ) laboratory. (Author)

  2. Investigating Theoretical PV Energy Generation Patterns with Their Relation to the Power Load Curve in Poland

    Directory of Open Access Journals (Sweden)

    Jakub Jurasz

    2016-01-01

    Full Text Available Polish energy sector is (almost from its origin dominated by fossil fuel feed power. This situation results from an abundance of relatively cheap coal (hard and lignite. Brown coal due to its nature is the cheapest energy source in Poland. However, hard coal which fuels 60% of polish power plants is picking up on prices and is susceptible to the coal imported from neighboring countries. Forced by the European Union (EU regulations, Poland is struggling at achieving its goal of reaching 15% of energy consumption from renewable energy sources (RES by 2020. Over the year 2015, RES covered 11.3% of gross energy consumption but this generation was dominated by solid biomass (over 80%. The aim of this paper was to answer the following research questions: What is the relation of irradiation values to the power load on a yearly and daily basis? and how should photovoltaics (PV be integrated in the polish power system? Conducted analysis allowed us to state that there exists a negative correlation between power demand and irradiation values on a yearly basis, but this is likely to change in the future. Secondly, on average, daily values of irradiation tend to follow power load curve over the first hours of the day.

  3. Using Light Curves to Characterize Size and Shape of Pseudo-Debris

    Science.gov (United States)

    Rodriquez, Heather M.; Abercromby, Kira J.; Jarvis, Kandy S.; Barker, Edwin

    2006-01-01

    Photometric measurements were collected for a new study aimed at estimating orbital debris sizes based on object brightness. To obtain a size from optical measurements the current practice is to assume an albedo and use a normalized magnitude to calculate optical size. However, assuming a single albedo value may not be valid for all objects or orbit types; material type and orientation can mask an object s true optical cross section. This experiment used a CCD camera to record data, a 300 W Xenon, Ozone Free collimated light source to simulate solar illumination, and a robotic arm with five degrees of freedom to move the piece of simulated debris through various orientations. The pseudo-debris pieces used in this experiment originate from the European Space Operations Centre s ESOC2 ground test explosion of a mock satellite. A uniformly illuminated white ping-pong ball was used as a zero-magnitude reference. Each debris piece was then moved through specific orientations and rotations to generate a light curve. This paper discusses the results of five different object-based light curves as measured through an x-rotation. Intensity measurements, from which each light curve was generated, were recorded in five degree increments from zero to 180 degrees. Comparing light curves of different shaped and sized pieces against their characteristic length establishes the start of a database from which an optical size estimation model will be derived in the future.

  4. Integrating spatial, temporal, and size probabilities for the annual landslide hazard maps in the Shihmen watershed, Taiwan

    Directory of Open Access Journals (Sweden)

    C. Y. Wu

    2013-09-01

    Full Text Available Landslide spatial, temporal, and size probabilities were used to perform a landslide hazard assessment in this study. Eleven intrinsic geomorphological, and two extrinsic rainfall factors were evaluated as landslide susceptibility related factors as they related to the success rate curves, landslide ratio plots, frequency distributions of landslide and non-landslide groups, as well as probability–probability plots. Data on landslides caused by Typhoon Aere in the Shihmen watershed were selected to train the susceptibility model. The landslide area probability, based on the power law relationship between the landslide area and a noncumulative number, was analyzed using the Pearson type 5 probability density function. The exceedance probabilities of rainfall with various recurrence intervals, including 2, 5, 10, 20, 50, 100 and 200 yr, were used to determine the temporal probabilities of the events. The study was conducted in the Shihmen watershed, which has an area of 760 km2 and is one of the main water sources for northern Taiwan. The validation result of Typhoon Krosa demonstrated that this landslide hazard model could be used to predict the landslide probabilities. The results suggested that integration of spatial, area, and exceedance probabilities to estimate the annual probability of each slope unit is feasible. The advantage of this annual landslide probability model lies in its ability to estimate the annual landslide risk, instead of a scenario-based risk.

  5. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  6. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  7. The novel composite mechanism of ammonium molybdophosphate loaded on silica matrix and its ion exchange breakthrough curves for cesium

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Hao [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China); Li, Yuxiang, E-mail: superfigure@163.com [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China); School of Materials Science and Engineering, Southwest University of Science and Technology, Mianyang 621010 (China); National Defense Key Discipline Laboratory for Nuclear Wastes and Environmental Safety, Southwest University of Science and Technology, Mianyang 621010 (China); Wu, Lang [State Key Laboratory Cultivation Base for Nonmetal Composites and Functional Materials, Southwest University of Science and Technology, Mianyang 621010 (China); Ma, Xue [School of Materials Science and Engineering, Southwest University of Science and Technology, Mianyang 621010 (China)

    2017-02-15

    Highlights: • The granular composites were fabricated by the sequential annealing mechanism. • The method controls the porous characteristics and stable structure of materials. • The breakthrough curve of Cs{sup +} follows the Thomas model with a high removal rate. • It is a probable for SM-AMP20 to recycle Cs{sup +} using an eluent of 2–3 mol/L NH{sub 4}NO{sub 3}. - Abstract: Long-lived {sup 137}Cs (over 30 years), a byproduct of the spent fuel fission processes, comprises the majority of high-level and prolific heat-generating waste in downstream processing. This study reports a novel sequential annealing mechanism with cross-linked network of polyvinyl alcohol, fabricating the composite of ammonium molybdophosphate loaded on silica matrix (SM-AMP20, 20 wt% AMP) as an excellent granular ion exchanger for removal Cs{sup +}. When the matrix is remarkably sequential annealed, well-dispersed SM-AMP20 particles are formed by firmly anchoring themselves on controlling the porous characteristics and stable structure. The material crystallizes in the complex cubic space group Pn-3m with cell parameters of crystalline AMP formation. The breakthrough curve of Cs{sup +} by SM-AMP20 follows the Thomas model with a high removal rate of 88.23% (∼10 mg/L of Cs{sup +}) and breakthrough time as high as 26 h (flow rate Q ≈ 2.5 mL/min and bed height Z ≈ 11 cm) at neutral pH. We also report on sorbents that could efficiently remove Cs{sup +} ions from complex solutions containing different competitive cations (Na{sup +}, Al{sup 3+}, Fe{sup 3+}, and Ni{sup 2+}, respectively) in large excess. Furthermore, this study shows that there is a probability for SM-AMP20 to recycle cesium using an eluent of 2–3 mol/L NH{sub 4}NO{sub 3} solution.

  8. Moving beyond probabilities – Strength of knowledge characterisations applied to security

    International Nuclear Information System (INIS)

    Askeland, Tore; Flage, Roger; Aven, Terje

    2017-01-01

    Many security experts avoid the concept of probability when assessing risk and vulnerabilities. Their main argument is that meaningful probabilities cannot be determined and they are consequently not useful for decision-making and security management. However, to give priority to some measures and not others, the likelihood dimension needs to be addressed in some way; the question is how. One approach receiving attention recently is to add strength of knowledge judgements to the probabilities and probability intervals generated. The judgements provide a qualitative labelling of how strong the knowledge supporting the probability assignments is. Criteria for such labelling have been developed, but not for a security setting. The purpose of this paper is to develop such criteria specific to security applications and, using some examples, to demonstrate their suitability. - Highlights: • The concept of probability is often avoided in security risk assessments. • We argue that the likelihood/probability dimension needs to be somehow addressed. • Probabilities should be supplemented by qualitative strength-of-knowledge scores. • Such criteria specific to security applications are developed. • Two examples are used to demonstrate the suitability of the suggested criteria.

  9. Towards freeform curved blazed gratings using diamond machining

    Science.gov (United States)

    Bourgenot, C.; Robertson, D. J.; Stelter, D.; Eikenberry, S.

    2016-07-01

    Concave blazed gratings greatly simplify the architecture of spectrographs by reducing the number of optical components. The production of these gratings using diamond-machining offers practically no limits in the design of the grating substrate shape, with the possibility of making large sag freeform surfaces unlike the alternative and traditional method of holography and ion etching. In this paper, we report on the technological challenges and progress in the making of these curved blazed gratings using an ultra-high precision 5 axes Moore-Nanotech machine. We describe their implementation in an integral field unit prototype called IGIS (Integrated Grating Imaging Spectrograph) where freeform curved gratings are used as pupil mirrors. The goal is to develop the technologies for the production of the next generation of low-cost, compact, high performance integral field unit spectrometers.

  10. Induced gravity in quantum theory in a curved space

    International Nuclear Information System (INIS)

    Etim, E.

    1983-01-01

    The reason for interest in the unorthodox view of first order (about R(x)) gravity as a matter-induced quantum effect is really to find an argument not to quantise it. According to this view quantum gravity should be constructed with an action which is, at least, quadratic in the scalar curvature R(x). Such a theory will not contain a dimensional parameter, like Newton's constant, and would probably be renormalisable. This lecture is intended to acquaint the non-expert with the phenomenon of induction of the scalar curvature term in the matter Lagrangian in a curved space in both relativistic and non-relativistic quantum theories

  11. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program.

    Science.gov (United States)

    Afouxenidis, D; Polymeris, G S; Tsirliganis, N C; Kitis, G

    2012-05-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the GLOw Curve ANalysis INtercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters.

  12. Computerised curve deconvolution of TL/OSL curves using a popular spreadsheet program

    International Nuclear Information System (INIS)

    Afouxenidis, D.; Polymeris, G. S.; Tsirliganis, N. C.; Kitis, G.

    2012-01-01

    This paper exploits the possibility of using commercial software for thermoluminescence and optically stimulated luminescence curve deconvolution analysis. The widely used software package Microsoft Excel, with the Solver utility has been used to perform deconvolution analysis to both experimental and reference glow curves resulted from the Glow Curve Analysis Intercomparison project. The simple interface of this programme combined with the powerful Solver utility, allows the analysis of complex stimulated luminescence curves into their components and the evaluation of the associated luminescence parameters. (authors)

  13. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  14. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  15. Experimental Method for Plotting S-N Curve with a Small Number of Specimens

    Directory of Open Access Journals (Sweden)

    Strzelecki Przemysław

    2016-12-01

    Full Text Available The study presents two approaches to plotting an S-N curve based on the experimental results. The first approach is commonly used by researchers and presented in detail in many studies and standard documents. The model uses a linear regression whose parameters are estimated by using the least squares method. A staircase method is used for an unlimited fatigue life criterion. The second model combines the S-N curve defined as a straight line and the record of random occurrence of the fatigue limit. A maximum likelihood method is used to estimate the S-N curve parameters. Fatigue data for C45+C steel obtained in the torsional bending test were used to compare the estimated S-N curves. For pseudo-random numbers generated by using the Mersenne Twister algorithm, the estimated S-N curve for 10 experimental results plotted by using the second model, estimates the fatigue life in the scatter band of the factor 3. The result gives good approximation, especially regarding the time required to plot the S-N curve.

  16. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  17. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  18. W-curve alignments for HIV-1 genomic comparisons.

    Directory of Open Access Journals (Sweden)

    Douglas J Cork

    2010-06-01

    Full Text Available The W-curve was originally developed as a graphical visualization technique for viewing DNA and RNA sequences. Its ability to render features of DNA also makes it suitable for computational studies. Its main advantage in this area is utilizing a single-pass algorithm for comparing the sequences. Avoiding recursion during sequence alignments offers advantages for speed and in-process resources. The graphical technique also allows for multiple models of comparison to be used depending on the nucleotide patterns embedded in similar whole genomic sequences. The W-curve approach allows us to compare large numbers of samples quickly.We are currently tuning the algorithm to accommodate quirks specific to HIV-1 genomic sequences so that it can be used to aid in diagnostic and vaccine efforts. Tracking the molecular evolution of the virus has been greatly hampered by gap associated problems predominantly embedded within the envelope gene of the virus. Gaps and hypermutation of the virus slow conventional string based alignments of the whole genome. This paper describes the W-curve algorithm itself, and how we have adapted it for comparison of similar HIV-1 genomes. A treebuilding method is developed with the W-curve that utilizes a novel Cylindrical Coordinate distance method and gap analysis method. HIV-1 C2-V5 env sequence regions from a Mother/Infant cohort study are used in the comparison.The output distance matrix and neighbor results produced by the W-curve are functionally equivalent to those from Clustal for C2-V5 sequences in the mother/infant pairs infected with CRF01_AE.Significant potential exists for utilizing this method in place of conventional string based alignment of HIV-1 genomes, such as Clustal X. With W-curve heuristic alignment, it may be possible to obtain clinically useful results in a short time-short enough to affect clinical choices for acute treatment. A description of the W-curve generation process, including a comparison

  19. W-curve alignments for HIV-1 genomic comparisons.

    Science.gov (United States)

    Cork, Douglas J; Lembark, Steven; Tovanabutra, Sodsai; Robb, Merlin L; Kim, Jerome H

    2010-06-01

    The W-curve was originally developed as a graphical visualization technique for viewing DNA and RNA sequences. Its ability to render features of DNA also makes it suitable for computational studies. Its main advantage in this area is utilizing a single-pass algorithm for comparing the sequences. Avoiding recursion during sequence alignments offers advantages for speed and in-process resources. The graphical technique also allows for multiple models of comparison to be used depending on the nucleotide patterns embedded in similar whole genomic sequences. The W-curve approach allows us to compare large numbers of samples quickly. We are currently tuning the algorithm to accommodate quirks specific to HIV-1 genomic sequences so that it can be used to aid in diagnostic and vaccine efforts. Tracking the molecular evolution of the virus has been greatly hampered by gap associated problems predominantly embedded within the envelope gene of the virus. Gaps and hypermutation of the virus slow conventional string based alignments of the whole genome. This paper describes the W-curve algorithm itself, and how we have adapted it for comparison of similar HIV-1 genomes. A treebuilding method is developed with the W-curve that utilizes a novel Cylindrical Coordinate distance method and gap analysis method. HIV-1 C2-V5 env sequence regions from a Mother/Infant cohort study are used in the comparison. The output distance matrix and neighbor results produced by the W-curve are functionally equivalent to those from Clustal for C2-V5 sequences in the mother/infant pairs infected with CRF01_AE. Significant potential exists for utilizing this method in place of conventional string based alignment of HIV-1 genomes, such as Clustal X. With W-curve heuristic alignment, it may be possible to obtain clinically useful results in a short time-short enough to affect clinical choices for acute treatment. A description of the W-curve generation process, including a comparison technique of

  20. Hydra-Ring: a computational framework to combine failure probabilities

    Science.gov (United States)

    Diermanse, Ferdinand; Roscoe, Kathryn; IJmker, Janneke; Mens, Marjolein; Bouwer, Laurens

    2013-04-01

    This presentation discusses the development of a new computational framework for the safety assessment of flood defence systems: Hydra-Ring. Hydra-Ring computes the failure probability of a flood defence system, which is composed of a number of elements (e.g., dike segments, dune segments or hydraulic structures), taking all relevant uncertainties explicitly into account. This is a major step forward in comparison with the current Dutch practice in which the safety assessment is done separately per individual flood defence section. The main advantage of the new approach is that it will result in a more balanced prioratization of required mitigating measures ('more value for money'). Failure of the flood defence system occurs if any element within the system fails. Hydra-Ring thus computes and combines failure probabilities of the following elements: - Failure mechanisms: A flood defence system can fail due to different failure mechanisms. - Time periods: failure probabilities are first computed for relatively small time scales (assessment of flood defense systems, Hydra-Ring can also be used to derive fragility curves, to asses the efficiency of flood mitigating measures, and to quantify the impact of climate change and land subsidence on flood risk. Hydra-Ring is being developed in the context of the Dutch situation. However, the computational concept is generic and the model is set up in such a way that it can be applied to other areas as well. The presentation will focus on the model concept and probabilistic computation techniques.

  1. Effect of β on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    International Nuclear Information System (INIS)

    Feng Qinghai; Yuan Wancheng

    2010-01-01

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of β is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of β are suggested.

  2. Eddy current testing device for metallic tubes at least locally curved

    International Nuclear Information System (INIS)

    Pigeon, Marcel; Vienot, Claude.

    1975-01-01

    Steam generators, condensers and heat exchangers generally consist of metallic tube bundles, the tubes having a complex geometry. The invention concerns an Eddy current testing device for metallic tubes at least locally curved, operating by translation of a probe inside the tubes [fr

  3. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  4. STADIC: a computer code for combining probability distributions

    International Nuclear Information System (INIS)

    Cairns, J.J.; Fleming, K.N.

    1977-03-01

    The STADIC computer code uses a Monte Carlo simulation technique for combining probability distributions. The specific function for combination of the input distribution is defined by the user by introducing the appropriate FORTRAN statements to the appropriate subroutine. The code generates a Monte Carlo sampling from each of the input distributions and combines these according to the user-supplied function to provide, in essence, a random sampling of the combined distribution. When the desired number of samples is obtained, the output routine calculates the mean, standard deviation, and confidence limits for the resultant distribution. This method of combining probability distributions is particularly useful in cases where analytical approaches are either too difficult or undefined

  5. Transition probabilities and dissociation energies of MnH and MnD molecules

    International Nuclear Information System (INIS)

    Nagarajan, K.; Rajamanickam, N.

    1997-01-01

    The Frank-Condon factors (vibrational transition probabilities) and r-centroids have been evaluated by the more reliable numerical integration procedure for the bands of A-X system of MnH and MnD molecules, using a suitable potential. By fitting the Hulburt- Hirschfelder function to the experimental potential curve using correlation coefficient, the dissociation energy for the electronic ground states of MnH and MnD molecules, respectively have been estimated as D 0 0 =251±5 KJ.mol -1 and D 0 0 =312±6 KJ.mol -1 . (authors)

  6. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  7. On the Inclusion of Short-distance Bystander Effects into a Logistic Tumor Control Probability Model.

    Science.gov (United States)

    Tempel, David G; Brodin, N Patrik; Tomé, Wolfgang A

    2018-01-01

    Currently, interactions between voxels are neglected in the tumor control probability (TCP) models used in biologically-driven intensity-modulated radiotherapy treatment planning. However, experimental data suggests that this may not always be justified when bystander effects are important. We propose a model inspired by the Ising model, a short-range interaction model, to investigate if and when it is important to include voxel to voxel interactions in biologically-driven treatment planning. This Ising-like model for TCP is derived by first showing that the logistic model of tumor control is mathematically equivalent to a non-interacting Ising model. Using this correspondence, the parameters of the logistic model are mapped to the parameters of an Ising-like model and bystander interactions are introduced as a short-range interaction as is the case for the Ising model. As an example, we apply the model to study the effect of bystander interactions in the case of radiation therapy for prostate cancer. The model shows that it is adequate to neglect bystander interactions for dose distributions that completely cover the treatment target and yield TCP estimates that lie in the shoulder of the dose response curve. However, for dose distributions that yield TCP estimates that lie on the steep part of the dose response curve or for inhomogeneous dose distributions having significant hot and/or cold regions, bystander effects may be important. Furthermore, the proposed model highlights a previously unexplored and potentially fruitful connection between the fields of statistical mechanics and tumor control probability/normal tissue complication probability modeling.

  8. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    Science.gov (United States)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  9. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    International Nuclear Information System (INIS)

    Gloger, Oliver; Völzke, Henry; Tönnies, Klaus; Mensel, Birger

    2015-01-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. (paper)

  10. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  11. Extension of Ko Straight-Beam Displacement Theory to Deformed Shape Predictions of Slender Curved Structures

    Science.gov (United States)

    Ko, William L.; Fleischer, Van Tran

    2011-01-01

    The Ko displacement theory originally developed for shape predictions of straight beams is extended to shape predictions of curved beams. The surface strains needed for shape predictions were analytically generated from finite-element nodal stress outputs. With the aid of finite-element displacement outputs, mathematical functional forms for curvature-effect correction terms are established and incorporated into straight-beam deflection equations for shape predictions of both cantilever and two-point supported curved beams. The newly established deflection equations for cantilever curved beams could provide quite accurate shape predictions for different cantilever curved beams, including the quarter-circle cantilever beam. Furthermore, the newly formulated deflection equations for two-point supported curved beams could provide accurate shape predictions for a range of two-point supported curved beams, including the full-circular ring. Accuracy of the newly developed curved-beam deflection equations is validated through shape prediction analysis of curved beams embedded in the windward shallow spherical shell of a generic crew exploration vehicle. A single-point collocation method for optimization of shape predictions is discussed in detail

  12. Bearing Diagnostics of Hydro Power Plants Using Wavelet Packet Transform and a Hidden Markov Model with Orbit Curves

    Directory of Open Access Journals (Sweden)

    Gabriel Pino

    2018-01-01

    Full Text Available The contribution of a medium-sized hydro power plant to the power grid can be either at base load or at peak load. When the latter is the most common operation mode, it increases the start and stop frequency, intensifying the hydro turbine components’ degradation, such as the guide bearings. This happens due to more frequent operation in transient states, which means being outside the service point of the machines’ nominal condition, consisting of speed, flow, and gross head. Such transient state operation increases the runner bearings’ mechanical vibration. The readings are acquired during the runner start-ups and filtered by a DC component mean value and a wavelet packet transform. The filtered series are used to estimate the relationship between the maximum orbit curve displacement and the accumulated operating hours. The estimated equation associated with the ISO 7919-5 vibration standards establishes the sojourn times of the degradation states, sufficient to obtain the transition probability distribution. Thereafter, a triangular probability function is used to determine the observation probability distribution in each state. Both matrices are inputs required by a hidden Markov model aiming to simulate the equipment deterioration process, given a sequence of maximum orbit curve displacements.

  13. Approximation by planar elastic curves

    DEFF Research Database (Denmark)

    Brander, David; Gravesen, Jens; Nørbjerg, Toke Bjerge

    2016-01-01

    We give an algorithm for approximating a given plane curve segment by a planar elastic curve. The method depends on an analytic representation of the space of elastic curve segments, together with a geometric method for obtaining a good initial guess for the approximating curve. A gradient......-driven optimization is then used to find the approximating elastic curve....

  14. Selection of risk reduction portfolios under interval-valued probabilities

    International Nuclear Information System (INIS)

    Toppila, Antti; Salo, Ahti

    2017-01-01

    A central problem in risk management is that of identifying the optimal combination (or portfolio) of improvements that enhance the reliability of the system most through reducing failure event probabilities, subject to the availability of resources. This optimal portfolio can be sensitive with regard to epistemic uncertainties about the failure events' probabilities. In this paper, we develop an optimization model to support the allocation of resources to improvements that mitigate risks in coherent systems in which interval-valued probabilities defined by lower and upper bounds are employed to capture epistemic uncertainties. Decision recommendations are based on portfolio dominance: a resource allocation portfolio is dominated if there exists another portfolio that improves system reliability (i) at least as much for all feasible failure probabilities and (ii) strictly more for some feasible probabilities. Based on non-dominated portfolios, recommendations about improvements to implement are derived by inspecting in how many non-dominated portfolios a given improvement is contained. We present an exact method for computing the non-dominated portfolios. We also present an approximate method that simplifies the reliability function using total order interactions so that larger problem instances can be solved with reasonable computational effort. - Highlights: • Reliability allocation under epistemic uncertainty about probabilities. • Comparison of alternatives using dominance. • Computational methods for generating the non-dominated alternatives. • Deriving decision recommendations that are robust with respect to epistemic uncertainty.

  15. ECONOMIC GROWTH AND AIR POLLUTION IN THECZECHREPUBLIC: DECOUPLING CURVES

    Directory of Open Access Journals (Sweden)

    Petr Šauer

    2012-07-01

    Full Text Available The decoupling curve, together with the Environmental Kuznets Curve, has beenrecognized as one of the important indicators showing relations betweeneconomic growth and environmental degradation/pollution. Many boththeoreticaland empirical studies have been published on it. Our paper brings models whichinvestigate relations between the economic growth per capita and selectedindicators of air pollution in theCzechRepublic. The analysis tried to go beforethe year 1990, despite the difficulties when dealing with different macroeconomicindicators published during the socialist period and those introduced after thetransition to a market economy. The results might be somehow surprising forthose dealing only with data generated after the year 1990: it is possible todiscover the turning points for some of the airborne pollutants already in the1980s.

  16. Bragg Curve Spectroscopy

    International Nuclear Information System (INIS)

    Gruhn, C.R.

    1981-05-01

    An alternative utilization is presented for the gaseous ionization chamber in the detection of energetic heavy ions, which is called Bragg Curve Spectroscopy (BCS). Conceptually, BCS involves using the maximum data available from the Bragg curve of the stopping heavy ion (HI) for purposes of identifying the particle and measuring its energy. A detector has been designed that measures the Bragg curve with high precision. From the Bragg curve the range from the length of the track, the total energy from the integral of the specific ionization over the track, the dE/dx from the specific ionization at the beginning of the track, and the Bragg peak from the maximum of the specific ionization of the HI are determined. This last signal measures the atomic number, Z, of the HI unambiguously

  17. A residual life prediction model based on the generalized σ -N curved surface

    OpenAIRE

    Zongwen AN; Xuezong BAI; Jianxiong GAO

    2016-01-01

    In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic); then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relation...

  18. AtomicJ: An open source software for analysis of force curves

    Science.gov (United States)

    Hermanowicz, Paweł; Sarna, Michał; Burda, Kvetoslava; Gabryś, Halina

    2014-06-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh.

  19. AtomicJ: An open source software for analysis of force curves

    International Nuclear Information System (INIS)

    Hermanowicz, Paweł; Gabryś, Halina; Sarna, Michał; Burda, Kvetoslava

    2014-01-01

    We present an open source Java application for analysis of force curves and images recorded with the Atomic Force Microscope. AtomicJ supports a wide range of contact mechanics models and implements procedures that reduce the influence of deviations from the contact model. It generates maps of mechanical properties, including maps of Young's modulus, adhesion force, and sample height. It can also calculate stacks, which reveal how sample's response to deformation changes with indentation depth. AtomicJ analyzes force curves concurrently on multiple threads, which allows for high speed of analysis. It runs on all popular operating systems, including Windows, Linux, and Macintosh

  20. Sediment Curve Uncertainty Estimation Using GLUE and Bootstrap Methods

    Directory of Open Access Journals (Sweden)

    aboalhasan fathabadi

    2017-02-01

    Full Text Available Introduction: In order to implement watershed practices to decrease soil erosion effects it needs to estimate output sediment of watershed. Sediment rating curve is used as the most conventional tool to estimate sediment. Regarding to sampling errors and short data, there are some uncertainties in estimating sediment using sediment curve. In this research, bootstrap and the Generalized Likelihood Uncertainty Estimation (GLUE resampling techniques were used to calculate suspended sediment loads by using sediment rating curves. Materials and Methods: The total drainage area of the Sefidrood watershed is about 560000 km2. In this study uncertainty in suspended sediment rating curves was estimated in four stations including Motorkhane, Miyane Tonel Shomare 7, Stor and Glinak constructed on Ayghdamosh, Ghrangho, GHezelOzan and Shahrod rivers, respectively. Data were randomly divided into a training data set (80 percent and a test set (20 percent by Latin hypercube random sampling.Different suspended sediment rating curves equations were fitted to log-transformed values of sediment concentration and discharge and the best fit models were selected based on the lowest root mean square error (RMSE and the highest correlation of coefficient (R2. In the GLUE methodology, different parameter sets were sampled randomly from priori probability distribution. For each station using sampled parameter sets and selected suspended sediment rating curves equation suspended sediment concentration values were estimated several times (100000 to 400000 times. With respect to likelihood function and certain subjective threshold, parameter sets were divided into behavioral and non-behavioral parameter sets. Finally using behavioral parameter sets the 95% confidence intervals for suspended sediment concentration due to parameter uncertainty were estimated. In bootstrap methodology observed suspended sediment and discharge vectors were resampled with replacement B (set to

  1. Learning Curve? Which One?

    Directory of Open Access Journals (Sweden)

    Paulo Prochno

    2004-07-01

    Full Text Available Learning curves have been studied for a long time. These studies provided strong support to the hypothesis that, as organizations produce more of a product, unit costs of production decrease at a decreasing rate (see Argote, 1999 for a comprehensive review of learning curve studies. But the organizational mechanisms that lead to these results are still underexplored. We know some drivers of learning curves (ADLER; CLARK, 1991; LAPRE et al., 2000, but we still lack a more detailed view of the organizational processes behind those curves. Through an ethnographic study, I bring a comprehensive account of the first year of operations of a new automotive plant, describing what was taking place on in the assembly area during the most relevant shifts of the learning curve. The emphasis is then on how learning occurs in that setting. My analysis suggests that the overall learning curve is in fact the result of an integration process that puts together several individual ongoing learning curves in different areas throughout the organization. In the end, I propose a model to understand the evolution of these learning processes and their supporting organizational mechanisms.

  2. The thermal curve of nuclear matter

    International Nuclear Information System (INIS)

    Ma, Y.G.; Peter, J.; Siwek, A.; Bocage, F.; Bougault, R.; Brou, R.; Colin, J.; Cussol, D.; Durand, D.; Genouin-Duhamel, E.; Gulminelli, F.; Lecolley, J.F.; Lefort, T.; Le Neindre, N.; Lopez, O.; Louvel, M.; Nguyen, A.D.; Steckmeyer, J.C.; Tamain, B.; Vient, E.

    1997-01-01

    Earlier measurements of nuclear matter thermal curve of liquid to gas phase transition presented two limitation: only one temperature measuring method was available and the mass number of the formed nuclei decreased from 190 to 50 when the excitation energy increased. To avoid these limitations experiments with the multidetector INDRA at GANIL were carried-out. Among the quasi-projectiles issued from the 36 Ar collisions at 52, 74, 95 A.MeV on the 58 Ni, nuclei of close masses were selected. The excitation energy was determined by the calorimetry of the charged products emitted by quasi-projectiles while the temperature was measured by three different methods. Very different apparent temperatures were obtained for the same excitation energy/nucleon. Only one curve displays a slope variation but no indication of plateau. With the quasi-projectiles obtained from the collisions of 129 Xe at 50 MeV/u on a 119 Sn target behaviors similar to those of 36 Ar were observed in the covered domain of excitation energy. To solve this puzzle and recover the initial temperatures of interest the only mean was to do a theoretical simulation in which one follows the de-excitation of the nuclei formed at different excitation energies and look for the thermal curve able to reproduce the observed temperatures. Two extreme possibilities were taken into account concerning the de-excitation process: either a sequential process established at E * /A≤ 3 MeV/u or a sudden multifragmentation in several hot fragments, most probably at E * /A≥ 10 MeV/u. In both cases it was possible to reproduce the whole set of experimental results concerning the 36 Ar projectile. The initial temperature increases steadily as a function of excitation energy showing no plateau or singular points. The results indicate that, being a system without external pressure, in its passage from the liquid phase to the gas phase the nucleus does not display necessarily a temperature plateau. Discussions on

  3. Contractibility of curves

    Directory of Open Access Journals (Sweden)

    Janusz Charatonik

    1991-11-01

    Full Text Available Results concerning contractibility of curves (equivalently: of dendroids are collected and discussed in the paper. Interrelations tetween various conditions which are either sufficient or necessary for a curve to be contractible are studied.

  4. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  5. Experimental study of curved guide tubes for pellet injection

    International Nuclear Information System (INIS)

    Combs, S.K.; Baylor, L.R.; Foust, C.R.; Gouge, M.J.; Jernigan, T.C.; Milora, S.L.

    1997-01-01

    The use of curved guide tubes for transporting frozen hydrogen pellets offers great flexibility for pellet injection into plasma devices. While this technique has been previously employed, an increased interest in its applicability has been generated with the recent ASDEX Upgrade experimental data for magnetic high-field side (HFS) pellet injection. In these innovative experiments, the pellet penetration appeared to be significantly deeper than for the standard magnetic low-field side injection scheme, along with corresponding greater fueling efficiencies. Thus, some of the major experimental fusion devices are planning experiments with HFS pellet injection. Because of the complex geometries of experimental fusion devices, installations with multiple curved guide tube sections will be required for HFS pellet injection. To more thoroughly understand and document the capability of curved guide tubes, an experimental study is under way at the Oak Ridge National Laboratory (ORNL). In particular, configurations and pellet parameters applicable for the DIII-D tokamak and the International Thermonuclear Experimental Reactor (ITER) were simulated in laboratory experiments. Initial test results with nominal 2.7- and 10-mm-diam deuterium pellets are presented and discussed

  6. Roc curves for continuous data

    CERN Document Server

    Krzanowski, Wojtek J

    2009-01-01

    Since ROC curves have become ubiquitous in many application areas, the various advances have been scattered across disparate articles and texts. ROC Curves for Continuous Data is the first book solely devoted to the subject, bringing together all the relevant material to provide a clear understanding of how to analyze ROC curves.The fundamental theory of ROC curvesThe book first discusses the relationship between the ROC curve and numerous performance measures and then extends the theory into practice by describing how ROC curves are estimated. Further building on the theory, the authors prese

  7. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  8. Generative adversarial networks for brain lesion detection

    Science.gov (United States)

    Alex, Varghese; Safwan, K. P. Mohammed; Chennamsetty, Sai Saketh; Krishnamurthi, Ganapathy

    2017-02-01

    Manual segmentation of brain lesions from Magnetic Resonance Images (MRI) is cumbersome and introduces errors due to inter-rater variability. This paper introduces a semi-supervised technique for detection of brain lesion from MRI using Generative Adversarial Networks (GANs). GANs comprises of a Generator network and a Discriminator network which are trained simultaneously with the objective of one bettering the other. The networks were trained using non lesion patches (n=13,000) from 4 different MR sequences. The network was trained on BraTS dataset and patches were extracted from regions excluding tumor region. The Generator network generates data by modeling the underlying probability distribution of the training data, (PData). The Discriminator learns the posterior probability P (Label Data) by classifying training data and generated data as "Real" or "Fake" respectively. The Generator upon learning the joint distribution, produces images/patches such that the performance of the Discriminator on them are random, i.e. P (Label Data = GeneratedData) = 0.5. During testing, the Discriminator assigns posterior probability values close to 0.5 for patches from non lesion regions, while patches centered on lesion arise from a different distribution (PLesion) and hence are assigned lower posterior probability value by the Discriminator. On the test set (n=14), the proposed technique achieves whole tumor dice score of 0.69, sensitivity of 91% and specificity of 59%. Additionally the generator network was capable of generating non lesion patches from various MR sequences.

  9. Probability sampling design in ethnobotanical surveys of medicinal plants

    Directory of Open Access Journals (Sweden)

    Mariano Martinez Espinosa

    2012-07-01

    Full Text Available Non-probability sampling design can be used in ethnobotanical surveys of medicinal plants. However, this method does not allow statistical inferences to be made from the data generated. The aim of this paper is to present a probability sampling design that is applicable in ethnobotanical studies of medicinal plants. The sampling design employed in the research titled "Ethnobotanical knowledge of medicinal plants used by traditional communities of Nossa Senhora Aparecida do Chumbo district (NSACD, Poconé, Mato Grosso, Brazil" was used as a case study. Probability sampling methods (simple random and stratified sampling were used in this study. In order to determine the sample size, the following data were considered: population size (N of 1179 families; confidence coefficient, 95%; sample error (d, 0.05; and a proportion (p, 0.5. The application of this sampling method resulted in a sample size (n of at least 290 families in the district. The present study concludes that probability sampling methods necessarily have to be employed in ethnobotanical studies of medicinal plants, particularly where statistical inferences have to be made using data obtained. This can be achieved by applying different existing probability sampling methods, or better still, a combination of such methods.

  10. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  11. Atlas of stress-strain curves

    CERN Document Server

    2002-01-01

    The Atlas of Stress-Strain Curves, Second Edition is substantially bigger in page dimensions, number of pages, and total number of curves than the previous edition. It contains over 1,400 curves, almost three times as many as in the 1987 edition. The curves are normalized in appearance to aid making comparisons among materials. All diagrams include metric (SI) units, and many also include U.S. customary units. All curves are captioned in a consistent format with valuable information including (as available) standard designation, the primary source of the curve, mechanical properties (including hardening exponent and strength coefficient), condition of sample, strain rate, test temperature, and alloy composition. Curve types include monotonic and cyclic stress-strain, isochronous stress-strain, and tangent modulus. Curves are logically arranged and indexed for fast retrieval of information. The book also includes an introduction that provides background information on methods of stress-strain determination, on...

  12. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  13. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  14. Power Curve Estimation With Multivariate Environmental Factors for Inland and Offshore Wind Farms

    KAUST Repository

    Lee, Giwhyun

    2015-04-22

    In the wind industry, a power curve refers to the functional relationship between the power output generated by a wind turbine and the wind speed at the time of power generation. Power curves are used in practice for a number of important tasks including predicting wind power production and assessing a turbine’s energy production efficiency. Nevertheless, actual wind power data indicate that the power output is affected by more than just wind speed. Several other environmental factors, such as wind direction, air density, humidity, turbulence intensity, and wind shears, have potential impact. Yet, in industry practice, as well as in the literature, current power curve models primarily consider wind speed and, sometimes, wind speed and direction. We propose an additive multivariate kernel method that can include the aforementioned environmental factors as a new power curve model. Our model provides, conditional on a given environmental condition, both the point estimation and density estimation of power output. It is able to capture the nonlinear relationships between environmental factors and the wind power output, as well as the high-order interaction effects among some of the environmental factors. Using operational data associated with four turbines in an inland wind farm and two turbines in an offshore wind farm, we demonstrate the improvement achieved by our kernel method.

  15. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  16. Tornado-Shaped Curves

    Science.gov (United States)

    Martínez, Sol Sáez; de la Rosa, Félix Martínez; Rojas, Sergio

    2017-01-01

    In Advanced Calculus, our students wonder if it is possible to graphically represent a tornado by means of a three-dimensional curve. In this paper, we show it is possible by providing the parametric equations of such tornado-shaped curves.

  17. Discriminating Among Probability Weighting Functions Using Adaptive Design Optimization

    Science.gov (United States)

    Cavagnaro, Daniel R.; Pitt, Mark A.; Gonzalez, Richard; Myung, Jay I.

    2014-01-01

    Probability weighting functions relate objective probabilities and their subjective weights, and play a central role in modeling choices under risk within cumulative prospect theory. While several different parametric forms have been proposed, their qualitative similarities make it challenging to discriminate among them empirically. In this paper, we use both simulation and choice experiments to investigate the extent to which different parametric forms of the probability weighting function can be discriminated using adaptive design optimization, a computer-based methodology that identifies and exploits model differences for the purpose of model discrimination. The simulation experiments show that the correct (data-generating) form can be conclusively discriminated from its competitors. The results of an empirical experiment reveal heterogeneity between participants in terms of the functional form, with two models (Prelec-2, Linear in Log Odds) emerging as the most common best-fitting models. The findings shed light on assumptions underlying these models. PMID:24453406

  18. A versatile curve-fit model for linear to deeply concave rank abundance curves

    NARCIS (Netherlands)

    Neuteboom, J.H.; Struik, P.C.

    2005-01-01

    A new, flexible curve-fit model for linear to concave rank abundance curves was conceptualized and validated using observational data. The model links the geometric-series model and log-series model and can also fit deeply concave rank abundance curves. The model is based ¿ in an unconventional way

  19. Low-spin electromagnetic transition probabilities in {sup 102,104}Cd

    Energy Technology Data Exchange (ETDEWEB)

    Jolie, J.; Dewald, A.; Fransen, C.; Linnemann, A.; Melon, B.; Moeller, O. [Inst. fuer Kernphysik, Univ. zu Koeln (Germany); Boelaert, N. [Inst. fuer Kernphysik, Univ. zu Koeln (Germany); Dept. of Subatomic and Radiation Physics, Gent Univ. (Belgium); Smirnova, N.; Heyde, K. [Dept. of Subatomic and Radiation Physics, Gent Univ. (Belgium)

    2007-07-01

    Lifetimes of low-lying states in {sup 102,104}Cd were determined by using the recoil distance Doppler shift technique with a plunger device and a Ge array consisting of five HP Ge detectors and one Euroball cluster detector. The experiments were carried out at the Cologne FN Tandem accelerator using the {sup 92,94}Mo({sup 12}C,2n){sup 102,104}Cd reactions. The differential decay curve method in coincidence mode was employed to derive the lifetime of the first excited 2{sup +} state in both nuclei and the first excited 4{sup +} state in {sup 104}Cd. The corresponding E2 transition probabilities agree well with large scale shell-model calculations. (orig.)

  20. CONFIGURATION GENERATOR MODEL

    International Nuclear Information System (INIS)

    Alsaed, A.

    2004-01-01

    ''The Disposal Criticality Analysis Methodology Topical Report'' prescribes an approach to the methodology for performing postclosure criticality analyses within the monitored geologic repository at Yucca Mountain, Nevada. An essential component of the methodology is the ''Configuration Generator Model for In-Package Criticality'' that provides a tool to evaluate the probabilities of degraded configurations achieving a critical state. The configuration generator model is a risk-informed, performance-based process for evaluating the criticality potential of degraded configurations in the monitored geologic repository. The method uses event tree methods to define configuration classes derived from criticality scenarios and to identify configuration class characteristics (parameters, ranges, etc.). The probabilities of achieving the various configuration classes are derived in part from probability density functions for degradation parameters. The NRC has issued ''Safety Evaluation Report for Disposal Criticality Analysis Methodology Topical Report, Revision 0''. That report contained 28 open items that required resolution through additional documentation. Of the 28 open items, numbers 5, 6, 9, 10, 18, and 19 were concerned with a previously proposed software approach to the configuration generator methodology and, in particular, the k eff regression analysis associated with the methodology. However, the use of a k eff regression analysis is not part of the current configuration generator methodology and, thus, the referenced open items are no longer considered applicable and will not be further addressed

  1. Demand curves for hypothetical cocaine in cocaine-dependent individuals.

    Science.gov (United States)

    Bruner, Natalie R; Johnson, Matthew W

    2014-03-01

    Drug purchasing tasks have been successfully used to examine demand for hypothetical consumption of abused drugs including heroin, nicotine, and alcohol. In these tasks, drug users make hypothetical choices whether to buy drugs, and if so, at what quantity, at various potential prices. These tasks allow for behavioral economic assessment of that drug's intensity of demand (preferred level of consumption at extremely low prices) and demand elasticity (sensitivity of consumption to price), among other metrics. However, a purchasing task for cocaine in cocaine-dependent individuals has not been investigated. This study examined a novel Cocaine Purchasing Task and the relation between resulting demand metrics and self-reported cocaine use data. Participants completed a questionnaire assessing hypothetical purchases of cocaine units at prices ranging from $0.01 to $1,000. Demand curves were generated from responses on the Cocaine Purchasing Task. Correlations compared metrics from the demand curve to measures of real-world cocaine use. Group and individual data were well modeled by a demand curve function. The validity of the Cocaine Purchasing Task was supported by a significant correlation between the demand curve metrics of demand intensity and O max (determined from Cocaine Purchasing Task data) and self-reported measures of cocaine use. Partial correlations revealed that after controlling for demand intensity, demand elasticity and the related measure, P max, were significantly correlated with real-world cocaine use. Results indicate that the Cocaine Purchasing Task produces orderly demand curve data, and that these data relate to real-world measures of cocaine use.

  2. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  3. Thermo-Electro-Mechanical Analysis of a Curved Functionally Graded Piezoelectric Actuator with Sandwich Structure

    Directory of Open Access Journals (Sweden)

    Liying Jiang

    2011-12-01

    Full Text Available In this work, the problem of a curved functionally graded piezoelectric (FGP actuator with sandwich structure under electrical and thermal loads is investigated. The middle layer in the sandwich structure is functionally graded with the piezoelectric coefficient g31 varying continuously along the radial direction of the curved actuator. Based on the theory of linear piezoelectricity, analytical solutions are obtained by using Airy stress function to examine the effects of material gradient and heat conduction on the performance of the curved actuator. It is found that the material gradient and thermal load have significant influence on the electroelastic fields and the mechanical response of the curved FGP actuator. Without the sacrifice of actuation deflection, smaller internal stresses are generated by using the sandwich actuator with functionally graded piezoelectric layer instead of the conventional bimorph actuator. This work is very helpful for the design and application of curved piezoelectric actuators under thermal environment.

  4. Thermo-Electro-Mechanical Analysis of a Curved Functionally Graded Piezoelectric Actuator with Sandwich Structure.

    Science.gov (United States)

    Yan, Zhi; Zaman, Mostafa; Jiang, Liying

    2011-12-12

    In this work, the problem of a curved functionally graded piezoelectric (FGP) actuator with sandwich structure under electrical and thermal loads is investigated. The middle layer in the sandwich structure is functionally graded with the piezoelectric coefficient g 31 varying continuously along the radial direction of the curved actuator. Based on the theory of linear piezoelectricity, analytical solutions are obtained by using Airy stress function to examine the effects of material gradient and heat conduction on the performance of the curved actuator. It is found that the material gradient and thermal load have significant influence on the electroelastic fields and the mechanical response of the curved FGP actuator. Without the sacrifice of actuation deflection, smaller internal stresses are generated by using the sandwich actuator with functionally graded piezoelectric layer instead of the conventional bimorph actuator. This work is very helpful for the design and application of curved piezoelectric actuators under thermal environment.

  5. In-Vehicle Dynamic Curve-Speed Warnings at High-Risk Rural Curves

    Science.gov (United States)

    2018-03-01

    Lane-departure crashes at horizontal curves represent a significant portion of fatal crashes on rural Minnesota roads. Because of this, solutions are needed to aid drivers in identifying upcoming curves and inform them of a safe speed at which they s...

  6. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  7. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  8. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  9. Extension of the master sintering curve for constant heating rate modeling

    Science.gov (United States)

    McCoy, Tammy Michelle

    density of the pellets based on the green density and the theoretical density of each of the compositions. The Master Sintering Curve (MSC) model is then utilized to generate data that can be utilized to predict the final density of the respective powder over a range of heating rates. The Elton Master Sintering Curve Extension (EMSCE) is developed to extend the functionality of the MSC tool. The parameters generated from the original MSC are used in tandem with the solution to the closed integral, theta ≡ 1cTo T1Texp -QRT dT, over a set range of temperatures. The EMSCE is used to generate a set of sintering curves having both constant heating rate and isothermal hold portions. The EMSCE extends the usefulness of the MSC by allowing this generation of a complete sintering schedule rather than just being able to predict the final relative density of a given material. The EMSCE is verified by generating a set of curves having both constant heating rate and an isothermal hold for the heat-treatment. The modeled curves are verified experimentally and a comparison of the model and experimental results are given for a selected composition. Porosity within the final product can hinder the product from sintering to full density. It is shown that some of the compositions studied did not sinter to full density because of the presence of large porosity that could not be eliminated in a reasonable amount of time. A statistical analysis of the volume fraction of porosity is completed to show the significance of the presence in the final product. The reason this is relevant to the MSC is that the model does not take into account the presence of porosity and assumes that the samples sinter to full density. When this does not happen, the model actually under-predicts the final density of the material.

  10. Thermoelectric coolers as power generators

    International Nuclear Information System (INIS)

    Burke, E.J.; Buist, R.J.

    1984-01-01

    There are many applications where thermoelectric (TE) coolers can be used effectively as power generators. The literature available on this subject is scarce and very limited in scope. This paper describes the configuration, capability, limitations and performance of TE coolers to be used as power generators. Also presented are performance curves enabling the user to design the optimum TE module for any given power generation application

  11. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  12. Calibration curves for biological dosimetry; Curvas de calibracion para dosimetria biologica

    Energy Technology Data Exchange (ETDEWEB)

    Guerrero C, C.; Brena V, M. [ININ, A.P. 18-1027, 11801 Mexico D.F. (Mexico)]. E-mail cgc@nuclear.inin.mx

    2004-07-01

    The generated information by the investigations in different laboratories of the world, included the ININ, in which settles down that certain class of chromosomal leisure it increases in function of the dose and radiation type, has given by result the obtaining of calibrated curves that are applied in the well-known technique as biological dosimetry. In this work is presented a summary of the work made in the laboratory that includes the calibrated curves for gamma radiation of {sup 60} Cobalt and X rays of 250 k Vp, examples of presumed exposure to ionizing radiation, resolved by means of aberration analysis and the corresponding dose estimate through the equations of the respective curves and finally a comparison among the dose calculations in those people affected by the accident of Ciudad Juarez, carried out by the group of Oak Ridge, USA and those obtained in this laboratory. (Author)

  13. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  14. Satellite altimetry based rating curves throughout the entire Amazon basin

    Science.gov (United States)

    Paris, A.; Calmant, S.; Paiva, R. C.; Collischonn, W.; Silva, J. S.; Bonnet, M.; Seyler, F.

    2013-05-01

    The Amazonian basin is the largest hydrological basin all over the world. In the recent past years, the basin has experienced an unusual succession of extreme draughts and floods, which origin is still a matter of debate. Yet, the amount of data available is poor, both over time and space scales, due to factor like basin's size, access difficulty and so on. One of the major locks is to get discharge series distributed over the entire basin. Satellite altimetry can be used to improve our knowledge of the hydrological stream flow conditions in the basin, through rating curves. Rating curves are mathematical relationships between stage and discharge at a given place. The common way to determine the parameters of the relationship is to compute the non-linear regression between the discharge and stage series. In this study, the discharge data was obtained by simulation through the entire basin using the MGB-IPH model with TRMM Merge input rainfall data and assimilation of gage data, run from 1998 to 2010. The stage dataset is made of ~800 altimetry series at ENVISAT and JASON-2 virtual stations. Altimetry series span between 2002 and 2010. In the present work we present the benefits of using stochastic methods instead of probabilistic ones to determine a dataset of rating curve parameters which are consistent throughout the entire Amazon basin. The rating curve parameters have been computed using a parameter optimization technique based on Markov Chain Monte Carlo sampler and Bayesian inference scheme. This technique provides an estimate of the best parameters for the rating curve, but also their posterior probability distribution, allowing the determination of a credibility interval for the rating curve. Also is included in the rating curve determination the error over discharges estimates from the MGB-IPH model. These MGB-IPH errors come from either errors in the discharge derived from the gage readings or errors in the satellite rainfall estimates. The present

  15. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  16. PHOTOMETRIC REDSHIFTS AND QUASAR PROBABILITIES FROM A SINGLE, DATA-DRIVEN GENERATIVE MODEL

    International Nuclear Information System (INIS)

    Bovy, Jo; Hogg, David W.; Weaver, Benjamin A.; Myers, Adam D.; Hennawi, Joseph F.; McMahon, Richard G.; Schiminovich, David; Sheldon, Erin S.; Brinkmann, Jon; Schneider, Donald P.

    2012-01-01

    We describe a technique for simultaneously classifying and estimating the redshift of quasars. It can separate quasars from stars in arbitrary redshift ranges, estimate full posterior distribution functions for the redshift, and naturally incorporate flux uncertainties, missing data, and multi-wavelength photometry. We build models of quasars in flux-redshift space by applying the extreme deconvolution technique to estimate the underlying density. By integrating this density over redshift, one can obtain quasar flux densities in different redshift ranges. This approach allows for efficient, consistent, and fast classification and photometric redshift estimation. This is achieved by combining the speed obtained by choosing simple analytical forms as the basis of our density model with the flexibility of non-parametric models through the use of many simple components with many parameters. We show that this technique is competitive with the best photometric quasar classification techniques—which are limited to fixed, broad redshift ranges and high signal-to-noise ratio data—and with the best photometric redshift techniques when applied to broadband optical data. We demonstrate that the inclusion of UV and NIR data significantly improves photometric quasar-star separation and essentially resolves all of the redshift degeneracies for quasars inherent to the ugriz filter system, even when included data have a low signal-to-noise ratio. For quasars spectroscopically confirmed by the SDSS 84% and 97% of the objects with Galaxy Evolution Explorer UV and UKIDSS NIR data have photometric redshifts within 0.1 and 0.3, respectively, of the spectroscopic redshift; this amounts to about a factor of three improvement over ugriz-only photometric redshifts. Our code to calculate quasar probabilities and redshift probability distributions is publicly available.

  17. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  18. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  19. The curve shortening problem

    CERN Document Server

    Chou, Kai-Seng

    2001-01-01

    Although research in curve shortening flow has been very active for nearly 20 years, the results of those efforts have remained scattered throughout the literature. For the first time, The Curve Shortening Problem collects and illuminates those results in a comprehensive, rigorous, and self-contained account of the fundamental results.The authors present a complete treatment of the Gage-Hamilton theorem, a clear, detailed exposition of Grayson''s convexity theorem, a systematic discussion of invariant solutions, applications to the existence of simple closed geodesics on a surface, and a new, almost convexity theorem for the generalized curve shortening problem.Many questions regarding curve shortening remain outstanding. With its careful exposition and complete guide to the literature, The Curve Shortening Problem provides not only an outstanding starting point for graduate students and new investigations, but a superb reference that presents intriguing new results for those already active in the field.

  20. On Bäcklund transformation and vortex filament equation for null Cartan curve in Minkowski 3-space

    Energy Technology Data Exchange (ETDEWEB)

    Grbović, Milica, E-mail: milica.grbovic@kg.ac.rs; Nešović, Emilija, E-mail: nesovickg@sbb.rs [University of Kragujevac, Faculty of Science, Department of Mathematics and Informatics (Serbia)

    2016-12-15

    In this paper we introduce Bäcklund transformation of a null Cartan curve in Minkowski 3-space as a transformation which maps a null Cartan helix to another null Cartan helix, congruent to the given one. We also give the sufficient conditions for a transformation between two null Cartan curves in the Minkowski 3-space such that these curves have equal constant torsions. By using the Da Rios vortex filament equation, based on localized induction approximation, we derive the vortex filament equation for a null Cartan curve and obtain evolution equation for it’s torsion. As an application, we show that Cartan’s frame vectors generate new solutions of the Da Rios vortex filament equation.

  1. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    Science.gov (United States)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  2. A Bayesian hierarchical model for demand curve analysis.

    Science.gov (United States)

    Ho, Yen-Yi; Nhu Vo, Tien; Chu, Haitao; Luo, Xianghua; Le, Chap T

    2018-07-01

    Drug self-administration experiments are a frequently used approach to assessing the abuse liability and reinforcing property of a compound. It has been used to assess the abuse liabilities of various substances such as psychomotor stimulants and hallucinogens, food, nicotine, and alcohol. The demand curve generated from a self-administration study describes how demand of a drug or non-drug reinforcer varies as a function of price. With the approval of the 2009 Family Smoking Prevention and Tobacco Control Act, demand curve analysis provides crucial evidence to inform the US Food and Drug Administration's policy on tobacco regulation, because it produces several important quantitative measurements to assess the reinforcing strength of nicotine. The conventional approach popularly used to analyze the demand curve data is individual-specific non-linear least square regression. The non-linear least square approach sets out to minimize the residual sum of squares for each subject in the dataset; however, this one-subject-at-a-time approach does not allow for the estimation of between- and within-subject variability in a unified model framework. In this paper, we review the existing approaches to analyze the demand curve data, non-linear least square regression, and the mixed effects regression and propose a new Bayesian hierarchical model. We conduct simulation analyses to compare the performance of these three approaches and illustrate the proposed approaches in a case study of nicotine self-administration in rats. We present simulation results and discuss the benefits of using the proposed approaches.

  3. Does my high blood pressure improve your survival? Overall and subgroup learning curves in health.

    Science.gov (United States)

    Van Gestel, Raf; Müller, Tobias; Bosmans, Johan

    2017-09-01

    Learning curves in health are of interest for a wide range of medical disciplines, healthcare providers, and policy makers. In this paper, we distinguish between three types of learning when identifying overall learning curves: economies of scale, learning from cumulative experience, and human capital depreciation. In addition, we approach the question of how treating more patients with specific characteristics predicts provider performance. To soften collinearity problems, we explore the use of least absolute shrinkage and selection operator regression as a variable selection method and Theil-Goldberger mixed estimation to augment the available information. We use data from the Belgian Transcatheter Aorta Valve Implantation (TAVI) registry, containing information on the first 860 TAVI procedures in Belgium. We find that treating an additional TAVI patient is associated with an increase in the probability of 2-year survival by about 0.16%-points. For adverse events like renal failure and stroke, we find that an extra day between procedures is associated with an increase in the probability for these events by 0.12%-points and 0.07%-points, respectively. Furthermore, we find evidence for positive learning effects from physicians' experience with defibrillation, treating patients with hypertension, and the use of certain types of replacement valves during the TAVI procedure. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Study on seismic behaviour of integral concrete bridges with different skew angles through fragility curves

    Directory of Open Access Journals (Sweden)

    Mahmoud Reza ُُShiravand

    2017-12-01

    Full Text Available Bridges are key elements in urban transportation system and should be designed to sustain earthquake induced damages to be utilized after earthquake. Extensive damages during last earthquakes highlighted the importance of seismic assessment and damage estimation of bridges. Skewness is one of the primary parameters effects on seismic behavior of bridges. Skew bridges are defined as bridges with skew angle piers and abutments. In these bridges, the piers have some degrees of skewness due to construction restrictions, such as those caused by crossing a waterway, railway line or road. This paper aims to investigate seismic behavior of skew concrete bridges using damage criteria and estimate probability of piers damage with fragility curves. To this end, three types of concrete bridges with two, three and four spans and varying skew angles of 00 ,100, 200 and 300 are modeled with finite element software. Seismic responses of bridge piers under 10 earthquake ground motion records are calculated using incremental dynamic analysis. Following, damage criteria proposed by Mackie and Stojadinovic are used to define damage limits of bridge piers in four damage states of slight, moderate, extensive and complete and bridge fragility curves are developed. The results show that increasing skew angles increases the probability of damage occurrence, particularly in extensive and complete damage states.

  5. Quantum electrodynamics in curved space-time

    International Nuclear Information System (INIS)

    Buchbinder, I.L.; Gitman, D.M.; Fradkin, E.S.

    1981-01-01

    The lagrangian of quantum electrodynamics in curved space-time is constructed and the interaction picture taking into account the external gravitational field exactly is introduced. The transform from the Heisenberg picture to the interaction picture is carried out in a manifestly covariant way. The properties of free spinor and electromagnetic quantum fields are discussed and conditions under which initial and final creation and annihilation operators are connected by unitarity transformation are indicated. The derivation of Feynman's rules for quantum processes are calculated on the base of generalized normal product of operators. The way of reduction formula derivations is indicated and the suitable Green's functions are introduced. A generating functional for this Green's function is defined and the system of functional equations for them is obtained. The representation of different generating funcationals by means of functional integrals is introduced. Some consequences of S-matrix unitary condition are considered which leads to the generalization of the optic theorem

  6. On the average capacity and bit error probability of wireless communication systems

    KAUST Repository

    Yilmaz, Ferkan; Alouini, Mohamed-Slim

    2011-01-01

    Analysis of the average binary error probabilities and average capacity of wireless communications systems over generalized fading channels have been considered separately in the past. This paper introduces a novel moment generating function

  7. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  8. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  9. Feature curve extraction from point clouds via developable strip intersection

    Directory of Open Access Journals (Sweden)

    Kai Wah Lee

    2016-04-01

    Full Text Available In this paper, we study the problem of computing smooth feature curves from CAD type point clouds models. The proposed method reconstructs feature curves from the intersections of developable strip pairs which approximate the regions along both sides of the features. The generation of developable surfaces is based on a linear approximation of the given point cloud through a variational shape approximation approach. A line segment sequencing algorithm is proposed for collecting feature line segments into different feature sequences as well as sequential groups of data points. A developable surface approximation procedure is employed to refine incident approximation planes of data points into developable strips. Some experimental results are included to demonstrate the performance of the proposed method.

  10. Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences

    DEFF Research Database (Denmark)

    Hansen, Søren; Blanke, Mogens

    2013-01-01

    desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....

  11. An investigation of the ignition probability and data analysis for the detection of relevant parameters of mechanically generated steel sparks in explosive gas/air-mixtures; Untersuchungen zur Zuendwahrscheinlichkeit und Datenanalyse zur Erfassung der Einflussgroessen mechanisch erzeugter Stahl-Schlagfunktion in explosionsfaehigen Brenngas/Luft-Gemischen

    Energy Technology Data Exchange (ETDEWEB)

    Grunewald, Thomas; Finke, Robert; Graetz, Rainer

    2010-07-01

    Mechanically generated sparks are a potential source of ignition in highly combustible areas. A multiplicity of mechanical and reaction-kinetic influences causes a complex interaction of parameters. It is only little known about their effect on the ignition probability. The ignition probability of mechanically generated sparks with a material combination of unalloyed steel/unalloyed steel and with an kinetic impact energy between 3 and 277 Nm could be determined statistically tolerable. In addition, the explosiveness of not oxidized particles at increased temperatures in excess stoichiometric mixtures was proven. A unique correlation between impact energy and ignition probability as well as a correlation of impact energy and number of separated particles could be determined. Also, a principle component analysis considering the interaction of individual particles could not find a specific combination of measurable characteristics of the particles, which correlate with a distinct increase of the ignition probability.

  12. The gene dosage effect of the rad52 mutation on X-ray survival curves of tetraploid yeast strains

    International Nuclear Information System (INIS)

    Ho, K.S.Y.

    1975-01-01

    The mutation rad52 in the yeast Saccharomyces cerevisiae confers sensitivity to X-rays. The gene dosage effect of this mutation on X-ray survival curves of tetraploid yeast strains is shown. With increasing number of rad52 alleles, both a decrease in the survival for a given dose and a decrease in the survival curve shoulder width are observed. The generation of such a family of survival curves using three different mathematical models is discussed

  13. Markov transition probability-based network from time series for characterizing experimental two-phase flow

    International Nuclear Information System (INIS)

    Gao Zhong-Ke; Hu Li-Dan; Jin Ning-De

    2013-01-01

    We generate a directed weighted complex network by a method based on Markov transition probability to represent an experimental two-phase flow. We first systematically carry out gas—liquid two-phase flow experiments for measuring the time series of flow signals. Then we construct directed weighted complex networks from various time series in terms of a network generation method based on Markov transition probability. We find that the generated network inherits the main features of the time series in the network structure. In particular, the networks from time series with different dynamics exhibit distinct topological properties. Finally, we construct two-phase flow directed weighted networks from experimental signals and associate the dynamic behavior of gas-liquid two-phase flow with the topological statistics of the generated networks. The results suggest that the topological statistics of two-phase flow networks allow quantitative characterization of the dynamic flow behavior in the transitions among different gas—liquid flow patterns. (general)

  14. Part 5: Receiver Operating Characteristic Curve and Area under the Curve

    Directory of Open Access Journals (Sweden)

    Saeed Safari

    2016-04-01

    Full Text Available Multiple diagnostic tools are used by emergency physicians,every day. In addition, new tools are evaluated to obtainmore accurate methods and reduce time or cost of conventionalones. In the previous parts of this educationalseries, we described diagnostic performance characteristicsof diagnostic tests including sensitivity, specificity, positiveand negative predictive values, and likelihood ratios. Thereceiver operating characteristics (ROC curve is a graphicalpresentation of screening characteristics. ROC curve is usedto determine the best cutoff point and compare two or moretests or observers by measuring the area under the curve(AUC. In this part of our educational series, we explain ROCcurve and two methods to determine the best cutoff value.

  15. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  16. Emergency diesel generator reliability analysis high flux isotope reactor

    International Nuclear Information System (INIS)

    Merryman, L.; Christie, B.

    1993-01-01

    A program to apply some of the techniques of reliability engineering to the High Flux Isotope Reactor (HFIR) was started on August 8, 1992. Part of the program was to track the conditional probabilities of the emergency diesel generators responding to a valid demand. This was done to determine if the performance of the emergency diesel generators (which are more than 25 years old) has deteriorated. The conditional probabilities of the diesel generators were computed and trended for the period from May 1990 to December 1992. The calculations indicate that the performance of the emergency diesel generators has not deteriorated in recent years, i.e., the conditional probabilities of the emergency diesel generators have been fairly stable over the last few years. This information will be one factor than may be considered in the decision to replace the emergency diesel generators

  17. Estimating the concordance probability in a survival analysis with a discrete number of risk groups.

    Science.gov (United States)

    Heller, Glenn; Mo, Qianxing

    2016-04-01

    A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.

  18. Evaluation of PCR and high-resolution melt curve analysis for differentiation of Salmonella isolates.

    Science.gov (United States)

    Saeidabadi, Mohammad Sadegh; Nili, Hassan; Dadras, Habibollah; Sharifiyazdi, Hassan; Connolly, Joanne; Valcanis, Mary; Raidal, Shane; Ghorashi, Seyed Ali

    2017-06-01

    Consumption of poultry products contaminated with Salmonella is one of the major causes of foodborne diseases worldwide and therefore detection and differentiation of Salmonella spp. in poultry is important. In this study, oligonucleotide primers were designed from hemD gene and a PCR followed by high-resolution melt (HRM) curve analysis was developed for rapid differentiation of Salmonella isolates. Amplicons of 228 bp were generated from 16 different Salmonella reference strains and from 65 clinical field isolates mainly from poultry farms. HRM curve analysis of the amplicons differentiated Salmonella isolates and analysis of the nucleotide sequence of the amplicons from selected isolates revealed that each melting curve profile was related to a unique DNA sequence. The relationship between reference strains and tested specimens was also evaluated using a mathematical model without visual interpretation of HRM curves. In addition, the potential of the PCR-HRM curve analysis was evaluated for genotyping of additional Salmonella isolates from different avian species. The findings indicate that PCR followed by HRM curve analysis provides a rapid and robust technique for genotyping of Salmonella isolates to determine the serovar/serotype.

  19. THE FIRST SYSTEMATIC STUDY OF TYPE Ibc SUPERNOVA MULTI-BAND LIGHT CURVES

    International Nuclear Information System (INIS)

    Drout, Maria R.; Soderberg, Alicia M.; Gal-Yam, Avishay; Arcavi, Iair; Green, Yoav; Cenko, S. Bradley; Fox, Derek B.; Leonard, Douglas C.; Sand, David J.; Moon, Dae-Sik

    2011-01-01

    We present detailed optical photometry for 25 Type Ibc supernovae (SNe Ibc) within d ≈ 150 Mpc obtained with the robotic Palomar 60 inch telescope in 2004-2007. This study represents the first uniform, systematic, and statistical sample of multi-band SNe Ibc light curves available to date. We correct the light curves for host galaxy extinction using a new technique based on the photometric color evolution, namely, we show that the (V – R) color of extinction-corrected SNe Ibc at Δt ≈ 10 days after V-band maximum is tightly distributed, ((V – R) V10 ) = 0.26 ± 0.06 mag. Using this technique, we find that SNe Ibc typically suffer from significant host galaxy extinction, (E(B – V)) ≈ 0.4 mag. A comparison of the extinction-corrected light curves for helium-rich (Type Ib) and helium-poor (Type Ic) SNe reveals that they are statistically indistinguishable, both in luminosity and decline rate. We report peak absolute magnitudes of (M R ) = –17.9 ± 0.9 mag and (M R ) = –18.3 ± 0.6 mag for SNe Ib and Ic, respectively. Focusing on the broad-lined (BL) SNe Ic, we find that they are more luminous than the normal SNe Ibc sample, (M R ) = –19.0 ± 1.1 mag, with a probability of only 1.6% that they are drawn from the same population of explosions. By comparing the peak absolute magnitudes of SNe Ic-BL with those inferred for local engine-driven explosions (GRB-SN 1998bw, XRF-SN 2006aj, and SN 2009bb) we find a 25% probability that relativistic SNe are drawn from the overall SNe Ic-BL population. Finally, we fit analytic models to the light curves to derive typical 56 Ni masses of M Ni ≈ 0.2 and 0.5 M ☉ for SNe Ibc and SNe Ic-BL, respectively. With reasonable assumptions for the photospheric velocities, we further extract kinetic energy and ejecta mass values of M ej ≈ 2 M ☉ and E K ≈ 10 51 erg for SNe Ibc, while for SNe Ic-BL we find higher values, M ej ≈ 5 M ☉ and E K ≈ 10 52 erg. We discuss the implications for the progenitors of SNe Ibc

  20. Method of construction spatial transition curve

    Directory of Open Access Journals (Sweden)

    S.V. Didanov

    2013-04-01

    Full Text Available Purpose. The movement of rail transport (speed rolling stock, traffic safety, etc. is largely dependent on the quality of the track. In this case, a special role is the transition curve, which ensures smooth insertion of the transition from linear to circular section of road. The article deals with modeling of spatial transition curve based on the parabolic distribution of the curvature and torsion. This is a continuation of research conducted by the authors regarding the spatial modeling of curved contours. Methodology. Construction of the spatial transition curve is numerical methods for solving nonlinear integral equations, where the initial data are taken coordinate the starting and ending points of the curve of the future, and the inclination of the tangent and the deviation of the curve from the tangent plane at these points. System solutions for the numerical method are the partial derivatives of the equations of the unknown parameters of the law of change of torsion and length of the transition curve. Findings. The parametric equations of the spatial transition curve are calculated by finding the unknown coefficients of the parabolic distribution of the curvature and torsion, as well as the spatial length of the transition curve. Originality. A method for constructing the spatial transition curve is devised, and based on this software geometric modeling spatial transition curves of railway track with specified deviations of the curve from the tangent plane. Practical value. The resulting curve can be applied in any sector of the economy, where it is necessary to ensure a smooth transition from linear to circular section of the curved space bypass. An example is the transition curve in the construction of the railway line, road, pipe, profile, flat section of the working blades of the turbine and compressor, the ship, plane, car, etc.

  1. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    Science.gov (United States)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  2. Photoelectic BV Light Curves of Algol and the Interpretations of the Light Curves

    Directory of Open Access Journals (Sweden)

    Ho-Il Kim

    1985-06-01

    Full Text Available Standardized B and V photoelectric light curves of Algol are made with the observations obtained during 1982-84 with the 40-cm and the 61-cm reflectors of Yonsei University Observatory. These light curves show asymmetry between ascending and descending shoulders. The ascending shoulder is 0.02 mag brighter than descending shoulder in V light curve and 0.03 mag in B light curve. These asymmetric light curves are interpreted as the result of inhomogeneous energy distribution on the surface of one star of the eclipsing pair rather than the result of gaseous stream flowing from KOIV to B8V star. The 180-year periodicity, so called great inequality, are most likely the result proposed by Kim et al. (1983 that the abrupt and discrete mass losses of cooler component may be the cause of this orbital change. The amount of mass loss deduced from these discrete period changes turned out to be of the order of 10^(-6 - 10^(-5 Msolar.

  3. A Journey Between Two Curves

    Directory of Open Access Journals (Sweden)

    Sergey A. Cherkis

    2007-03-01

    Full Text Available A typical solution of an integrable system is described in terms of a holomorphic curve and a line bundle over it. The curve provides the action variables while the time evolution is a linear flow on the curve's Jacobian. Even though the system of Nahm equations is closely related to the Hitchin system, the curves appearing in these two cases have very different nature. The former can be described in terms of some classical scattering problem while the latter provides a solution to some Seiberg-Witten gauge theory. This note identifies the setup in which one can formulate the question of relating the two curves.

  4. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  5. The estimation of probable maximum precipitation: the case of Catalonia.

    Science.gov (United States)

    Casas, M Carmen; Rodríguez, Raül; Nieto, Raquel; Redaño, Angel

    2008-12-01

    A brief overview of the different techniques used to estimate the probable maximum precipitation (PMP) is presented. As a particular case, the 1-day PMP over Catalonia has been calculated and mapped with a high spatial resolution. For this purpose, the annual maximum daily rainfall series from 145 pluviometric stations of the Instituto Nacional de Meteorología (Spanish Weather Service) in Catalonia have been analyzed. In order to obtain values of PMP, an enveloping frequency factor curve based on the actual rainfall data of stations in the region has been developed. This enveloping curve has been used to estimate 1-day PMP values of all the 145 stations. Applying the Cressman method, the spatial analysis of these values has been achieved. Monthly precipitation climatological data, obtained from the application of Geographic Information Systems techniques, have been used as the initial field for the analysis. The 1-day PMP at 1 km(2) spatial resolution over Catalonia has been objectively determined, varying from 200 to 550 mm. Structures with wavelength longer than approximately 35 km can be identified and, despite their general concordance, the obtained 1-day PMP spatial distribution shows remarkable differences compared to the annual mean precipitation arrangement over Catalonia.

  6. EXPLORING THE VARIABLE SKY WITH LINEAR. III. CLASSIFICATION OF PERIODIC LIGHT CURVES

    Energy Technology Data Exchange (ETDEWEB)

    Palaversa, Lovro; Eyer, Laurent; Rimoldini, Lorenzo [Observatoire Astronomique de l' Université de Genève, 51 chemin des Maillettes, CH-1290 Sauverny (Switzerland); Ivezić, Željko; Loebman, Sarah; Hunt-Walker, Nicholas; VanderPlas, Jacob; Westman, David; Becker, Andrew C. [Department of Astronomy, University of Washington, P.O. Box 351580, Seattle, WA 98195-1580 (United States); Ruždjak, Domagoj; Sudar, Davor; Božić, Hrvoje [Hvar Observatory, Faculty of Geodesy, Kačićeva 26, 10000 Zagreb (Croatia); Galin, Mario [Faculty of Geodesy, Kačićeva 26, 10000 Zagreb (Croatia); Kroflin, Andrea; Mesarić, Martina; Munk, Petra; Vrbanec, Dijana [Department of Physics, Faculty of Science, University of Zagreb, Bijenička cesta 32, 10000 Zagreb (Croatia); Sesar, Branimir [Division of Physics, Mathematics, and Astronomy, Caltech, Pasadena, CA 91125 (United States); Stuart, J. Scott [Lincoln Laboratory, Massachusetts Institute of Technology, 244 Wood Street, Lexington, MA 02420-9108 (United States); Srdoč, Gregor, E-mail: lovro.palaversa@unige.ch [Saršoni 90, 51216 Viškovo (Croatia); and others

    2013-10-01

    We describe the construction of a highly reliable sample of ∼7000 optically faint periodic variable stars with light curves obtained by the asteroid survey LINEAR across 10,000 deg{sup 2} of the northern sky. The majority of these variables have not been cataloged yet. The sample flux limit is several magnitudes fainter than most other wide-angle surveys; the photometric errors range from ∼0.03 mag at r = 15 to ∼0.20 mag at r = 18. Light curves include on average 250 data points, collected over about a decade. Using Sloan Digital Sky Survey (SDSS) based photometric recalibration of the LINEAR data for about 25 million objects, we selected ∼200,000 most probable candidate variables with r < 17 and visually confirmed and classified ∼7000 periodic variables using phased light curves. The reliability and uniformity of visual classification across eight human classifiers was calibrated and tested using a catalog of variable stars from the SDSS Stripe 82 region and verified using an unsupervised machine learning approach. The resulting sample of periodic LINEAR variables is dominated by 3900 RR Lyrae stars and 2700 eclipsing binary stars of all subtypes and includes small fractions of relatively rare populations such as asymptotic giant branch stars and SX Phoenicis stars. We discuss the distribution of these mostly uncataloged variables in various diagrams constructed with optical-to-infrared SDSS, Two Micron All Sky Survey, and Wide-field Infrared Survey Explorer photometry, and with LINEAR light-curve features. We find that the combination of light-curve features and colors enables classification schemes much more powerful than when colors or light curves are each used separately. An interesting side result is a robust and precise quantitative description of a strong correlation between the light-curve period and color/spectral type for close and contact eclipsing binary stars (β Lyrae and W UMa): as the color-based spectral type varies from K4 to F5, the

  7. Bond yield curve construction

    Directory of Open Access Journals (Sweden)

    Kožul Nataša

    2014-01-01

    Full Text Available In the broadest sense, yield curve indicates the market's view of the evolution of interest rates over time. However, given that cost of borrowing it closely linked to creditworthiness (ability to repay, different yield curves will apply to different currencies, market sectors, or even individual issuers. As government borrowing is indicative of interest rate levels available to other market players in a particular country, and considering that bond issuance still remains the dominant form of sovereign debt, this paper describes yield curve construction using bonds. The relationship between zero-coupon yield, par yield and yield to maturity is given and their usage in determining curve discount factors is described. Their usage in deriving forward rates and pricing related derivative instruments is also discussed.

  8. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  9. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  10. Trajectory Optimization of Spray Painting Robot for Complex Curved Surface Based on Exponential Mean Bézier Method

    Directory of Open Access Journals (Sweden)

    Wei Chen

    2017-01-01

    Full Text Available Automated tool trajectory planning for spray painting robots is still a challenging problem, especially for a large complex curved surface. This paper presents a new method of trajectory optimization for spray painting robot based on exponential mean Bézier method. The definition and the three theorems of exponential mean Bézier curves are discussed. Then a spatial painting path generation method based on exponential mean Bézier curves is developed. A new simple algorithm for trajectory optimization on complex curved surfaces is introduced. A golden section method is adopted to calculate the values. The experimental results illustrate that the exponential mean Bézier curves enhanced flexibility of the path planning, and the trajectory optimization algorithm achieved satisfactory performance. This method can also be extended to other applications.

  11. Stability of skyrmions on curved surfaces in the presence of a magnetic field

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho-Santos, V.L., E-mail: vagson.carvalho@usach.cl [Instituto Federal de Educação, Ciência e Tecnologia Baiano - Campus Senhor do Bonfim, Km 04 Estrada da Igara, 48970-000 Senhor do Bonfim, Bahia (Brazil); Departamento de Física, Universidad de Santiago de Chile and CEDENNA, Avda. Ecuador 3493, Santiago (Chile); Elias, R.G.; Altbir, D. [Departamento de Física, Universidad de Santiago de Chile and CEDENNA, Avda. Ecuador 3493, Santiago (Chile); Fonseca, J.M. [Universidade Federal de Viçosa, Departamento de Física, Avenida Peter Henry Rolfs s/n, 36570-000 Viçosa, MG (Brazil)

    2015-10-01

    We study the stability and energetics associated to skyrmions appearing as excitations on curved surfaces. Using a continuum model we show that the presence of cylindrically radial and azimuthal fields destabilize the skyrmions that appear in the absence of an external field. Weak fields generate fractional skyrmions while strong magnetic fields yield stable 2π-skyrmions, which have their widths diminished by the magnetic field strength. Under azimuthal fields vortex appears as stable state on the curved surface. - Highlights: • Stability of skyrmions on curved surfaces in the presence of a magnetic field. • Weak fields can destabilize skyrmions. • Strong magnetic fields yield the appearing of 2π-skyrmions. • The width of skyrmions is determined by the curvature and magnetic field strength. • Under azimuthal fields vortex appears as stable states.

  12. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine.

    Science.gov (United States)

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L; Balleteros, Francisco

    2016-12-07

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets.

  13. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  14. Birth weight curves tailored to maternal world region.

    Science.gov (United States)

    Ray, Joel G; Sgro, Michael; Mamdani, Muhammad M; Glazier, Richard H; Bocking, Alan; Hilliard, Robert; Urquia, Marcelo L

    2012-02-01

    Newborns of certain immigrant mothers are smaller at birth than those of domestically born mothers. Contemporary, population-derived percentile curves for these newborns are lacking, as are estimates of their risk of being misclassified as too small or too large using conventional rather than tailored birth weight curves. We completed a population-based study of 766 688 singleton live births in Ontario from 2002 to 2007. Smoothed birth weight percentile curves were generated for males and females, categorized by maternal world region of birth: Canada (63.5%), Europe/Western nations (7.6%), Africa/Caribbean (4.9%), Middle East/North Africa (3.4%), Latin America (3.4%), East Asia/Pacific (8.1%), and South Asia (9.2%). We determined the likelihood of misclassifying an infant as small for gestational age (≤ 10th percentile for weight) or as large for gestational age (≥ 90th percentile for weight) on a Canadian-born maternal curve versus one specific to maternal world region of origin. Significantly lower birth weights were seen at gestation-specific 10th, 50th, and 90th percentiles among term infants born to mothers from each world region, with the exception of Europe/Western nations, compared with those for infants of Canadian-born mothers. For example, for South Asian babies born at 40 weeks' gestation, the absolute difference at the 10th percentile was 198 g (95% CI 183 to 212) for males and 170 g (95% CI 161 to 179) for females. Controlling for maternal age and parity, South Asian males had an odds ratio of 2.60 (95% CI 2.53 to 2.68) of being misclassified as small for gestational age, equivalent to approximately 116 in 1000 newborns; for South Asian females the OR was 2.41 (95% CI 2.34 to 2.48), equivalent to approximately 106 per 1000 newborns. Large for gestational age would be missed in approximately 61 per 1000 male and 57 per 1000 female South Asian newborns if conventional rather than ethnicity-specific birth weight curves were used. Birth weight curves

  15. Generation of response functions of a NaI detector by using an interpolation technique

    International Nuclear Information System (INIS)

    Tominaga, Shoji

    1983-01-01

    A computer method is developed for generating response functions of a NaI detector to monoenergetic γ-rays. The method is based on an interpolation between measured response curves by a detector. The computer programs are constructed for Heath's response spectral library. The principle of the basic mathematics used for interpolation, which was reported previously by the author, et al., is that response curves can be decomposed into a linear combination of intrinsic-component patterns, and thereby the interpolation of curves is reduced to a simple interpolation of weighting coefficients needed to combine the component patterns. This technique has some advantages of data compression, reduction in computation time, and stability of the solution, in comparison with the usual functional fitting method. The processing method of segmentation of a spectrum is devised to generate useful and precise response curves. A spectral curve, obtained for each γ-ray source, is divided into some regions defined by the physical processes, such as the photopeak area, the Compton continuum area, the backscatter peak area, and so on. Each segment curve then is processed separately for interpolation. Lastly the estimated curves to the respective areas are connected on one channel scale. The generation programs are explained briefly. It is shown that the generated curve represents the overall shape of a response spectrum including not only its photopeak but also the corresponding Compton area, with a sufficient accuracy. (author)

  16. Parametric curve evaluation of a phototransistor used as detector in stereotactic radiosurgery X-ray beam

    International Nuclear Information System (INIS)

    Lima, Daniela Pontes A.; Santos, Luiz Antonio P.; Santos, Walter M.; Silva Junior, Eronides F. da

    2005-01-01

    Phototransistors have been widely used as detectors for low energy X-rays. However, when they are used in high energy X-rays fields like those generated from linear accelerators (linac), there is a certain loss of sensibility to the ionizing radiation. This damage is cumulative and irreversible. Thus, a correction factor must be applied to its response, which is proportional to the integrated dose. However, it is possible to estimate the correction factor by using the V x I parametric curve of the device. The aim of this work was to develop studies to evaluate and correlate the parametric response curve of a phototransistor with its loss of sensibility after irradiation. An Agilent 4155C semiconductor parameter analyzer was used to trace the parametric curve. X-rays were generated by a 14 MV Primus-Siemens linear accelerator. The results demonstrated that there is a correlation between the integrated dose applied to the phototransistor and the parametric response of the device. Studies are under way to determine how such behavior can provide information for the dosimetric planning in stereotactic radiosurgery. (author)

  17. Curve Digitizer – A software for multiple curves digitizing

    Directory of Open Access Journals (Sweden)

    Florentin ŞPERLEA

    2010-06-01

    Full Text Available The Curve Digitizer is software that extracts data from an image file representing a graphicand returns them as pairs of numbers which can then be used for further analysis and applications.Numbers can be read on a computer screen stored in files or copied on paper. The final result is adata set that can be used with other tools such as MSEXCEL. Curve Digitizer provides a useful toolfor any researcher or engineer interested in quantifying the data displayed graphically. The image filecan be obtained by scanning a document

  18. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    International Nuclear Information System (INIS)

    Lott, B.; Escande, L.; Larsson, S.; Ballet, J.

    2012-01-01

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  19. CAMAC modular programmable function generator

    Energy Technology Data Exchange (ETDEWEB)

    Turner, G.W.; Suehiro, S.; Hendricks, R.W.

    1980-12-01

    A CAMAC modular programmable function generator has been developed. The device contains a 1024 word by 12-bit memory, a 12-bit digital-to-analog converter with a 600 ns settling time, an 18-bit programmable frequency register, and two programmable trigger output registers. The trigger registers can produce programmed output logic transitions at various (binary) points in the output function curve, and are used to synchronize various other data acquisition devices with the function curve.

  20. CAMAC modular programmable function generator

    International Nuclear Information System (INIS)

    Turner, G.W.; Suehiro, S.; Hendricks, R.W.

    1980-12-01

    A CAMAC modular programmable function generator has been developed. The device contains a 1024 word by 12-bit memory, a 12-bit digital-to-analog converter with a 600 ns settling time, an 18-bit programmable frequency register, and two programmable trigger output registers. The trigger registers can produce programmed output logic transitions at various (binary) points in the output function curve, and are used to synchronize various other data acquisition devices with the function curve

  1. Probability of Detection (POD) as a statistical model for the validation of qualitative methods.

    Science.gov (United States)

    Wehling, Paul; LaBudde, Robert A; Brunelle, Sharon L; Nelson, Maria T

    2011-01-01

    A statistical model is presented for use in validation of qualitative methods. This model, termed Probability of Detection (POD), harmonizes the statistical concepts and parameters between quantitative and qualitative method validation. POD characterizes method response with respect to concentration as a continuous variable. The POD model provides a tool for graphical representation of response curves for qualitative methods. In addition, the model allows comparisons between candidate and reference methods, and provides calculations of repeatability, reproducibility, and laboratory effects from collaborative study data. Single laboratory study and collaborative study examples are given.

  2. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  3. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  4. Bi-Hamiltonian operators, integrable flows of curves using moving frames and geometric map equations

    International Nuclear Information System (INIS)

    Anco, Stephen C

    2006-01-01

    Moving frames of various kinds are used to derive bi-Hamiltonian operators and associated hierarchies of multi-component soliton equations from group-invariant flows of non-stretching curves in constant-curvature manifolds and Lie-group manifolds. The hierarchy in the constant-curvature case consists of a vector mKdV equation coming from a parallel frame, a vector potential mKdV equation coming from a covariantly constant frame, and higher order counterparts generated by an underlying vector mKdV recursion operator. In the Lie-group case, the hierarchy comprises a group-invariant analogue of the vector NLS equation coming from a left-invariant frame, along with higher order counterparts generated by a recursion operator that is like a square root of the mKdV one. The corresponding respective curve flows are found to be given by geometric nonlinear PDEs, specifically mKdV and group-invariant analogues of Schroedinger maps. In all cases the hierarchies also contain variants of vector sine-Gordon equations arising from the kernel of the respective recursion operators. The geometric PDEs that describe the corresponding curve flows are shown to be wave maps

  5. Bi-Hamiltonian operators, integrable flows of curves using moving frames and geometric map equations

    Energy Technology Data Exchange (ETDEWEB)

    Anco, Stephen C [Department of Mathematics, Brock University, St Catharines, ON (Canada)

    2006-03-03

    Moving frames of various kinds are used to derive bi-Hamiltonian operators and associated hierarchies of multi-component soliton equations from group-invariant flows of non-stretching curves in constant-curvature manifolds and Lie-group manifolds. The hierarchy in the constant-curvature case consists of a vector mKdV equation coming from a parallel frame, a vector potential mKdV equation coming from a covariantly constant frame, and higher order counterparts generated by an underlying vector mKdV recursion operator. In the Lie-group case, the hierarchy comprises a group-invariant analogue of the vector NLS equation coming from a left-invariant frame, along with higher order counterparts generated by a recursion operator that is like a square root of the mKdV one. The corresponding respective curve flows are found to be given by geometric nonlinear PDEs, specifically mKdV and group-invariant analogues of Schroedinger maps. In all cases the hierarchies also contain variants of vector sine-Gordon equations arising from the kernel of the respective recursion operators. The geometric PDEs that describe the corresponding curve flows are shown to be wave maps.

  6. Calculation and comparison with experimental data of cascade curves for liquid xenon

    International Nuclear Information System (INIS)

    Strugal'skij, Z.S.; Yablonskij, Z.

    1975-01-01

    Cascade curves calculated by different methods are compared with the experimental data for showers caused by gamma-quanta with the energies from 40 to 2000 MeV in liquid xenon. The minimum energy of shower electrons (cut-off energy) taken into account by the experiment amounts to 3.1-+1.2 MeV, whereas the calculated cascade curves are given for the energies ranging from 40 to 4000 MeV at the cut-off energies 2.3; 3.5; 4.7 MeV. The depth of the shower development is reckoned from the point of generation of gamma-quanta which create showers. Cascade curves are calculated by the moment method with consideration for three moments. The following physical processes are taken into consideration: generation of electron-positron pairs; Compton effect; bremsstrahlung; ionization losses. The dependences of the mean number of particles on the depth of the shower development are obtained from measurements of photographs taken with a xenon bubble chamber. Presented are similar dependences calculated by the moment and Monte-Carlo methods. From the data analysis it follows that the calculation provides correct position of the shower development maximum, but different methods of calculation for small and low depths of shower development yield drastically different results. The Monte-Carlo method provides better agreement with the experimental data

  7. A residual life prediction model based on the generalized σ -N curved surface

    Directory of Open Access Journals (Sweden)

    Zongwen AN

    2016-06-01

    Full Text Available In order to investigate change rule of the residual life of structure under random repeated load, firstly, starting from the statistic meaning of random repeated load, the joint probability density function of maximum stress and minimum stress is derived based on the characteristics of order statistic (maximum order statistic and minimum order statistic; then, based on the equation of generalized σ -N curved surface, considering the influence of load cycles number on fatigue life, a relationship among minimum stress, maximum stress and residual life, that is the σmin(n- σmax(n-Nr(n curved surface model, is established; finally, the validity of the proposed model is demonstrated by a practical case. The result shows that the proposed model can reflect the influence of maximum stress and minimum stress on residual life of structure under random repeated load, which can provide a theoretical basis for life prediction and reliability assessment of structure.

  8. Predicting the costs of photovoltaic solar modules in 2020 using experience curve models

    International Nuclear Information System (INIS)

    La Tour, Arnaud de; Glachant, Matthieu; Ménière, Yann

    2013-01-01

    Except in few locations, photovoltaic generated electricity remains considerably more expensive than conventional sources. It is however expected that innovation and learning-by-doing will lead to drastic cuts in production cost in the near future. The goal of this paper is to predict the cost of PV modules out to 2020 using experience curve models, and to draw implications about the cost of PV electricity. Using annual data on photovoltaic module prices, cumulative production, R and D knowledge stock and input prices for silicon and silver over the period 1990–2011, we identify a experience curve model which minimizes the difference between predicted and actual module prices. This model predicts a 67% decrease of module price from 2011 to 2020. This rate implies that the cost of PV generated electricity will reach that of conventional electricity by 2020 in the sunniest countries with annual solar irradiation of 2000 kWh/year or more, such as California, Italy, and Spain. - Highlights: • We predict the cost of PV modules out to 2020 using experience curve models. • The model predicts a 67% decrease of module price from 2011 to 2020. • We draw implications about the cost of PV electricity

  9. Estimating reaction rate constants: comparison between traditional curve fitting and curve resolution

    NARCIS (Netherlands)

    Bijlsma, S.; Boelens, H. F. M.; Hoefsloot, H. C. J.; Smilde, A. K.

    2000-01-01

    A traditional curve fitting (TCF) algorithm is compared with a classical curve resolution (CCR) approach for estimating reaction rate constants from spectral data obtained in time of a chemical reaction. In the TCF algorithm, reaction rate constants an estimated from the absorbance versus time data

  10. A catalog of special plane curves

    CERN Document Server

    Lawrence, J Dennis

    2014-01-01

    Among the largest, finest collections available-illustrated not only once for each curve, but also for various values of any parameters present. Covers general properties of curves and types of derived curves. Curves illustrated by a CalComp digital incremental plotter. 12 illustrations.

  11. Association of total-mixed-ration chemical composition with milk, fat, and protein yield lactation curves at the individual level

    NARCIS (Netherlands)

    Caccamo, M.; Veerkamp, R.F.; Licitra, G.; Petriglieri, R.; Terra, La F.; Pozzebon, A.; Ferguson, J.D.

    2012-01-01

    The objective of this study was to examine the effect of the chemical composition of a total mixed ration (TMR) tested quarterly from March 2006 through December 2008 for milk, fat, and protein yield curves for 27 herds in Ragusa, Sicily. Before this study, standard yield curves were generated on

  12. Efficient Multiphoton Generation in Waveguide Quantum Electrodynamics

    Science.gov (United States)

    González-Tudela, A.; Paulisch, V.; Kimble, H. J.; Cirac, J. I.

    2017-05-01

    Engineering quantum states of light is at the basis of many quantum technologies such as quantum cryptography, teleportation, or metrology among others. Though, single photons can be generated in many scenarios, the efficient and reliable generation of complex single-mode multiphoton states is still a long-standing goal in the field, as current methods either suffer from low fidelities or small probabilities. Here we discuss several protocols which harness the strong and long-range atomic interactions induced by waveguide QED to efficiently load excitations in a collection of atoms, which can then be triggered to produce the desired multiphoton state. In order to boost the success probability and fidelity of each excitation process, atoms are used to both generate the excitations in the rest, as well as to herald the successful generation. Furthermore, to overcome the exponential scaling of the probability of success with the number of excitations, we design a protocol to merge excitations that are present in different internal atomic levels with a polynomial scaling.

  13. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  14. Elliptic curves for applications (Tutorial)

    NARCIS (Netherlands)

    Lange, T.; Bernstein, D.J.; Chatterjee, S.

    2011-01-01

    More than 25 years ago, elliptic curves over finite fields were suggested as a group in which the Discrete Logarithm Problem (DLP) can be hard. Since then many researchers have scrutinized the security of the DLP on elliptic curves with the result that for suitably chosen curves only exponential

  15. Differential geometry and topology of curves

    CERN Document Server

    Animov, Yu

    2001-01-01

    Differential geometry is an actively developing area of modern mathematics. This volume presents a classical approach to the general topics of the geometry of curves, including the theory of curves in n-dimensional Euclidean space. The author investigates problems for special classes of curves and gives the working method used to obtain the conditions for closed polygonal curves. The proof of the Bakel-Werner theorem in conditions of boundedness for curves with periodic curvature and torsion is also presented. This volume also highlights the contributions made by great geometers. past and present, to differential geometry and the topology of curves.

  16. Models of genus one curves

    OpenAIRE

    Sadek, Mohammad

    2010-01-01

    In this thesis we give insight into the minimisation problem of genus one curves defined by equations other than Weierstrass equations. We are interested in genus one curves given as double covers of P1, plane cubics, or complete intersections of two quadrics in P3. By minimising such a curve we mean making the invariants associated to its defining equations as small as possible using a suitable change of coordinates. We study the non-uniqueness of minimisations of the genus one curves des...

  17. The crime kuznets curve

    OpenAIRE

    Buonanno, Paolo; Fergusson, Leopoldo; Vargas, Juan Fernando

    2014-01-01

    We document the existence of a Crime Kuznets Curve in US states since the 1970s. As income levels have risen, crime has followed an inverted U-shaped pattern, first increasing and then dropping. The Crime Kuznets Curve is not explained by income inequality. In fact, we show that during the sample period inequality has risen monotonically with income, ruling out the traditional Kuznets Curve. Our finding is robust to adding a large set of controls that are used in the literature to explain the...

  18. Energization of electrons in a plasma beam entering a curved magnetic field

    International Nuclear Information System (INIS)

    Brenning, N.; Lindberg, L.; Eriksson, A.

    1980-09-01

    Earlier experiments have indicated that suprathermal electrons appear when a collisionless plasma flowing along a magnetic field enters a region where the magnetic field is curved. In the present investigation newly developed methods of He-spectroscopy based on the absolute intensities of the He I 3889 A and He II 4686 A lines are utilized to study the electron temperature and to estimate the population of non-thermal electrons. The density of helium added for the diagnostic purpose is so low that the flow is not disturbed. It is found that the intrusion of the plasma into a curved or transverse field gives rise to a slight increase (15-20%) in the electron temperature and a remarkable increase in the fraction of non-thermal (>100 eV) electrons from below 1% to as much as 20-25% of the total electron population. There are also indications that the energization of electrons is particularly efficient on that side of the plasma beam which becomes polarized to a positive potential when entering the curved field. The experiments are confined to the case of weak magnetic field, i.e. only the electrons are magnetically confined. New details of the electric field and potential structure are presented and discussed. Electric field components parallel to the magnetic field are likely to energize the electrons, probably through the run-away phenomenon. (Auth.)

  19. A direct method to solve optimal knots of B-spline curves: An application for non-uniform B-spline curves fitting.

    Directory of Open Access Journals (Sweden)

    Van Than Dung

    Full Text Available B-spline functions are widely used in many industrial applications such as computer graphic representations, computer aided design, computer aided manufacturing, computer numerical control, etc. Recently, there exist some demands, e.g. in reverse engineering (RE area, to employ B-spline curves for non-trivial cases that include curves with discontinuous points, cusps or turning points from the sampled data. The most challenging task in these cases is in the identification of the number of knots and their respective locations in non-uniform space in the most efficient computational cost. This paper presents a new strategy for fitting any forms of curve by B-spline functions via local algorithm. A new two-step method for fast knot calculation is proposed. In the first step, the data is split using a bisecting method with predetermined allowable error to obtain coarse knots. Secondly, the knots are optimized, for both locations and continuity levels, by employing a non-linear least squares technique. The B-spline function is, therefore, obtained by solving the ordinary least squares problem. The performance of the proposed method is validated by using various numerical experimental data, with and without simulated noise, which were generated by a B-spline function and deterministic parametric functions. This paper also discusses the benchmarking of the proposed method to the existing methods in literature. The proposed method is shown to be able to reconstruct B-spline functions from sampled data within acceptable tolerance. It is also shown that, the proposed method can be applied for fitting any types of curves ranging from smooth ones to discontinuous ones. In addition, the method does not require excessive computational cost, which allows it to be used in automatic reverse engineering applications.

  20. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  1. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  2. Image Watermarking Scheme for Specifying False Positive Probability and Bit-pattern Embedding

    Science.gov (United States)

    Sayama, Kohei; Nakamoto, Masayoshi; Muneyasu, Mitsuji; Ohno, Shuichi

    This paper treats a discrete wavelet transform(DWT)-based image watermarking with considering the false positive probability and bit-pattern embedding. We propose an iterative embedding algorithm of watermarking signals which are K sets pseudo-random numbers generated by a secret key. In the detection, K correlations between the watermarked DWT coefficients and watermark signals are computed by using the secret key. L correlations are made available for the judgment of the watermark presence with specified false positive probability, and the other K-L correlations are corresponding to the bit-pattern signal. In the experiment, we show the detection results with specified false positive probability and the bit-pattern recovery, and the comparison of the proposed method against JPEG compression, scaling down and cropping.

  3. ROBUST DECLINE CURVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Sutawanir Darwis

    2012-05-01

    Full Text Available Empirical decline curve analysis of oil production data gives reasonable answer in hyperbolic type curves situations; however the methodology has limitations in fitting real historical production data in present of unusual observations due to the effect of the treatment to the well in order to increase production capacity. The development ofrobust least squares offers new possibilities in better fitting production data using declinecurve analysis by down weighting the unusual observations. This paper proposes a robustleast squares fitting lmRobMM approach to estimate the decline rate of daily production data and compares the results with reservoir simulation results. For case study, we usethe oil production data at TBA Field West Java. The results demonstrated that theapproach is suitable for decline curve fitting and offers a new insight in decline curve analysis in the present of unusual observations.

  4. The J-resistance curve Leak-before-Break test program on material for the Darlington Nuclear Generating Station

    International Nuclear Information System (INIS)

    Mukherjee, B.

    1988-01-01

    The Darlington Leak-Before-Break (DLBB) approach has been developed for large diameter (21, 22, 24 inch) SA106B heat transport (HT) piping and SA105 fittings as a design alternative to pipewhip restraints and in recognition of the questionable benefits of providing such restraints. Ontario Hydro's DLBB approach is based on the elastic plastic fracture mechanics method. In this test program, J-resistance curves were determined from actual pipe heats that were used in the construction of the Darlington heat transport systems (Units 1 and 2). Test blocks were prepared using four different welding procedures for nuclear Class I piping. The test program was designed to take into account the effect of various factors such as test temperature, crack plane orientation, welding effects, etc., which have influence on fracture properties. A total of 91 tests were conducted. An acceptable lower bound J-resistance curve for the piping steels was obtained by machining maximum thickness specimens from the pipes and by testing side grooved compact tension specimens. Test results showed that all pipes, welds and heat-affected zone materials within the scope of the DLBB program exhibited uppershelf toughness behaviour. All specimens showed high crack initiation toughness Jsub(lc), rising J-resistance curve and stable and ductile crack extension. Toughness of product forms depended on the direction of crack extension (circumferential versus axial crack orientation). Toughness of DLBB welds and parent materials at 250 0 C was lower than that at 20 0 C. (author)

  5. Seismic fragility curves of bridge piers accounting for ground motions in Korea

    Science.gov (United States)

    Nguyen, Duy-Duan; Lee, Tae-Hyung

    2018-04-01

    Korea is located in a slight-to-moderate seismic zone. Nevertheless, several studies pointed that the peak earthquake magnitude in the region can be reached to approximately 6.5. Accordingly, a seismic vulnerability evaluation of the existing structures accounting for ground motions in Korea is momentous. The purpose of this paper is to develop seismic fragility curves for bridge piers of a steel box girder bridge equipped with and without base isolators based on a set of ground motions recorded in Korea. A finite element simulation platform, OpenSees, is utilized to perform nonlinear time history analyses of the bridges. A series of damage states is defined based on a damage index which is expressed in terms of the column displacement ductility ratio. The fragility curves based on Korean motions were thereafter compared with the fragility curves generated using worldwide earthquakes to assess the effect of the two ground motion groups on the seismic fragility curves of the bridge piers. The results reveal that both non- and base-isolated bridge piers are less vulnerable during the Korean ground motions than that under worldwide earthquakes.

  6. NEW CONCEPTS AND TEST METHODS OF CURVE PROFILE AREA DENSITY IN SURFACE: ESTIMATION OF AREAL DENSITY ON CURVED SPATIAL SURFACE

    OpenAIRE

    Hong Shen

    2011-01-01

    The concepts of curve profile, curve intercept, curve intercept density, curve profile area density, intersection density in containing intersection (or intersection density relied on intersection reference), curve profile intersection density in surface (or curve intercept intersection density relied on intersection of containing curve), and curve profile area density in surface (AS) were defined. AS expressed the amount of curve profile area of Y phase in the unit containing surface area, S...

  7. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  8. Fabricating small-scale, curved, polymeric structures with convex and concave menisci through interfacial free energy equilibrium.

    Science.gov (United States)

    Cheng, Chao-Min; Matsuura, Koji; Wang, I-Jan; Kuroda, Yuka; LeDuc, Philip R; Naruse, Keiji

    2009-11-21

    Polymeric curved structures are widely used in imaging systems including optical fibers and microfluidic channels. Here, we demonstrate that small-scale, poly(dimethylsiloxane) (PDMS)-based, curved structures can be fabricated through controlling interfacial free energy equilibrium. Resultant structures have a smooth, symmetric, curved surface, and may be convex or concave in form based on surface tension balance. Their curvatures are controlled by surface characteristics (i.e., hydrophobicity and hydrophilicity) of the molds and semi-liquid PDMS. In addition, these structures are shown to be biocompatible for cell culture. Our system provides a simple, efficient and economical method for generating integrateable optical components without costly fabrication facilities.

  9. M-curves and symmetric products

    Indian Academy of Sciences (India)

    Indranil Biswas

    2017-08-03

    Aug 3, 2017 ... is bounded above by g + 1, where g is the genus of X [11]. Curves which have exactly the maximum number (i.e., genus +1) of components of the real part are called M-curves. Classifying real algebraic curves up to homeomorphism is straightforward, however, classifying even planar non-singular real ...

  10. Use of regionalisation approach to develop fire frequency curves for Victoria, Australia

    Science.gov (United States)

    Khastagir, Anirban; Jayasuriya, Niranjali; Bhuyian, Muhammed A.

    2017-11-01

    It is important to perform fire frequency analysis to obtain fire frequency curves (FFC) based on fire intensity at different parts of Victoria. In this paper fire frequency curves (FFCs) were derived based on forest fire danger index (FFDI). FFDI is a measure related to fire initiation, spreading speed and containment difficulty. The mean temperature (T), relative humidity (RH) and areal extent of open water (LC2) during summer months (Dec-Feb) were identified as the most important parameters for assessing the risk of occurrence of bushfire. Based on these parameters, Andrews' curve equation was applied to 40 selected meteorological stations to identify homogenous stations to form unique clusters. A methodology using peak FFDI from cluster averaged FFDIs was developed by applying Log Pearson Type III (LPIII) distribution to generate FFCs. A total of nine homogeneous clusters across Victoria were identified, and subsequently their FFC's were developed in order to estimate the regionalised fire occurrence characteristics.

  11. Inverse Diffusion Curves Using Shape Optimization.

    Science.gov (United States)

    Zhao, Shuang; Durand, Fredo; Zheng, Changxi

    2018-07-01

    The inverse diffusion curve problem focuses on automatic creation of diffusion curve images that resemble user provided color fields. This problem is challenging since the 1D curves have a nonlinear and global impact on resulting color fields via a partial differential equation (PDE). We introduce a new approach complementary to previous methods by optimizing curve geometry. In particular, we propose a novel iterative algorithm based on the theory of shape derivatives. The resulting diffusion curves are clean and well-shaped, and the final image closely approximates the input. Our method provides a user-controlled parameter to regularize curve complexity, and generalizes to handle input color fields represented in a variety of formats.

  12. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  13. Evaluation of Oconee steam-generator debris. Final report

    International Nuclear Information System (INIS)

    Rigdon, M.A.; Rubright, M.M.; Sarver, L.W.

    1981-10-01

    Pieces of debris were observed near damaged tubes at the 14th support plate elevation in the Oconee 1-B steam generator. A project was initiated to evaluate the physical and chemical nature of the debris, to identify its source, and to determine its role in tube damage at this elevation. Various laboratory techniques were used to characterize several debris and mill scale samples. Data from these samples were then compared with each other and with literature data. It was concluded that seven of eight debris samples were probably formed in the steam generator. Six of these samples were probably formed by high temperature aqueous corrosion early in the life of the steam generator. The seventh sample was probably formed by the deposition and spalling of magnetite on the Inconel steam generator tubes. None of the debris samples resembled any of the mill scale samples

  14. Retrograde curves of solidus and solubility

    International Nuclear Information System (INIS)

    Vasil'ev, M.V.

    1979-01-01

    The investigation was concerned with the constitutional diagrams of the eutectic type with ''retrograde solidus'' and ''retrograde solubility curve'' which must be considered as diagrams with degenerate monotectic transformation. The solidus and the solubility curves form a retrograde curve with a common retrograde point representing the solubility maximum. The two branches of the Aetrograde curve can be described with the aid of two similar equations. Presented are corresponding equations for the Cd-Zn system and shown is the possibility of predicting the run of the solubility curve

  15. [Customized and non-customized French intrauterine growth curves. II - Comparison with existing curves and benefits of customization].

    Science.gov (United States)

    Ego, A; Prunet, C; Blondel, B; Kaminski, M; Goffinet, F; Zeitlin, J

    2016-02-01

    Our aim is to compare the new French EPOPé intrauterine growth curves, developed to address the guidelines 2013 of the French College of Obstetricians and Gynecologists, with reference curves currently used in France, and to evaluate the consequences of their adjustment for fetal sex and maternal characteristics. Eight intrauterine and birthweight curves, used in France were compared to the EPOPé curves using data from the French Perinatal Survey 2010. The influence of adjustment on the rate of SGA births and the characteristics of these births was analysed. Due to their birthweight values and distribution, the selected intrauterine curves are less suitable for births in France than the new curves. Birthweight curves led to low rates of SGA births from 4.3 to 8.5% compared to 10.0% with the EPOPé curves. The adjustment for maternal and fetal characteristics avoids the over-representation of girls among SGA births, and reclassifies 4% of births. Among births reclassified as SGA, the frequency of medical and obstetrical risk factors for growth restriction, smoking (≥10 cigarettes/day), and neonatal transfer is higher than among non-SGA births (P<0.01). The EPOPé curves are more suitable for French births than currently used curves, and their adjustment improves the identification of mothers and babies at risk of growth restriction and poor perinatal outcomes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  16. CONFIRMATION OF HOT JUPITER KEPLER-41b VIA PHASE CURVE ANALYSIS

    International Nuclear Information System (INIS)

    Quintana, Elisa V.; Rowe, Jason F.; Caldwell, Douglas A.; Christiansen, Jessie L.; Jenkins, Jon M.; Morris, Robert L.; Smith, Jeffrey C.; Thompson, Susan E.; Barclay, Thomas; Howell, Steve B.; Borucki, William J.; Sanderfer, Dwight T.; Still, Martin; Ciardi, David R.; Demory, Brice-Olivier; Klaus, Todd C.; Fulton, Benjamin J.; Shporer, Avi

    2013-01-01

    We present high precision photometry of Kepler-41, a giant planet in a 1.86 day orbit around a G6V star that was recently confirmed through radial velocity measurements. We have developed a new method to confirm giant planets solely from the photometric light curve, and we apply this method herein to Kepler-41 to establish the validity of this technique. We generate a full phase photometric model by including the primary and secondary transits, ellipsoidal variations, Doppler beaming, and reflected/emitted light from the planet. Third light contamination scenarios that can mimic a planetary transit signal are simulated by injecting a full range of dilution values into the model, and we re-fit each diluted light curve model to the light curve. The resulting constraints on the maximum occultation depth and stellar density combined with stellar evolution models rules out stellar blends and provides a measurement of the planet's mass, size, and temperature. We expect about two dozen Kepler giant planets can be confirmed via this method.

  17. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    International Nuclear Information System (INIS)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul; Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo

    2010-01-01

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  18. Validation of Ulchin Units 1, 2 CONTEMPT Model Prior to the Production of EQ Envelope Curve

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Su Hyun; Kim, Min Ki; Hong, Soon Joon; Lee, Byung Chul [FNC Technology Co., SNU, Seoul (Korea, Republic of); Suh, Jeong Kwan; Lee, Jae Yong; Song, Dong Soo [KEPCO Research Institute, Daejeon (Korea, Republic of)

    2010-10-15

    The Ulchin Units 1, 2 will be refurbished with RSG (Replacement of Steam Generator) and PU (Power Uprate). The current EQ (Environmental Qualification) envelope curve should be modified according to RSG and PU. The containment P/T (Pressure/Temperature) analysis in Ulchin Units 1, 2 FSAR was done using EDF computer program PAREO6. PAREO6 uses the same assumptions as the US NRC CONTEMPT program, and the results given by both programs are in good agreement. It is utilized to determine pressure and temperature variations in a PWR containment subsequent to a reactor coolant or secondary system pipe break. But PAREO6 cannot be available to the production of EQ envelope curve, so CONTEMPT code should be used instead of PAREO6. It is essential to validate the CONTEMPT OSG (Original Steam Generator) model prior to the production of EQ envelope curve considering RSG and PU. This study has been performed to validate the CONTEMPT model of Ulchin Units 1, 2 by comparing the CONTEMPT results with the PAERO6 results in Ulchin Units 1, 2 FSAR

  19. Extended analysis of cooling curves

    International Nuclear Information System (INIS)

    Djurdjevic, M.B.; Kierkus, W.T.; Liliac, R.E.; Sokolowski, J.H.

    2002-01-01

    Thermal Analysis (TA) is the measurement of changes in a physical property of a material that is heated through a phase transformation temperature range. The temperature changes in the material are recorded as a function of the heating or cooling time in such a manner that allows for the detection of phase transformations. In order to increase accuracy, characteristic points on the cooling curve have been identified using the first derivative curve plotted versus time. In this paper, an alternative approach to the analysis of the cooling curve has been proposed. The first derivative curve has been plotted versus temperature and all characteristic points have been identified with the same accuracy achieved using the traditional method. The new cooling curve analysis also enables the Dendrite Coherency Point (DCP) to be detected using only one thermocouple. (author)

  20. Probability of identification: a statistical model for the validation of qualitative botanical identification methods.

    Science.gov (United States)

    LaBudde, Robert A; Harnly, James M

    2012-01-01

    A qualitative botanical identification method (BIM) is an analytical procedure that returns a binary result (1 = Identified, 0 = Not Identified). A BIM may be used by a buyer, manufacturer, or regulator to determine whether a botanical material being tested is the same as the target (desired) material, or whether it contains excessive nontarget (undesirable) material. The report describes the development and validation of studies for a BIM based on the proportion of replicates identified, or probability of identification (POI), as the basic observed statistic. The statistical procedures proposed for data analysis follow closely those of the probability of detection, and harmonize the statistical concepts and parameters between quantitative and qualitative method validation. Use of POI statistics also harmonizes statistical concepts for botanical, microbiological, toxin, and other analyte identification methods that produce binary results. The POI statistical model provides a tool for graphical representation of response curves for qualitative methods, reporting of descriptive statistics, and application of performance requirements. Single collaborator and multicollaborative study examples are given.