WorldWideScience

Sample records for probability approach applied

  1. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  2. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  3. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  4. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  5. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  6. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  7. Applied probability and stochastic processes. 2. ed.

    Energy Technology Data Exchange (ETDEWEB)

    Feldman, Richard M. [Texas A and M Univ., College Station, TX (United States). Industrial and Systems Engineering Dept.; Valdez-Flores, Ciriaco [Sielken and Associates Consulting, Inc., Bryan, TX (United States)

    2010-07-01

    This book presents applied probability and stochastic processes in an elementary but mathematically precise manner, with numerous examples and exercises to illustrate the range of engineering and science applications of the concepts. The book is designed to give the reader an intuitive understanding of probabilistic reasoning, in addition to an understanding of mathematical concepts and principles. The initial chapters present a summary of probability and statistics and then Poisson processes, Markov chains, Markov processes and queuing processes are introduced. Advanced topics include simulation, inventory theory, replacement theory, Markov decision theory, and the use of matrix geometric procedures in the analysis of queues. Included in the second edition are appendices at the end of several chapters giving suggestions for the use of Excel in solving the problems of the chapter. Also new in this edition are an introductory chapter on statistics and a chapter on Poisson processes that includes some techniques used in risk assessment. The old chapter on queues has been expanded and broken into two new chapters: one for simple queuing processes and one for queuing networks. Support is provided through the web site http://apsp.tamu.edu where students will have the answers to odd numbered problems and instructors will have access to full solutions and Excel files for homework. (orig.)

  8. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    CERN Document Server

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  9. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  10. Elements of probability and statistics an introduction to probability with De Finetti’s approach and to Bayesian statistics

    CERN Document Server

    Biagini, Francesca

    2016-01-01

    This book provides an introduction to elementary probability and to Bayesian statistics using de Finetti's subjectivist approach. One of the features of this approach is that it does not require the introduction of sample space – a non-intrinsic concept that makes the treatment of elementary probability unnecessarily complicate – but introduces as fundamental the concept of random numbers directly related to their interpretation in applications. Events become a particular case of random numbers and probability a particular case of expectation when it is applied to events. The subjective evaluation of expectation and of conditional expectation is based on an economic choice of an acceptable bet or penalty. The properties of expectation and conditional expectation are derived by applying a coherence criterion that the evaluation has to follow. The book is suitable for all introductory courses in probability and statistics for students in Mathematics, Informatics, Engineering, and Physics.

  11. Path probability of stochastic motion: A functional approach

    Science.gov (United States)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  12. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr; Rabia, Sherif; Mahmoud, Mohamed; Aly, Moustafa H.; Shihada, Basem

    2016-01-01

    by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio

  13. The Probability Approach to English If-Conditional Sentences

    Science.gov (United States)

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  14. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  15. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  16. Triangulation based inclusion probabilities: a design-unbiased sampling approach

    OpenAIRE

    Fehrmann, Lutz; Gregoire, Timothy; Kleinn, Christoph

    2011-01-01

    A probabilistic sampling approach for design-unbiased estimation of area-related quantitative characteristics of spatially dispersed population units is proposed. The developed field protocol includes a fixed number of 3 units per sampling location and is based on partial triangulations over their natural neighbors to derive the individual inclusion probabilities. The performance of the proposed design is tested in comparison to fixed area sample plots in a simulation with two forest stands. ...

  17. Elements of a function analytic approach to probability.

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger Georges (University of Southern California, Los Angeles, CA); Red-Horse, John Robert

    2008-02-01

    We first provide a detailed motivation for using probability theory as a mathematical context in which to analyze engineering and scientific systems that possess uncertainties. We then present introductory notes on the function analytic approach to probabilistic analysis, emphasizing the connections to various classical deterministic mathematical analysis elements. Lastly, we describe how to use the approach as a means to augment deterministic analysis methods in a particular Hilbert space context, and thus enable a rigorous framework for commingling deterministic and probabilistic analysis tools in an application setting.

  18. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  19. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  20. Dropping Probability Reduction in OBS Networks: A Simple Approach

    KAUST Repository

    Elrasad, Amr

    2016-08-01

    In this paper, we propose and derive a slotted-time model for analyzing the burst blocking probability in Optical Burst Switched (OBS) networks. We evaluated the immediate and delayed signaling reservation schemes. The proposed model compares the performance of both just-in-time (JIT) and just-enough-time (JET) signaling protocols associated with of void/non-void filling link scheduling schemes. It also considers none and limited range wavelength conversions scenarios. Our model is distinguished by being adaptable to different offset-time and burst length distributions. We observed that applying a limited range of wavelength conversion, burst blocking probability is reduced by several orders of magnitudes and yields a better burst delivery ratio compared with full wavelength conversion.

  1. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  2. Analysis of femtosecond pump-probe photoelectron-photoion coincidence measurements applying Bayesian probability theory

    Science.gov (United States)

    Rumetshofer, M.; Heim, P.; Thaler, B.; Ernst, W. E.; Koch, M.; von der Linden, W.

    2018-06-01

    Ultrafast dynamical processes in photoexcited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals provoked by pump and/or probe pulses alone often obscure these excited-state signals. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement, as commonly applied, results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Coincidence measurements additionally suffer from false coincidences, requiring long data-acquisition times to keep erroneous signals at an acceptable level. Here we present a probabilistic approach based on Bayesian probability theory that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection, we reconstruct the interesting excited-state spectrum from pump-probe and pump-only measurements. This approach allows us to treat background and false coincidences consistently and on the same footing. We demonstrate that the Bayesian formalism has the following advantages over simple signal subtraction: (i) the signal-to-noise ratio is significantly increased, (ii) the pump-only contribution is not overestimated, (iii) false coincidences are excluded, (iv) prior knowledge, such as positivity, is consistently incorporated, (v) confidence intervals are provided for the reconstructed spectrum, and (vi) it is applicable to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, the Bayesian approach allows us to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. The probabilistic approach is thoroughly scrutinized by challenging mock data. The application to pump-probe coincidence measurements on acetone molecules enables quantitative interpretations

  3. Continuation of probability density functions using a generalized Lyapunov approach

    Energy Technology Data Exchange (ETDEWEB)

    Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)

    2017-05-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  4. Collective fluctuations in magnetized plasma: Transition probability approach

    International Nuclear Information System (INIS)

    Sosenko, P.P.

    1997-01-01

    Statistical plasma electrodynamics is elaborated with special emphasis on the transition probability approach and quasi-particles, and on modern applications to magnetized plasmas. Fluctuation spectra in the magnetized plasma are calculated in the range of low frequencies (with respect to the cyclotron one), and the conditions for the transition from incoherent to collective fluctuations are established. The role of finite-Larmor-radius effects and particle polarization drift in such a transition is explained. The ion collective features in fluctuation spectra are studied. 63 refs., 30 figs

  5. Moving beyond probabilities – Strength of knowledge characterisations applied to security

    International Nuclear Information System (INIS)

    Askeland, Tore; Flage, Roger; Aven, Terje

    2017-01-01

    Many security experts avoid the concept of probability when assessing risk and vulnerabilities. Their main argument is that meaningful probabilities cannot be determined and they are consequently not useful for decision-making and security management. However, to give priority to some measures and not others, the likelihood dimension needs to be addressed in some way; the question is how. One approach receiving attention recently is to add strength of knowledge judgements to the probabilities and probability intervals generated. The judgements provide a qualitative labelling of how strong the knowledge supporting the probability assignments is. Criteria for such labelling have been developed, but not for a security setting. The purpose of this paper is to develop such criteria specific to security applications and, using some examples, to demonstrate their suitability. - Highlights: • The concept of probability is often avoided in security risk assessments. • We argue that the likelihood/probability dimension needs to be somehow addressed. • Probabilities should be supplemented by qualitative strength-of-knowledge scores. • Such criteria specific to security applications are developed. • Two examples are used to demonstrate the suitability of the suggested criteria.

  6. More efficient integrated safeguards by applying a reasonable detection probability for maintaining low presence probability of undetected nuclear proliferating activities

    International Nuclear Information System (INIS)

    Otsuka, Naoto

    2013-01-01

    Highlights: • A theoretical foundation is presented for more efficient Integrated Safeguards (IS). • Probability of undetected nuclear proliferation activities should be maintained low. • For nations under IS, the probability to start proliferation activities is very low. • The fact can decrease the detection probability of IS by dozens of percentage points. • The cost of IS per nation can be cut down by reducing inspection frequencies etc. - Abstract: A theoretical foundation is presented for implementing more efficiently the present International Atomic Energy Agency (IAEA) integrated safeguards (ISs) on the basis of fuzzy evaluation of the probability that the evaluated nation will continue peaceful activities. It is shown that by determining the presence probability of undetected nuclear proliferating activities, nations under IS can be maintained at acceptably low proliferation risk levels even if the detection probability of current IS is decreased by dozens of percentage from the present value. This makes it possible to reduce inspection frequency and the number of collected samples, allowing the IAEA to cut costs per nation. This will contribute to further promotion and application of IS to more nations by the IAEA, and more efficient utilization of IAEA resources from the viewpoint of whole IS framework

  7. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  8. Concepts and Bounded Rationality: An Application of Niestegge's Approach to Conditional Quantum Probabilities

    Science.gov (United States)

    Blutner, Reinhard

    2009-03-01

    Recently, Gerd Niestegge developed a new approach to quantum mechanics via conditional probabilities developing the well-known proposal to consider the Lüders-von Neumann measurement as a non-classical extension of probability conditionalization. I will apply his powerful and rigorous approach to the treatment of concepts using a geometrical model of meaning. In this model, instances are treated as vectors of a Hilbert space H. In the present approach there are at least two possibilities to form categories. The first possibility sees categories as a mixture of its instances (described by a density matrix). In the simplest case we get the classical probability theory including the Bayesian formula. The second possibility sees categories formed by a distinctive prototype which is the superposition of the (weighted) instances. The construction of prototypes can be seen as transferring a mixed quantum state into a pure quantum state freezing the probabilistic characteristics of the superposed instances into the structure of the formed prototype. Closely related to the idea of forming concepts by prototypes is the existence of interference effects. Such inference effects are typically found in macroscopic quantum systems and I will discuss them in connection with several puzzles of bounded rationality. The present approach nicely generalizes earlier proposals made by authors such as Diederik Aerts, Andrei Khrennikov, Ricardo Franco, and Jerome Busemeyer. Concluding, I will suggest that an active dialogue between cognitive approaches to logic and semantics and the modern approach of quantum information science is mandatory.

  9. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  10. Continuation of probability density functions using a generalized Lyapunov approach

    NARCIS (Netherlands)

    Baars, S.; Viebahn, J. P.; Mulder, T. E.; Kuehn, C.; Wubs, F. W.; Dijkstra, H. A.

    2017-01-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial

  11. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    The hydrodynamic modelling of rapid flood events due to extreme climatic events in urban environment is both a complex and challenging task. The horizontal resolution necessary to resolve complexity of urban flood dynamics is a critical issue; the presence of obstacles of varying shapes and length scales, gaps between buildings and the complex geometry of the city such as slopes affect flow paths and flood levels magnitudes. These small scale processes require a high resolution grid to be modelled accurately (2m or less, Olbert et al., 2015; Hunter et al., 2008; Brown et al., 2007) and, therefore, altimetry data of at least the same resolution. Along with availability of high-resolution LiDAR data and computational capabilities, as well as state of the art nested modelling approaches, these problems can now be overcome. Flooding and drying, domain definition, frictional resistance and boundary descriptions are all important issues to be addressed when modelling urban flooding. In recent years, the number of urban flood models dramatically increased giving a good insight into various modelling problems and solutions (Mark et al., 2004; Mason et al., 2007; Fewtrell et al., 2008; Shubert et al., 2008). Despite extensive modelling work conducted for fluvial (e.g. Mignot et al., 2006; Hunter et al., 2008; Yu and Lane, 2006) and coastal mechanisms of flooding (e.g. Gallien et al., 2011; Yang et al., 2012), the amount of investigations into combined coastal-fluvial flooding is still very limited (e.g. Orton et al., 2012; Lian et al., 2013). This is surprising giving the extent of flood consequences when both mechanisms occur simultaneously, which usually happens when they are driven by one process such as a storm. The reason for that could be the fact that the likelihood of joint event is much smaller than those of any of the two contributors occurring individually, because for fast moving storms the rainfall-driven fluvial flood arrives usually later than the storm surge

  12. Long-Term Fatigue and Its Probability of Failure Applied to Dental Implants

    Directory of Open Access Journals (Sweden)

    María Prados-Privado

    2016-01-01

    Full Text Available It is well known that dental implants have a high success rate but even so, there are a lot of factors that can cause dental implants failure. Fatigue is very sensitive to many variables involved in this phenomenon. This paper takes a close look at fatigue analysis and explains a new method to study fatigue from a probabilistic point of view, based on a cumulative damage model and probabilistic finite elements, with the goal of obtaining the expected life and the probability of failure. Two different dental implants were analysed. The model simulated a load of 178 N applied with an angle of 0°, 15°, and 20° and a force of 489 N with the same angles. Von Mises stress distribution was evaluated and once the methodology proposed here was used, the statistic of the fatigue life and the probability cumulative function were obtained. This function allows us to relate each cycle life with its probability of failure. Cylindrical implant has a worst behaviour under the same loading force compared to the conical implant analysed here. Methodology employed in the present study provides very accuracy results because all possible uncertainties have been taken in mind from the beginning.

  13. Dynamic probability evaluation of safety levels of earth-rockfill dams using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Zi-wu Fan

    2009-06-01

    Full Text Available In order to accurately predict and control the aging process of dams, new information should be collected continuously to renew the quantitative evaluation of dam safety levels. Owing to the complex structural characteristics of dams, it is quite difficult to predict the time-varying factors affecting their safety levels. It is not feasible to employ dynamic reliability indices to evaluate the actual safety levels of dams. Based on the relevant regulations for dam safety classification in China, a dynamic probability description of dam safety levels was developed. Using the Bayesian approach and effective information mining, as well as real-time information, this study achieved more rational evaluation and prediction of dam safety levels. With the Bayesian expression of discrete stochastic variables, the a priori probabilities of the dam safety levels determined by experts were combined with the likelihood probability of the real-time check information, and the probability information for the evaluation of dam safety levels was renewed. The probability index was then applied to dam rehabilitation decision-making. This method helps reduce the difficulty and uncertainty of the evaluation of dam safety levels and complies with the current safe decision-making regulations for dams in China. It also enhances the application of current risk analysis methods for dam safety levels.

  14. Unequal Probability Marking Approach to Enhance Security of Traceback Scheme in Tree-Based WSNs.

    Science.gov (United States)

    Huang, Changqin; Ma, Ming; Liu, Xiao; Liu, Anfeng; Zuo, Zhengbang

    2017-06-17

    Fog (from core to edge) computing is a newly emerging computing platform, which utilizes a large number of network devices at the edge of a network to provide ubiquitous computing, thus having great development potential. However, the issue of security poses an important challenge for fog computing. In particular, the Internet of Things (IoT) that constitutes the fog computing platform is crucial for preserving the security of a huge number of wireless sensors, which are vulnerable to attack. In this paper, a new unequal probability marking approach is proposed to enhance the security performance of logging and migration traceback (LM) schemes in tree-based wireless sensor networks (WSNs). The main contribution of this paper is to overcome the deficiency of the LM scheme that has a higher network lifetime and large storage space. In the unequal probability marking logging and migration (UPLM) scheme of this paper, different marking probabilities are adopted for different nodes according to their distances to the sink. A large marking probability is assigned to nodes in remote areas (areas at a long distance from the sink), while a small marking probability is applied to nodes in nearby area (areas at a short distance from the sink). This reduces the consumption of storage and energy in addition to enhancing the security performance, lifetime, and storage capacity. Marking information will be migrated to nodes at a longer distance from the sink for increasing the amount of stored marking information, thus enhancing the security performance in the process of migration. The experimental simulation shows that for general tree-based WSNs, the UPLM scheme proposed in this paper can store 1.12-1.28 times the amount of stored marking information that the equal probability marking approach achieves, and has 1.15-1.26 times the storage utilization efficiency compared with other schemes.

  15. Daniel Courgeau: Probability and social science: methodological relationships between the two approaches [Review of: . Probability and social science: methodological relationships between the two approaches

    NARCIS (Netherlands)

    Willekens, F.J.C.

    2013-01-01

    Throughout history, humans engaged in games in which randomness plays a role. In the 17th century, scientists started to approach chance scientifically and to develop a theory of probability. Courgeau describes how the relationship between probability theory and social sciences emerged and evolved

  16. Enhancing Ignition Probability and Fusion Yield in NIF Indirect Drive Targets with Applied Magnetic Fields

    Science.gov (United States)

    Perkins, L. John; Logan, B. Grant; Ho, Darwin; Zimmerman, George; Rhodes, Mark; Blackfield, Donald; Hawkins, Steven

    2017-10-01

    Imposed magnetic fields of tens of Tesla that increase to greater than 10 kT (100 MGauss) under capsule compression may relax conditions for ignition and propagating burn in indirect-drive ICF targets. This may allow attainment of ignition, or at least significant fusion energy yields, in presently-performing ICF targets on the National Ignition Facility that today are sub-marginal for thermonuclear burn through adverse hydrodynamic conditions at stagnation. Results of detailed 2D radiation-hydrodynamic-burn simulations applied to NIF capsule implosions with low-mode shape perturbations and residual kinetic energy loss indicate that such compressed fields may increase the probability for ignition through range reduction of fusion alpha particles, suppression of electron heat conduction and stabilization of higher-mode RT instabilities. Optimum initial applied fields are around 50 T. Off-line testing has been performed of a hohlraum coil and pulsed power supply that could be integrated on NIF; axial fields of 58T were obtained. Given the full plasma structure at capsule stagnation may be governed by 3-D resistive MHD, the formation of closed magnetic field lines might further augment ignition prospects. Experiments are now required to assess the potential of applied magnetic fields to NIF ICF ignition and burn. Work performed under auspices of U.S. DOE by LLNL under Contract DE-AC52-07NA27344.

  17. Probability of ignition - a better approach than ignition margin

    International Nuclear Information System (INIS)

    Ho, S.K.; Perkins, L.J.

    1989-01-01

    The use of a figure of merit - the probability of ignition - is proposed for the characterization of the ignition performance of projected ignition tokamaks. Monte Carlo and analytic models have been developed to compute the uncertainty distribution function for ignition of a given tokamak design, in terms of the uncertainties inherent in the tokamak physics database. A sample analysis with this method indicates that the risks of not achieving ignition may be unacceptably high unless the accepted margins for ignition are increased. (author). Letter-to-the-editor. 12 refs, 2 figs, 2 tabs

  18. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    International Nuclear Information System (INIS)

    Shafii, Mohammad Ali; Meidianti, Rahma; Wildian,; Fitriyani, Dian; Tongkukut, Seni H. J.; Arkundato, Artoto

    2014-01-01

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation

  19. Theoretical analysis of integral neutron transport equation using collision probability method with quadratic flux approach

    Energy Technology Data Exchange (ETDEWEB)

    Shafii, Mohammad Ali, E-mail: mashafii@fmipa.unand.ac.id; Meidianti, Rahma, E-mail: mashafii@fmipa.unand.ac.id; Wildian,, E-mail: mashafii@fmipa.unand.ac.id; Fitriyani, Dian, E-mail: mashafii@fmipa.unand.ac.id [Department of Physics, Andalas University Padang West Sumatera Indonesia (Indonesia); Tongkukut, Seni H. J. [Department of Physics, Sam Ratulangi University Manado North Sulawesi Indonesia (Indonesia); Arkundato, Artoto [Department of Physics, Jember University Jember East Java Indonesia (Indonesia)

    2014-09-30

    Theoretical analysis of integral neutron transport equation using collision probability (CP) method with quadratic flux approach has been carried out. In general, the solution of the neutron transport using the CP method is performed with the flat flux approach. In this research, the CP method is implemented in the cylindrical nuclear fuel cell with the spatial of mesh being conducted into non flat flux approach. It means that the neutron flux at any point in the nuclear fuel cell are considered different each other followed the distribution pattern of quadratic flux. The result is presented here in the form of quadratic flux that is better understanding of the real condition in the cell calculation and as a starting point to be applied in computational calculation.

  20. Predicting footbridge vibrations using a probability-based approach

    DEFF Research Database (Denmark)

    Pedersen, Lars; Frier, Christian

    2017-01-01

    Vibrations in footbridges may be problematic as excessive vibrations may occur as a result of actions of pedestrians. Design-stage predictions of levels of footbridge vibration to the action of a pedestrian are useful and have been employed for many years based on a deterministic approach to mode...

  1. Probability of cavitation for single ultrasound pulses applied to tissues and tissue-mimicking materials.

    Science.gov (United States)

    Maxwell, Adam D; Cain, Charles A; Hall, Timothy L; Fowlkes, J Brian; Xu, Zhen

    2013-03-01

    In this study, the negative pressure values at which inertial cavitation consistently occurs in response to a single, two-cycle, focused ultrasound pulse were measured in several media relevant to cavitation-based ultrasound therapy. The pulse was focused into a chamber containing one of the media, which included liquids, tissue-mimicking materials, and ex vivo canine tissue. Focal waveforms were measured by two separate techniques using a fiber-optic hydrophone. Inertial cavitation was identified by high-speed photography in optically transparent media and an acoustic passive cavitation detector. The probability of cavitation (P(cav)) for a single pulse as a function of peak negative pressure (p(-)) followed a sigmoid curve, with the probability approaching one when the pressure amplitude was sufficient. The statistical threshold (defined as P(cav) = 0.5) was between p(-) = 26 and 30 MPa in all samples with high water content but varied between p(-) = 13.7 and >36 MPa in other media. A model for radial cavitation bubble dynamics was employed to evaluate the behavior of cavitation nuclei at these pressure levels. A single bubble nucleus with an inertial cavitation threshold of p(-) = 28.2 megapascals was estimated to have a 2.5 nm radius in distilled water. These data may be valuable for cavitation-based ultrasound therapy to predict the likelihood of cavitation at various pressure levels and dimensions of cavitation-induced lesions in tissue. Copyright © 2013 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  2. Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis

    Science.gov (United States)

    2015-08-13

    Critical Catalyst Reactant Branching Processes with Controlled Immigration , Annals of Applied Probability (03 2012) Amarjit Budhiraja, Rami Atar ...Markus Fischer. Large Deviation Properties of Weakly Interacting Processes via Weak Convergence Methods, Annals of Probability (10 2010) Rami Atar ...Dimensional Forward-Backward Stochastic Differen- tial Equations and the KPZ Equation Electron. J. Probab., 19 (2014), no. 40, 121. [2] R. Atar and A

  3. Sundanese ancient manuscripts search engine using probability approach

    Science.gov (United States)

    Suryani, Mira; Hadi, Setiawan; Paulus, Erick; Nurma Yulita, Intan; Supriatna, Asep K.

    2017-10-01

    Today, Information and Communication Technology (ICT) has become a regular thing for every aspect of live include cultural and heritage aspect. Sundanese ancient manuscripts as Sundanese heritage are in damage condition and also the information that containing on it. So in order to preserve the information in Sundanese ancient manuscripts and make them easier to search, a search engine has been developed. The search engine must has good computing ability. In order to get the best computation in developed search engine, three types of probabilistic approaches: Bayesian Networks Model, Divergence from Randomness with PL2 distribution, and DFR-PL2F as derivative form DFR-PL2 have been compared in this study. The three probabilistic approaches supported by index of documents and three different weighting methods: term occurrence, term frequency, and TF-IDF. The experiment involved 12 Sundanese ancient manuscripts. From 12 manuscripts there are 474 distinct terms. The developed search engine tested by 50 random queries for three types of query. The experiment results showed that for the single query and multiple query, the best searching performance given by the combination of PL2F approach and TF-IDF weighting method. The performance has been evaluated using average time responds with value about 0.08 second and Mean Average Precision (MAP) about 0.33.

  4. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  5. Measuring market share of petrol stations using conditional probability approach

    Science.gov (United States)

    Sharif, Shamshuritawati; Lwee, Xue Yin

    2017-05-01

    Oil and gas production is the strength of Malaysia's growth over past decades. It is one of the most strategic economic branches in the world. Since the oil industry is essential for the economic growth of a country, only a few undertakings have been achieved to establish. It is a very risky business. Therefore the dealer must have some information in hand before setting up a new business plan. Understanding the current business situation is an important strategy to avoid risky ventures. In this study, the aim is to deliver a very simple but essential way to identify the market share based on customer's choice factors. This approach is presented to encourage the non-statisticians to use it easily in helping their business performance. From this study, the most important factors differ from one station to another station. The results show that the factors of customer's choice for BHPetrol, Caltex, PETRON, PETRONAS and SHELL are site location, service quality, service quality, size of the petrol station, and brand image, respectively.

  6. Arsenic Accumulation in Rice and Probable Mitigation Approaches: A Review

    Directory of Open Access Journals (Sweden)

    Anindita Mitra

    2017-10-01

    Full Text Available According to recent reports, millions of people across the globe are suffering from arsenic (As toxicity. Arsenic is present in different oxidative states in the environment and enters in the food chain through soil and water. In the agricultural field, irrigation with arsenic contaminated water, that is, having a higher level of arsenic contamination on the top soil, which may affects the quality of crop production. The major crop like rice (Oryza sativa L. requires a considerable amount of water to complete its lifecycle. Rice plants potentially accumulate arsenic, particularly inorganic arsenic (iAs from the field, in different body parts including grains. Different transporters have been reported in assisting the accumulation of arsenic in plant cells; for example, arsenate (AsV is absorbed with the help of phosphate transporters, and arsenite (AsIII through nodulin 26-like intrinsic protein (NIP by the silicon transport pathway and plasma membrane intrinsic protein aquaporins. Researchers and practitioners are trying their level best to mitigate the problem of As contamination in rice. However, the solution strategies vary considerably with various factors, such as cultural practices, soil, water, and environmental/economic conditions, etc. The contemporary work on rice to explain arsenic uptake, transport, and metabolism processes at rhizosphere, may help to formulate better plans. Common agronomical practices like rain water harvesting for crop irrigation, use of natural components that help in arsenic methylation, and biotechnological approaches may explore how to reduce arsenic uptake by food crops. This review will encompass the research advances and practical agronomic strategies on arsenic contamination in rice crop.

  7. Applying discursive approaches to health psychology.

    Science.gov (United States)

    Seymour-Smith, Sarah

    2015-04-01

    The aim of this paper is to outline the contribution of two strands of discursive research, glossed as 'macro' and 'micro,' to the field of health psychology. A further goal is to highlight some contemporary debates in methodology associated with the use of interview data versus more naturalistic data in qualitative health research. Discursive approaches provide a way of analyzing talk as a social practice that considers how descriptions are put together and what actions they achieve. A selection of recent examples of discursive research from one applied area of health psychology, studies of diet and obesity, are drawn upon in order to illustrate the specifics of both strands. 'Macro' discourse work in psychology incorporates a Foucauldian focus on the way that discourses regulate subjectivities, whereas the concept of interpretative repertoires affords more agency to the individual: both are useful for identifying the cultural context of talk. Both 'macro' and 'micro' strands focus on accountability to varying degrees. 'Micro' Discursive Psychology, however, pays closer attention to the sequential organization of constructions and focuses on naturalistic settings that allow for the inclusion of an analysis of the health professional. Diets are typically depicted as an individual responsibility in mainstream health psychology, but discursive research highlights how discourses are collectively produced and bound up with social practices. (c) 2015 APA, all rights reserved).

  8. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  9. Optimal selection for BRCA1 and BRCA2 mutation testing using a combination of ' easy to apply ' probability models

    NARCIS (Netherlands)

    Bodmer, D.; Ligtenberg, M. J. L.; van der Hout, A. H.; Gloudemans, S.; Ansink, K.; Oosterwijk, J. C.; Hoogerbrugge, N.

    2006-01-01

    To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models. In a retrospective study of 263 families with breast and/or ovarian cancer patients, the utility of

  10. The distributed failure probability approach to dependent failure analysis, and its application

    International Nuclear Information System (INIS)

    Hughes, R.P.

    1989-01-01

    The Distributed Failure Probability (DFP) approach to the problem of dependent failures in systems is presented. The basis of the approach is that the failure probability of a component is a variable. The source of this variability is the change in the 'environment' of the component, where the term 'environment' is used to mean not only obvious environmental factors such as temperature etc., but also such factors as the quality of maintenance and manufacture. The failure probability is distributed among these various 'environments' giving rise to the Distributed Failure Probability method. Within the framework which this method represents, modelling assumptions can be made, based both on engineering judgment and on the data directly. As such, this DFP approach provides a soundly based and scrutable technique by which dependent failures can be quantitatively assessed. (orig.)

  11. Promoting Active Learning When Teaching Introductory Statistics and Probability Using a Portfolio Curriculum Approach

    Science.gov (United States)

    Adair, Desmond; Jaeger, Martin; Price, Owen M.

    2018-01-01

    The use of a portfolio curriculum approach, when teaching a university introductory statistics and probability course to engineering students, is developed and evaluated. The portfolio curriculum approach, so called, as the students need to keep extensive records both as hard copies and digitally of reading materials, interactions with faculty,…

  12. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  13. A semi-mechanistic approach to calculate the probability of fuel defects

    International Nuclear Information System (INIS)

    Tayal, M.; Millen, E.; Sejnoha, R.

    1992-10-01

    In this paper the authors describe the status of a semi-mechanistic approach to the calculation of the probability of fuel defects. This approach expresses the defect probability in terms of fundamental parameters such as local stresses, local strains, and fission product concentration. The calculations of defect probability continue to reflect the influences of the conventional parameters like power ramp, burnup and CANLUB. In addition, the new approach provides a mechanism to account for the impacts of additional factors involving detailed fuel design and reactor operation, for example pellet density, pellet shape and size, sheath diameter and thickness, pellet/sheath clearance, and coolant temperature and pressure. The approach has been validated against a previous empirical correlation. AN illustrative example shows how the defect thresholds are influenced by changes in the internal design of the element and in the coolant pressure. (Author) (7 figs., tab., 12 refs.)

  14. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  15. Identifying functional reorganization of spelling networks: an individual peak probability comparison approach

    Science.gov (United States)

    Purcell, Jeremy J.; Rapp, Brenda

    2013-01-01

    Previous research has shown that damage to the neural substrates of orthographic processing can lead to functional reorganization during reading (Tsapkini et al., 2011); in this research we ask if the same is true for spelling. To examine the functional reorganization of spelling networks we present a novel three-stage Individual Peak Probability Comparison (IPPC) analysis approach for comparing the activation patterns obtained during fMRI of spelling in a single brain-damaged individual with dysgraphia to those obtained in a set of non-impaired control participants. The first analysis stage characterizes the convergence in activations across non-impaired control participants by applying a technique typically used for characterizing activations across studies: Activation Likelihood Estimate (ALE) (Turkeltaub et al., 2002). This method was used to identify locations that have a high likelihood of yielding activation peaks in the non-impaired participants. The second stage provides a characterization of the degree to which the brain-damaged individual's activations correspond to the group pattern identified in Stage 1. This involves performing a Mahalanobis distance statistics analysis (Tsapkini et al., 2011) that compares each of a control group's peak activation locations to the nearest peak generated by the brain-damaged individual. The third stage evaluates the extent to which the brain-damaged individual's peaks are atypical relative to the range of individual variation among the control participants. This IPPC analysis allows for a quantifiable, statistically sound method for comparing an individual's activation pattern to the patterns observed in a control group and, thus, provides a valuable tool for identifying functional reorganization in a brain-damaged individual with impaired spelling. Furthermore, this approach can be applied more generally to compare any individual's activation pattern with that of a set of other individuals. PMID:24399981

  16. Probability approaching method (PAM) and its application on fuel management optimization

    International Nuclear Information System (INIS)

    Liu, Z.; Hu, Y.; Shi, G.

    2004-01-01

    For multi-cycle reloading optimization problem, a new solving scheme is presented. The multi-cycle problem is de-coupled into a number of relatively independent mono-cycle issues, then this non-linear programming problem with complex constraints is solved by an advanced new algorithm -probability approaching method (PAM), which is based on probability theory. The result on simplified core model shows well effect of this new multi-cycle optimization scheme. (authors)

  17. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  18. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  19. Half-life measurements and photon emission probabilities of frequently applied radioisotopes

    International Nuclear Information System (INIS)

    Schoetzig, U.; Schrader, H.

    1998-09-01

    It belongs to the duties of the PTB department for 'Radioactivity' to determine the radioactivity emitted by radioactive radiation sources and publish their specific decay data, also called ''standards'', so that appliers of such sources may calibrate their equipment accordingly, as e.g. photon detectors. Further data required for proper calibration are those defining the photon emission probability per decay, P(E), at the relevant photon energy E. The emission rate R(E) is derived from the activity A, by the calculus R(E)=A x P(E), and the half-lives of decay, T 1 /2, together with the standards are used for determining the time of measurement. The calibration quality essentially is determined by those two parameters and the incertainties involved. The PTB 'Radioactivity' department therefore publishes for users recommended decay data elaborated and used by the experts at PTB. The tabulated data are either measured at PTB, or critically selected from data compilations of other publication sources. The tabulated decay data presented here are intended to serve as a source of reference for laboratory work and should be used in combination with the comprehensive data collections available (see the bibliography of this document: 86BRFI, 91TECD, 96FI, Nuclear Data Sheets, e.g. 98ND84). (orig./CB) [de

  20. Applying a gaming approach to IP strategy.

    Science.gov (United States)

    Gasnier, Arnaud; Vandamme, Luc

    2010-02-01

    Adopting an appropriate IP strategy is an important but complex area, particularly in the pharmaceutical and biotechnology sectors, in which aspects such as regulatory submissions, high competitive activity, and public health and safety information requirements limit the amount of information that can be protected effectively through secrecy. As a result, and considering the existing time limits for patent protection, decisions on how to approach IP in these sectors must be made with knowledge of the options and consequences of IP positioning. Because of the specialized nature of IP, it is necessary to impart knowledge regarding the options and impact of IP to decision-makers, whether at the level of inventors, marketers or strategic business managers. This feature review provides some insight on IP strategy, with a focus on the use of a new 'gaming' approach for transferring the skills and understanding needed to make informed IP-related decisions; the game Patentopolis is discussed as an example of such an approach. Patentopolis involves interactive activities with IP-related business decisions, including the exploitation and enforcement of IP rights, and can be used to gain knowledge on the impact of adopting different IP strategies.

  1. Applied Regression Modeling A Business Approach

    CERN Document Server

    Pardoe, Iain

    2012-01-01

    An applied and concise treatment of statistical regression techniques for business students and professionals who have little or no background in calculusRegression analysis is an invaluable statistical methodology in business settings and is vital to model the relationship between a response variable and one or more predictor variables, as well as the prediction of a response value given values of the predictors. In view of the inherent uncertainty of business processes, such as the volatility of consumer spending and the presence of market uncertainty, business professionals use regression a

  2. Estimating Probability of Default on Peer to Peer Market – Survival Analysis Approach

    Directory of Open Access Journals (Sweden)

    Đurović Andrija

    2017-05-01

    Full Text Available Arguably a cornerstone of credit risk modelling is the probability of default. This article aims is to search for the evidence of relationship between loan characteristics and probability of default on peer-to-peer (P2P market. In line with that, two loan characteristics are analysed: 1 loan term length and 2 loan purpose. The analysis is conducted using survival analysis approach within the vintage framework. Firstly, 12 months probability of default through the cycle is used to compare riskiness of analysed loan characteristics. Secondly, log-rank test is employed in order to compare complete survival period of cohorts. Findings of the paper suggest that there is clear evidence of relationship between analysed loan characteristics and probability of default. Longer term loans are more risky than the shorter term ones and the least risky loans are those used for credit card payoff.

  3. Critical Applied Linguistics: An Evaluative Interdisciplinary Approach in Criticism and Evaluation of Applied Linguistics’ Disciplines

    OpenAIRE

    H. Davari

    2015-01-01

    The emergence of some significant critical approaches and directions in the field of applied linguistics from the mid-1980s onwards has met with various positive and opposite reactions. On the basis of their strength and significance, such approaches and directions have challenged some of the mainstream approaches’ claims, principles and assumptions. Among them, critical applied linguistics can be highlighted as a new approach, developed by the Australian applied linguist, Alastair Pennycook....

  4. Teaching Probability to Pre-Service Teachers with Argumentation Based Science Learning Approach

    Science.gov (United States)

    Can, Ömer Sinan; Isleyen, Tevfik

    2016-01-01

    The aim of this study is to explore the effects of the argumentation based science learning (ABSL) approach on the teaching probability to pre-service teachers. The sample of the study included 41 students studying at the Department of Elementary School Mathematics Education in a public university during the 2014-2015 academic years. The study is…

  5. The Classicist and the Frequentist Approach to Probability within a "TinkerPlots2" Combinatorial Problem

    Science.gov (United States)

    Prodromou, Theodosia

    2012-01-01

    This article seeks to address a pedagogical theory of introducing the classicist and the frequentist approach to probability, by investigating important elements in 9th grade students' learning process while working with a "TinkerPlots2" combinatorial problem. Results from this research study indicate that, after the students had seen…

  6. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  7. Calculation of the tunneling time using the extended probability of the quantum histories approach

    International Nuclear Information System (INIS)

    Rewrujirek, Jiravatt; Hutem, Artit; Boonchui, Sutee

    2014-01-01

    The dwell time of quantum tunneling has been derived by Steinberg (1995) [7] as a function of the relation between transmission and reflection times τ t and τ r , weighted by the transmissivity and the reflectivity. In this paper, we reexamine the dwell time using the extended probability approach. The dwell time is calculated as the weighted average of three mutually exclusive events. We consider also the scattering process due to a resonance potential in the long-time limit. The results show that the dwell time can be expressed as the weighted sum of transmission, reflection and internal probabilities.

  8. A probability tomography approach to the analysis of potential field data in the Campi Flegrei caldera (Italy)

    Energy Technology Data Exchange (ETDEWEB)

    Iuliano, T.; Patella, D. [Naples Univ. Federico 2., Naples (Italy). Dipartimento di Scienze Fisiche; Mauriello, P. [Consiglio Nazionale delle Ricerche, Istituto per le Tecnologie Applicate ai Beni Culturali, Rome (Italy)

    2001-04-01

    The results of the application of the 3a probability tomography imaging approach to the study of the Ca mpi Fagarol (Cf) caldera are presented and discussed. The tomography approach has been applied to gravity, magnetic and ground deformation data already available in literature. The analysis of the 3a tomographic images is preceded by a brief qualitative interpretation of the original survey maps and by an outline of the probability tomography approach for each geophysical prospecting method. The results derived from the 3a tomographic images are the high occurrence probabilities of both gravity and ground deformation source centers in the Cf caldera under the town of Palazzo. A Bagger negative anomaly source centre is highlighted in the depth range 1.6-2 km b.s.l., whereas a positive ground deformation point source, responsible for the bradyseismic crisis of 1982-1984, is estimated at a mean depth of 3-4 km b.s.l. These inferences, combined with the results of a previous analysis of magnetotelluric, dipolar geolectrical and self-potential data, corroborate the hypothesis that the bradyseismic events in the CF area may be explained by hot fluids vertical advection and subsequent lateral diffusion within a trapped reservoir overlying a magma chamber.

  9. A probability tomography approach to the analysis of potential field data in the Campi Flegrei caldera (Italy

    Directory of Open Access Journals (Sweden)

    D. Patella

    2001-06-01

    Full Text Available The results of the application of the 3D probability tomography imaging approach to the study of the Campi Flegrei (CF caldera are presented and discussed. The tomography approach has been applied to gravity, magnetic and ground deformation data already available in literature. The analysis of the 3D tomographic images is preceded by a brief qualitative interpretation of the original survey maps and by an outline of the probability tomography approach for each geophysical prospecting method. The results derived from the 3D tomographic images are the high occurrence probabilities of both gravity and ground deformation source centres in the CF caldera under the town of Pozzuoli. A Bouguer negative anomaly source centre is highlighted in the depth range 1.6-2 km b.s.l., whereas a positive ground deformation point source, responsible for the bradyseismic crisis of 1982-1984, is estimated at a mean depth of 3-4 km b.s.l. These inferences, combined with the results of a previous analysis of magnetotelluric, dipolar geoelectrical and self-potential data, corroborate the hypothesis that the bradyseismic events in the CF area may be explained by hot fluids vertical advection and subsequent lateral diffusion within a trapped reservoir overlying a magma chamber.

  10. Risk Profile Indicators and Spanish Banks’ Probability of Default from a Regulatory Approach

    Directory of Open Access Journals (Sweden)

    Pilar Gómez-Fernández-Aguado

    2018-04-01

    Full Text Available This paper analyses the relationships between the traditional bank risk profile indicators and a new measure of banks’ probability of default that considers the Basel regulatory framework. First, based on the SYstemic Model of Bank Originated Losses (SYMBOL, we calculated the individual probabilities of default (PD of a representative sample of Spanish credit institutions during the period of 2008–2016. Then, panel data regressions were estimated to explore the influence of the risk indicators on the PD. Our findings on the Spanish banking system could be important to regulatory and supervisory authorities. First, the PD based on the SYMBOL model could be used to analyse bank risk from a regulatory approach. Second, the results might be useful for designing new regulations focused on the key factors that affect the banks’ probability of default. Third, our findings reveal that the emphasis on regulation and supervision should differ by type of entity.

  11. Assessment of Optical Coherence Tomography Color Probability Codes in Myopic Glaucoma Eyes After Applying a Myopic Normative Database.

    Science.gov (United States)

    Seol, Bo Ram; Kim, Dong Myung; Park, Ki Ho; Jeoung, Jin Wook

    2017-11-01

    To evaluate the optical coherence tomography (OCT) color probability codes based on a myopic normative database and to investigate whether the implementation of the myopic normative database can improve the OCT diagnostic ability in myopic glaucoma. Comparative validity study. In this study, 305 eyes (154 myopic healthy eyes and 151 myopic glaucoma eyes) were included. A myopic normative database was obtained based on myopic healthy eyes. We evaluated the agreement between OCT color probability codes after applying the built-in and myopic normative databases, respectively. Another 120 eyes (60 myopic healthy eyes and 60 myopic glaucoma eyes) were included and the diagnostic performance of OCT color codes using a myopic normative database was investigated. The mean weighted kappa (Kw) coefficients for quadrant retinal nerve fiber layer (RNFL) thickness, clock-hour RNFL thickness, and ganglion cell-inner plexiform layer (GCIPL) thickness were 0.636, 0.627, and 0.564, respectively. The myopic normative database showed a higher specificity than did the built-in normative database in quadrant RNFL thickness, clock-hour RNFL thickness, and GCIPL thickness (P database in quadrant RNFL thickness, clock-hour RNFL thickness, and GCIPL thickness (P = .011, P = .004, P database. The implementation of a myopic normative database is needed to allow more precise interpretation of OCT color probability codes when used in myopic eyes. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Forecasting the Stock Market with Linguistic Rules Generated from the Minimize Entropy Principle and the Cumulative Probability Distribution Approaches

    Directory of Open Access Journals (Sweden)

    Chung-Ho Su

    2010-12-01

    Full Text Available To forecast a complex and non-linear system, such as a stock market, advanced artificial intelligence algorithms, like neural networks (NNs and genetic algorithms (GAs have been proposed as new approaches. However, for the average stock investor, two major disadvantages are argued against these advanced algorithms: (1 the rules generated by NNs and GAs are difficult to apply in investment decisions; and (2 the time complexity of the algorithms to produce forecasting outcomes is very high. Therefore, to provide understandable rules for investors and to reduce the time complexity of forecasting algorithms, this paper proposes a novel model for the forecasting process, which combines two granulating methods (the minimize entropy principle approach and the cumulative probability distribution approach and a rough set algorithm. The model verification demonstrates that the proposed model surpasses the three listed conventional fuzzy time-series models and a multiple regression model (MLR in forecast accuracy.

  13. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  14. Critical Applied Linguistics: An Evaluative Interdisciplinary Approach in Criticism and Evaluation of Applied Linguistics’ Disciplines

    Directory of Open Access Journals (Sweden)

    H. Davari

    2015-11-01

    Full Text Available The emergence of some significant critical approaches and directions in the field of applied linguistics from the mid-1980s onwards has met with various positive and opposite reactions. On the basis of their strength and significance, such approaches and directions have challenged some of the mainstream approaches’ claims, principles and assumptions. Among them, critical applied linguistics can be highlighted as a new approach, developed by the Australian applied linguist, Alastair Pennycook. The aspects, domains and concerns of this new approach were introduced in his book in 2001. Due to the undeniable importance of this approach, as well as partial negligence regarding it in Iranian academic setting, this paper first intends to introduce this approach, as an approach that evaluates various disciplines of applied linguistics through its own specific principles and interests. Then, in order to show its step-by-step application in the evaluation of different disciplines of applied linguistics, with a glance at its significance and appropriateness in Iranian society, two domains, namely English language education and language policy and planning, are introduced and evaluated in order to provide readers with a visible and practical picture of its interdisciplinary nature and evaluative functions. The findings indicate the efficacy of applying this interdisciplinary framework in any language-in-education policy and planning in accordance with the political, social and cultural context of the target society.

  15. An analytic approach to probability tables for the unresolved resonance region

    Science.gov (United States)

    Brown, David; Kawano, Toshihiko

    2017-09-01

    The Unresolved Resonance Region (URR) connects the fast neutron region with the Resolved Resonance Region (RRR). The URR is problematic since resonances are not resolvable experimentally yet the fluctuations in the neutron cross sections play a discernible and technologically important role: the URR in a typical nucleus is in the 100 keV - 2 MeV window where the typical fission spectrum peaks. The URR also represents the transition between R-matrix theory used to described isolated resonances and Hauser-Feshbach theory which accurately describes the average cross sections. In practice, only average or systematic features of the resonances in the URR are known and are tabulated in evaluations in a nuclear data library such as ENDF/B-VII.1. Codes such as AMPX and NJOY can compute the probability distribution of the cross section in the URR under some assumptions using Monte Carlo realizations of sets of resonances. These probability distributions are stored in the so-called PURR tables. In our work, we begin to develop a scheme for computing the covariance of the cross section probability distribution analytically. Our approach offers the possibility of defining the limits of applicability of Hauser-Feshbach theory and suggests a way to calculate PURR tables directly from systematics for nuclei whose RRR is unknown, provided one makes appropriate assumptions about the shape of the cross section probability distribution.

  16. An analytic approach to probability tables for the unresolved resonance region

    Directory of Open Access Journals (Sweden)

    Brown David

    2017-01-01

    Full Text Available The Unresolved Resonance Region (URR connects the fast neutron region with the Resolved Resonance Region (RRR. The URR is problematic since resonances are not resolvable experimentally yet the fluctuations in the neutron cross sections play a discernible and technologically important role: the URR in a typical nucleus is in the 100 keV – 2 MeV window where the typical fission spectrum peaks. The URR also represents the transition between R-matrix theory used to described isolated resonances and Hauser-Feshbach theory which accurately describes the average cross sections. In practice, only average or systematic features of the resonances in the URR are known and are tabulated in evaluations in a nuclear data library such as ENDF/B-VII.1. Codes such as AMPX and NJOY can compute the probability distribution of the cross section in the URR under some assumptions using Monte Carlo realizations of sets of resonances. These probability distributions are stored in the so-called PURR tables. In our work, we begin to develop a scheme for computing the covariance of the cross section probability distribution analytically. Our approach offers the possibility of defining the limits of applicability of Hauser-Feshbach theory and suggests a way to calculate PURR tables directly from systematics for nuclei whose RRR is unknown, provided one makes appropriate assumptions about the shape of the cross section probability distribution.

  17. Applying lessons from the ecohealth approach to make food ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Applying lessons from the ecohealth approach to make food systems healthier ... the biennial Ecohealth Congress of the International Association for Ecology and ... intersectoral policies that address the notable increase in obesity, diabetes, ...

  18. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    Science.gov (United States)

    De Gregorio, Sofia; Camarda, Marco

    2016-07-26

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  19. Flood probability quantification for road infrastructure: Data-driven spatial-statistical approach and case study applications.

    Science.gov (United States)

    Kalantari, Zahra; Cavalli, Marco; Cantone, Carolina; Crema, Stefano; Destouni, Georgia

    2017-03-01

    Climate-driven increase in the frequency of extreme hydrological events is expected to impose greater strain on the built environment and major transport infrastructure, such as roads and railways. This study develops a data-driven spatial-statistical approach to quantifying and mapping the probability of flooding at critical road-stream intersection locations, where water flow and sediment transport may accumulate and cause serious road damage. The approach is based on novel integration of key watershed and road characteristics, including also measures of sediment connectivity. The approach is concretely applied to and quantified for two specific study case examples in southwest Sweden, with documented road flooding effects of recorded extreme rainfall. The novel contributions of this study in combining a sediment connectivity account with that of soil type, land use, spatial precipitation-runoff variability and road drainage in catchments, and in extending the connectivity measure use for different types of catchments, improve the accuracy of model results for road flood probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Calculation of the uncertainty in complication probability for various dose-response models, applied to the parotid gland

    International Nuclear Information System (INIS)

    Schilstra, C.; Meertens, H.

    2001-01-01

    Purpose: Usually, models that predict normal tissue complication probability (NTCP) are fitted to clinical data with the maximum likelihood (ML) method. This method inevitably causes a loss of information contained in the data. In this study, an alternative method is investigated that calculates the parameter probability distribution (PD), and, thus, conserves all information. The PD method also allows the calculation of the uncertainty in the NTCP, which is an (often-neglected) prerequisite for the intercomparison of both treatment plans and NTCP models. The PD and ML methods are applied to parotid gland data, and the results are compared. Methods and Materials: The drop in salivary flow due to radiotherapy was measured in 25 parotid glands of 15 patients. Together with the parotid gland dose-volume histograms (DVH), this enabled the calculation of the parameter PDs for three different NTCP models (Lyman, relative seriality, and critical volume). From these PDs, the NTCP and its uncertainty could be calculated for arbitrary parotid gland DVHs. ML parameters and resulting NTCP values were calculated also. Results: All models fitted equally well. The parameter PDs turned out to have nonnormal shapes and long tails. The NTCP predictions of the ML and PD method usually differed considerably, depending on the NTCP model and the nature of irradiation. NTCP curves and ML parameters suggested a highly parallel organization of the parotid gland. Conclusions: Considering the substantial differences between the NTCP predictions of the ML and PD method, the use of the PD method is preferred, because this is the only method that takes all information contained in the clinical data into account. Furthermore, PD method gives a true measure of the uncertainty in the NTCP

  1. Improved Membership Probability for Moving Groups: Bayesian and Machine Learning Approaches

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-01-01

    Gravitationally unbound loose stellar associations (i.e., young nearby moving groups: moving groups hereafter) have been intensively explored because they are important in planet and disk formation studies, exoplanet imaging, and age calibration. Among the many efforts devoted to the search for moving group members, a Bayesian approach (e.g.,using the code BANYAN) has become popular recently because of the many advantages it offers. However, the resultant membership probability needs to be carefully adopted because of its sensitive dependence on input models. In this study, we have developed an improved membership calculation tool focusing on the beta-Pic moving group. We made three improvements for building models used in BANYAN II: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZUVW. Our improved tool can change membership probability up to 70%. Membership probability is critical and must be better defined. For example, our code identifies only one third of the candidate members in SIMBAD that are believed to be kinematically associated with beta-Pic moving group.Additionally, we performed cluster analysis of young nearby stars using an unsupervised machine learning approach. As more moving groups and their members are identified, the complexity and ambiguity in moving group configuration has been increased. To clarify this issue, we analyzed ~4,000 X-ray bright young stellar candidates. Here, we present the preliminary results. By re-identifying moving groups with the least human intervention, we expect to understand the composition of the solar neighborhood. Moreover better defined moving group membership will help us understand star formation and evolution in relatively low density environments; especially for the low-mass stars which will be identified in the coming Gaia release.

  2. An extended risk assessment approach for chemical plants applied to a study related to pipe ruptures

    International Nuclear Information System (INIS)

    Milazzo, Maria Francesca; Aven, Terje

    2012-01-01

    Risk assessments and Quantitative Risk Assessment (QRA) in particular have been used in the chemical industry for many years to support decision-making on the choice of arrangements and measures associated with chemical processes, transportation and storage of dangerous substances. The assessments have been founded on a risk perspective seeing risk as a function of frequency of events (probability) and associated consequences. In this paper we point to the need for extending this approach to place a stronger emphasis on uncertainties. A recently developed risk framework designed to better reflect such uncertainties is presented and applied to a chemical plant and specifically the analysis of accidental events related to the rupture of pipes. Two different ways of implementing the framework are presented, one based on the introduction of probability models and one without. The differences between the standard approach and the extended approaches are discussed from a theoretical point of view as well as from a practical risk analyst perspective.

  3. An operational-oriented approach to the assessment of low probability seismic ground motions for critical infrastructures

    Science.gov (United States)

    Garcia-Fernandez, Mariano; Assatourians, Karen; Jimenez, Maria-Jose

    2018-01-01

    Extreme natural hazard events have the potential to cause significant disruption to critical infrastructure (CI) networks. Among them, earthquakes represent a major threat as sudden-onset events with limited, if any, capability of forecast, and high damage potential. In recent years, the increased exposure of interdependent systems has heightened concern, motivating the need for a framework for the management of these increased hazards. The seismic performance level and resilience of existing non-nuclear CIs can be analyzed by identifying the ground motion input values leading to failure of selected key elements. Main interest focuses on the ground motions exceeding the original design values, which should correspond to low probability occurrence. A seismic hazard methodology has been specifically developed to consider low-probability ground motions affecting elongated CI networks. The approach is based on Monte Carlo simulation, which allows for building long-duration synthetic earthquake catalogs to derive low-probability amplitudes. This approach does not affect the mean hazard values and allows obtaining a representation of maximum amplitudes that follow a general extreme-value distribution. This facilitates the analysis of the occurrence of extremes, i.e., very low probability of exceedance from unlikely combinations, for the development of, e.g., stress tests, among other applications. Following this methodology, extreme ground-motion scenarios have been developed for selected combinations of modeling inputs including seismic activity models (source model and magnitude-recurrence relationship), ground motion prediction equations (GMPE), hazard levels, and fractiles of extreme ground motion. The different results provide an overview of the effects of different hazard modeling inputs on the generated extreme motion hazard scenarios. This approach to seismic hazard is at the core of the risk analysis procedure developed and applied to European CI transport

  4. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  5. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    Science.gov (United States)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  6. Estimating the Probability of Wind Ramping Events: A Data-driven Approach

    OpenAIRE

    Wang, Cheng; Wei, Wei; Wang, Jianhui; Qiu, Feng

    2016-01-01

    This letter proposes a data-driven method for estimating the probability of wind ramping events without exploiting the exact probability distribution function (PDF) of wind power. Actual wind data validates the proposed method.

  7. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  8. Faster exact Markovian probability functions for motif occurrences: a DFA-only approach.

    Science.gov (United States)

    Ribeca, Paolo; Raineri, Emanuele

    2008-12-15

    The computation of the statistical properties of motif occurrences has an obviously relevant application: patterns that are significantly over- or under-represented in genomes or proteins are interesting candidates for biological roles. However, the problem is computationally hard; as a result, virtually all the existing motif finders use fast but approximate scoring functions, in spite of the fact that they have been shown to produce systematically incorrect results. A few interesting exact approaches are known, but they are very slow and hence not practical in the case of realistic sequences. We give an exact solution, solely based on deterministic finite-state automata (DFA), to the problem of finding the whole relevant part of the probability distribution function of a simple-word motif in a homogeneous (biological) sequence. Out of that, the z-value can always be computed, while the P-value can be obtained either when it is not too extreme with respect to the number of floating-point digits available in the implementation, or when the number of pattern occurrences is moderately low. In particular, the time complexity of the algorithms for Markov models of moderate order (0 manage to obtain an algorithm which is both easily interpretable and efficient. This approach can be used for exact statistical studies of very long genomes and protein sequences, as we illustrate with some examples on the scale of the human genome.

  9. Effect of velocity variation on secondary-ion-emission probability: Quantum stationary approach

    International Nuclear Information System (INIS)

    Goldberg, E.C.; Ferron, J.; Passeggi, M.C.G.

    1989-01-01

    The ion-velocity dependence of the ionization probability for an atom ejected from a surface is examined by using a quantum approach in which the coupled motion between electrons and the outgoing nucleus is followed along the whole trajectory by solving the stationary Schroedinger equation. We choose a very-small-cluster-model system in which the motion of the atom is restricted to one dimension, and with energy potential curves corresponding to the involved channels varying appreciably with the atom position. We found an exponential dependence on the inverse of the asymptotic ion velocity for high emission energies, and a smoother behavior with slight oscillations at low energies. These results are compared with those obtained within a dynamical-trajectory approximation using either a constant velocity equal to the asymptotic ionic value, or expressions for the velocity derived from the eikonal approximation and from the classical limit of the current vector. Both approaches give similar results provided the velocity is allowed to adjust self-consistently to potential energies and transition-amplitude variations. Strong oscillations are observed in the low-emission-energy range either if the transitions are neglected, or a constant velocity along the whole path is assumed for the ejected particle

  10. Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.

    Science.gov (United States)

    Morstyn, Ron

    2011-08-01

    The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.

  11. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    Science.gov (United States)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  12. A PROBABILITY BASED APPROACH FOR THE ALLOCATION OF PLAYER DRAFT SELECTIONS IN AUSTRALIAN RULES FOOTBALL

    Directory of Open Access Journals (Sweden)

    Anthony Bedford

    2006-12-01

    Full Text Available Australian Rules Football, governed by the Australian Football League (AFL is the most popular winter sport played in Australia. Like North American team based leagues such as the NFL, NBA and NHL, the AFL uses a draft system for rookie players to join a team's list. The existing method of allocating draft selections in the AFL is simply based on the reverse order of each team's finishing position for that season, with teams winning less than or equal to 5 regular season matches obtaining an additional early round priority draft pick. Much criticism has been levelled at the existing system since it rewards losing teams and does not encourage poorly performing teams to win matches once their season is effectively over. We propose a probability-based system that allocates a score based on teams that win 'unimportant' matches (akin to Carl Morris' definition of importance. We base the calculation of 'unimportance' on the likelihood of a team making the final eight following each round of the season. We then investigate a variety of approaches based on the 'unimportance' measure to derive a score for 'unimportant' and unlikely wins. We explore derivatives of this system, compare past draft picks with those obtained under our system, and discuss the attractiveness of teams knowing the draft reward for winning each match in a season

  13. The hybrid thermography approach applied to architectural structures

    Science.gov (United States)

    Sfarra, S.; Ambrosini, D.; Paoletti, D.; Nardi, I.; Pasqualoni, G.

    2017-07-01

    This work contains an overview of infrared thermography (IRT) method and its applications relating to the investigation of architectural structures. In this method, the passive approach is usually used in civil engineering, since it provides a panoramic view of the thermal anomalies to be interpreted also thanks to the use of photographs focused on the region of interest (ROI). The active approach, is more suitable for laboratory or indoor inspections, as well as for objects having a small size. The external stress to be applied is thermal, coming from non-natural apparatus such as lamps or hot / cold air jets. In addition, the latter permits to obtain quantitative information related to defects not detectable to the naked eyes. Very recently, the hybrid thermography (HIRT) approach has been introduced to the attention of the scientific panorama. It can be applied when the radiation coming from the sun, directly arrives (i.e., possibly without the shadow cast effect) on a surface exposed to the air. A large number of thermograms must be collected and a post-processing analysis is subsequently applied via advanced algorithms. Therefore, an appraisal of the defect depth can be obtained passing through the calculation of the combined thermal diffusivity of the materials above the defect. The approach is validated herein by working, in a first stage, on a mosaic sample having known defects while, in a second stage, on a Church built in L'Aquila (Italy) and covered with a particular masonry structure called apparecchio aquilano. The results obtained appear promising.

  14. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  15. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  16. Applying a Problem Based Learning Approach to Land Management Education

    DEFF Research Database (Denmark)

    Enemark, Stig

    Land management covers a wide range activities associated with the management of land and natural resources that are required to fulfil political objectives and achieve sustainable development. This paper presents an overall understanding of the land management paradigm and the benefits of good...... land governance to society. A land administration system provides a country with the infrastructure to implement land-related policies and land management strategies. By applying this land management profile to surveying education, this paper suggests that there is a need to move away from an exclusive...... engineering focus toward adopting an interdisciplinary and problem-based approach to ensure that academic programmes can cope with the wide range of land administration functions and challenges. An interdisciplinary approach to surveying education calls for the need to address issues and problems in a real...

  17. Setting research priorities by applying the combined approach matrix.

    Science.gov (United States)

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  18. Applying a Modified Triad Approach to Investigate Wastewater lines

    International Nuclear Information System (INIS)

    Pawlowicz, R.; Urizar, L.; Blanchard, S.; Jacobsen, K.; Scholfield, J.

    2006-01-01

    Approximately 20 miles of wastewater lines are below grade at an active military Base. This piping network feeds or fed domestic or industrial wastewater treatment plants on the Base. Past wastewater line investigations indicated potential contaminant releases to soil and groundwater. Further environmental assessment was recommended to characterize the lines because of possible releases. A Remedial Investigation (RI) using random sampling or use of sampling points spaced at predetermined distances along the entire length of the wastewater lines, however, would be inefficient and cost prohibitive. To accomplish RI goals efficiently and within budget, a modified Triad approach was used to design a defensible sampling and analysis plan and perform the investigation. The RI task was successfully executed and resulted in a reduced fieldwork schedule, and sampling and analytical costs. Results indicated that no major releases occurred at the biased sampling points. It was reasonably extrapolated that since releases did not occur at the most likely locations, then the entire length of a particular wastewater line segment was unlikely to have contaminated soil or groundwater and was recommended for no further action. A determination of no further action was recommended for the majority of the waste lines after completing the investigation. The modified Triad approach was successful and a similar approach could be applied to investigate wastewater lines on other United States Department of Defense or Department of Energy facilities. (authors)

  19. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  20. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Andrés Cencerrado

    2013-01-01

    Full Text Available This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus.

  1. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  2. Integration of geophysical datasets by a conjoint probability tomography approach: application to Italian active volcanic areas

    Directory of Open Access Journals (Sweden)

    D. Patella

    2008-06-01

    Full Text Available We expand the theory of probability tomography to the integration of different geophysical datasets. The aim of the new method is to improve the information quality using a conjoint occurrence probability function addressed to highlight the existence of common sources of anomalies. The new method is tested on gravity, magnetic and self-potential datasets collected in the volcanic area of Mt. Vesuvius (Naples, and on gravity and dipole geoelectrical datasets collected in the volcanic area of Mt. Etna (Sicily. The application demonstrates that, from a probabilistic point of view, the integrated analysis can delineate the signature of some important volcanic targets better than the analysis of the tomographic image of each dataset considered separately.

  3. Measuring survival time: a probability-based approach useful in healthcare decision-making.

    Science.gov (United States)

    2011-01-01

    In some clinical situations, the choice between treatment options takes into account their impact on patient survival time. Due to practical constraints (such as loss to follow-up), survival time is usually estimated using a probability calculation based on data obtained in clinical studies or trials. The two techniques most commonly used to estimate survival times are the Kaplan-Meier method and the actuarial method. Despite their limitations, they provide useful information when choosing between treatment options.

  4. Estimating the Probabilities of Default for Callable Bonds: A Duffie-Singleton Approach

    OpenAIRE

    David Wang

    2005-01-01

    This paper presents a model for estimating the default risks implicit in the prices of callable corporate bonds. The model considers three essential ingredients in the pricing of callable corporate bonds: stochastic interest rate, default risk, and call provision. The stochastic interest rate is modeled as a square-root diffusion process. The default risk is modeled as a constant spread, with the magnitude of this spread impacting the probability of a Poisson process governing the arrival of ...

  5. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    Science.gov (United States)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  6. Views on Montessori Approach by Teachers Serving at Schools Applying the Montessori Approach

    Science.gov (United States)

    Atli, Sibel; Korkmaz, A. Merve; Tastepe, Taskin; Koksal Akyol, Aysel

    2016-01-01

    Problem Statement: Further studies on Montessori teachers are required on the grounds that the Montessori approach, which, having been applied throughout the world, holds an important place in the alternative education field. Yet it is novel for Turkey, and there are only a limited number of studies on Montessori teachers in Turkey. Purpose of…

  7. Proximity approach to study fusion probabilities in heavy-ion collisions

    International Nuclear Information System (INIS)

    Raj Kumari

    2013-01-01

    The fusion cross-sections at the sub-barrier energies are found to be enhanced compared to the predictions of the barrier penetration model. The aim is to test Bass 80, Aage Winther (AW) 95, Denisov DP, Proximity 2010 and Skyrme Energy Density Formalism (SEDF) at energies above as well as below barrier height. For the present systematic study, the fusion probabilities for the reactions of 28 Si+ 24,26 Mg 30 Si+ 24 Mg and 28,30 Si+ 58,62 Ni have been calculated

  8. Impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach.

    Science.gov (United States)

    Chandrasekar, A; Rakkiyappan, R; Cao, Jinde

    2015-10-01

    This paper studies the impulsive synchronization of Markovian jumping randomly coupled neural networks with partly unknown transition probabilities via multiple integral approach. The array of neural networks are coupled in a random fashion which is governed by Bernoulli random variable. The aim of this paper is to obtain the synchronization criteria, which is suitable for both exactly known and partly unknown transition probabilities such that the coupled neural network is synchronized with mixed time-delay. The considered impulsive effects can be synchronized at partly unknown transition probabilities. Besides, a multiple integral approach is also proposed to strengthen the Markovian jumping randomly coupled neural networks with partly unknown transition probabilities. By making use of Kronecker product and some useful integral inequalities, a novel Lyapunov-Krasovskii functional was designed for handling the coupled neural network with mixed delay and then impulsive synchronization criteria are solvable in a set of linear matrix inequalities. Finally, numerical examples are presented to illustrate the effectiveness and advantages of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Constructing probable explanations of nonconformity : a data-aware and history-based approach

    NARCIS (Netherlands)

    Alizadeh, M.; de Leoni, M.; Zannone, N.

    2016-01-01

    Auditing the execution of business processes is becoming a critical issue for organizations. Conformance checking has been proposed as a viable approach to analyze process executions with respect to a process model. In particular, alignments provide a robust approach to conformance checking in that

  10. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  11. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  12. Classical analogues of a quantum system in spatial and temporal domains: A probability amplitude approach

    Directory of Open Access Journals (Sweden)

    Pradipta Panchadhyayee

    2016-12-01

    Full Text Available We have simulated the similar features of the well-known classical phenomena in quantum domain under the formalism of probability amplitude method. The identical pattern of interference fringes of a Fabry–Perot interferometer (especially on reflection mode is obtained through the power-broadened spectral line shape of the population distribution in the excited state with careful delineation of a coherently driven two-level atomic model. In a unit wavelength domain, such pattern can be substantially modified by controlling typical spatial field arrangement in one and two dimensions, which is found complementary to the findings of recent research on atom localization in sub-wavelength domain. The spatial dependence of temporal dynamics has also been studied at a particular condition, which is equivalent to that could be obtained under Raman–Nath diffraction controlled by spatial phase.

  13. Human errors evaluation for muster in emergency situations applying human error probability index (HEPI, in the oil company warehouse in Hamadan City

    Directory of Open Access Journals (Sweden)

    2012-12-01

    Full Text Available Introduction: Emergency situation is one of the influencing factors on human error. The aim of this research was purpose to evaluate human error in emergency situation of fire and explosion at the oil company warehouse in Hamadan city applying human error probability index (HEPI. . Material and Method: First, the scenario of emergency situation of those situation of fire and explosion at the oil company warehouse was designed and then maneuver against, was performed. The scaled questionnaire of muster for the maneuver was completed in the next stage. Collected data were analyzed to calculate the probability success for the 18 actions required in an emergency situation from starting point of the muster until the latest action to temporary sheltersafe. .Result: The result showed that the highest probability of error occurrence was related to make safe workplace (evaluation phase with 32.4 % and lowest probability of occurrence error in detection alarm (awareness phase with 1.8 %, probability. The highest severity of error was in the evaluation phase and the lowest severity of error was in the awareness and recovery phase. Maximum risk level was related to the evaluating exit routes and selecting one route and choosy another exit route and minimum risk level was related to the four evaluation phases. . Conclusion: To reduce the risk of reaction in the exit phases of an emergency situation, the following actions are recommended, based on the finding in this study: A periodic evaluation of the exit phase and modifying them if necessary, conducting more maneuvers and analyzing this results along with a sufficient feedback to the employees.

  14. Calculating the Prior Probability Distribution for a Causal Network Using Maximum Entropy: Alternative Approaches

    Directory of Open Access Journals (Sweden)

    Michael J. Markham

    2011-07-01

    Full Text Available Some problems occurring in Expert Systems can be resolved by employing a causal (Bayesian network and methodologies exist for this purpose. These require data in a specific form and make assumptions about the independence relationships involved. Methodologies using Maximum Entropy (ME are free from these conditions and have the potential to be used in a wider context including systems consisting of given sets of linear and independence constraints, subject to consistency and convergence. ME can also be used to validate results from the causal network methodologies. Three ME methods for determining the prior probability distribution of causal network systems are considered. The first method is Sequential Maximum Entropy in which the computation of a progression of local distributions leads to the over-all distribution. This is followed by development of the Method of Tribus. The development takes the form of an algorithm that includes the handling of explicit independence constraints. These fall into two groups those relating parents of vertices, and those deduced from triangulation of the remaining graph. The third method involves a variation in the part of that algorithm which handles independence constraints. Evidence is presented that this adaptation only requires the linear constraints and the parental independence constraints to emulate the second method in a substantial class of examples.

  15. Towards a capability approach to careers: Applying Amartya Sen's thinking

    OpenAIRE

    Robertson, Peter.

    2015-01-01

    Amartya Sen’s capability approach characterizes an individual’s well-being in terms of what they are able to be, and what they are able to do. This framework for thinking has many commonalities with the core ideas in career guidance. Sen’s approach is abstract and not in itself a complete or explanatory theory, but a case can be made that the capability approach has something to offer career theory when combined with a life-career developmental approach. It may also suggest ways of working th...

  16. An approach to controlling radiation exposures of probabilities less than one

    International Nuclear Information System (INIS)

    Ahmed, J.U.; Gonzalez, A.J.

    1988-01-01

    IAEA efforts to develop guidelines for a unified approach for the application of radiation protection principles to radiation exposures assumed to occur with certainty and exposures which are not certain to occur are discussed. A useful criterion is that of a limit on individual risk. A simple approach would be to define separate limits for normal and accident situations. For waste disposal ICRP has suggested a risk limit for accident situations to be of the same magnitude as that for normal operation. The IAEA is considering a risk limit of 10 -5 in a year for consistency with general safety standards of dose limitation. A source-related upper bound is needed which has to be apportioned from the risk limit in order to take into account the presence of other sources

  17. On Generating Optimal Signal Probabilities for Random Tests: A Genetic Approach

    Directory of Open Access Journals (Sweden)

    M. Srinivas

    1996-01-01

    Full Text Available Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed. A brief overview of Genetic Algorithms (GAs and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance of our GAbased approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger.

  18. A novel multi-model probability battery state of charge estimation approach for electric vehicles using H-infinity algorithm

    International Nuclear Information System (INIS)

    Lin, Cheng; Mu, Hao; Xiong, Rui; Shen, Weixiang

    2016-01-01

    Highlights: • A novel multi-model probability battery SOC fusion estimation approach was proposed. • The linear matrix inequality-based H∞ technique is employed to estimate the SOC. • The Bayes theorem has been employed to realize the optimal weight for the fusion. • The robustness of the proposed approach is verified by different batteries. • The results show that the proposed method can promote global estimation accuracy. - Abstract: Due to the strong nonlinearity and complex time-variant property of batteries, the existing state of charge (SOC) estimation approaches based on a single equivalent circuit model (ECM) cannot provide the accurate SOC for the entire discharging period. This paper aims to present a novel SOC estimation approach based on a multiple ECMs fusion method for improving the practical application performance. In the proposed approach, three battery ECMs, namely the Thevenin model, the double polarization model and the 3rd order RC model, are selected to describe the dynamic voltage of lithium-ion batteries and the genetic algorithm is then used to determine the model parameters. The linear matrix inequality-based H-infinity technique is employed to estimate the SOC from the three models and the Bayes theorem-based probability method is employed to determine the optimal weights for synthesizing the SOCs estimated from the three models. Two types of lithium-ion batteries are used to verify the feasibility and robustness of the proposed approach. The results indicate that the proposed approach can improve the accuracy and reliability of the SOC estimation against uncertain battery materials and inaccurate initial states.

  19. Modular Modelling and Simulation Approach - Applied to Refrigeration Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær; Stoustrup, Jakob

    2008-01-01

    This paper presents an approach to modelling and simulation of the thermal dynamics of a refrigeration system, specifically a reefer container. A modular approach is used and the objective is to increase the speed and flexibility of the developed simulation environment. The refrigeration system...

  20. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  1. Applying Digital Sensor Technology: A Problem-Solving Approach

    Science.gov (United States)

    Seedhouse, Paul; Knight, Dawn

    2016-01-01

    There is currently an explosion in the number and range of new devices coming onto the technology market that use digital sensor technology to track aspects of human behaviour. In this article, we present and exemplify a three-stage model for the application of digital sensor technology in applied linguistics that we have developed, namely,…

  2. An Applied Project-Driven Approach to Undergraduate Research Experiences

    Science.gov (United States)

    Karls, Michael A.

    2017-01-01

    In this paper I will outline the process I have developed for conducting applied mathematics research with undergraduates and give some examples of the projects we have worked on. Several of these projects have led to refereed publications that could be used to illustrate topics taught in the undergraduate curriculum.

  3. Risk factors of delay proportional probability in diphtheria-tetanus-pertussis vaccination of Iranian children; Life table approach analysis

    Directory of Open Access Journals (Sweden)

    Mohsen Mokhtari

    2015-01-01

    Full Text Available Despite success in expanded program immunization for an increase in vaccination coverage in the children of world, timeliness and schedule of vaccination remains as one of the challenges in public health. This study purposed to demonstrate the related factors of delayed diphtheria-tetanus-pertussis (DTP vaccination using life table approach. A historical cohort study conducted in the poor areas of five large Iran cities. Totally, 3610 children with 24-47 months old age who had documented vaccination card were enrolled. Time of vaccination for the third dose of DTP vaccine was calculated. Life table survival was used to calculate the proportional probability of vaccination in each time. Wilcoxon test was used for the comparison proportional probability of delayed vaccination based on studies factors. The overall median delayed time for DTP3 was 38.52 days. The Wilcoxon test showed that city, nationality, education level of parents, birth order and being in rural areas are related to the high probability of delay time for DTP3 vaccination (P 0.05. Being away from the capital, a high concentration of immigrants in the city borders with a low socioeconomic class leads to prolonged delay in DTP vaccination time. Special attention to these areas is needed to increase the levels of parental knowledge and to facilitate access to the health services care.

  4. Overview of the structured assessment approach and documentation of algorithms to compute the probability of adversary detection

    International Nuclear Information System (INIS)

    Rice, T.R.; Derby, S.L.

    1978-01-01

    The Structured Assessment Approach was applied to material control and accounting systems at facilities that process Special Nuclear Material. Four groups of analytical techniques were developed for four general adversory types. Probabilistic algorithms were developed and compared with existing algorithms. 20 figures

  5. A Multiobjective Approach Applied to the Protein Structure Prediction Problem

    Science.gov (United States)

    2002-03-07

    local conformations [38]. Moreover, all these models have the same theme in trying to define the properties a real protein has when folding. Today , it...attempted to solve the PSP problem with a real valued GA and found better results than a competitor (Scheraga, et al) [50]; however, today we know that...ACM Symposium on Applied computing (SAC01) (March 11-14 2001). Las Vegas, Nevada. [22] Derrida , B. “Random Energy Model: Limit of a Family of

  6. Applied approach slab settlement research, design/construction : final report.

    Science.gov (United States)

    2013-08-01

    Approach embankment settlement is a pervasive problem in Oklahoma and many other states. The bump and/or abrupt slope change poses a danger to traffic and can cause increased dynamic loads on the bridge. Frequent and costly maintenance may be needed ...

  7. Tennis: Applied Examples of a Game-Based Teaching Approach

    Science.gov (United States)

    Crespo, Miguel; Reid, Machar M.; Miley, Dave

    2004-01-01

    In this article, the authors reveal that tennis has been increasingly taught with a tactical model or game-based approach, which emphasizes learning through practice in match-like drills and actual play, rather than in practicing strokes for exact technical execution. Its goal is to facilitate the player's understanding of the tactical, physical…

  8. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    Science.gov (United States)

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Major accident prevention through applying safety knowledge management approach.

    Science.gov (United States)

    Kalatpour, Omid

    2016-01-01

    Many scattered resources of knowledge are available to use for chemical accident prevention purposes. The common approach to management process safety, including using databases and referring to the available knowledge has some drawbacks. The main goal of this article was to devise a new emerged knowledge base (KB) for the chemical accident prevention domain. The scattered sources of safety knowledge were identified and scanned. Then, the collected knowledge was formalized through a computerized program. The Protégé software was used to formalize and represent the stored safety knowledge. The domain knowledge retrieved as well as data and information. This optimized approach improved safety and health knowledge management (KM) process and resolved some typical problems in the KM process. Upgrading the traditional resources of safety databases into the KBs can improve the interaction between the users and knowledge repository.

  10. A Multi-Criterion Evolutionary Approach Applied to Phylogenetic Reconstruction

    OpenAIRE

    Cancino, W.; Delbem, A.C.B.

    2010-01-01

    In this paper, we proposed an MOEA approach, called PhyloMOEA which solves the phylogenetic inference problem using maximum parsimony and maximum likelihood criteria. The PhyloMOEA's development was motivated by several studies in the literature (Huelsenbeck, 1995; Jin & Nei, 1990; Kuhner & Felsenstein, 1994; Tateno et al., 1994), which point out that various phylogenetic inference methods lead to inconsistent solutions. Techniques using parsimony and likelihood criteria yield to different tr...

  11. A new kinetic biphasic approach applied to biodiesel process intensification

    Energy Technology Data Exchange (ETDEWEB)

    Russo, V.; Tesser, R.; Di Serio, M.; Santacesaria, E. [Naples Univ. (Italy). Dept. of Chemistry

    2012-07-01

    Many different papers have been published on the kinetics of the transesterification of vegetable oil with methanol, in the presence of alkaline catalysts to produce biodiesel. All the proposed approaches are based on the assumption of a pseudo-monophasic system. The consequence of these approaches is that some experimental aspects cannot be described. For the reaction performed in batch conditions, for example, the monophasic approach is not able to reproduce the different plateau obtained by using different amount of catalyst or the induction time observed at low stirring rates. Moreover, it has been observed by operating in continuous reactors that micromixing has a dramatic effect on the reaction rate. At this purpose, we have recently observed that is possible to obtain a complete conversion to biodiesel in less than 10 seconds of reaction time. This observation is confirmed also by other authors using different types of reactors like: static mixers, micro-reactors, oscillatory flow reactors, cavitational reactors, microwave reactors or centrifugal contactors. In this work we will show that a recently proposed biphasic kinetic approach is able to describe all the aspects before mentioned that cannot be described by the monophasic kinetic model. In particular, we will show that the biphasic kinetic model can describe both the induction time observed in the batch reactors, at low stirring rate, and the very high conversions obtainable in a micro-channel reactor. The adopted biphasic kinetic model is based on a reliable reaction mechanism that will be validated by the experimental evidences reported in this work. (orig.)

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. The applying stakeholder approach to strategic management of territories development

    Directory of Open Access Journals (Sweden)

    Ilshat Azamatovich Tazhitdinov

    2013-06-01

    Full Text Available In the paper, the aspects of the strategic management of socioeconomic development of territories in terms of stakeholder approach are discussed. The author's interpretation of the concept of stakeholder sub-region is proposed, and their classification into internal and external to the territorial socioeconomic system of sub-regional level is offered. The types of interests and types of resources stakeholders in the sub-region are identified, and at the same time the correlation of interests and resources allows to determine the groups (alliances stakeholders, which ensure the balance of interests depending on the certain objectives of the association. The conceptual stakeholder agent model of management of strategic territorial development within the hierarchical system of «region — sub-region — municipal formation,» is proposed. All stakeholders there are considered as the influence agents directing its own resources to provide a comprehensive approach to management territorial development. The interaction between all the influence agents of the «Region — Sub-region — municipal formation» is provided vertically and horizontally through the initialization of the development and implementation of strategic documents of the sub-region. Vertical interaction occurs between stakeholders such as government and municipal authorities being as a guideline, and the horizontal — between the rests of them being as a partnership. Within the proposed model, the concurrent engineering is implemented, which is a form of inter-municipal strategic cooperation of local government municipalities for the formation and analyzing a set of alternatives of the project activities in the sub-region in order to choose the best options. The proposed approach was tested in the development of medium-term comprehensive program of socioeconomic development of the Zauralye and sub-regions of the North-East of the Republic of Bashkortostan (2011–2015.

  14. Modeling in applied sciences a kinetic theory approach

    CERN Document Server

    Pulvirenti, Mario

    2000-01-01

    Modeling complex biological, chemical, and physical systems, in the context of spatially heterogeneous mediums, is a challenging task for scientists and engineers using traditional methods of analysis Modeling in Applied Sciences is a comprehensive survey of modeling large systems using kinetic equations, and in particular the Boltzmann equation and its generalizations An interdisciplinary group of leading authorities carefully develop the foundations of kinetic models and discuss the connections and interactions between model theories, qualitative and computational analysis and real-world applications This book provides a thoroughly accessible and lucid overview of the different aspects, models, computations, and methodology for the kinetic-theory modeling process Topics and Features * Integrated modeling perspective utilized in all chapters * Fluid dynamics of reacting gases * Self-contained introduction to kinetic models * Becker–Doring equations * Nonlinear kinetic models with chemical reactions * Kinet...

  15. Cortical complexity in bipolar disorder applying a spherical harmonics approach.

    Science.gov (United States)

    Nenadic, Igor; Yotter, Rachel A; Dietzek, Maren; Langbein, Kerstin; Sauer, Heinrich; Gaser, Christian

    2017-05-30

    Recent studies using surface-based morphometry of structural magnetic resonance imaging data have suggested that some changes in bipolar disorder (BP) might be neurodevelopmental in origin. We applied a novel analysis of cortical complexity based on fractal dimensions in high-resolution structural MRI scans of 18 bipolar disorder patients and 26 healthy controls. Our region-of-interest based analysis revealed increases in fractal dimensions (in patients relative to controls) in left lateral orbitofrontal cortex and right precuneus, and decreases in right caudal middle frontal, entorhinal cortex, and right pars orbitalis, and left fusiform and posterior cingulate cortices. While our analysis is preliminary, it suggests that early neurodevelopmental pathologies might contribute to bipolar disorder, possibly through genetic mechanisms. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  16. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  17. Undiscovered resource evaluation: Towards applying a systematic approach to uranium

    International Nuclear Information System (INIS)

    Fairclough, M.; Katona, L.

    2014-01-01

    Evaluations of potential mineral resource supply range from spatial to aspatial, and everything in between across a range of scales. They also range from qualitative to quantitative with similar hybrid examples across the spectrum. These can compromise detailed deposit-specific reserve and resource calculations, target generative processes and estimates of potential endowments in a broad geographic or geological area. All are estimates until the ore has been discovered and extracted. Contemporary national or provincial scale evaluations of mineral potential are relatively advanced and some include uranium, such as those for South Australia undertaken by the State Geological Survey. These play an important role in land-use planning as well as attracting exploration investment and range from datato knowledge-driven approaches. Studies have been undertaken for the Mt Painter region, as well as for adjacent basins. The process of estimating large-scale potential mineral endowments is critical for national and international planning purposes but is a relatively recent and less common undertaking. In many cases, except at a general level, the data and knowledge for a relatively immature terrain is lacking, requiring assessment by analogy with other areas. Commencing in the 1980s, the United States Geological Survey, and subsequently the Geological Survey of Canada evaluated a range of commodities ranging from copper to hydrocarbons with a view to security of supply. They developed innovative approaches to, as far as practical, reduce the uncertainty and maximise the reproducibility of the calculations in information-poor regions. Yet the approach to uranium was relatively ad hoc and incomplete (such as the US Department of Energy NURE project). Other historic attempts, such as the IAEA-NEA International Uranium Resource Evaluation Project (IUREP) in the 1970s, were mainly qualitative. While there is still no systematic global evaluation of undiscovered uranium resources

  18. An Inverse Kinematic Approach Using Groebner Basis Theory Applied to Gait Cycle Analysis

    Science.gov (United States)

    2013-03-01

    AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS Anum Barki AFIT-ENP-13-M-02 DEPARTMENT OF THE AIR...copyright protection in the United States. AFIT-ENP-13-M-02 AN INVERSE KINEMATIC APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS THESIS...APPROACH USING GROEBNER BASIS THEORY APPLIED TO GAIT CYCLE ANALYSIS Anum Barki, BS Approved: Dr. Ronald F. Tuttle (Chairman) Date Dr. Kimberly Kendricks

  19. Delayed Monocular SLAM Approach Applied to Unmanned Aerial Vehicles.

    Science.gov (United States)

    Munguia, Rodrigo; Urzua, Sarquis; Grau, Antoni

    2016-01-01

    In recent years, many researchers have addressed the issue of making Unmanned Aerial Vehicles (UAVs) more and more autonomous. In this context, the state estimation of the vehicle position is a fundamental necessity for any application involving autonomy. However, the problem of position estimation could not be solved in some scenarios, even when a GPS signal is available, for instance, an application requiring performing precision manoeuvres in a complex environment. Therefore, some additional sensory information should be integrated into the system in order to improve accuracy and robustness. In this work, a novel vision-based simultaneous localization and mapping (SLAM) method with application to unmanned aerial vehicles is proposed. One of the contributions of this work is to design and develop a novel technique for estimating features depth which is based on a stochastic technique of triangulation. In the proposed method the camera is mounted over a servo-controlled gimbal that counteracts the changes in attitude of the quadcopter. Due to the above assumption, the overall problem is simplified and it is focused on the position estimation of the aerial vehicle. Also, the tracking process of visual features is made easier due to the stabilized video. Another contribution of this work is to demonstrate that the integration of very noisy GPS measurements into the system for an initial short period of time is enough to initialize the metric scale. The performance of this proposed method is validated by means of experiments with real data carried out in unstructured outdoor environments. A comparative study shows that, when compared with related methods, the proposed approach performs better in terms of accuracy and computational time.

  20. Delayed Monocular SLAM Approach Applied to Unmanned Aerial Vehicles.

    Directory of Open Access Journals (Sweden)

    Rodrigo Munguia

    Full Text Available In recent years, many researchers have addressed the issue of making Unmanned Aerial Vehicles (UAVs more and more autonomous. In this context, the state estimation of the vehicle position is a fundamental necessity for any application involving autonomy. However, the problem of position estimation could not be solved in some scenarios, even when a GPS signal is available, for instance, an application requiring performing precision manoeuvres in a complex environment. Therefore, some additional sensory information should be integrated into the system in order to improve accuracy and robustness. In this work, a novel vision-based simultaneous localization and mapping (SLAM method with application to unmanned aerial vehicles is proposed. One of the contributions of this work is to design and develop a novel technique for estimating features depth which is based on a stochastic technique of triangulation. In the proposed method the camera is mounted over a servo-controlled gimbal that counteracts the changes in attitude of the quadcopter. Due to the above assumption, the overall problem is simplified and it is focused on the position estimation of the aerial vehicle. Also, the tracking process of visual features is made easier due to the stabilized video. Another contribution of this work is to demonstrate that the integration of very noisy GPS measurements into the system for an initial short period of time is enough to initialize the metric scale. The performance of this proposed method is validated by means of experiments with real data carried out in unstructured outdoor environments. A comparative study shows that, when compared with related methods, the proposed approach performs better in terms of accuracy and computational time.

  1. Mouse genetic approaches applied to the normal tissue radiation response

    International Nuclear Information System (INIS)

    Haston, Christina K.

    2012-01-01

    The varying responses of inbred mouse models to radiation exposure present a unique opportunity to dissect the genetic basis of radiation sensitivity and tissue injury. Such studies are complementary to human association studies as they permit both the analysis of clinical features of disease, and of specific variants associated with its presentation, in a controlled environment. Herein I review how animal models are studied to identify specific genetic variants influencing predisposition to radiation-induced traits. Among these radiation-induced responses are documented strain differences in repair of DNA damage and in extent of tissue injury (in the lung, skin, and intestine) which form the base for genetic investigations. For example, radiation-induced DNA damage is consistently greater in tissues from BALB/cJ mice, than the levels in C57BL/6J mice, suggesting there may be an inherent DNA damage level per strain. Regarding tissue injury, strain specific inflammatory and fibrotic phenotypes have been documented for principally, C57BL/6 C3H and A/J mice but a correlation among responses such that knowledge of the radiation injury in one tissue informs of the response in another is not evident. Strategies to identify genetic differences contributing to a trait based on inbred strain differences, which include linkage analysis and the evaluation of recombinant congenic (RC) strains, are presented, with a focus on the lung response to irradiation which is the only radiation-induced tissue injury mapped to date. Such approaches are needed to reveal genetic differences in susceptibility to radiation injury, and also to provide a context for the effects of specific genetic variation uncovered in anticipated clinical association studies. In summary, mouse models can be studied to uncover heritable variation predisposing to specific radiation responses, and such variations may point to pathways of importance to phenotype development in the clinic.

  2. Conserving relativistic many-body approach: Equation of state, spectral function, and occupation probabilities of nuclear matter

    International Nuclear Information System (INIS)

    de Jong, F.; Malfliet, R.

    1991-01-01

    Starting from a relativistic Lagrangian we derive a ''conserving'' approximation for the description of nuclear matter. We show this to be a nontrivial extension over the relativistic Dirac-Brueckner scheme. The saturation point of the equation of state calculated agrees very well with the empirical saturation point. The conserving character of the approach is tested by means of the Hugenholtz--van Hove theorem. We find the theorem fulfilled very well around saturation. A new value for compression modulus is derived, K=310 MeV. Also we calculate the occupation probabilities at normal nuclear matter densities by means of the spectral function. The average depletion κ of the Fermi sea is found to be κ∼0.11

  3. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Directory of Open Access Journals (Sweden)

    Le Riche R.

    2010-06-01

    dimensionality. POD is based on projecting the full field images on a modal basis, constructed from sample simulations, and which can account for the variations of the full field as the elastic constants and other parameters of interest are varied. The fidelity of the decomposition depends on the number of basis vectors used. Typically even complex fields can be accurately represented with no more than a few dozen modes and for our problem we showed that only four or five modes are sufficient [5]. To further reduce the computational cost of the Bayesian approach we use response surface approximations of the POD coefficients of the fields. We show that 3rd degree polynomial response surface approximations provide a satisfying accuracy. The combination of POD decomposition and response surface methodology allows to bring down the computational time of the Bayesian identification to a few days. The proposed approach is applied to Moiré interferometry full field displacement measurements from a traction experiment on a plate with a hole. The laminate with a layup of [45,- 45,0]s is made out of a Toray® T800/3631 graphite/epoxy prepreg. The measured displacement maps are provided in Figure 1. The mean values of the identified properties joint probability density function are in agreement with previous identifications carried out on the same material. Furthermore the probability density function also provides the coefficient of variation with which the properties are identified as well as the correlations between the various properties. We find that while the longitudinal Young’s modulus is identified with good accuracy (low standard deviation, the Poisson’s ration is identified with much higher uncertainty. Several of the properties are also found to be correlated. The identified uncertainty structure of the elastic constants (i.e. variance co-variance matrix has potential benefits to reliability analyses, by allowing a more accurate description of the input uncertainty. An

  4. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Science.gov (United States)

    Gogu, C.; Yin, W.; Haftka, R.; Ifju, P.; Molimard, J.; Le Riche, R.; Vautrin, A.

    2010-06-01

    based on projecting the full field images on a modal basis, constructed from sample simulations, and which can account for the variations of the full field as the elastic constants and other parameters of interest are varied. The fidelity of the decomposition depends on the number of basis vectors used. Typically even complex fields can be accurately represented with no more than a few dozen modes and for our problem we showed that only four or five modes are sufficient [5]. To further reduce the computational cost of the Bayesian approach we use response surface approximations of the POD coefficients of the fields. We show that 3rd degree polynomial response surface approximations provide a satisfying accuracy. The combination of POD decomposition and response surface methodology allows to bring down the computational time of the Bayesian identification to a few days. The proposed approach is applied to Moiré interferometry full field displacement measurements from a traction experiment on a plate with a hole. The laminate with a layup of [45,- 45,0]s is made out of a Toray® T800/3631 graphite/epoxy prepreg. The measured displacement maps are provided in Figure 1. The mean values of the identified properties joint probability density function are in agreement with previous identifications carried out on the same material. Furthermore the probability density function also provides the coefficient of variation with which the properties are identified as well as the correlations between the various properties. We find that while the longitudinal Young’s modulus is identified with good accuracy (low standard deviation), the Poisson’s ration is identified with much higher uncertainty. Several of the properties are also found to be correlated. The identified uncertainty structure of the elastic constants (i.e. variance co-variance matrix) has potential benefits to reliability analyses, by allowing a more accurate description of the input uncertainty. An additional

  5. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  6. A Unified Moment-Based Approach for the Evaluation of the Outage Probability with Noise and Interference

    KAUST Repository

    Rached, Nadhir Ben

    2016-12-05

    In this paper, we develop a novel moment-based approach for the evaluation of the outage probability (OP) in a generalized fading environment with interference and noise. Our method is based on the derivation of a power series expansion of the OP of the signal-to-interference-plus-noise ratio (SINR). It does not necessitate stringent requirements, the only major ones being the existence of a power series expansion of the cumulative distribution function of the desired user power and the knowledge of the cross-moments of the interferers’ powers. The newly derived formula is shown to be applicable for most of the well-practical fading models of the desired user under some assumptions on the parameters of the powers’ distributions. A further advantage of our method is that it is applicable irrespective of the nature of the fading models of the interfering powers, the only requirement being the perfect knowledge of their crossmoments. In order to illustrate the wide scope of applicability of our technique, we present a convergence study of the provided formula for the Generalized Gamma and the Rice fading models. Moreover, we show that our analysis has direct bearing on recent multi-channel applications using selection diversity techniques. Finally, we assess by simulations the accuracy of the proposed formula for various fading environments.

  7. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  8. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  9. SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume II. Detailed analysis of the technical issues

    International Nuclear Information System (INIS)

    Embrey, D.E.; Humphreys, P.; Rosa, E.A.; Kirwan, B.; Rea, K.

    1984-07-01

    This two-volume report presents the procedures and analyses performed in developing an approach for structuring expert judgments to estimate human error probabilities. Volume I presents an overview of work performed in developing the approach: SLIM-MAUD (Success Likelihood Index Methodology, implemented through the use of an interactive computer program called MAUD-Multi-Attribute Utility Decomposition). Volume II provides a more detailed analysis of the technical issues underlying the approach

  10. Bonding in Heavier Group 14 Zero-Valent Complexes-A Combined Maximum Probability Domain and Valence Bond Theory Approach.

    Science.gov (United States)

    Turek, Jan; Braïda, Benoît; De Proft, Frank

    2017-10-17

    The bonding in heavier Group 14 zero-valent complexes of a general formula L 2 E (E=Si-Pb; L=phosphine, N-heterocyclic and acyclic carbene, cyclic tetrylene and carbon monoxide) is probed by combining valence bond (VB) theory and maximum probability domain (MPD) approaches. All studied complexes are initially evaluated on the basis of the structural parameters and the shape of frontier orbitals revealing a bent structural motif and the presence of two lone pairs at the central E atom. For the VB calculations three resonance structures are suggested, representing the "ylidone", "ylidene" and "bent allene" structures, respectively. The influence of both ligands and central atoms on the bonding situation is clearly expressed in different weights of the resonance structures for the particular complexes. In general, the bonding in the studied E 0 compounds, the tetrylones, is best described as a resonating combination of "ylidone" and "ylidene" structures with a minor contribution of the "bent allene" structure. Moreover, the VB calculations allow for a straightforward assessment of the π-backbonding (E→L) stabilization energy. The validity of the suggested resonance model is further confirmed by the complementary MPD calculations focusing on the E lone pair region as well as the E-L bonding region. Likewise, the MPD method reveals a strong influence of the σ-donating and π-accepting properties of the ligand. In particular, either one single domain or two symmetrical domains are found in the lone pair region of the central atom, supporting the predominance of either the "ylidene" or "ylidone" structures having one or two lone pairs at the central atom, respectively. Furthermore, the calculated average populations in the lone pair MPDs correlate very well with the natural bond orbital (NBO) populations, and can be related to the average number of electrons that is backdonated to the ligands. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Challenges and Limitations of Applying an Emotion-driven Design Approach on Elderly Users

    DEFF Research Database (Denmark)

    Andersen, Casper L.; Gudmundsson, Hjalte P.; Achiche, Sofiane

    2011-01-01

    a competitive advantage for companies. In this paper, challenges of applying an emotion-driven design approach applied on elderly people, in order to identify their user needs towards walking frames, are discussed. The discussion will be based on the experiences and results obtained from the case study...... related to the participants’ age and cognitive abilities. The challenges encountered are discussed and guidelines on what should be taken into account to facilitate an emotion-driven design approach for elderly people are proposed....

  12. Cluster Validity Classification Approaches Based on Geometric Probability and Application in the Classification of Remotely Sensed Images

    Directory of Open Access Journals (Sweden)

    LI Jian-Wei

    2014-08-01

    Full Text Available On the basis of the cluster validity function based on geometric probability in literature [1, 2], propose a cluster analysis method based on geometric probability to process large amount of data in rectangular area. The basic idea is top-down stepwise refinement, firstly categories then subcategories. On all clustering levels, use the cluster validity function based on geometric probability firstly, determine clusters and the gathering direction, then determine the center of clustering and the border of clusters. Through TM remote sensing image classification examples, compare with the supervision and unsupervised classification in ERDAS and the cluster analysis method based on geometric probability in two-dimensional square which is proposed in literature 2. Results show that the proposed method can significantly improve the classification accuracy.

  13. Responses of mink to auditory stimuli: Prerequisites for applying the ‘cognitive bias’ approach

    DEFF Research Database (Denmark)

    Svendsen, Pernille Maj; Malmkvist, Jens; Halekoh, Ulrich

    2012-01-01

    The aim of the study was to determine and validate prerequisites for applying a cognitive (judgement) bias approach to assessing welfare in farmed mink (Neovison vison). We investigated discrimination ability and associative learning ability using auditory cues. The mink (n = 15 females) were...... farmed mink in a judgement bias approach would thus appear to be feasible. However several specific issues are to be considered in order to successfully adapt a cognitive bias approach to mink, and these are discussed....

  14. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  15. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  16. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  17. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  18. Diagnostics of enterprise bankruptcy occurrence probability in an anti-crisis management: modern approaches and classification of models

    Directory of Open Access Journals (Sweden)

    I.V. Zhalinska

    2015-09-01

    Full Text Available Diagnostics of enterprise bankruptcy occurrence probability is defined as an important tool ensuring the viability of an organization under conditions of unpredictable dynamic environment. The paper aims to define the basic features of diagnostics of bankruptcy occurrence probability models and their classification. The article grounds the objective increasing of crisis probability in modern enterprises where such increasing leads to the need to improve the efficiency of anti-crisis enterprise activities. The system of anti-crisis management is based on the subsystem of diagnostics of bankruptcy occurrence probability. Such a subsystem is the main one for further measures to prevent and overcome the crisis. The classification of existing models of enterprise bankruptcy occurrence probability has been suggested. The classification is based on methodical and methodological principles of models. The following main groups of models are determined: the models using financial ratios, aggregates and scores, the models of discriminated analysis, the methods of strategic analysis, informal models, artificial intelligence systems and the combination of the models. The classification made it possible to identify the analytical capabilities of each of the groups of models suggested.

  19. A Probably Minor Role for Land-Applied Goat Manure in the Transmission of Coxiella burnetii to Humans in the 2007-2010 Dutch Q Fever Outbreak

    NARCIS (Netherlands)

    Brom, Van den R.; Roest, H.I.J.; Bruin, de Arnout; Dercksen, D.; Santman-Berends, I.M.G.A.; Hoek, van der Wim; Dinkla, A.; Vellema, Jelmer; Vellema, P.

    2015-01-01

    In 2007, Q fever started to become a major public health problem in the Netherlands, with small ruminants as most probable source. In order to reduce environmental contamination, control measures for manure were implemented because of the assumption that manure was highly contaminated with Coxiella

  20. A probably minor role for land-applied goat manure in the transmission of Coxiella burnetii to humans in the 2007–2010 Dutch Q fever outbreak

    NARCIS (Netherlands)

    Van den Brom, R.; Roest, H.J.; De Bruin, A.; Dercksen, D.; Santman-Berends, I.; Van der Hoek, W.; Dinkla, A.; Vellema, J.; Vellema, P.

    2015-01-01

    In 2007, Q fever started to become a major public health problem in the Netherlands, with small ruminants as most probable source. In order to reduce environmental contamination, control measures for manure were implemented because of the assumption that manure was highly contaminated with Coxiella

  1. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki; Park, Kihong; Alouini, Mohamed-Slim

    2017-01-01

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  2. A Generic Simulation Approach for the Fast and Accurate Estimation of the Outage Probability of Single Hop and Multihop FSO Links Subject to Generalized Pointing Errors

    KAUST Repository

    Ben Issaid, Chaouki

    2017-07-28

    When assessing the performance of the free space optical (FSO) communication systems, the outage probability encountered is generally very small, and thereby the use of nave Monte Carlo simulations becomes prohibitively expensive. To estimate these rare event probabilities, we propose in this work an importance sampling approach which is based on the exponential twisting technique to offer fast and accurate results. In fact, we consider a variety of turbulence regimes, and we investigate the outage probability of FSO communication systems, under a generalized pointing error model based on the Beckmann distribution, for both single and multihop scenarios. Selected numerical simulations are presented to show the accuracy and the efficiency of our approach compared to naive Monte Carlo.

  3. An approach for estimating the breach probabilities of moraine-dammed lakes in the Chinese Himalayas using remote-sensing data

    Directory of Open Access Journals (Sweden)

    X. Wang

    2012-10-01

    Full Text Available To make first-order estimates of the probability of moraine-dammed lake outburst flood (MDLOF and prioritize the probabilities of breaching posed by potentially dangerous moraine-dammed lakes (PDMDLs in the Chinese Himalayas, an objective approach is presented. We first select five indicators to identify PDMDLs according to four predesigned criteria. The climatic background was regarded as the climatic precondition of the moraine-dam failure, and under different climatic preconditions, we distinguish the trigger mechanisms of MDLOFs and subdivide them into 17 possible breach modes, with each mode having three or four components; we combined the precondition, modes and components to construct a decision-making tree of moraine-dam failure. Conversion guidelines were established so as to quantify the probabilities of components of a breach mode employing the historic performance method combined with expert knowledge and experience. The region of the Chinese Himalayas was chosen as a study area where there have been frequent MDLOFs in recent decades. The results show that the breaching probabilities (P of 142 PDMDLs range from 0.037 to 0.345, and they can be further categorized as 43 lakes with very high breach probabilities (P ≥ 0.24, 47 lakes with high breach probabilities (0.18 ≤ P < 0.24, 24 lakes with mid-level breach probabilities (0.12 ≤ P < 0.18, 24 lakes with low breach probabilities (0.06 ≤ P < 0.12, and four lakes with very low breach probabilities (p < 0.06.

  4. What subject matter questions motivate the use of machine learning approaches compared to statistical models for probability prediction?

    Science.gov (United States)

    Binder, Harald

    2014-07-01

    This is a discussion of the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Probabilistic Approach to Provide Scenarios of Earthquake-Induced Slope Failures (PARSIFAL Applied to the Alcoy Basin (South Spain

    Directory of Open Access Journals (Sweden)

    Salvatore Martino

    2018-02-01

    Full Text Available The PARSIFAL (Probabilistic Approach to pRovide Scenarios of earthquake-Induced slope FAiLures approach was applied in the basin of Alcoy (Alicante, South Spain, to provide a comprehensive scenario of earthquake-induced landslides. The basin of Alcoy is well known for several historical landslides, mainly represented by earth-slides, that involve urban settlement as well as infrastructures (i.e., roads, bridges. The PARSIFAL overcomes several limits existing in other approaches, allowing the concomitant analyses of: (i first-time landslides (due to both rock-slope failures and shallow earth-slides and reactivations of existing landslides; (ii slope stability analyses of different failure mechanisms; (iii comprehensive mapping of earthquake-induced landslide scenarios in terms of exceedance probability of critical threshold values of co-seismic displacements. Geotechnical data were used to constrain the slope stability analysis, while specific field surveys were carried out to measure jointing and strength conditions of rock masses and to inventory already existing landslides. GIS-based susceptibility analyses were performed to assess the proneness to shallow earth-slides as well as to verify kinematic compatibility to planar or wedge rock-slides and to topples. The experienced application of PARSIFAL to the Alcoy basin: (i confirms the suitability of the approach at a municipality scale, (ii outputs the main role of saturation in conditioning slope instabilities in this case study, (iii demonstrates the reliability of the obtained results respect to the historical data.

  6. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny

    OpenAIRE

    Maddock, Simon T.; Briscoe, Andrew G.; Wilkinson, Mark; Waeschenbach, Andrea; San Mauro, Diego; Day, Julia J.; Littlewood, D. Tim J.; Foster, Peter G.; Nussbaum, Ronald A.; Gower, David J.

    2016-01-01

    Mitochondrial genome (mitogenome) sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS) technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a ‘traditional’ Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing) on four different sequencing pla...

  7. Multimodal Approach for Automatic Emotion Recognition Applied to the Tension Levels Study in TV Newscasts

    Directory of Open Access Journals (Sweden)

    Moisés Henrique Ramos Pereira

    2015-12-01

    Full Text Available This article addresses a multimodal approach to automatic emotion recognition in participants of TV newscasts (presenters, reporters, commentators and others able to assist the tension levels study in narratives of events in this television genre. The methodology applies state-of-the-art computational methods to process and analyze facial expressions, as well as speech signals. The proposed approach contributes to semiodiscoursive study of TV newscasts and their enunciative praxis, assisting, for example, the identification of the communication strategy of these programs. To evaluate the effectiveness of the proposed approach was applied it in a video related to a report displayed on a Brazilian TV newscast great popularity in the state of Minas Gerais. The experimental results are promising on the recognition of emotions on the facial expressions of tele journalists and are in accordance with the distribution of audiovisual indicators extracted over a TV newscast, demonstrating the potential of the approach to support the TV journalistic discourse analysis.This article addresses a multimodal approach to automatic emotion recognition in participants of TV newscasts (presenters, reporters, commentators and others able to assist the tension levels study in narratives of events in this television genre. The methodology applies state-of-the-art computational methods to process and analyze facial expressions, as well as speech signals. The proposed approach contributes to semiodiscoursive study of TV newscasts and their enunciative praxis, assisting, for example, the identification of the communication strategy of these programs. To evaluate the effectiveness of the proposed approach was applied it in a video related to a report displayed on a Brazilian TV newscast great popularity in the state of Minas Gerais. The experimental results are promising on the recognition of emotions on the facial expressions of tele journalists and are in accordance

  8. Usefulness of an equal-probability assumption for out-of-equilibrium states: A master equation approach

    KAUST Repository

    Nogawa, Tomoaki; Ito, Nobuyasu; Watanabe, Hiroshi

    2012-01-01

    We examine the effectiveness of assuming an equal probability for states far from equilibrium. For this aim, we propose a method to construct a master equation for extensive variables describing nonstationary nonequilibrium dynamics. The key point of the method is the assumption that transient states are equivalent to the equilibrium state that has the same extensive variables, i.e., an equal probability holds for microscopic states in nonequilibrium. We demonstrate an application of this method to the critical relaxation of the two-dimensional Potts model by Monte Carlo simulations. While the one-variable description, which is adequate for equilibrium, yields relaxation dynamics that are very fast, the redundant two-variable description well reproduces the true dynamics quantitatively. These results suggest that some class of the nonequilibrium state can be described with a small extension of degrees of freedom, which may lead to an alternative way to understand nonequilibrium phenomena. © 2012 American Physical Society.

  9. Usefulness of an equal-probability assumption for out-of-equilibrium states: A master equation approach

    KAUST Repository

    Nogawa, Tomoaki

    2012-10-18

    We examine the effectiveness of assuming an equal probability for states far from equilibrium. For this aim, we propose a method to construct a master equation for extensive variables describing nonstationary nonequilibrium dynamics. The key point of the method is the assumption that transient states are equivalent to the equilibrium state that has the same extensive variables, i.e., an equal probability holds for microscopic states in nonequilibrium. We demonstrate an application of this method to the critical relaxation of the two-dimensional Potts model by Monte Carlo simulations. While the one-variable description, which is adequate for equilibrium, yields relaxation dynamics that are very fast, the redundant two-variable description well reproduces the true dynamics quantitatively. These results suggest that some class of the nonequilibrium state can be described with a small extension of degrees of freedom, which may lead to an alternative way to understand nonequilibrium phenomena. © 2012 American Physical Society.

  10. Applying a new ensemble approach to estimating stock status of marine fisheries around the world

    DEFF Research Database (Denmark)

    Rosenberg, Andrew A.; Kleisner, Kristin M.; Afflerbach, Jamie

    2018-01-01

    The exploitation status of marine fisheries stocks worldwide is of critical importance for food security, ecosystem conservation, and fishery sustainability. Applying a suite of data-limited methods to global catch data, combined through an ensemble modeling approach, we provide quantitative esti...

  11. Geothermal potential assessment for a low carbon strategy : A new systematic approach applied in southern Italy

    NARCIS (Netherlands)

    Trumpy, E.; Botteghi, S.; Caiozzi, F.; Donato, A.; Gola, G.; Montanari, D.; Pluymaekers, M. P D; Santilano, A.; van Wees, J. D.; Manzella, A.

    2016-01-01

    In this study a new approach to geothermal potential assessment was set up and applied in four regions in southern Italy. Our procedure, VIGORThermoGIS, relies on the volume method of assessment and uses a 3D model of the subsurface to integrate thermal, geological and petro-physical data. The

  12. Geothermal potential assessment for a low carbon strategy: A new systematic approach applied in southern Italy

    NARCIS (Netherlands)

    Trumpy, E.; Botteghi, S.; Caiozzi, F.; Donato, A.; Gola, G.; Montanari, D.; Pluymaekers, M.P.D.; Santilano, A.; Wees, J.D. van; Manzella, A.

    2016-01-01

    In this study a new approach to geothermal potential assessment was set up and applied in four regions in southern Italy. Our procedure, VIGORThermoGIS, relies on the volume method of assessment and uses a 3D model of the subsurface to integrate thermal, geological and petro-physical data. The

  13. Blended Risk Approach in Applying PSA Models to Risk-Based Regulations

    International Nuclear Information System (INIS)

    Dimitrijevic, V. B.; Chapman, J. R.

    1996-01-01

    In this paper, the authors will discuss a modern approach in applying PSA models in risk-based regulation. The Blended Risk Approach is a combination of traditional and probabilistic processes. It is receiving increased attention in different industries in the U. S. and abroad. The use of the deterministic regulations and standards provides a proven and well understood basis on which to assess and communicate the impact of change to plant design and operation. Incorporation of traditional values into risk evaluation is working very well in the blended approach. This approach is very application specific. It includes multiple risk attributes, qualitative risk analysis, and basic deterministic principles. In blending deterministic and probabilistic principles, this approach ensures that the objectives of the traditional defense-in-depth concept are not compromised and the design basis of the plant is explicitly considered. (author)

  14. A suggested approach toward measuring sorption and applying sorption data to repository performance assessment

    International Nuclear Information System (INIS)

    Rundberg, R.S.

    1992-01-01

    The prediction of radionuclide migration for the purpose of assessing the safety of a nuclear waste repository will be based on a collective knowledge of hydrologic and geochemical properties of the surrounding rock and groundwater. This knowledge along with assumption about the interactions of radionuclides with groundwater and minerals form the scientific basis for a model capable of accurately predicting the repository's performance. Because the interaction of radionuclides in geochemical systems is known to be complicated, several fundamental and empirical approaches to measuring the interaction between radionuclides and the geologic barrier have been developed. The approaches applied to the measurement of sorption involve the use of pure minerals, intact, or crushed rock in dynamic and static experiments. Each approach has its advantages and disadvantages. There is no single best method for providing sorption data for performance assessment models which can be applied without invoking information derived from multiple experiments. 53 refs., 12 figs

  15. A whole-of-curriculum approach to improving nursing students' applied numeracy skills.

    Science.gov (United States)

    van de Mortel, Thea F; Whitehair, Leeann P; Irwin, Pauletta M

    2014-03-01

    Nursing students often perform poorly on numeracy tests. Whilst one-off interventions have been trialled with limited success, a whole-of-curriculum approach may provide a better means of improving applied numeracy skills. The objective of the study is to assess the efficacy of a whole-of-curriculum approach in improving nursing students' applied numeracy skills. Two cycles of assessment, implementation and evaluation of strategies were conducted following a high fail rate in the final applied numeracy examination in a Bachelor of Nursing (BN) programme. Strategies included an early diagnostic assessment followed by referral to remediation, setting the pass mark at 100% for each of six applied numeracy examinations across the programme, and employing a specialist mathematics teacher to provide consistent numeracy teaching. The setting of the study is one Australian university. 1035 second and third year nursing students enrolled in four clinical nursing courses (CNC III, CNC IV, CNC V and CNC VI) were included. Data on the percentage of students who obtained 100% in their applied numeracy examination in up to two attempts were collected from CNCs III, IV, V and VI between 2008 and 2011. A four by two χ(2) contingency table was used to determine if the differences in the proportion of students achieving 100% across two examination attempts in each CNC were significantly different between 2008 and 2011. The percentage of students who obtained 100% correct answers on the applied numeracy examinations was significantly higher in 2011 than in 2008 in CNC III (χ(2)=272, 3; p<0.001), IV (χ(2)=94.7, 3; p<0.001) and VI (χ(2)=76.3, 3; p<0.001). A whole-of-curriculum approach to developing applied numeracy skills in BN students resulted in a substantial improvement in these skills over four years. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Uncharted territory: A complex systems approach as an emerging paradigm in applied linguistics

    Directory of Open Access Journals (Sweden)

    Weideman, Albert J

    2009-12-01

    Full Text Available Developing a theory of applied linguistics is a top priority for the discipline today. The emergence of a new paradigm - a complex systems approach - in applied linguistics presents us with a unique opportunity to give prominence to the development of a foundational framework for this design discipline. Far from being a mere philosophical exercise, such a framework will find application in the training and induction of new entrants into the discipline within the developing context of South Africa, as well as internationally.

  17. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  18. A generalised chemical precipitation modelling approach in wastewater treatment applied to calcite

    DEFF Research Database (Denmark)

    Mbamba, Christian Kazadi; Batstone, Damien J.; Flores Alsina, Xavier

    2015-01-01

    , the present study aims to identify a broadly applicable precipitation modelling approach. The study uses two experimental platforms applied to calcite precipitating from synthetic aqueous solutions to identify and validate the model approach. Firstly, dynamic pH titration tests are performed to define...... an Arrhenius-style correction of kcryst. The influence of magnesium (a common and representative added impurity) on kcryst was found to be significant but was considered an optional correction because of a lesser influence as compared to that of temperature. Other variables such as ionic strength and pH were...

  19. Evaluating the effect of corridors and landscape heterogeneity on dispersal probability: a comparison of three spatially explicit modelling approaches

    DEFF Research Database (Denmark)

    Jepsen, J. U.; Baveco, J. M.; Topping, C. J.

    2004-01-01

    preferences of the modeller, rather than by a critical evaluation of model performance. We present a comparison of three common spatial simulation approaches (patch-based incidence-function model (IFM), individual-based movement model (IBMM), individual-based population model including detailed behaviour...

  20. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  1. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  2. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  3. Interference statistics and capacity analysis for uplink transmission in two-tier small cell networks: A geometric probability approach

    KAUST Repository

    Tabassum, Hina

    2014-07-01

    This paper presents a novel framework to derive the statistics of the interference considering dedicated and shared spectrum access for uplink transmission in two-tier small cell networks such as the macrocell-femtocell networks. The framework exploits the distance distributions from geometric probability theory to characterize the uplink interference while considering a traditional grid-model set-up for macrocells along with the randomly deployed femtocells. The derived expressions capture the impact of path-loss, composite shadowing and fading, uniform and non-uniform traffic loads, spatial distribution of femtocells, and partial and full spectral reuse among femtocells. Considering dedicated spectrum access, first, we derive the statistics of co-tier interference incurred at both femtocell and macrocell base stations (BSs) from a single interferer by approximating generalized- K composite fading distribution with the tractable Gamma distribution. We then derive the distribution of the number of interferers considering partial spectral reuse and moment generating function (MGF) of the cumulative interference for both partial and full spectral reuse scenarios. Next, we derive the statistics of the cross-tier interference at both femtocell and macrocell BSs considering shared spectrum access. Finally, we utilize the derived expressions to analyze the capacity in both dedicated and shared spectrum access scenarios. The derived expressions are validated by the Monte Carlo simulations. Numerical results are generated to assess the feasibility of shared and dedicated spectrum access in femtocells under varying traffic load and spectral reuse scenarios. © 2014 IEEE.

  4. A statistical approach to estimating effects of performance shaping factors on human error probabilities of soft controls

    International Nuclear Information System (INIS)

    Kim, Yochan; Park, Jinkyun; Jung, Wondea; Jang, Inseok; Hyun Seong, Poong

    2015-01-01

    Despite recent efforts toward data collection for supporting human reliability analysis, there remains a lack of empirical basis in determining the effects of performance shaping factors (PSFs) on human error probabilities (HEPs). To enhance the empirical basis regarding the effects of the PSFs, a statistical methodology using a logistic regression and stepwise variable selection was proposed, and the effects of the PSF on HEPs related with the soft controls were estimated through the methodology. For this estimation, more than 600 human error opportunities related to soft controls in a computerized control room were obtained through laboratory experiments. From the eight PSF surrogates and combinations of these variables, the procedure quality, practice level, and the operation type were identified as significant factors for screen switch and mode conversion errors. The contributions of these significant factors to HEPs were also estimated in terms of a multiplicative form. The usefulness and limitation of the experimental data and the techniques employed are discussed herein, and we believe that the logistic regression and stepwise variable selection methods will provide a way to estimate the effects of PSFs on HEPs in an objective manner. - Highlights: • It is necessary to develop an empirical basis for the effects of the PSFs on the HEPs. • A statistical method using a logistic regression and variable selection was proposed. • The effects of PSFs on the HEPs of soft controls were empirically investigated. • The significant factors were identified and their effects were estimated

  5. On finding the C in CBT: the challenges of applying gambling-related cognitive approaches to video-gaming.

    Science.gov (United States)

    Delfabbro, Paul; King, Daniel

    2015-03-01

    Many similarities have been drawn between the activities of gambling and video-gaming. Both are repetitive activities with intermittent reinforcement, decision-making opportunities, and elements of risk-taking. As a result, it might be tempting to believe that cognitive strategies that are used to treat problem gambling might also be applied to problematic video gaming. In this paper, we argue that many cognitive approaches to gambling that typically involve a focus on erroneous beliefs about probabilities and randomness are not readily applicable to video gaming. Instead, we encourage a focus on other clusters of cognitions that relate to: (a) the salience and over-valuing of gaming rewards, experiences, and identities, (b) maladaptive and inflexible rules about behaviour, (c) the use of video-gaming to maintain self-esteem, and (d) video-gaming for social status and recognition. This theoretical discussion is advanced as a starting point for the development of more refined cognitive treatment approaches for problematic video gaming.

  6. Classification by a neural network approach applied to non destructive testing

    International Nuclear Information System (INIS)

    Lefevre, M.; Preteux, F.; Lavayssiere, B.

    1995-01-01

    Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this paper, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighborhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (authors). 21 refs., 5 figs

  7. A measurement error approach to assess the association between dietary diversity, nutrient intake, and mean probability of adequacy.

    Science.gov (United States)

    Joseph, Maria L; Carriquiry, Alicia

    2010-11-01

    Collection of dietary intake information requires time-consuming and expensive methods, making it inaccessible to many resource-poor countries. Quantifying the association between simple measures of usual dietary diversity and usual nutrient intake/adequacy would allow inferences to be made about the adequacy of micronutrient intake at the population level for a fraction of the cost. In this study, we used secondary data from a dietary intake study carried out in Bangladesh to assess the association between 3 food group diversity indicators (FGI) and calcium intake; and the association between these same 3 FGI and a composite measure of nutrient adequacy, mean probability of adequacy (MPA). By implementing Fuller's error-in-the-equation measurement error model (EEM) and simple linear regression (SLR) models, we assessed these associations while accounting for the error in the observed quantities. Significant associations were detected between usual FGI and usual calcium intakes, when the more complex EEM was used. The SLR model detected significant associations between FGI and MPA as well as for variations of these measures, including the best linear unbiased predictor. Through simulation, we support the use of the EEM. In contrast to the EEM, the SLR model does not account for the possible correlation between the measurement errors in the response and predictor. The EEM performs best when the model variables are not complex functions of other variables observed with error (e.g. MPA). When observation days are limited and poor estimates of the within-person variances are obtained, the SLR model tends to be more appropriate.

  8. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  9. An approach of optimal sensitivity applied in the tertiary loop of the automatic generation control

    Energy Technology Data Exchange (ETDEWEB)

    Belati, Edmarcio A. [CIMATEC - SENAI, Salvador, BA (Brazil); Alves, Dilson A. [Electrical Engineering Department, FEIS, UNESP - Sao Paulo State University (Brazil); da Costa, Geraldo R.M. [Electrical Engineering Department, EESC, USP - Sao Paulo University (Brazil)

    2008-09-15

    This paper proposes an approach of optimal sensitivity applied in the tertiary loop of the automatic generation control. The approach is based on the theorem of non-linear perturbation. From an optimal operation point obtained by an optimal power flow a new optimal operation point is directly determined after a perturbation, i.e., without the necessity of an iterative process. This new optimal operation point satisfies the constraints of the problem for small perturbation in the loads. The participation factors and the voltage set point of the automatic voltage regulators (AVR) of the generators are determined by the technique of optimal sensitivity, considering the effects of the active power losses minimization and the network constraints. The participation factors and voltage set point of the generators are supplied directly to a computational program of dynamic simulation of the automatic generation control, named by power sensitivity mode. Test results are presented to show the good performance of this approach. (author)

  10. Effects produced by oscillations applied to nonlinear dynamic systems: a general approach and examples

    DEFF Research Database (Denmark)

    Blekhman, I. I.; Sorokin, V. S.

    2016-01-01

    A general approach to study effects produced by oscillations applied to nonlinear dynamic systems is developed. It implies a transition from initial governing equations of motion to much more simple equations describing only the main slow component of motions (the vibro-transformed dynamics.......g., the requirement for the involved nonlinearities to be weak. The approach is illustrated by several relevant examples from various fields of science, e.g., mechanics, physics, chemistry and biophysics....... equations). The approach is named as the oscillatory strobodynamics, since motions are perceived as under a stroboscopic light. The vibro-transformed dynamics equations comprise terms that capture the averaged effect of oscillations. The method of direct separation of motions appears to be an efficient...

  11. DETERMINING TYPE Ia SUPERNOVA HOST GALAXY EXTINCTION PROBABILITIES AND A STATISTICAL APPROACH TO ESTIMATING THE ABSORPTION-TO-REDDENING RATIO R{sub V}

    Energy Technology Data Exchange (ETDEWEB)

    Cikota, Aleksandar [European Southern Observatory, Karl-Schwarzschild-Strasse 2, D-85748 Garching b. München (Germany); Deustua, Susana [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Marleau, Francine, E-mail: acikota@eso.org [Institute for Astro- and Particle Physics, University of Innsbruck, Technikerstrasse 25/8, A-6020 Innsbruck (Austria)

    2016-03-10

    We investigate limits on the extinction values of Type Ia supernovae (SNe Ia) to statistically determine the most probable color excess, E(B – V), with galactocentric distance, and use these statistics to determine the absorption-to-reddening ratio, R{sub V}, for dust in the host galaxies. We determined pixel-based dust mass surface density maps for 59 galaxies from the Key Insight on Nearby Galaxies: a Far-infrared Survey with Herschel (KINGFISH). We use SN Ia spectral templates to develop a Monte Carlo simulation of color excess E(B – V) with R{sub V} = 3.1 and investigate the color excess probabilities E(B – V) with projected radial galaxy center distance. Additionally, we tested our model using observed spectra of SN 1989B, SN 2002bo, and SN 2006X, which occurred in three KINGFISH galaxies. Finally, we determined the most probable reddening for Sa–Sap, Sab–Sbp, Sbc–Scp, Scd–Sdm, S0, and irregular galaxy classes as a function of R/R{sub 25}. We find that the largest expected reddening probabilities are in Sab–Sb and Sbc–Sc galaxies, while S0 and irregular galaxies are very dust poor. We present a new approach for determining the absorption-to-reddening ratio R{sub V} using color excess probability functions and find values of R{sub V} = 2.71 ± 1.58 for 21 SNe Ia observed in Sab–Sbp galaxies, and R{sub V} = 1.70 ± 0.38, for 34 SNe Ia observed in Sbc–Scp galaxies.

  12. Addressing dependability by applying an approach for model-based risk assessment

    International Nuclear Information System (INIS)

    Gran, Bjorn Axel; Fredriksen, Rune; Thunem, Atoosa P.-J.

    2007-01-01

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development

  13. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    Science.gov (United States)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  14. Addressing dependability by applying an approach for model-based risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjorn Axel [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: bjorn.axel.gran@hrp.no; Fredriksen, Rune [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: rune.fredriksen@hrp.no; Thunem, Atoosa P.-J. [Institutt for energiteknikk, OECD Halden Reactor Project, NO-1751 Halden (Norway)]. E-mail: atoosa.p-j.thunem@hrp.no

    2007-11-15

    This paper describes how an approach for model-based risk assessment (MBRA) can be applied for addressing different dependability factors in a critical application. Dependability factors, such as availability, reliability, safety and security, are important when assessing the dependability degree of total systems involving digital instrumentation and control (I and C) sub-systems. In order to identify risk sources their roles with regard to intentional system aspects such as system functions, component behaviours and intercommunications must be clarified. Traditional risk assessment is based on fault or risk models of the system. In contrast to this, MBRA utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tried out within the telemedicine and e-commerce areas, and provided through a series of seven trials a sound basis for risk assessments. In this paper the results from the CORAS project are presented, and it is discussed how the approach for applying MBRA meets the needs of a risk-informed Man-Technology-Organization (MTO) model, and how methodology can be applied as a part of a trust case development.

  15. A Regional Guidebook for Applying The Approach to Assessing Wetland Functions of Depressed Wetlands in Peninsular, Florida

    National Research Council Canada - National Science Library

    Noble, Chris

    2004-01-01

    The Hydrogeomophic (HGM) Approach is a method for developing functional indices and the protocols used to apply these indices to the assessment of wetland functions at a site-specific scale The HGM Approach was initially...

  16. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda, E-mail: fernanda.tumelero@yahoo.com.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana, E-mail: claudiopeteren@yahoo.com.br, E-mail: gleniogoncalves@yahoo.com.br, E-mail: luana-lazzari@hotmail.com [Universidade Federal de Pelotas (DME/UFPEL), Capao do Leao, RS (Brazil). Instituto de Fisica e Matematica

    2015-07-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  17. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    International Nuclear Information System (INIS)

    Tumelero, Fernanda; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana

    2015-01-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  18. Applying Adverse Outcome Pathways (AOPs) to support Integrated Approaches to Testing and Assessment (IATA).

    Science.gov (United States)

    Tollefsen, Knut Erik; Scholz, Stefan; Cronin, Mark T; Edwards, Stephen W; de Knecht, Joop; Crofton, Kevin; Garcia-Reyero, Natalia; Hartung, Thomas; Worth, Andrew; Patlewicz, Grace

    2014-12-01

    Chemical regulation is challenged by the large number of chemicals requiring assessment for potential human health and environmental impacts. Current approaches are too resource intensive in terms of time, money and animal use to evaluate all chemicals under development or already on the market. The need for timely and robust decision making demands that regulatory toxicity testing becomes more cost-effective and efficient. One way to realize this goal is by being more strategic in directing testing resources; focusing on chemicals of highest concern, limiting testing to the most probable hazards, or targeting the most vulnerable species. Hypothesis driven Integrated Approaches to Testing and Assessment (IATA) have been proposed as practical solutions to such strategic testing. In parallel, the development of the Adverse Outcome Pathway (AOP) framework, which provides information on the causal links between a molecular initiating event (MIE), intermediate key events (KEs) and an adverse outcome (AO) of regulatory concern, offers the biological context to facilitate development of IATA for regulatory decision making. This manuscript summarizes discussions at the Workshop entitled "Advancing AOPs for Integrated Toxicology and Regulatory Applications" with particular focus on the role AOPs play in informing the development of IATA for different regulatory purposes. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. A multicriteria decision making approach applied to improving maintenance policies in healthcare organizations.

    Science.gov (United States)

    Carnero, María Carmen; Gómez, Andrés

    2016-04-23

    Healthcare organizations have far greater maintenance needs for their medical equipment than other organization, as many are used directly with patients. However, the literature on asset management in healthcare organizations is very limited. The aim of this research is to provide more rational application of maintenance policies, leading to an increase in quality of care. This article describes a multicriteria decision-making approach which integrates Markov chains with the multicriteria Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH), to facilitate the best choice of combination of maintenance policies by using the judgements of a multi-disciplinary decision group. The proposed approach takes into account the level of acceptance that a given alternative would have among professionals. It also takes into account criteria related to cost, quality of care and impact of care cover. This multicriteria approach is applied to four dialysis subsystems: patients infected with hepatitis C, infected with hepatitis B, acute and chronic; in all cases, the maintenance strategy obtained consists of applying corrective and preventive maintenance plus two reserve machines. The added value in decision-making practices from this research comes from: (i) integrating the use of Markov chains to obtain the alternatives to be assessed by a multicriteria methodology; (ii) proposing the use of MACBETH to make rational decisions on asset management in healthcare organizations; (iii) applying the multicriteria approach to select a set or combination of maintenance policies in four dialysis subsystems of a health care organization. In the multicriteria decision making approach proposed, economic criteria have been used, related to the quality of care which is desired for patients (availability), and the acceptance that each alternative would have considering the maintenance and healthcare resources which exist in the organization, with the inclusion of a

  20. Applying the health action process approach (HAPA) to the choice of health products: An exploratory study

    DEFF Research Database (Denmark)

    Krutulyte, Rasa; Grunert, Klaus G.; Scholderer, Joachim

    This paper presents the results of a qualitative pilot study that aimed to uncovering Danish consumers' motives for choosing health food. Schwarzer's (1992) health action process approach (HAPA) was applied to understand the process by which people chose health products. The research focused...... on the role of the behavioural intention predictors such as risk perception, outcome expectations and self-efficacy. The model has been proved to be a useful framework for understanding consumer choosing health food and is substantial in the further application of dietary choice issues....

  1. Logic-probable approach to the determination of the risk-sustainable strategy of the region and enterprises

    Directory of Open Access Journals (Sweden)

    Yu. M. Sokolinskaya

    2017-01-01

    Full Text Available The article describes a logical-probabilistic approach to the definition of crisis situations that threaten economic security. The onset of risk events of any kind entails two types of damage - direct economic and social damage, and indirect damage, which can be much more significant. The main strategic goal of ensuring economic security is the stable and most efficient functioning of the enterprise at present and ensuring its high potential for future development. The lack of a unified management mechanism due to systemic threats imposed on the economy as a whole led to an exacerbation of the situation primarily in the basic sectors (infrastructure, health, education, law enforcement. This, in turn, is the destructor of the basis for development and ensuring national security of the regions and the country as a whole, contributing to a decrease in the standard of living and quality of life of the population. Methods and methods of risk and safety management require the use of analysis covering all types of possible threats, such as engineering, economic, social factors, etc. And they must also take into account not only existing, but also remote consequences of the decisions made. The choice of a strategy for responding to a risk situation relies on the results of a comprehensive risk assessment, additional analysis of the technological and economic potential of the Voronezh region and industrial enterprise, the projected external environment, the current legal and regulatory framework (taxes, inflation, increase in the number of competitors, marketing and other studies. The organization of a risk management system based on their integrated assessment allows tracking and timely signaling of undesirable events in business activities.

  2. Improving the efficiency of a chemotherapy day unit: applying a business approach to oncology.

    Science.gov (United States)

    van Lent, Wineke A M; Goedbloed, N; van Harten, W H

    2009-03-01

    To improve the efficiency of a hospital-based chemotherapy day unit (CDU). The CDU was benchmarked with two other CDUs to identify their attainable performance levels for efficiency, and causes for differences. Furthermore, an in-depth analysis using a business approach, called lean thinking, was performed. An integrated set of interventions was implemented, among them a new planning system. The results were evaluated using pre- and post-measurements. We observed 24% growth of treatments and bed utilisation, a 12% increase of staff member productivity and an 81% reduction of overtime. The used method improved process design and led to increased efficiency and a more timely delivery of care. Thus, the business approaches, which were adapted for healthcare, were successfully applied. The method may serve as an example for other oncology settings with problems concerning waiting times, patient flow or lack of beds.

  3. Applying a synthetic approach to the resilience of Finnish reindeer herding as a changing livelihood

    Directory of Open Access Journals (Sweden)

    Simo Sarkki

    2016-12-01

    Full Text Available Reindeer herding is an emblematic livelihood for Northern Finland, culturally important for local people and valuable in tourism marketing. We examine the livelihood resilience of Finnish reindeer herding by narrowing the focus of general resilience on social-ecological systems (SESs to a specific livelihood while also acknowledging wider contexts in which reindeer herding is embedded. The questions for specified resilience can be combined with the applied DPSIR approach (Drivers; Pressures: resilience to what; State: resilience of what; Impacts: resilience for whom; Responses: resilience by whom and how. This paper is based on a synthesis of the authors' extensive anthropological fieldwork on reindeer herding and other land uses in Northern Finland. Our objective is to synthesize various opportunities and challenges that underpin the resilience of reindeer herding as a viable livelihood. The DPSIR approach, applied here as a three step procedure, helps focus the analysis on different components of SES and their dynamic interactions. First, various land use-related DPSIR factors and their relations (synergies and trade-offs to reindeer herding are mapped. Second, detailed DPSIR factors underpinning the resilience of reindeer herding are identified. Third, examples of interrelations between DPSIR factors are explored, revealing the key dynamics between Pressures, State, Impacts, and Responses related to the livelihood resilience of reindeer herding. In the Discussion section, we recommend that future applications of the DPSIR approach in examining livelihood resilience should (1 address cumulative pressures, (2 consider the state dimension as more tuned toward the social side of SES, (3 assess both the negative and positive impacts of environmental change on the examined livelihood by a combination of science led top-down and participatory bottom-up approaches, and (4 examine and propose governance solutions as well as local adaptations by

  4. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  5. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    Science.gov (United States)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  6. Applying theory-driven approaches to understanding and modifying clinicians' behavior: what do we know?

    Science.gov (United States)

    Perkins, Matthew B; Jensen, Peter S; Jaccard, James; Gollwitzer, Peter; Oettingen, Gabriele; Pappadopulos, Elizabeth; Hoagwood, Kimberly E

    2007-03-01

    Despite major recent research advances, large gaps exist between accepted mental health knowledge and clinicians' real-world practices. Although hundreds of studies have successfully utilized basic behavioral science theories to understand, predict, and change patients' health behaviors, the extent to which these theories-most notably the theory of reasoned action (TRA) and its extension, the theory of planned behavior (TPB)-have been applied to understand and change clinician behavior is unclear. This article reviews the application of theory-driven approaches to understanding and changing clinician behaviors. MEDLINE and PsycINFO databases were searched, along with bibliographies, textbooks on health behavior or public health, and references from experts, to find article titles that describe theory-driven approaches (TRA or TPB) to understanding and modifying health professionals' behavior. A total of 19 articles that detailed 20 studies described the use of TRA or TPB and clinicians' behavior. Eight articles describe the use of TRA or TPB with physicians, four relate to nurses, three relate to pharmacists, and two relate to health workers. Only two articles applied TRA or TPB to mental health clinicians. The body of work shows that different constructs of TRA or TPB predict intentions and behavior among different groups of clinicians and for different behaviors and guidelines. The number of studies on this topic is extremely limited, but they offer a rationale and a direction for future research as well as a theoretical basis for increasing the specificity and efficiency of clinician-targeted interventions.

  7. Matrix-exponential distributions in applied probability

    CERN Document Server

    Bladt, Mogens

    2017-01-01

    This book contains an in-depth treatment of matrix-exponential (ME) distributions and their sub-class of phase-type (PH) distributions. Loosely speaking, an ME distribution is obtained through replacing the intensity parameter in an exponential distribution by a matrix. The ME distributions can also be identified as the class of non-negative distributions with rational Laplace transforms. If the matrix has the structure of a sub-intensity matrix for a Markov jump process we obtain a PH distribution which allows for nice probabilistic interpretations facilitating the derivation of exact solutions and closed form formulas. The full potential of ME and PH unfolds in their use in stochastic modelling. Several chapters on generic applications, like renewal theory, random walks and regenerative processes, are included together with some specific examples from queueing theory and insurance risk. We emphasize our intention towards applications by including an extensive treatment on statistical methods for PH distribu...

  8. Probabilistic approaches applied to damage and embrittlement of structural materials in nuclear power plants

    International Nuclear Information System (INIS)

    Vincent, L.

    2012-01-01

    The present study deals with the long-term mechanical behaviour and damage of structural materials in nuclear power plants. An experimental way is first followed to study the thermal fatigue of austenitic stainless steels with a focus on the effects of mean stress and bi-axiality. Furthermore, the measurement of displacement fields by Digital Image Correlation techniques has been successfully used to detect early crack initiation during high cycle fatigue tests. A probabilistic model based on the shielding zones surrounding existing cracks is proposed to describe the development of crack networks. A more numeric way is then followed to study the embrittlement consequences of the irradiation hardening of the bainitic steel constitutive of nuclear pressure vessels. A crystalline plasticity law, developed in agreement with lower scale results (Dislocation Dynamics), is introduced in a Finite Element code in order to run simulations on aggregates and obtain the distributions of the maximum principal stress inside a Representative Volume Element. These distributions are then used to improve the classical Local Approach to Fracture which estimates the probability for a microstructural defect to be loaded up to a critical level. (author) [fr

  9. The Intensive Dysphagia Rehabilitation Approach Applied to Patients With Neurogenic Dysphagia: A Case Series Design Study.

    Science.gov (United States)

    Malandraki, Georgia A; Rajappa, Akila; Kantarcigil, Cagla; Wagner, Elise; Ivey, Chandra; Youse, Kathleen

    2016-04-01

    To examine the effects of the Intensive Dysphagia Rehabilitation approach on physiological and functional swallowing outcomes in adults with neurogenic dysphagia. Intervention study; before-after trial with 4-week follow-up through an online survey. Outpatient university clinics. A consecutive sample of subjects (N=10) recruited from outpatient university clinics. All subjects were diagnosed with adult-onset neurologic injury or disease. Dysphagia diagnosis was confirmed through clinical and endoscopic swallowing evaluations. No subjects withdrew from the study. Participants completed the 4-week Intensive Dysphagia Rehabilitation protocol, including 2 oropharyngeal exercise regimens, a targeted swallowing routine using salient stimuli, and caregiver participation. Treatment included hourly sessions twice per week and home practice for approximately 45 min/d. Outcome measures assessed pre- and posttreatment included airway safety using an 8-point Penetration Aspiration Scale, lingual isometric pressures, self-reported swallowing-related quality of life (QOL), and level of oral intake. Also, patients were monitored for adverse dysphagia-related effects. QOL and adverse effects were also assessed at the 4-week follow-up (online survey). The Intensive Dysphagia Rehabilitation approach was effective in improving maximum and mean Penetration Aspiration Scale scores (PDysphagia Rehabilitation approach was safe and improved physiological and some functional swallowing outcomes in our sample; however, further investigation is needed before it can be widely applied. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  10. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  11. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  12. The mapping approach in the path integral formalism applied to curve-crossing systems

    International Nuclear Information System (INIS)

    Novikov, Alexey; Kleinekathoefer, Ulrich; Schreiber, Michael

    2004-01-01

    The path integral formalism in a combined phase-space and coherent-state representation is applied to the problem of curve-crossing dynamics. The system of interest is described by two coupled one-dimensional harmonic potential energy surfaces interacting with a heat bath consisting of harmonic oscillators. The mapping approach is used to rewrite the Lagrangian function of the electronic part of the system. Using the Feynman-Vernon influence-functional method the bath is eliminated whereas the non-Gaussian part of the path integral is treated using the generating functional for the electronic trajectories. The dynamics of a Gaussian wave packet is analyzed along a one-dimensional reaction coordinate within a perturbative treatment for a small coordinate shift between the potential energy surfaces

  13. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  14. Pointing and the Evolution of Language: An Applied Evolutionary Epistemological Approach

    Directory of Open Access Journals (Sweden)

    Nathalie Gontier

    2013-07-01

    Full Text Available Numerous evolutionary linguists have indicated that human pointing behaviour might be associated with the evolution of language. At an ontogenetic level, and in normal individuals, pointing develops spontaneously and the onset of human pointing precedes as well as facilitates phases in speech and language development. Phylogenetically, pointing behaviour might have preceded and facilitated the evolutionary origin of both gestural and vocal language. Contrary to wild non-human primates, captive and human-reared nonhuman primates also demonstrate pointing behaviour. In this article, we analyse the debates on pointing and its role it might have played in language evolution from a meta-level. From within an Applied Evolutionary Epistemological approach, we examine how exactly we can determine whether pointing has been a unit, a level or a mechanism in language evolution.

  15. Positive Mathematical Programming Approaches – Recent Developments in Literature and Applied Modelling

    Directory of Open Access Journals (Sweden)

    Thomas Heckelei

    2012-05-01

    Full Text Available This paper reviews and discusses the more recent literature and application of Positive Mathematical Programming in the context of agricultural supply models. Specifically, advances in the empirical foundation of parameter specifications as well as the economic rationalisation of PMP models – both criticized in earlier reviews – are investigated. Moreover, the paper provides an overview on a larger set of models with regular/repeated policy application that apply variants of PMP. Results show that most applications today avoid arbitrary parameter specifications and rely on exogenous information on supply responses to calibrate model parameters. However, only few approaches use multiple observations to estimate parameters, which is likely due to the still considerable technical challenges associated with it. Equally, we found only limited reflection on the behavioral or technological assumptions that could rationalise the PMP model structure while still keeping the model’s advantages.

  16. Applying a learning design methodology in the flipped classroom approach – empowering teachers to reflect

    DEFF Research Database (Denmark)

    Triantafyllou, Evangelia; Kofoed, Lise; Purwins, Hendrik

    2016-01-01

    One of the recent developments in teaching that heavily relies on current technology is the “flipped classroom” approach. In a flipped classroom the traditional lecture and homework sessions are inverted. Students are provided with online material in order to gain necessary knowledge before class......, while class time is devoted to clarifications and application of this knowledge. The hypothesis is that there could be deep and creative discussions when teacher and students physically meet. This paper discusses how the learning design methodology can be applied to represent, share and guide educators...... and values of different stakeholders (i.e. institutions, educators, learners, and external agents), which influence the design and success of flipped classrooms. Moreover, it looks at the teaching cycle from a flipped instruction model perspective and adjusts it to cater for the reflection loops educators...

  17. The systems approach for applying artificial intelligence to space station automation (Invited Paper)

    Science.gov (United States)

    Grose, Vernon L.

    1985-12-01

    The progress of technology is marked by fragmentation -- dividing research and development into ever narrower fields of specialization. Ultimately, specialists know everything about nothing. And hope for integrating those slender slivers of specialty into a whole fades. Without an integrated, all-encompassing perspective, technology becomes applied in a lopsided and often inefficient manner. A decisionary model, developed and applied for NASA's Chief Engineer toward establishment of commercial space operations, can be adapted to the identification, evaluation, and selection of optimum application of artificial intelligence for space station automation -- restoring wholeness to a situation that is otherwise chaotic due to increasing subdivision of effort. Issues such as functional assignments for space station task, domain, and symptom modules can be resolved in a manner understood by all parties rather than just the person with assigned responsibility -- and ranked by overall significance to mission accomplishment. Ranking is based on the three basic parameters of cost, performance, and schedule. This approach has successfully integrated many diverse specialties in situations like worldwide terrorism control, coal mining safety, medical malpractice risk, grain elevator explosion prevention, offshore drilling hazards, and criminal justice resource allocation -- all of which would have otherwise been subject to "squeaky wheel" emphasis and support of decision-makers.

  18. Theoretical modeling of electroosmotic flow in soft microchannels: A variational approach applied to the rectangular geometry

    Science.gov (United States)

    Sadeghi, Arman

    2018-03-01

    Modeling of fluid flow in polyelectrolyte layer (PEL)-grafted microchannels is challenging due to their two-layer nature. Hence, the pertinent studies are limited only to circular and slit geometries for which matching the solutions for inside and outside the PEL is simple. In this paper, a simple variational-based approach is presented for the modeling of fully developed electroosmotic flow in PEL-grafted microchannels by which the whole fluidic area is considered as a single porous medium of variable properties. The model is capable of being applied to microchannels of a complex cross-sectional area. As an application of the method, it is applied to a rectangular microchannel of uniform PEL properties. It is shown that modeling a rectangular channel as a slit may lead to considerable overestimation of the mean velocity especially when both the PEL and electric double layer (EDL) are thick. It is also demonstrated that the mean velocity is an increasing function of the fixed charge density and PEL thickness and a decreasing function of the EDL thickness and PEL friction coefficient. The influence of the PEL thickness on the mean velocity, however, vanishes when both the PEL thickness and friction coefficient are sufficiently high.

  19. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny.

    Directory of Open Access Journals (Sweden)

    Simon T Maddock

    Full Text Available Mitochondrial genome (mitogenome sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a 'traditional' Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing on four different sequencing platforms (Illumina's HiSeq and MiSeq, Roche's 454 GS FLX, and Life Technologies' Ion Torrent to produce seven (near- complete mitogenomes from six species that form a small radiation of caecilian amphibians from the Seychelles. The fastest, most accurate method of obtaining mitogenome sequences that we tested was direct sequencing of genomic DNA (shotgun sequencing using the MiSeq platform. Bayesian inference and maximum likelihood analyses using seven different partitioning strategies were unable to resolve compellingly all phylogenetic relationships among the Seychelles caecilian species, indicating the need for additional data in this case.

  20. Next-Generation Mitogenomics: A Comparison of Approaches Applied to Caecilian Amphibian Phylogeny.

    Science.gov (United States)

    Maddock, Simon T; Briscoe, Andrew G; Wilkinson, Mark; Waeschenbach, Andrea; San Mauro, Diego; Day, Julia J; Littlewood, D Tim J; Foster, Peter G; Nussbaum, Ronald A; Gower, David J

    2016-01-01

    Mitochondrial genome (mitogenome) sequences are being generated with increasing speed due to the advances of next-generation sequencing (NGS) technology and associated analytical tools. However, detailed comparisons to explore the utility of alternative NGS approaches applied to the same taxa have not been undertaken. We compared a 'traditional' Sanger sequencing method with two NGS approaches (shotgun sequencing and non-indexed, multiplex amplicon sequencing) on four different sequencing platforms (Illumina's HiSeq and MiSeq, Roche's 454 GS FLX, and Life Technologies' Ion Torrent) to produce seven (near-) complete mitogenomes from six species that form a small radiation of caecilian amphibians from the Seychelles. The fastest, most accurate method of obtaining mitogenome sequences that we tested was direct sequencing of genomic DNA (shotgun sequencing) using the MiSeq platform. Bayesian inference and maximum likelihood analyses using seven different partitioning strategies were unable to resolve compellingly all phylogenetic relationships among the Seychelles caecilian species, indicating the need for additional data in this case.

  1. Fatigue damage approach applied to Li-ion batteries ageing characterization

    Energy Technology Data Exchange (ETDEWEB)

    Dudézert, C. [Renault, Technocentre, Guyancourt (France); Université Paris Sud/Université Paris-Saclay, ICMMO (UMR CNRS 8182), Orsay (France); CEA/LITEN, Grenoble (France); Reynier, Y. [CEA/LITEN, Grenoble (France); Duffault, J.-M. [Université Paris Sud/Université Paris-Saclay, ICMMO (UMR CNRS 8182), Orsay (France); Franger, S., E-mail: sylvain.franger@u-psud.fr [Université Paris Sud/Université Paris-Saclay, ICMMO (UMR CNRS 8182), Orsay (France)

    2016-11-15

    Reliability of energy storage devices is one of the foremost concerns in electric vehicles (EVs) development. Battery ageing, i.e. the degradation of battery energy and power, depends mainly on time, on the environmental conditions and on the in-use solicitations endured by the storage system. In case of EV, the heavy dependence of the battery use with the car performance, the driving cycles, and the weather conditions make the battery life prediction an intricate issue. Mechanical physicists have developed a quick and exhaustive methodology to diagnose reliability of complex structures enduring complex loads. This “fatigue” approach expresses the performance fading due to a complex load through the evolution corresponding to basic ones. Thus, a state of health variable named “damage” binds the load history and ageing. The battery ageing study described here consists in applying this mechanical approach to electrochemical systems by connecting the ageing factors with the battery characteristics evolutions. In that way, a specific “fatigue” test protocol has been established. This experimental confrontation has led to distinguishing calendar from cycling ageing mechanisms.

  2. Fatigue damage approach applied to Li-ion batteries ageing characterization

    International Nuclear Information System (INIS)

    Dudézert, C.; Reynier, Y.; Duffault, J.-M.; Franger, S.

    2016-01-01

    Reliability of energy storage devices is one of the foremost concerns in electric vehicles (EVs) development. Battery ageing, i.e. the degradation of battery energy and power, depends mainly on time, on the environmental conditions and on the in-use solicitations endured by the storage system. In case of EV, the heavy dependence of the battery use with the car performance, the driving cycles, and the weather conditions make the battery life prediction an intricate issue. Mechanical physicists have developed a quick and exhaustive methodology to diagnose reliability of complex structures enduring complex loads. This “fatigue” approach expresses the performance fading due to a complex load through the evolution corresponding to basic ones. Thus, a state of health variable named “damage” binds the load history and ageing. The battery ageing study described here consists in applying this mechanical approach to electrochemical systems by connecting the ageing factors with the battery characteristics evolutions. In that way, a specific “fatigue” test protocol has been established. This experimental confrontation has led to distinguishing calendar from cycling ageing mechanisms.

  3. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  4. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  5. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  6. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  7. Blood transfusion determines postoperative morbidity in pediatric cardiac surgery applying a comprehensive blood-sparing approach.

    Science.gov (United States)

    Redlin, Matthias; Kukucka, Marian; Boettcher, Wolfgang; Schoenfeld, Helge; Huebler, Michael; Kuppe, Hermann; Habazettl, Helmut

    2013-09-01

    Recently we suggested a comprehensive blood-sparing approach in pediatric cardiac surgery that resulted in no transfusion in 71 infants (25%), postoperative transfusion only in 68 (24%), and intraoperative transfusion in 149 (52%). We analyzed the effects of transfusion on postoperative morbidity and mortality in the same cohort of patients. The effect of transfusion on the length of mechanical ventilation and intensive care unit stay was assessed using Kaplan-Meier curves. To assess whether transfusion independently determined the length of mechanical ventilation and length of intensive care unit stay, a multivariate model was applied. Additionally, in the subgroup of transfused infants, the effect of the applied volume of packed red blood cells was assessed. The median length of mechanical ventilation was 11 hours (interquartile range, 9-18 hours), 33 hours (interquartile range, 18-80 hours), and 93 hours (interquartile range, 34-161 hours) in the no transfusion, postoperative transfusion only, and intraoperative transfusion groups, respectively (P interquartile range, 1-2 days), 3.5 days (interquartile range, 2-5 days), and 8 days (interquartile range, 3-9 days; P < .00001). The multivariate hazard ratio for early extubation was 0.24 (95% confidence interval, 0.16-0.35) and 0.37 (95% confidence interval, 0.25-0.55) for the intraoperative transfusion and postoperative transfusion only groups, respectively (P < .00001). In addition, the cardiopulmonary time, body weight, need for reoperation, and hemoglobin during cardiopulmonary bypass affected the length of mechanical ventilation. Similar results were obtained for the length of intensive care unit stay. In the subgroup of transfused infants, the volume of packed red blood cells also independently affected both the length of mechanical ventilation and the length of intensive care unit stay. The incidence and volume of blood transfusion markedly affects postoperative morbidity in pediatric cardiac surgery. These

  8. An approach to applying quality assurance to nuclear fuel waste disposal

    International Nuclear Information System (INIS)

    Cooper, R.B.; Abel, R.

    1996-12-01

    An approach to developing and applying a quality assurance program for a nuclear fuel waste disposal facility is described. The proposed program would be based on N286-series standards used for quality assurance programs in nuclear power plants, and would cover all aspects of work across all stages of the project, from initial feasibility studies to final closure of the vault. A quality assurance manual describing the overall quality assurance program and its elements would be prepared at the outset. Planning requirements of the quality assurance program would be addressed in a comprehensive plan for the project. Like the QA manual, this plan would be prepared at the outset of the project and updated at each stage. Particular attention would be given to incorporating the observational approach in procedures for underground engineering, where the ability to adapt designs and mining techniques to changing ground conditions would be essential. Quality verification requirements would be addressed through design reviews, peer reviews, inspections and surveillance, equipment calibration and laboratory analysis checks, and testing programs. Regular audits and program reviews would help to assess the state of implementation, degree of conformance to standards, and effectiveness of the quality assurance program. Audits would be particularly useful in assessing the quality systems of contractors and suppliers, and in verifying the completion of work at the end of stages. Since a nuclear fuel waste disposal project would span a period of about 90 years, a key function of the quality assurance program would be to ensure the continuity of knowledge and the transfer of experience from one stage to another This would be achieved by maintaining a records management system throughout the life of the project, by ensuring that work procedures were documented and kept current with new technologies and practices, and by instituting training programs that made use of experience gained

  9. Novel approach of fragment-based lead discovery applied to renin inhibitors.

    Science.gov (United States)

    Tawada, Michiko; Suzuki, Shinkichi; Imaeda, Yasuhiro; Oki, Hideyuki; Snell, Gyorgy; Behnke, Craig A; Kondo, Mitsuyo; Tarui, Naoki; Tanaka, Toshimasa; Kuroita, Takanobu; Tomimoto, Masaki

    2016-11-15

    A novel approach was conducted for fragment-based lead discovery and applied to renin inhibitors. The biochemical screening of a fragment library against renin provided the hit fragment which showed a characteristic interaction pattern with the target protein. The hit fragment bound only to the S1, S3, and S3 SP (S3 subpocket) sites without any interactions with the catalytic aspartate residues (Asp32 and Asp215 (pepsin numbering)). Prior to making chemical modifications to the hit fragment, we first identified its essential binding sites by utilizing the hit fragment's substructures. Second, we created a new and smaller scaffold, which better occupied the identified essential S3 and S3 SP sites, by utilizing library synthesis with high-throughput chemistry. We then revisited the S1 site and efficiently explored a good building block attaching to the scaffold with library synthesis. In the library syntheses, the binding modes of each pivotal compound were determined and confirmed by X-ray crystallography and the library was strategically designed by structure-based computational approach not only to obtain a more active compound but also to obtain informative Structure Activity Relationship (SAR). As a result, we obtained a lead compound offering synthetic accessibility as well as the improved in vitro ADMET profiles. The fragments and compounds possessing a characteristic interaction pattern provided new structural insights into renin's active site and the potential to create a new generation of renin inhibitors. In addition, we demonstrated our FBDD strategy integrating highly sensitive biochemical assay, X-ray crystallography, and high-throughput synthesis and in silico library design aimed at fragment morphing at the initial stage was effective to elucidate a pocket profile and a promising lead compound. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. An Effective Risk Minimization Strategy Applied to an Outdoor Music Festival: A Multi-Agency Approach.

    Science.gov (United States)

    Luther, Matt; Gardiner, Fergus; Lenson, Shane; Caldicott, David; Harris, Ryan; Sabet, Ryan; Malloy, Mark; Perkins, Jo

    2018-04-01

    Specific Event Identifiers a. Event type: Outdoor music festival. b. Event onset date: December 3, 2016. c. Location of event: Regatta Point, Commonwealth Park. d. Geographical coordinates: Canberra, Australian Capital Territory (ACT), Australia (-35.289002, 149.131957, 600m). e. Dates and times of observation in latitude, longitude, and elevation: December 3, 2016, 11:00-23:00. f. Response type: Event medical support. Abstract Introduction Young adult patrons are vulnerable to risk-taking behavior, including drug taking, at outdoor music festivals. Therefore, the aim of this field report is to discuss the on-site medical response during a music festival, and subsequently highlight observed strategies aimed at minimizing substance abuse harm. The observed outdoor music festival was held in Canberra (Australian Capital Territory [ACT], Australia) during the early summer of 2016, with an attendance of 23,008 patrons. First aid and on-site medical treatment data were gained from the relevant treatment area and service. The integrated first aid service provided support to 292 patients. Final analysis consisted of 286 patients' records, with 119 (41.6%) males and 167 (58.4%) females. Results from this report indicated that drug intoxication was an observed event issue, with 15 (5.1%) treated on site and 13 emergency department (ED) presentations, primarily related to trauma or medical conditions requiring further diagnostics. This report details an important public health need, which could be met by providing a coordinated approach, including a robust on-site medical service, accepting intrinsic risk-taking behavior. This may include on-site drug-checking, providing reliable information on drug content with associated education. Luther M , Gardiner F , Lenson S , Caldicott D , Harris R , Sabet R , Malloy M , Perkins J . An effective risk minimization strategy applied to an outdoor music festival: a multi-agency approach. Prehosp Disaster Med. 2018;33(2):220-224.

  11. A Monte Carlo approach applied to ultrasonic non-destructive testing

    Science.gov (United States)

    Mosca, I.; Bilgili, F.; Meier, T.; Sigloch, K.

    2012-04-01

    Non-destructive testing based on ultrasound allows us to detect, characterize and size discrete flaws in geotechnical and architectural structures and materials. This information is needed to determine whether such flaws can be tolerated in future service. In typical ultrasonic experiments, only the first-arriving P-wave is interpreted, and the remainder of the recorded waveform is neglected. Our work aims at understanding surface waves, which are strong signals in the later wave train, with the ultimate goal of full waveform tomography. At present, even the structural estimation of layered media is still challenging because material properties of the samples can vary widely, and good initial models for inversion do not often exist. The aim of the present study is to combine non-destructive testing with a theoretical data analysis and hence to contribute to conservation strategies of archaeological and architectural structures. We analyze ultrasonic waveforms measured at the surface of a variety of samples, and define the behaviour of surface waves in structures of increasing complexity. The tremendous potential of ultrasonic surface waves becomes an advantage only if numerical forward modelling tools are available to describe the waveforms accurately. We compute synthetic full seismograms as well as group and phase velocities for the data. We invert them for the elastic properties of the sample via a global search of the parameter space, using the Neighbourhood Algorithm. Such a Monte Carlo approach allows us to perform a complete uncertainty and resolution analysis, but the computational cost is high and increases quickly with the number of model parameters. Therefore it is practical only for defining the seismic properties of media with a limited number of degrees of freedom, such as layered structures. We have applied this approach to both synthetic layered structures and real samples. The former contributed to benchmark the propagation of ultrasonic surface

  12. VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations.

    Science.gov (United States)

    Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Ostergaard, Pia; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Jiang, Xiaoyi; Mortimer, Peter S; Kiefer, Friedemann

    2017-08-17

    Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy-based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003.

  13. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    Science.gov (United States)

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  14. A strategy to apply a graded approach to a new research reactor I and C design

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Park, Jae Kwan; Kim, Taek Kyu; Bae, Sang Hoon; Baang, Dane; Kim, Young Ki

    2012-01-01

    A project for the development of a new research reactor (NRR) was launched by KAERI in 2012. It has two purposes: 1) providing a facility for radioisotope production, neutron transmutation doping, and semiconductor wafer doping, and 2) obtaining a standard model for exporting a research reactor (RR). The instrumentation and control (I and C) design should reveal an appropriate architecture for the NRR export. The adoption of a graded approach (GA) was taken into account to design the I and C and architecture. Although the GA for RRs is currently under development by the IAEA, it has been recommended and applied in many areas of nuclear facilities. The Canadian Nuclear Safety Commission allows for the use of a GA for RRs to meet the safety requirements. Germany applied the GA to a decommissioning project. It categorized the level of complexity of the decommissioning project using the GA. In the case of 10 C.F.R. Part 830 830.7, a contractor must use a GA to implement the requirements of the part, document the basis of the GA used, and submit that document to U.S. DOE. It mentions that a challenge is the inconsistent application of GA on DOE programs. RG 1.176 states that graded quality assurance brings benefits of resource allocation based on the safety significance of the items. The U.S. NRC also applied the GA to decommissioning small facilities. The NASA published a handbook for risk informed decision making that is conducted using a GA. ISATR67.04.09 2005 supplements ANSI/ISA.S67.04.01. 2000 and ISA RP67.04.02 2000 in determining the setpoint using a GA. The GA is defined as a risk informed approach that, without compromising safety, allows safety requirements to be implemented in such a way that the level of design, analysis, and documentation are commensurate with the potential risks of the reactor. The IAEA is developing a GA through DS351 and has recommended applying it to a reactor design according to power and hazarding level. Owing to the wide range of RR

  15. A strategy to apply a graded approach to a new research reactor I and C design

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Yong Suk; Park, Jae Kwan; Kim, Taek Kyu; Bae, Sang Hoon; Baang, Dane; Kim, Young Ki [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    A project for the development of a new research reactor (NRR) was launched by KAERI in 2012. It has two purposes: 1) providing a facility for radioisotope production, neutron transmutation doping, and semiconductor wafer doping, and 2) obtaining a standard model for exporting a research reactor (RR). The instrumentation and control (I and C) design should reveal an appropriate architecture for the NRR export. The adoption of a graded approach (GA) was taken into account to design the I and C and architecture. Although the GA for RRs is currently under development by the IAEA, it has been recommended and applied in many areas of nuclear facilities. The Canadian Nuclear Safety Commission allows for the use of a GA for RRs to meet the safety requirements. Germany applied the GA to a decommissioning project. It categorized the level of complexity of the decommissioning project using the GA. In the case of 10 C.F.R. Part 830 830.7, a contractor must use a GA to implement the requirements of the part, document the basis of the GA used, and submit that document to U.S. DOE. It mentions that a challenge is the inconsistent application of GA on DOE programs. RG 1.176 states that graded quality assurance brings benefits of resource allocation based on the safety significance of the items. The U.S. NRC also applied the GA to decommissioning small facilities. The NASA published a handbook for risk informed decision making that is conducted using a GA. ISATR67.04.09 2005 supplements ANSI/ISA.S67.04.01. 2000 and ISA RP67.04.02 2000 in determining the setpoint using a GA. The GA is defined as a risk informed approach that, without compromising safety, allows safety requirements to be implemented in such a way that the level of design, analysis, and documentation are commensurate with the potential risks of the reactor. The IAEA is developing a GA through DS351 and has recommended applying it to a reactor design according to power and hazarding level. Owing to the wide range of RR

  16. Lactic Acid Bacteria Selection for Biopreservation as a Part of Hurdle Technology Approach Applied on Seafood

    Directory of Open Access Journals (Sweden)

    Norman Wiernasz

    2017-05-01

    Full Text Available As fragile food commodities, microbial, and organoleptic qualities of fishery and seafood can quickly deteriorate. In this context, microbial quality and security improvement during the whole food processing chain (from catch to plate, using hurdle technology, a combination of mild preserving technologies such as biopreservation, modified atmosphere packaging, and superchilling, are of great interest. As natural flora and antimicrobial metabolites producers, lactic acid bacteria (LAB are commonly studied for food biopreservation. Thirty-five LAB known to possess interesting antimicrobial activity were selected for their potential application as bioprotective agents as a part of hurdle technology applied to fishery products. The selection approach was based on seven criteria including antimicrobial activity, alteration potential, tolerance to chitosan coating, and superchilling process, cross inhibition, biogenic amines production (histamine, tyramine, and antibiotics resistance. Antimicrobial activity was assessed against six common spoiling bacteria in fishery products (Shewanella baltica, Photobacterium phosphoreum, Brochothrix thermosphacta, Lactobacillus sakei, Hafnia alvei, Serratia proteamaculans and one pathogenic bacterium (Listeria monocytogenes in co-culture inhibitory assays miniaturized in 96-well microtiter plates. Antimicrobial activity and spoilage evaluation, both performed in cod and salmon juice, highlighted the existence of sensory signatures and inhibition profiles, which seem to be species related. Finally, six LAB with no unusual antibiotics resistance profile nor histamine production ability were selected as bioprotective agents for further in situ inhibitory assays in cod and salmon based products, alone or in combination with other hurdles (chitosan, modified atmosphere packing, and superchilling.

  17. Extraction of thermal Green's function using diffuse fields: a passive approach applied to thermography

    Science.gov (United States)

    Capriotti, Margherita; Sternini, Simone; Lanza di Scalea, Francesco; Mariani, Stefano

    2016-04-01

    In the field of non-destructive evaluation, defect detection and visualization can be performed exploiting different techniques relying either on an active or a passive approach. In the following paper the passive technique is investigated due to its numerous advantages and its application to thermography is explored. In previous works, it has been shown that it is possible to reconstruct the Green's function between any pair of points of a sensing grid by using noise originated from diffuse fields in acoustic environments. The extraction of the Green's function can be achieved by cross-correlating these random recorded waves. Averaging, filtering and length of the measured signals play an important role in this process. This concept is here applied in an NDE perspective utilizing thermal fluctuations present on structural materials. Temperature variations interacting with thermal properties of the specimen allow for the characterization of the material and its health condition. The exploitation of the thermographic image resolution as a dense grid of sensors constitutes the basic idea underlying passive thermography. Particular attention will be placed on the creation of a proper diffuse thermal field, studying the number, placement and excitation signal of heat sources. Results from numerical simulations will be presented to assess the capabilities and performances of the passive thermal technique devoted to defect detection and imaging of structural components.

  18. Applied tagmemics: A heuristic approach to the use of graphic aids in technical writing

    Science.gov (United States)

    Brownlee, P. P.; Kirtz, M. K.

    1981-01-01

    In technical report writing, two needs which must be met if reports are to be useable by an audience are the language needs and the technical needs of that particular audience. A heuristic analysis helps to decide the most suitable format for information; that is, whether the information should be presented verbally or visually. The report writing process should be seen as an organic whole which can be divided and subdivided according to the writer's purpose, but which always functions as a totality. The tagmemic heuristic, because it itself follows a process of deconstructing and reconstructing information, lends itself to being a useful approach to the teaching of technical writing. By applying the abstract questions this heuristic asks to specific parts of the report. The language and technical needs of the audience are analyzed by examining the viability of the solution within the givens of the corporate structure, and by deciding which graphic or verbal format will best suit the writer's purpose. By following such a method, answers which are both specific and thorough in their range of application are found.

  19. An explorative chemometric approach applied to hyperspectral images for the study of illuminated manuscripts

    Science.gov (United States)

    Catelli, Emilio; Randeberg, Lise Lyngsnes; Alsberg, Bjørn Kåre; Gebremariam, Kidane Fanta; Bracci, Silvano

    2017-04-01

    Hyperspectral imaging (HSI) is a fast non-invasive imaging technology recently applied in the field of art conservation. With the help of chemometrics, important information about the spectral properties and spatial distribution of pigments can be extracted from HSI data. With the intent of expanding the applications of chemometrics to the interpretation of hyperspectral images of historical documents, and, at the same time, to study the colorants and their spatial distribution on ancient illuminated manuscripts, an explorative chemometric approach is here presented. The method makes use of chemometric tools for spectral de-noising (minimum noise fraction (MNF)) and image analysis (multivariate image analysis (MIA) and iterative key set factor analysis (IKSFA)/spectral angle mapper (SAM)) which have given an efficient separation, classification and mapping of colorants from visible-near-infrared (VNIR) hyperspectral images of an ancient illuminated fragment. The identification of colorants was achieved by extracting and interpreting the VNIR spectra as well as by using a portable X-ray fluorescence (XRF) spectrometer.

  20. A novel approach to enhance food safety: industry-academia-government partnership for applied research.

    Science.gov (United States)

    Osterholm, Michael T; Ostrowsky, Julie; Farrar, Jeff A; Gravani, Robert B; Tauxe, Robert V; Buchanan, Robert L; Hedberg, Craig W

    2009-07-01

    An independent collaborative approach was developed for stimulating research on high-priority food safety issues. The Fresh Express Produce Safety Research Initiative was launched in 2007 with $2 million in unrestricted funds from industry and independent direction and oversight from a scientific advisory panel consisting of nationally recognized food safety experts from academia and government agencies. The program had two main objectives: (i) to fund rigorous, innovative, and multidisciplinary research addressing the safety of lettuce, spinach, and other leafy greens and (ii) to share research findings as widely and quickly as possible to support the development of advanced safeguards within the fresh-cut produce industry. Sixty-five proposals were submitted in response to a publicly announced request for proposals and were competitively evaluated. Nine research projects were funded to examine underlying factors involved in Escherichia coli O157:H7 contamination of lettuce, spinach, and other leafy greens and potential strategies for preventing the spread of foodborne pathogens. Results of the studies, published in the Journal of Food Protection, help to identify promising directions for future research into potential sources and entry points of contamination and specific factors associated with harvesting, processing, transporting, and storing produce that allow contaminants to persist and proliferate. The program provides a model for leveraging the strengths of industry, academia, and government to address high-priority issues quickly and directly through applied research. This model can be productively extended to other pathogens and other leafy and nonleafy produce.

  1. Distribution function approach to redshift space distortions. Part V: perturbation theory applied to dark matter halos

    Energy Technology Data Exchange (ETDEWEB)

    Vlah, Zvonimir; Seljak, Uroš [Institute for Theoretical Physics, University of Zürich, Zürich (Switzerland); Okumura, Teppei [Institute for the Early Universe, Ewha Womans University, Seoul, S. Korea (Korea, Republic of); Desjacques, Vincent, E-mail: zvlah@physik.uzh.ch, E-mail: seljak@physik.uzh.ch, E-mail: teppei@ewha.ac.kr, E-mail: Vincent.Desjacques@unige.ch [Département de Physique Théorique and Center for Astroparticle Physics (CAP) Université de Genéve, Genéve (Switzerland)

    2013-10-01

    Numerical simulations show that redshift space distortions (RSD) introduce strong scale dependence in the power spectra of halos, with ten percent deviations relative to linear theory predictions even on relatively large scales (k < 0.1h/Mpc) and even in the absence of satellites (which induce Fingers-of-God, FoG, effects). If unmodeled these effects prevent one from extracting cosmological information from RSD surveys. In this paper we use Eulerian perturbation theory (PT) and Eulerian halo biasing model and apply it to the distribution function approach to RSD, in which RSD is decomposed into several correlators of density weighted velocity moments. We model each of these correlators using PT and compare the results to simulations over a wide range of halo masses and redshifts. We find that with an introduction of a physically motivated halo biasing, and using dark matter power spectra from simulations, we can reproduce the simulation results at a percent level on scales up to k ∼ 0.15h/Mpc at z = 0, without the need to have free FoG parameters in the model.

  2. Applying the reasoned action approach to understanding health protection and health risk behaviors.

    Science.gov (United States)

    Conner, Mark; McEachan, Rosemary; Lawton, Rebecca; Gardner, Peter

    2017-12-01

    The Reasoned Action Approach (RAA) developed out of the Theory of Reasoned Action and Theory of Planned Behavior but has not yet been widely applied to understanding health behaviors. The present research employed the RAA in a prospective design to test predictions of intention and action for groups of protection and risk behaviors separately in the same sample. To test the RAA for health protection and risk behaviors. Measures of RAA components plus past behavior were taken in relation to eight protection and six risk behaviors in 385 adults. Self-reported behavior was assessed one month later. Multi-level modelling showed instrumental attitude, experiential attitude, descriptive norms, capacity and past behavior were significant positive predictors of intentions to engage in protection or risk behaviors. Injunctive norms were only significant predictors of intention in protection behaviors. Autonomy was a significant positive predictor of intentions in protection behaviors and a negative predictor in risk behaviors (the latter relationship became non-significant when controlling for past behavior). Multi-level modelling showed that intention, capacity, and past behavior were significant positive predictors of action for both protection and risk behaviors. Experiential attitude and descriptive norm were additional significant positive predictors of risk behaviors. The RAA has utility in predicting both protection and risk health behaviors although the power of predictors may vary across these types of health behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Stakeholder Theory As an Ethical Approach to Effective Management: applying the theory to multiple contexts

    Directory of Open Access Journals (Sweden)

    Jeffrey S. Harrison

    2015-09-01

    Full Text Available Objective – This article provides a brief overview of stakeholder theory, clears up some widely held misconceptions, explains the importance of examining stakeholder theory from a variety of international perspectives and how this type of research will advance management theory, and introduces the other articles in the special issue. Design/methodology/approach – Some of the foundational ideas of stakeholder theory are discussed, leading to arguments about the importance of the theory to management research, especially in an international context. Findings – Stakeholder theory is found to be a particularly useful perspective for addressing some of the important issues in business from an international perspective. It offers an opportunity to reinterpret a variety of concepts, models and phenomena across may different disciplines. Practical implications – The concepts explored in this article may be applied in many contexts, domestically and internationally, and across business disciplines as diverse as economics, public administration, finance, philosophy, marketing, law, and management. Originality/value – Research on stakeholder theory in an international context is both lacking and sorely needed. This article and the others in this special issue aim to help fill that void.

  4. multi-scale data assimilation approaches and error characterisation applied to the inverse modelling of atmospheric constituent emission fields

    International Nuclear Information System (INIS)

    Koohkan, Mohammad Reza

    2012-01-01

    Data assimilation in geophysical sciences aims at optimally estimating the state of the system or some parameters of the system's physical model. To do so, data assimilation needs three types of information: observations and background information, a physical/numerical model, and some statistical description that prescribes uncertainties to each component of the system. In my dissertation, new methodologies of data assimilation are used in atmospheric chemistry and physics: the joint use of a 4D-Var with a sub-grid statistical model to consistently account for representativeness errors, accounting for multiple scale in the BLUE estimation principle, and a better estimation of prior errors using objective estimation of hyper-parameters. These three approaches will be specifically applied to inverse modelling problems focusing on the emission fields of tracers or pollutants. First, in order to estimate the emission inventories of carbon monoxide over France, in-situ stations which are impacted by the representativeness errors are used. A sub-grid model is introduced and coupled with a 4D-Var to reduce the representativeness error. Indeed, the results of inverse modelling showed that the 4D-Var routine was not fit to handle the representativeness issues. The coupled data assimilation system led to a much better representation of the CO concentration variability, with a significant improvement of statistical indicators, and more consistent estimation of the CO emission inventory. Second, the evaluation of the potential of the IMS (International Monitoring System) radionuclide network is performed for the inversion of an accidental source. In order to assess the performance of the global network, a multi-scale adaptive grid is optimised using a criterion based on degrees of freedom for the signal (DFS). The results show that several specific regions remain poorly observed by the IMS network. Finally, the inversion of the surface fluxes of Volatile Organic Compounds

  5. Incorporation of various uncertainties in dependent failure-probability estimation

    International Nuclear Information System (INIS)

    Samanta, P.K.; Mitra, S.P.

    1982-01-01

    This paper describes an approach that allows the incorporation of various types of uncertainties in the estimation of dependent failure (common mode failure) probability. The types of uncertainties considered are attributable to data, modeling and coupling. The method developed is applied to a class of dependent failures, i.e., multiple human failures during testing, maintenance and calibration. Estimation of these failures is critical as they have been shown to be significant contributors to core melt probability in pressurized water reactors

  6. An approach for evaluating the integrity of fuel applied in Innovative Nuclear Energy Systems

    International Nuclear Information System (INIS)

    Nakae, Nobuo; Ozawa, Takayuki; Ohta, Hirokazu; Ogata, Takanari; Sekimoto, Hiroshi

    2014-01-01

    One of the important issues in the study of Innovative Nuclear Energy Systems is evaluating the integrity of fuel applied in Innovative Nuclear Energy Systems. An approach for evaluating the integrity of the fuel is discussed here based on the procedure currently used in the integrity evaluation of fast reactor fuel. The fuel failure modes determining fuel life time were reviewed and fuel integrity was analyzed and compared with the failure criteria. Metal and nitride fuels with austenitic and ferritic stainless steel (SS) cladding tubes were examined in this study. For the purpose of representative irradiation behavior analyses of the fuel for Innovative Nuclear Energy Systems, the correlations of the cladding characteristics were modeled based on well-known characteristics of austenitic modified 316 SS (PNC316), ferritic–martensitic steel (PNC–FMS) and oxide dispersion strengthened steel (PNC–ODS). The analysis showed that the fuel lifetime is limited by channel fracture which is a nonductile type (brittle) failure associated with a high level of irradiation-induced swelling in the case of austenitic steel cladding. In case of ferritic steel, on the other hand, the fuel lifetime is controlled by cladding creep rupture. The lifetime evaluated here is limited to 200 GW d/t, which is lower than the target burnup value of 500 GW d/t. One of the possible measures to extend the lifetime may be reducing the fuel smeared density and ventilating fission gas in the plenum for metal fuel and by reducing the maximum cladding temperature from 650 to 600 °C for both metal and nitride fuel

  7. Hybrid sequencing approach applied to human fecal metagenomic clone libraries revealed clones with potential biotechnological applications.

    Science.gov (United States)

    Džunková, Mária; D'Auria, Giuseppe; Pérez-Villarroya, David; Moya, Andrés

    2012-01-01

    Natural environments represent an incredible source of microbial genetic diversity. Discovery of novel biomolecules involves biotechnological methods that often require the design and implementation of biochemical assays to screen clone libraries. However, when an assay is applied to thousands of clones, one may eventually end up with very few positive clones which, in most of the cases, have to be "domesticated" for downstream characterization and application, and this makes screening both laborious and expensive. The negative clones, which are not considered by the selected assay, may also have biotechnological potential; however, unfortunately they would remain unexplored. Knowledge of the clone sequences provides important clues about potential biotechnological application of the clones in the library; however, the sequencing of clones one-by-one would be very time-consuming and expensive. In this study, we characterized the first metagenomic clone library from the feces of a healthy human volunteer, using a method based on 454 pyrosequencing coupled with a clone-by-clone Sanger end-sequencing. Instead of whole individual clone sequencing, we sequenced 358 clones in a pool. The medium-large insert (7-15 kb) cloning strategy allowed us to assemble these clones correctly, and to assign the clone ends to maintain the link between the position of a living clone in the library and the annotated contig from the 454 assembly. Finally, we found several open reading frames (ORFs) with previously described potential medical application. The proposed approach allows planning ad-hoc biochemical assays for the clones of interest, and the appropriate sub-cloning strategy for gene expression in suitable vectors/hosts.

  8. Hybrid sequencing approach applied to human fecal metagenomic clone libraries revealed clones with potential biotechnological applications.

    Directory of Open Access Journals (Sweden)

    Mária Džunková

    Full Text Available Natural environments represent an incredible source of microbial genetic diversity. Discovery of novel biomolecules involves biotechnological methods that often require the design and implementation of biochemical assays to screen clone libraries. However, when an assay is applied to thousands of clones, one may eventually end up with very few positive clones which, in most of the cases, have to be "domesticated" for downstream characterization and application, and this makes screening both laborious and expensive. The negative clones, which are not considered by the selected assay, may also have biotechnological potential; however, unfortunately they would remain unexplored. Knowledge of the clone sequences provides important clues about potential biotechnological application of the clones in the library; however, the sequencing of clones one-by-one would be very time-consuming and expensive. In this study, we characterized the first metagenomic clone library from the feces of a healthy human volunteer, using a method based on 454 pyrosequencing coupled with a clone-by-clone Sanger end-sequencing. Instead of whole individual clone sequencing, we sequenced 358 clones in a pool. The medium-large insert (7-15 kb cloning strategy allowed us to assemble these clones correctly, and to assign the clone ends to maintain the link between the position of a living clone in the library and the annotated contig from the 454 assembly. Finally, we found several open reading frames (ORFs with previously described potential medical application. The proposed approach allows planning ad-hoc biochemical assays for the clones of interest, and the appropriate sub-cloning strategy for gene expression in suitable vectors/hosts.

  9. An approach for evaluating the integrity of fuel applied in Innovative Nuclear Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nakae, Nobuo, E-mail: nakae-nobuo@jnes.go.jp [Center for Research into Innovative Nuclear Energy System, Tokyo Institute of Technology, 2-12-1-N1-19, Ookayama, Meguro-ku, Tokyo 152-8550 (Japan); Ozawa, Takayuki [Advanced Nuclear System Research and Development Directorate, Japan Atomic Energy Agency, 4-33, Muramatsu, Tokai-mura, Ibaraki-ken 319-1194 (Japan); Ohta, Hirokazu; Ogata, Takanari [Nuclear Technology Research Laboratory, Central Research Institute of Electric Power Industry, 2-11-1, Iwado Kita, Komae-shi, Tokyo 201-8511 (Japan); Sekimoto, Hiroshi [Center for Research into Innovative Nuclear Energy System, Tokyo Institute of Technology, 2-12-1-N1-19, Ookayama, Meguro-ku, Tokyo 152-8550 (Japan)

    2014-03-15

    One of the important issues in the study of Innovative Nuclear Energy Systems is evaluating the integrity of fuel applied in Innovative Nuclear Energy Systems. An approach for evaluating the integrity of the fuel is discussed here based on the procedure currently used in the integrity evaluation of fast reactor fuel. The fuel failure modes determining fuel life time were reviewed and fuel integrity was analyzed and compared with the failure criteria. Metal and nitride fuels with austenitic and ferritic stainless steel (SS) cladding tubes were examined in this study. For the purpose of representative irradiation behavior analyses of the fuel for Innovative Nuclear Energy Systems, the correlations of the cladding characteristics were modeled based on well-known characteristics of austenitic modified 316 SS (PNC316), ferritic–martensitic steel (PNC–FMS) and oxide dispersion strengthened steel (PNC–ODS). The analysis showed that the fuel lifetime is limited by channel fracture which is a nonductile type (brittle) failure associated with a high level of irradiation-induced swelling in the case of austenitic steel cladding. In case of ferritic steel, on the other hand, the fuel lifetime is controlled by cladding creep rupture. The lifetime evaluated here is limited to 200 GW d/t, which is lower than the target burnup value of 500 GW d/t. One of the possible measures to extend the lifetime may be reducing the fuel smeared density and ventilating fission gas in the plenum for metal fuel and by reducing the maximum cladding temperature from 650 to 600 °C for both metal and nitride fuel.

  10. Features of applying systems approach for evaluating the reliability of cryogenic systems for special purposes

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. The analysis of cryogenic installations confirms objective regularity of increase in amount of the tasks solved by systems of a special purpose. One of the most important directions of development of a cryogenics is creation of installations for air separation product receipt, namely oxygen and nitrogen. Modern aviation complexes require use of these gases in large numbers as in gaseous, and in the liquid state. The onboard gas systems applied in aircraft of the Russian Federation are subdivided on: oxygen system; air (nitric system; system of neutral gas; fire-proof system. Technological schemes ADI are in many respects determined by pressure of compressed air or, in a general sense, a refrigerating cycle. For the majority ADI a working body of a refrigerating cycle the divided air is, that is technological and refrigerating cycles in installation are integrated. By this principle differentiate installations: low pressure; average and high pressure; with detander; with preliminary chilling. There is also insignificant number of the ADI types in which refrigerating and technological cycles are separated. These are installations with external chilling. For the solution of tasks of control of technical condition of the BRV hardware in real time and estimates of indicators of reliability it is offered to use multi-agent technologies. Multi-agent approach is the most acceptable for creation of SPPR for reliability assessment as allows: to redistribute processing of information on elements of system that leads to increase in overall performance; to solve a problem of accumulating, storage and recycling of knowledge that will allow to increase significantly efficiency of the solution of tasks of an assessment of reliability; to considerably reduce intervention of the person in process of functioning of system that will save time of the person of the making decision (PMD and will not demand from it special skills of work with it.

  11. Applying the health action process approach to bicycle helmet use and evaluating a social marketing campaign.

    Science.gov (United States)

    Karl, Florian M; Smith, Jennifer; Piedt, Shannon; Turcotte, Kate; Pike, Ian

    2017-08-05

    Bicycle injuries are of concern in Canada. Since helmet use was mandated in 1996 in the province of British Columbia, Canada, use has increased and head injuries have decreased. Despite the law, many cyclists do not wear a helmet. Health action process approach (HAPA) model explains intention and behaviour with self-efficacy, risk perception, outcome expectancies and planning constructs. The present study examines the impact of a social marketing campaign on HAPA constructs in the context of bicycle helmet use. A questionnaire was administered to identify factors determining helmet use. Intention to obey the law, and perceived risk of being caught if not obeying the law were included as additional constructs. Path analysis was used to extract the strongest influences on intention and behaviour. The social marketing campaign was evaluated through t-test comparisons after propensity score matching and generalised linear modelling (GLM) were applied to adjust for the same covariates. 400 cyclists aged 25-54 years completed the questionnaire. Self-efficacy and Intention were most predictive of intention to wear a helmet, which, moderated by planning, strongly predicted behaviour. Perceived risk and outcome expectancies had no significant impact on intention. GLM showed that exposure to the campaign was significantly associated with higher values in self-efficacy, intention and bicycle helmet use. Self-efficacy and planning are important points of action for promoting helmet use. Social marketing campaigns that remind people of appropriate preventive action have an impact on behaviour. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  12. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  13. Applying the competence-based approach to management in the aerospace industry

    OpenAIRE

    Arpentieva Mariam; Duvalina Olga; Braitseva Svetlana; Gorelova Irina; Rozhnova Anna

    2018-01-01

    Problems of management in aerospace manufacturing are similar to those we observe in other sectors, the main of which is the flattening of strategic management. The main reason lies in the attitude towards human resource of the organization. In the aerospace industry employs 250 thousand people, who need individual approach. The individual approach can offer competence-based approach to management. The purpose of the study is proof of the benefits of the competency approach to human resource ...

  14. New approach for validating the segmentation of 3D data applied to individual fibre extraction

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Dahl, Anders Bjorholm; Dahl, Vedrana Andersen

    2017-01-01

    We present two approaches for validating the segmentation of 3D data. The first approach consists on comparing the amount of estimated material to a value provided by the manufacturer. The second approach consists on comparing the segmented results to those obtained from imaging modalities...

  15. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  16. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-04

    Analyses of complex samples of cosmetics, such as creams or lotions, are generally achieved by HPLC. These analyses are often multistep gradients, due to the presence of compounds with a large range of polarity. For instance, the bioactive compounds may be polar, while the matrix contains lipid components that are rather non-polar, thus cosmetic formulations are usually oil-water emulsions. Supercritical fluid chromatography (SFC) uses mobile phases composed of carbon dioxide and organic co-solvents, allowing for good solubility of both the active compounds and the matrix excipients. Moreover, the classical and well-known properties of these mobile phases yield fast analyses and ensure rapid method development. However, due to the large number of stationary phases available for SFC and to the varied additional parameters acting both on retention and separation factors (co-solvent nature and percentage, temperature, backpressure, flow rate, column dimensions and particle size), a simplified approach can be followed to ensure a fast method development. First, suited stationary phases should be carefully selected for an initial screening, and then the other operating parameters can be limited to the co-solvent nature and percentage, maintaining the oven temperature and back-pressure constant. To describe simple method development guidelines in SFC, three sample applications are discussed in this paper: UV-filters (sunscreens) in sunscreen cream, glyceryl caprylate in eye liner and caffeine in eye serum. Firstly, five stationary phases (ACQUITY UPC(2)) are screened with isocratic elution conditions (10% methanol in carbon dioxide). Complementary of the stationary phases is assessed based on our spider diagram classification which compares a large number of stationary phases based on five molecular interactions. Secondly, the one or two best stationary phases are retained for further optimization of mobile phase composition, with isocratic elution conditions or, when

  19. Introduction to probability and stochastic processes with applications

    CERN Document Server

    Castañ, Blanco; Arunachalam, Viswanathan; Dharmaraja, Selvamuthu

    2012-01-01

    An easily accessible, real-world approach to probability and stochastic processes Introduction to Probability and Stochastic Processes with Applications presents a clear, easy-to-understand treatment of probability and stochastic processes, providing readers with a solid foundation they can build upon throughout their careers. With an emphasis on applications in engineering, applied sciences, business and finance, statistics, mathematics, and operations research, the book features numerous real-world examples that illustrate how random phenomena occur in nature and how to use probabilistic t

  20. Applying the archetype approach to the database of a biobank information management system.

    Science.gov (United States)

    Späth, Melanie Bettina; Grimson, Jane

    2011-03-01

    The purpose of this study is to investigate the feasibility of applying the openEHR archetype approach to modelling the data in the database of an existing proprietary biobank information management system. A biobank information management system stores the clinical/phenotypic data of the sample donor and sample related information. The clinical/phenotypic data is potentially sourced from the donor's electronic health record (EHR). The study evaluates the reuse of openEHR archetypes that have been developed for the creation of an interoperable EHR in the context of biobanking, and proposes a new set of archetypes specifically for biobanks. The ultimate goal of the research is the development of an interoperable electronic biomedical research record (eBMRR) to support biomedical knowledge discovery. The database of the prostate cancer biobank of the Irish Prostate Cancer Research Consortium (PCRC), which supports the identification of novel biomarkers for prostate cancer, was taken as the basis for the modelling effort. First the database schema of the biobank was analyzed and reorganized into archetype-friendly concepts. Then, archetype repositories were searched for matching archetypes. Some existing archetypes were reused without change, some were modified or specialized, and new archetypes were developed where needed. The fields of the biobank database schema were then mapped to the elements in the archetypes. Finally, the archetypes were arranged into templates specifically to meet the requirements of the PCRC biobank. A set of 47 archetypes was found to cover all the concepts used in the biobank. Of these, 29 (62%) were reused without change, 6 were modified and/or extended, 1 was specialized, and 11 were newly defined. These archetypes were arranged into 8 templates specifically required for this biobank. A number of issues were encountered in this research. Some arose from the immaturity of the archetype approach, such as immature modelling support tools

  1. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  2. Applying the competence-based approach to management in the aerospace industry

    Directory of Open Access Journals (Sweden)

    Arpentieva Mariam

    2018-01-01

    Full Text Available Problems of management in aerospace manufacturing are similar to those we observe in other sectors, the main of which is the flattening of strategic management. The main reason lies in the attitude towards human resource of the organization. In the aerospace industry employs 250 thousand people, who need individual approach. The individual approach can offer competence-based approach to management. The purpose of the study is proof of the benefits of the competency approach to human resource management in context strategic management of the aerospace organization. To achieve this goal it is possible to obtain the method of comparative analysis. The article compares two approaches to personnel management. The transition to competence-based human resource management means (a a different understanding of the object of management; (b involvement in all functions of human resource management «knowledge – skills – abilities» of the employee; (c to change the approach to strategic management aerospace industry.

  3. Frontolateral Approach Applied to Sellar Region Lesions: A Retrospective Study in 79 Patients

    Directory of Open Access Journals (Sweden)

    Hao-Cheng Liu

    2016-01-01

    Conclusions: FLA was an effective approach in the treatment of sellar region lesions with good preservation of visual function. FLA classification enabled tailored craniotomies for each patient according to the anatomic site of tumor invasion. This study found that FLA had similar outcomes to other surgical approaches of sellar region lesions.

  4. An Optimisation Approach Applied to Design the Hydraulic Power Supply for a Forklift Truck

    DEFF Research Database (Denmark)

    Pedersen, Henrik Clemmensen; Andersen, Torben Ole; Hansen, Michael Rygaard

    2004-01-01

    -level optimisation approach, and is in the current paper exemplified through the design of the hydraulic power supply for a forklift truck. The paper first describes the prerequisites for the method and then explains the different steps in the approach to design the hydraulic system. Finally the results...

  5. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  6. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  7. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  8. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  9. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  10. Upgrading of the Ukrainian NPPs and the ''2+2'' approach applied for the licensing of the major modifications

    International Nuclear Information System (INIS)

    Gorbatchev, A.; Goetsch, D.; Redko, V.; Madonna, A.

    2003-01-01

    Many of the planned upgrading measures of Ukrainian VVER plants and of the unique Armenian power plant (Medzanor) are financed by the European Union (EU) through the TACIS program. The ''2+2'' approach implies a deep collaboration between Ukrainian or Armenian regulatory authorities, local operating organizations and EU organizations. This approach allows: - a smooth adaptation of western technologies to VVERs, - a comprehensive checking of Ukrainian, Armenian and western regulatory requirements, and - the transfer of know-how to the Ukrainian and Armenian organizations. This report presents the principles applied for ''2+2'' approach as well as a summary of the main recommendations given in the framework of the licensing process

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. Comparison of Science-Technology-Society Approach and Textbook Oriented Instruction on Students' Abilities to Apply Science Concepts

    Science.gov (United States)

    Kapici, Hasan Ozgur; Akcay, Hakan; Yager, Robert E.

    2017-01-01

    It is important for students to learn concepts and using them for solving problems and further learning. Within this respect, the purpose of this study is to investigate students' abilities to apply science concepts that they have learned from Science-Technology-Society based approach or textbook oriented instruction. Current study is based on…

  13. A Structured Approach to Teaching Applied Problem Solving through Technology Assessment.

    Science.gov (United States)

    Fischbach, Fritz A.; Sell, Nancy J.

    1986-01-01

    Describes an approach to problem solving based on real-world problems. Discusses problem analysis and definitions, preparation of briefing documents, solution finding techniques (brainstorming and synectics), solution evaluation and judgment, and implementation. (JM)

  14. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  15. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    OpenAIRE

    Amany AlShawi

    2016-01-01

    Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers...

  16. From biological anthropology to applied public health: epidemiological approaches to the study of infectious disease.

    Science.gov (United States)

    Albalak, Rachel

    2009-01-01

    This article describes two large, multisite infectious disease programs: the Tuberculosis Epidemiologic Studies Consortium (TBESC) and the Emerging Infections Programs (EIPs). The links between biological anthropology and applied public health are highlighted using these programs as examples. Funded by the Centers for Disease Control and Prevention (CDC), the TBESC and EIPs conduct applied public health research to strengthen infectious disease prevention and control efforts in the United States. They involve collaborations among CDC, public health departments, and academic and clinical institutions. Their unique role in national infectious disease work, including their links to anthropology, shared elements, key differences, strengths and challenges, is discussed.

  17. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  18. Applying Rawlsian Approaches to Resolve Ethical Issues : Inventory and Setting of a Research Agenda

    NARCIS (Netherlands)

    Doorn, N.

    2009-01-01

    Insights from social science are increasingly used in the field of applied ethics. However, recent insights have shown that the empirical branch of business ethics lacks thorough theoretical grounding. This article discusses the use of the Rawlsian methods of wide reflective equilibrium and

  19. Fluid Intelligence as a Predictor of Learning: A Longitudinal Multilevel Approach Applied to Math

    Science.gov (United States)

    Primi, Ricardo; Ferrao, Maria Eugenia; Almeida, Leandro S.

    2010-01-01

    The association between fluid intelligence and inter-individual differences was investigated using multilevel growth curve modeling applied to data measuring intra-individual improvement on math achievement tests. A sample of 166 students (88 boys and 78 girls), ranging in age from 11 to 14 (M = 12.3, SD = 0.64), was tested. These individuals took…

  20. Understanding the Conceptual Development Phase of Applied Theory-Building Research: A Grounded Approach

    Science.gov (United States)

    Storberg-Walker, Julia

    2007-01-01

    This article presents a provisional grounded theory of conceptual development for applied theory-building research. The theory described here extends the understanding of the components of conceptual development and provides generalized relations among the components. The conceptual development phase of theory-building research has been widely…

  1. A Transfer Learning Approach for Applying Matrix Factorization to Small ITS Datasets

    Science.gov (United States)

    Voß, Lydia; Schatten, Carlotta; Mazziotti, Claudia; Schmidt-Thieme, Lars

    2015-01-01

    Machine Learning methods for Performance Prediction in Intelligent Tutoring Systems (ITS) have proven their efficacy; specific methods, e.g. Matrix Factorization (MF), however suffer from the lack of available information about new tasks or new students. In this paper we show how this problem could be solved by applying Transfer Learning (TL),…

  2. How mass spectrometric approaches applied to bacterial identification have revolutionized the study of human gut microbiota.

    Science.gov (United States)

    Grégory, Dubourg; Chaudet, Hervé; Lagier, Jean-Christophe; Raoult, Didier

    2018-03-01

    Describing the human hut gut microbiota is one the most exciting challenges of the 21 st century. Currently, high-throughput sequencing methods are considered as the gold standard for this purpose, however, they suffer from several drawbacks, including their inability to detect minority populations. The advent of mass-spectrometric (MS) approaches to identify cultured bacteria in clinical microbiology enabled the creation of the culturomics approach, which aims to establish a comprehensive repertoire of cultured prokaryotes from human specimens using extensive culture conditions. Areas covered: This review first underlines how mass spectrometric approaches have revolutionized clinical microbiology. It then highlights the contribution of MS-based methods to culturomics studies, paying particular attention to the extension of the human gut microbiota repertoire through the discovery of new bacterial species. Expert commentary: MS-based approaches have enabled cultivation methods to be resuscitated to study the human gut microbiota and thus to fill in the blanks left by high-throughput sequencing methods in terms of culturing minority populations. Continued efforts to recover new taxa using culture methods, combined with their rapid implementation in genomic databases, would allow for an exhaustive analysis of the gut microbiota through the use of a comprehensive approach.

  3. Nonlinear approaches in engineering applications applied mechanics, vibration control, and numerical analysis

    CERN Document Server

    Jazar, Reza

    2015-01-01

    This book focuses on the latest applications of nonlinear approaches in different disciplines of engineering. For each selected topic, detailed concept development, derivations, and relevant knowledge are provided for the convenience of the readers. The topics range from dynamic systems and control to optimal approaches in nonlinear dynamics. The volume includes invited chapters from world class experts in the field. The selected topics are of great interest in the fields of engineering and physics and this book is ideal for engineers and researchers working in a broad range of practical topics and approaches. This book also: ·         Explores the most up-to-date applications and underlying principles of nonlinear approaches to problems in engineering and physics, including sections on analytic nonlinearity and practical nonlinearity ·         Enlightens readers to the conceptual significance of nonlinear approaches with examples of applications in scientific and engineering problems from v...

  4. Chaotic artificial immune approach applied to economic dispatch of electric energy using thermal units

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Mariani, Viviana Cocco

    2009-01-01

    The economic dispatch problem (EDP) is an optimization problem useful in power systems operation. The objective of the EDP of electric power generation, whose characteristics are complex and highly non-linear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying system constraints. Recently, as an alternative to the conventional mathematical approaches, modern heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. As special mechanism to avoid being trapped in local minimum, the ergodicity property of chaotic sequences has been used as optimization technique in EDPs. Based on the chaos theory, this paper discusses the design and validation of an optimization procedure based on a chaotic artificial immune network approach based on Zaslavsky's map. The optimization approach based on chaotic artificial immune network is validated for a test system consisting of 13 thermal units whose incremental fuel cost function takes into account the valve-point loading effects. Simulation results and comparisons show that the chaotic artificial immune network approach is competitive in performance with other optimization approaches presented in literature and is also an attractive tool to be used on applications in the power systems field.

  5. The flux-coordinate independent approach applied to X-point geometries

    International Nuclear Information System (INIS)

    Hariri, F.; Hill, P.; Ottaviani, M.; Sarazin, Y.

    2014-01-01

    A Flux-Coordinate Independent (FCI) approach for anisotropic systems, not based on magnetic flux coordinates, has been introduced in Hariri and Ottaviani [Comput. Phys. Commun. 184, 2419 (2013)]. In this paper, we show that the approach can tackle magnetic configurations including X-points. Using the code FENICIA, an equilibrium with a magnetic island has been used to show the robustness of the FCI approach to cases in which a magnetic separatrix is present in the system, either by design or as a consequence of instabilities. Numerical results are in good agreement with the analytic solutions of the sound-wave propagation problem. Conservation properties are verified. Finally, the critical gain of the FCI approach in situations including the magnetic separatrix with an X-point is demonstrated by a fast convergence of the code with the numerical resolution in the direction of symmetry. The results highlighted in this paper show that the FCI approach can efficiently deal with X-point geometries

  6. Perceptual-cognitive expertise in sport: some considerations when applying the expert performance approach.

    Science.gov (United States)

    Williams, A Mark; Ericsson, K Anders

    2005-06-01

    The number of researchers studying perceptual-cognitive expertise in sport is increasing. The intention in this paper is to review the currently accepted framework for studying expert performance and to consider implications for undertaking research work in the area of perceptual-cognitive expertise in sport. The expert performance approach presents a descriptive and inductive approach for the systematic study of expert performance. The nature of expert performance is initially captured in the laboratory using representative tasks that identify reliably superior performance. Process-tracing measures are employed to determine the mechanisms that mediate expert performance on the task. Finally, the specific types of activities that lead to the acquisition and development of these mediating mechanisms are identified. General principles and mechanisms may be discovered and then validated by more traditional experimental designs. The relevance of this approach to the study of perceptual-cognitive expertise in sport is discussed and suggestions for future work highlighted.

  7. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    Science.gov (United States)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    emphasize one flood typology or one flood dynamic (for example flash floods are often over-represented than slow dynamic floods in existing databases). Thus, the selected criteria have to introduce a general overview of flooding risk in France by integrating all typologies: storm surges, torrential floods, rising groundwater level and resulting to flood, etc. The methodology developed for the evaluation grid is inspired by several scientific works related to historical hydrology (Bradzil, 2006; Benito et al., 2004) or extreme floods classification (Kundzewics et al. 2013; Garnier E., 2005). The referenced information are mainly issued from investigations realized for the PFRA (archives, local data),from internet databases on flooding disasters, and from a complementary bibliography (some scientists such as Maurice Pardé a geographer who largely documented French floods during the 20th century). The proposed classification relies on three main axes. Each axis is associated to a set of criteria, each one related to a score (from 0.5 to 4 points), and pointing out a final remarkability score. • The flood intensity characterizing the flood's hazard level. It is composed of the submersion duration, important to valorize floods with slow dynamics as flooding from groundwater, the event peak discharge's return period, and the presence of factors increasing significantly the hazard level (dykes breaks, log jam, sediment transport…) • The flood severity focuses on economic damages, social and political repercussions, media coverage of the event, fatalities number or eventual flood warning failures. Analyzing the flood consequences is essential in order to evaluate the vulnerability of society at disaster date. • The spatial extension of the flood, which contributes complementary information to the two first axes. The evaluation grid was tested and applied on the sample of 176 remarkable events. Around twenty events (from 1856 to 2010) come out with a high remarkability rate

  8. Applied anatomy of a new approach of endoscopic technique in thyroid gland surgery.

    Science.gov (United States)

    Liu, Hong; Xie, Yong-jun; Xu, Yi-quan; Li, Chao; Liu, Xing-guo

    2012-10-01

    To explore the feasibility and safety of transtracheal assisted sublingual approach to totally endoscopic thyroidectomy by studying the anatomical approach and adjacent structures. A total of 5 embalmed adult cadavers from Chengdu Medical College were dissected layer by layer in the cervical region, pharyngeal region, and mandible region, according to transtracheal assisted sublingual approach that was verified from the anatomical approach and planes. A total of 15 embalmed adult cadavers were dissected by arterial vascular casting technique, imaging scanning technique, and thin layer cryotomy. Then the vessel and anatomical structures of thyroid surgical region were analyzed qualitatively and quantitatively. Three-dimensional visualization of larynx artery was reconstructed by Autodesk 3ds Max 2010(32). Transtracheal assisted sublingual approach for totally endoscopic thyroidectomy was simulated on 5 embalmed adult cadavers. The sublingual observed access was located in the middle of sublingual region. The geniohyoid muscle, mylohyoid seam, and submental triangle were divided in turn in the middle to reach the plane under the plastima muscles. Superficial cervical fascia, anterior body of hyoid bone, and infrahyoid muscles were passed in sequence to reach thyroid gland surgical region. The transtracheal operational access was placed from the cavitas oris propria, isthmus faucium, subepiglottic region, laryngeal pharynx, and intermediate laryngeal cavit, and then passed from the top down in order to reach pars cervicalis tracheae where a sagittal incision was made in the anterior wall of cartilagines tracheales to reach a ascertained surgical region. Transtracheal assisted sublingual approach to totally endoscopic thyroidectomy is anatomically feasible and safe and can be useful in thyroid gland surgery.

  9. An approach using quantum ant colony optimization applied to the problem of nuclear reactors reload

    International Nuclear Information System (INIS)

    Silva, Marcio H.; Lima, Alan M.M. de; Schirru, Roberto; Medeiros, J.A.C.C.

    2009-01-01

    The basic concept behind the nuclear reactor fuel reloading problem is to find a configuration of new and used fuel elements, to keep the plant working at full power by the largest possible duration, within the safety restrictions. The main restriction is the power peaking factor, which is the limit value for the preservation of the fuel assembly. The QACO A lfa algorithm is a modified version of Quantum Ant Colony Optimization (QACO) proposed by Wang et al, which uses a new actualization method and a pseudo evaporation step. We examined the QACO A lfa behavior associated to physics of reactors code RECNOD when applied to this problem. Although the QACO have been developed for continuous functions, the binary model used in this work allows applying it to discrete problems, such as the mentioned above. (author)

  10. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  11. A semantic web approach applied to integrative bioinformatics experimentation: a biological use case with genomics data.

    NARCIS (Netherlands)

    Post, L.J.G.; Roos, M.; Marshall, M.S.; van Driel, R.; Breit, T.M.

    2007-01-01

    The numerous public data resources make integrative bioinformatics experimentation increasingly important in life sciences research. However, it is severely hampered by the way the data and information are made available. The semantic web approach enhances data exchange and integration by providing

  12. Applying adaptive management in resource use in South African National Parks: A case study approach

    Directory of Open Access Journals (Sweden)

    Kelly Scheepers

    2011-05-01

    Conservation implications: There is no blueprint for the development of sustainable resource use systems and resource use is often addressed according to multiple approaches in national parks. However, the SANParks resource use policy provides a necessary set of guiding principles for resource use management across the national park system that allows for monitoring progress.

  13. Using Narratives to Develop Standards for Leaders: Applying an Innovative Approach in Western Australia

    Science.gov (United States)

    Wildy, Helen; Pepper, Coral

    2005-01-01

    Dissatisfaction with long lists of duties as substitutes for standards led to the innovative application of narratives as an alternative approach to the generation and use of standards for school leaders. This paper describes research conducted over nearly a decade in collaboration with the state education authority in Western Australia,…

  14. A single grain approach applied to modelling recrystallization kinetics in a single-phase metal

    NARCIS (Netherlands)

    Chen, S.P.; Zwaag, van der S.

    2004-01-01

    A comprehensive model for the recrystallization kinetics is proposed which incorporates both microstructure and the textural components in the deformed state. The model is based on the single-grain approach proposed previously. The influence of the as-deformed grain orientation, which affects the

  15. Creating patient value in glaucoma care : applying quality costing and care delivery value chain approaches

    NARCIS (Netherlands)

    D.F. de Korne (Dirk); J.C.A. Sol (Kees); T. Custers (Thomas); E. van Sprundel (Esther); B.M. van Ineveld (Martin); H.G. Lemij (Hans); N.S. Klazinga (Niek)

    2009-01-01

    textabstractPurpose: The purpose of this paper is to explore in a specific hospital care process the applicability in practice of the theories of quality costing and value chains. Design/methodology/approach: In a retrospective case study an in-depth evaluation of the use of a quality cost model

  16. A systematic approach for fine-tuning of fuzzy controllers applied to WWTPs

    DEFF Research Database (Denmark)

    Ruano, M.V.; Ribes, J.; Sin, Gürkan

    2010-01-01

    A systematic approach for fine-tuning fuzzy controllers has been developed and evaluated for an aeration control system implemented in a WWTR The challenge with the application of fuzzy controllers to WWTPs is simply that they contain many parameters, which need to be adjusted for different WWTP ...

  17. Advancing early detection of autism spectrum disorder by applying an integrated two-stage screening approach

    NARCIS (Netherlands)

    Oosterling, Iris J.; Wensing, Michel; Swinkels, Sophie H.; van der Gaag, Rutger Jan; Visser, Janne C.; Woudenberg, Tim; Minderaa, Ruud; Steenhuis, Mark-Peter; Buitelaar, Jan K.

    Background: Few field trials exist on the impact of implementing guidelines for the early detection of autism spectrum disorders (ASD). The aims of the present study were to develop and evaluate a clinically relevant integrated early detection programme based on the two-stage screening approach of

  18. Improving the efficiency of a chemotherapy day unit: Applying a business approach to oncology

    NARCIS (Netherlands)

    van Lent, W.A.M.; Goedbloed, N.; van Harten, Willem H.

    2009-01-01

    Aim: To improve the efficiency of a hospital-based chemotherapy day unit (CDU). - Methods: The CDU was benchmarked with two other CDUs to identify their attainable performance levels for efficiency, and causes for differences. Furthermore, an in-depth analysis using a business approach, called lean

  19. A clustering approach applied to time-lapse ERT interpretation - Case study of Lascaux cave

    Science.gov (United States)

    Xu, Shan; Sirieix, Colette; Riss, Joëlle; Malaurent, Philippe

    2017-09-01

    The Lascaux cave, located in southwest France, is one of the most important prehistoric cave in the world that shows Paleolithic paintings. This study aims to characterize the structure of the weathered epikarst setting located above the cave using Time-Lapse Electrical Resistivity Tomography (ERT) combined with local hydrogeological and climatic environmental data. Twenty ERT profiles were carried out for two years and helped us to record the seasonal and spatial variations of the electrical resistivity of the hydraulic upstream area of the Lascaux cave. The 20 interpreted resistivity models were merged into a single synthetic model using a multidimensional statistical method (Hierarchical Agglomerative Clustering). The individual blocks from the synthetic model associated with a similar resistivity variability were gathered into 7 clusters. We combined the resistivity temporal variations with climatic and hydrogeological data to propose a geo-electrical model that relates to a conceptual geological model. We provide a geological interpretation for each cluster regarding epikarst features. The superficial clusters (no 1 & 2) are linked to effective rainfall and trees, probably a fractured limestone. Another two clusters (no 6 & 7) are linked to detrital formations (sand and clay respectively). The cluster 3 may correspond to a marly limestone that forms a non-permeable horizon. Finally, the electrical behavior of the last two clusters (no 4 & 5) is correlated with the variation of flow rate; they may be a privileged feed zone of the flow in the cave.

  20. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  1. Parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method of ledre profile attributes

    Science.gov (United States)

    Hastuti, S.; Harijono; Murtini, E. S.; Fibrianto, K.

    2018-03-01

    This current study is aimed to investigate the use of parametric and non-parametric approach for sensory RATA (Rate-All-That-Apply) method. Ledre as Bojonegoro unique local food product was used as point of interest, in which 319 panelists were involved in the study. The result showed that ledre is characterized as easy-crushed texture, sticky in mouth, stingy sensation and easy to swallow. It has also strong banana flavour with brown in colour. Compared to eggroll and semprong, ledre has more variances in terms of taste as well the roll length. As RATA questionnaire is designed to collect categorical data, non-parametric approach is the common statistical procedure. However, similar results were also obtained as parametric approach, regardless the fact of non-normal distributed data. Thus, it suggests that parametric approach can be applicable for consumer study with large number of respondents, even though it may not satisfy the assumption of ANOVA (Analysis of Variances).

  2. A human rights-consistent approach to multidimensional welfare measurement applied to sub-Saharan Africa

    DEFF Research Database (Denmark)

    Arndt, Channing; Mahrt, Kristi; Hussain, Azhar

    2017-01-01

    is in reality inconsistent with the Universal Declaration of Human Rights principles of indivisibility, inalienability, and equality. We show that a first-order dominance methodology maintains consistency with basic principles, discuss the properties of the multidimensional poverty index and first......The rights-based approach to development targets progress towards the realization of 30 articles set forth in the Universal Declaration of Human Rights. Progress is frequently measured using the multidimensional poverty index. While elegant and useful, the multidimensional poverty index...

  3. A Hybrid Approach to the Valuation of RFID/MEMS technology applied to ordnance inventory

    OpenAIRE

    Doerr, Kenneth H.; Gates, William R.; Mutty, John E.

    2006-01-01

    We report on an analysis of the costs and benefits of fielding Radio Frequency Identification / MicroElectroMechanical System (RFID /MEMS) technology for the management of ordnance inventory. A factorial model of these benefits is proposed. Our valuation approach combines a multi-criteria tool for the valuation of qualitative factors with a monte-carlo simulation of anticipated financial factors. In a sample survey, qualitative factors are shown to account of over half of the anticipated bene...

  4. A Guttman-Based Approach to Identifying Cumulativeness Applied to Chimpanzee Culture

    OpenAIRE

    Graber, RB; de Cock, DR; Burton, ML

    2012-01-01

    Human culture appears to build on itself-that is, to be to some extent cumulative. Whether this property is shared by culture in the common chimpanzee is controversial. The question previously has been approached, qualitatively (and inconclusively), by debating whether any chimpanzee culture traits have resulted from individuals building on one another's work ("ratcheting"). The fact that the chimpanzees at different sites have distinctive repertoires of traits affords a different avenue of a...

  5. In vitro approach to studying cutaneous metabolism and disposition of topically applied xenobiotics

    International Nuclear Information System (INIS)

    Kao, J.; Hall, J.; Shugart, L.R.; Holland, J.M.

    1984-01-01

    The extent to which cutaneous metabolism may be involved in the penetration and fate of topically applied xenobiotics was examined by metabolically viable and structurally intact mouse skin in organ culture. Evidence that skin penetration of certain chemicals is coupled to cutaneous metabolism was based upon observations utilizing [ 14 C]benzo[a]pyrene (BP). As judged by the recovery of radioactivity in the culture medium 24 hr after in vitro topical application of [ 14 C]BP to the skin from both control and 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD)-induced C3H mice, skin penetration of BP was higher in the induced tissue. All classes of metabolites of BP were found in the culture medium; water-soluble metabolites predominated and negligible amounts of unmetabolized BP were found. As shown by enzymatic hydrolysis of the medium, TCDD induction resulted in shifting the cutaneous metabolism of BP toward the synthesis of more water-soluble conjugates. Differences in the degree of covalent binding of BP, via diol epoxide intermediates to epidermal DNA, from control and induced tissues were observed. These differences may reflect a change in the pathways of metabolism as a consequence of TCDD induction. These results indicated that topically applied BP is metabolized by the skin during its passage through the skin; and the degree of percutaneous penetration and disposition of BP was dependent upon the metabolic status of the tissue. This suggests that cutaneous metabolism may play an important role in the translocation and subsequent physiological disposition of topically applied BP. 33 references, 5 figures, 2 tables

  6. Overuse tendinosis, not tendinitis part 2: applying the new approach to patellar tendinopathy.

    Science.gov (United States)

    Cook, J L; Khan, K M; Maffulli, N; Purdam, C

    2000-06-01

    Patellar tendinopathy causes substantial morbidity in both professional and recreational athletes. The condition is most common in athletes of jumping sports such as basketball and volleyball, but it also occurs in soccer, track, and tennis athletes. The disorder arises most often from collagen breakdown rather than inflammation, a tendinosis rather than a tendinitis. Physicians must address the degenerative pathology underlying patellar tendinopathy because regimens that seek to minimize (nonexistent) inflammation would appear illogical. Suggestions for applying the 'tendinosis paradigm' to patellar tendinopathy management include conservative measures such as load reduction, strengthening exercises, and massage. Surgery should be considered only after a long-term and appropriate conservative regimen has failed.

  7. Optical waveguiding and applied photonics technological aspects, experimental issue approaches and measurements

    CERN Document Server

    Massaro, Alessandro

    2012-01-01

    Optoelectronics--technology based on applications light such as micro/nano quantum electronics, photonic devices, laser for measurements and detection--has become an important field of research. Many applications and physical problems concerning optoelectronics are analyzed in Optical Waveguiding and Applied Photonics.The book is organized in order to explain how to implement innovative sensors starting from basic physical principles. Applications such as cavity resonance, filtering, tactile sensors, robotic sensor, oil spill detection, small antennas and experimental setups using lasers are a

  8. A risk analysis approach applied to field surveillance in utility meters in legal metrology

    Science.gov (United States)

    Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.

    2018-03-01

    Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.

  9. New approach to K-electron-capture probabilities to the 437 and 384 keV levels in the decay of /sup 133/Ba

    Energy Technology Data Exchange (ETDEWEB)

    Singh, K; Sahota, H S [Punjabi Univ., Patiala (India). Dept. of Physics

    1983-12-01

    The K-electron-capture probabilities to the 437 and 384 keV levels in the decay of /sup 133/Ba have been determined from a measurement of gamma-ray intensities in conjunction with an analysis of the K x-ray-gamma-ray sum peaks. The results are independent of fluorescence yield and detector efficiency.

  10. Geometric and Dosimetric Approach to Determine Probability of Late Cardiac Mortality in Left Tangential Breast Irradiation: Comparison Between Wedged Beams and Field-in-Field Technique

    International Nuclear Information System (INIS)

    Pili, Giorgio; Grimaldi, Luca; Fidanza, Christian; Florio, Elena T.; Petruzzelli, Maria F.; D'Errico, Maria P.; De Tommaso, Cristina; Tramacere, Francesco; Musaio, Francesca; Castagna, Roberta; Francavilla, Maria C.; Gianicolo, Emilio A.L.; Portaluri, Maurizio

    2011-01-01

    Purpose: To evaluate the probability of late cardiac mortality resulting from left breast irradiation planned with tangential fields and to compare this probability between the wedged beam and field-in-field (FIF) techniques and to investigate whether some geometric/dosimetric indicators can be determined to estimate the cardiac mortality probability before treatment begins. Methods and Materials: For 30 patients, differential dose-volume histograms were calculated for the wedged beam and FIF plans, and the corresponding cardiac mortality probabilities were determined using the relative seriality model. As a comparative index of the dose distribution uniformity, the planning target volume (PTV) percentages involved in 97-103% of prescribed dose were determined for the two techniques. Three geometric parameters were measured for each patient: the maximal length, indicates how much the heart contours were displaced toward the PTV, the angle subtended at the center of the computed tomography slice by the PTV contour, and the thorax width/thickness ratio. Results: Evaluating the differential dose-volume histograms showed that the gain in uniformity between the two techniques was about 1.5. With the FIF technique, the mean dose sparing for the heart, the left anterior descending coronary artery, and the lung was 15% (2.5 Gy vs. 2.2 Gy), 21% (11.3 Gy vs. 9.0 Gy), and 42% (8.0 Gy vs. 4.6 Gy) respectively, compared with the wedged beam technique. Also, the cardiac mortality probability decreased by 40% (from 0.9% to 0.5%). Three geometric parameters, the maximal length, angle subtended at the center of the computed tomography slice by the PTV contour, and thorax width/thickness ratio, were the determining factors (p = .06 for FIF, and p = .10 for wedged beam) for evaluating the cardiac mortality probability. Conclusion: The FIF technique seemed to yield a lower cardiac mortality probability than the conventional wedged beam technique. However, although our study

  11. The GRADE approach for assessing new technologies as applied to apheresis devices in ulcerative colitis

    Directory of Open Access Journals (Sweden)

    Cabriada-Nuño Jose

    2010-06-01

    Full Text Available Abstract Background In the last few years, a new non-pharmacological treatment, termed apheresis, has been developed to lessen the burden of ulcerative colitis (UC. Several methods can be used to establish treatment recommendations, but over the last decade an informal collaboration group of guideline developers, methodologists, and clinicians has developed a more sensible and transparent approach known as the Grading of Recommendations, Assessment, Development and Evaluation (GRADE. GRADE has mainly been used in clinical practice guidelines and systematic reviews. The aim of the present study is to describe the use of this approach in the development of recommendations for a new health technology, and to analyse the strengths, weaknesses, opportunities, and threats found when doing so. Methods A systematic review of the use of apheresis for UC treatment was performed in June 2004 and updated in May 2008. Two related clinical questions were selected, the outcomes of interest defined, and the quality of the evidence assessed. Finally, the overall quality of each question was taken into account to formulate recommendations following the GRADE approach. To evaluate this experience, a SWOT (strengths, weaknesses, opportunities and threats analysis was performed to enable a comparison with our previous experience with the SIGN (Scottish Intercollegiate Guidelines Network method. Results Application of the GRADE approach allowed recommendations to be formulated and the method to be clarified and made more explicit and transparent. Two weak recommendations were proposed to answer to the formulated questions. Some challenges, such as the limited number of studies found for the new technology and the difficulties encountered when searching for the results for the selected outcomes, none of which are specific to GRADE, were identified. GRADE was considered to be a more time-consuming method, although it has the advantage of taking into account patient

  12. Recruiting the next generation: applying a values-based approach to recruitment.

    Science.gov (United States)

    Ritchie, Georgina; Ashworth, Lisa; Bades, Annette

    2018-05-02

    The qualified district nurse (DN) role demands high levels of leadership. Attracting the right candidates to apply for the Specialist Practice Qualification District Nursing (SPQDN) education programme is essential to ensure fitness to practice on qualification. Anecdotal evidence suggested that the traditional panel interview discouraged candidates from applying and a need to improve the quality of the overall interview process was identified by the authors. The University of Central Lancashire in partnership with Lancashire Care NHS Foundation Trust adopted the National Values Based Recruitment (VBR) Framework to select candidates to gain entry onto the SPQDN course. This involved using 'selection centres' of varying activities including a multiple mini interview, written exercise, group discussion, and portfolio review with scores attached to each centre. The ultimate aim of utilising VBR was to align personal and profession values to both the nursing profession and the Trust whilst allowing a fairer assessment process. An evaluation of the VBR recruitment process demonstrated 100% pass rate for the course and 100% satisfaction with the interview process reported by all 16 candidates over three academic years. Interviewer feedback showed deeper insight into the candidates' skills and values aligned with the core values and skills required by future District Nurse leaders within the Trust.

  13. Applying a radiomics approach to predict prognosis of lung cancer patients

    Science.gov (United States)

    Emaminejad, Nastaran; Yan, Shiju; Wang, Yunzhi; Qian, Wei; Guan, Yubao; Zheng, Bin

    2016-03-01

    Radiomics is an emerging technology to decode tumor phenotype based on quantitative analysis of image features computed from radiographic images. In this study, we applied Radiomics concept to investigate the association among the CT image features of lung tumors, which are either quantitatively computed or subjectively rated by radiologists, and two genomic biomarkers namely, protein expression of the excision repair cross-complementing 1 (ERCC1) genes and a regulatory subunit of ribonucleotide reductase (RRM1), in predicting disease-free survival (DFS) of lung cancer patients after surgery. An image dataset involving 94 patients was used. Among them, 20 had cancer recurrence within 3 years, while 74 patients remained DFS. After tumor segmentation, 35 image features were computed from CT images. Using the Weka data mining software package, we selected 10 non-redundant image features. Applying a SMOTE algorithm to generate synthetic data to balance case numbers in two DFS ("yes" and "no") groups and a leave-one-case-out training/testing method, we optimized and compared a number of machine learning classifiers using (1) quantitative image (QI) features, (2) subjective rated (SR) features, and (3) genomic biomarkers (GB). Data analyses showed relatively lower correlation among the QI, SR and GB prediction results (with Pearson correlation coefficients 0.5). Among them, using QI yielded the highest performance.

  14. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    Hill, M.

    2001-01-01

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO 2 ) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO 2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  15. Skill-Based Approach Applied to Gifted Students, its Potential in Latin America

    Directory of Open Access Journals (Sweden)

    Andrew Alexi Almazán-Anaya

    2015-09-01

    Full Text Available This paper presents, as a reflective essay, the current educational situation of gifted students (with more intelligence than the average in Latin America and the possibility of using skill-based education within differentiated programs (intended for gifted individuals, a sector where scarce scientific studies have been done and a consensus of an ideal educative model has not been reached yet. Currently these students, in general, lack of specialized educational assistance intended to identify and develop their cognitive abilities, so it is estimated that a high percentage (95% of such population is not detected in the traditional education system. Although there are differentiated education models, they are rarely applied. A student-centered education program is a solution proposed to apply this pedagogical model and cover such population. The characteristics of this program that do support differentiated instruction for gifted individuals compatible with experiences in the US, Europe and Latin America are analyzed. Finally, this paper concludes with an analysis of possible research areas that, if explored in the future, would help us to find answers about the feasibility and relation between skill-based programs and differentiated education for gifted students.

  16. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  17. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  18. New research at Paisley Caves:applying new integrated analytical approaches to understanding stratigraphy, taphonomy, and site formation processes

    OpenAIRE

    Shillito, Lisa-Marie; Blong, John C; Jenkins, Dennis L; Stafford Jr, Thomas W; Whelton, Helen; McDonough, Katelyn; Bull, Ian

    2018-01-01

    Paisley Caves in Oregon has become well known due to early dates, and human presence in the form of coprolites, found to contain ancient human DNA. Questions remain over whether the coprolites themselves are human, or whether the DNA is mobile in the sediments. This brief introduces new research applying an integrated analytical approach combining sediment micromorphology and lipid biomarker analysis, which aims to resolve these problems.

  19. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  20. The research of approaches of applying the results of big data analysis in higher education

    Science.gov (United States)

    Kochetkov, O. T.; Prokhorov, I. V.

    2017-01-01

    This article briefly discusses the approaches to the use of Big Data in the educational process of higher educational institutions. There is a brief description of nature of big data, their distribution in the education industry and new ways to use Big Data as part of the educational process are offered as well. This article describes a method for the analysis of the relevant requests by using Yandex.Wordstat (for laboratory works on the processing of data) and Google Trends (for actual pictures of interest and preference in a higher education institution).

  1. Multidisciplinary approach of early breast cancer: The biology applied to radiation oncology

    International Nuclear Information System (INIS)

    Bourgier, Céline; Ozsahin, Mahmut; Azria, David

    2010-01-01

    Early breast cancer treatment is based on a multimodality approach with the application of clinical and histological prognostic factors to determine locoregional and systemic treatments. The entire scientific community is strongly involved in the management of this disease: radiologists for screening and early diagnosis, gynecologists, surgical oncologists and radiation oncologists for locoregional treatment, pathologists and biologists for personalized characterization, genetic counselors for BRCA mutation history and medical oncologists for systemic therapies. Recently, new biological tools have established various prognostic subsets of breast cancer and developed predictive markers for miscellaneous treatments. The aim of this article is to highlight the contribution of biological tools in the locoregional management of early breast cancer

  2. Current Methods Applied to Biomaterials - Characterization Approaches, Safety Assessment and Biological International Standards.

    Science.gov (United States)

    Oliveira, Justine P R; Ortiz, H Ivan Melendez; Bucio, Emilio; Alves, Patricia Terra; Lima, Mayara Ingrid Sousa; Goulart, Luiz Ricardo; Mathor, Monica B; Varca, Gustavo H C; Lugao, Ademar B

    2018-04-10

    Safety and biocompatibility assessment of biomaterials are themes of constant concern as advanced materials enter the market as well as products manufactured by new techniques emerge. Within this context, this review provides an up-to-date approach on current methods for the characterization and safety assessment of biomaterials and biomedical devices from a physicalchemical to a biological perspective, including a description of the alternative methods in accordance with current and established international standards. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  3. Chemical, spectroscopic, and ab initio modelling approach to interfacial reactivity applied to anion retention by siderite

    International Nuclear Information System (INIS)

    Badaut, V.

    2010-07-01

    Among the many radionuclides contained in high-level nuclear waste, 79 Se was identified as a potential threat to the safety of long term underground storage. However, siderite (FeCO 3 ) is known to form upon corrosion of the waste container, and the impact of this mineral on the fate of selenium was not accounted for. In this work, the interactions between selenium oxyanions - selenate and selenite - and siderite were investigated. To this end, both experimental characterizations (solution chemistry, X-ray Absorption Spectroscopy - XAS) and theoretical studies (ab initio modelling using Density Functional Theory - DFT ) were performed. Selenite and selenate (≤ 10 3 M) retention experiments by siderite suspensions (75 g/L ) at neutral pH in reducing glovebox (5 % H 2 ) showed that selenite is quantitatively immobilized by siderite after 48 h of reaction time, when selenate is only partly immobilized after 10 days. In the selenite case, XAS showed that immobilized selenium is initially present as Se(IV) probably sorbed on siderite surface. After 10 days of reaction, selenite ions are quantitatively reduced and form poorly crystalline elementary selenium. Selenite retention and reduction kinetics are therefore distinct. On the other hand, the fraction of immobilized selenate retained in the solid fraction does not appear to be significantly reduced over the probed timescale (10 days). For a better understanding of the reduction mechanism of selenite ions by siderite, the properties of bulk and perfect surfaces of siderite were modelled using DFT. We suggest that the properties of the valence electrons can be correctly described only if the symmetry of the fundamental state electronic density is lower than the experimental crystallographic symmetry. We then show that the retention of simple molecules as O 2 or H 2 O on siderite and magnesite (10 -14 ) perfect surfaces (perfect cleavage plane, whose surface energy is the lowest according to DFT) can be modelled with

  4. Method to integrate clinical guidelines into the electronic health record (EHR) by applying the archetypes approach.

    Science.gov (United States)

    Garcia, Diego; Moro, Claudia Maria Cabral; Cicogna, Paulo Eduardo; Carvalho, Deborah Ribeiro

    2013-01-01

    Clinical guidelines are documents that assist healthcare professionals, facilitating and standardizing diagnosis, management, and treatment in specific areas. Computerized guidelines as decision support systems (DSS) attempt to increase the performance of tasks and facilitate the use of guidelines. Most DSS are not integrated into the electronic health record (EHR), ordering some degree of rework especially related to data collection. This study's objective was to present a method for integrating clinical guidelines into the EHR. The study developed first a way to identify data and rules contained in the guidelines, and then incorporate rules into an archetype-based EHR. The proposed method tested was anemia treatment in the Chronic Kidney Disease Guideline. The phases of the method are: data and rules identification; archetypes elaboration; rules definition and inclusion in inference engine; and DSS-EHR integration and validation. The main feature of the proposed method is that it is generic and can be applied toany type of guideline.

  5. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  6. A systematic approach applied in design of a micro heat exchanger

    DEFF Research Database (Denmark)

    Omidvarnia, Farzaneh; Hansen, Hans Nørgaard; Sarhadi, Ali

    2016-01-01

    The number of products benefiting from micro components in the market is increasing, and consequently, the demand for well-matched tools, equipment and systems with micro features is eventually increasing as well. During the design process of micro products, a number of issues appear which...... from the design process of the micro heat exchanger are added to the RTC unit and can be applied as guidelines in design pro- cess of any other micro heat exchanger. In other words, the current study can provide a useful guideline in design for manufacturing of micro products....... are inherent due to the down scaling or physical phenomena dominating in the micro range but negligible in the macro scale. In fact, some aspects in design for micro manufacturing are considerably different compared to the de- sign procedure taken at the macro level. Identifying the differences between design...

  7. Applying a Systems Approach to Monitoring and Assessing Climate Change Mitigation Potential in Mexico's Forest Sector

    Science.gov (United States)

    Olguin-Alvarez, M. I.; Wayson, C.; Fellows, M.; Birdsey, R.; Smyth, C.; Magnan, M.; Dugan, A.; Mascorro, V.; Alanís, A.; Serrano, E.; Kurz, W. A.

    2017-12-01

    Since 2012, the Mexican government through its National Forestry Commission, with support from the Commission for Environmental Cooperation, the Forest Services of Canada and USA, the SilvaCarbon Program and research institutes in Mexico, has made important progress towards the use of carbon dynamics models ("gain-loss" approach) for greenhouse gas (GHG) emissions monitoring and projections into the future. Here we assess the biophysical mitigation potential of policy alternatives identified by the Mexican Government (e.g. net zero deforestation rate, sustainable forest management) based on a systems approach that models carbon dynamics in forest ecosystems, harvested wood products and substitution benefits in two contrasting states of Mexico. We provide key messages and results derived from the use of the Carbon Budget Model of the Canadian Forest Sector and a harvested wood products model, parameterized with input data from Mexicós National Forest Monitoring System (e.g. forest inventories, remote sensing, disturbance data). The ultimate goal of this tri-national effort is to develop data and tools for carbon assessment in strategic landscapes in North America, emphasizing the need to include multiple sectors and types of collaborators (scientific and policy-maker communities) to design more comprehensive portfolios for climate change mitigation in accordance with the Paris Agreement of the United Nation Framework Convention on Climate Change (e.g. Mid-Century Strategy, NDC goals).

  8. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)], e-mail: vasconv@cdtn.br, e-mail: silvaem@cdtn.br, e-mail: aclc@cdtn.br, e-mail: reissc@cdtn.br

    2009-07-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  9. Applying attachment theory to effective practice with hard-to-reach youth: the AMBIT approach.

    Science.gov (United States)

    Bevington, Dickon; Fuggle, Peter; Fonagy, Peter

    2015-01-01

    Adolescent Mentalization-Based Integrative Treatment (AMBIT) is a developing approach to working with "hard-to-reach" youth burdened with multiple co-occurring morbidities. This article reviews the core features of AMBIT, exploring applications of attachment theory to understand what makes young people "hard to reach," and provide routes toward increased security in their attachment to a worker. Using the theory of the pedagogical stance and epistemic ("pertaining to knowledge") trust, we show how it is the therapeutic worker's accurate mentalizing of the adolescent that creates conditions for new learning, including the establishment of alternative (more secure) internal working models of helping relationships. This justifies an individual keyworker model focused on maintaining a mentalizing stance toward the adolescent, but simultaneously emphasizing the critical need for such keyworkers to remain well connected to their wider team, avoiding activation of their own attachment behaviors. We consider the role of AMBIT in developing a shared team culture (shared experiences, shared language, shared meanings), toward creating systemic contexts supportive of such relationships. We describe how team training may enhance the team's ability to serve as a secure base for keyworkers, and describe an innovative approach to treatment manualization, using a wiki format as one way of supporting this process.

  10. Multiobjective scatter search approach with new combination scheme applied to solve environmental/economic dispatch problem

    International Nuclear Information System (INIS)

    Athayde Costa e Silva, Marsil de; Klein, Carlos Eduardo; Mariani, Viviana Cocco; Santos Coelho, Leandro dos

    2013-01-01

    The environmental/economic dispatch (EED) is an important daily optimization task in the operation of many power systems. It involves the simultaneous optimization of fuel cost and emission objectives which are conflicting ones. The EED problem can be formulated as a large-scale highly constrained nonlinear multiobjective optimization problem. In recent years, many metaheuristic optimization approaches have been reported in the literature to solve the multiobjective EED. In terms of metaheuristics, recently, scatter search approaches are receiving increasing attention, because of their potential to effectively explore a wide range of complex optimization problems. This paper proposes an improved scatter search (ISS) to deal with multiobjective EED problems based on concepts of Pareto dominance and crowding distance and a new scheme for the combination method. In this paper, we have considered the standard IEEE (Institute of Electrical and Electronics Engineers) 30-bus system with 6-generators and the results obtained by proposed ISS algorithm are compared with the other recently reported results in the literature. Simulation results demonstrate that the proposed ISS algorithm is a capable candidate in solving the multiobjective EED problems. - Highlights: ► Economic dispatch. ► We solve the environmental/economic economic power dispatch problem with scatter search. ► Multiobjective scatter search can effectively improve the global search ability

  11. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos

    2009-01-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  12. Applying quantitative structure–activity relationship approaches to nanotoxicology: Current status and future potential

    International Nuclear Information System (INIS)

    Winkler, David A.; Mombelli, Enrico; Pietroiusti, Antonio; Tran, Lang; Worth, Andrew; Fadeel, Bengt; McCall, Maxine J.

    2013-01-01

    The potential (eco)toxicological hazard posed by engineered nanoparticles is a major scientific and societal concern since several industrial sectors (e.g. electronics, biomedicine, and cosmetics) are exploiting the innovative properties of nanostructures resulting in their large-scale production. Many consumer products contain nanomaterials and, given their complex life-cycle, it is essential to anticipate their (eco)toxicological properties in a fast and inexpensive way in order to mitigate adverse effects on human health and the environment. In this context, the application of the structure–toxicity paradigm to nanomaterials represents a promising approach. Indeed, according to this paradigm, it is possible to predict toxicological effects induced by chemicals on the basis of their structural similarity with chemicals for which toxicological endpoints have been previously measured. These structure–toxicity relationships can be quantitative or qualitative in nature and they can predict toxicological effects directly from the physicochemical properties of the entities (e.g. nanoparticles) of interest. Therefore, this approach can aid in prioritizing resources in toxicological investigations while reducing the ethical and monetary costs that are related to animal testing. The purpose of this review is to provide a summary of recent key advances in the field of QSAR modelling of nanomaterial toxicity, to identify the major gaps in research required to accelerate the use of quantitative structure–activity relationship (QSAR) methods, and to provide a roadmap for future research needed to achieve QSAR models useful for regulatory purposes

  13. Applying the Analog Configurability Test Approach in a Wireless Sensor Network Application

    Directory of Open Access Journals (Sweden)

    Agustín Laprovitta

    2014-01-01

    Full Text Available This work addresses the application of the analog configurability test (ACT approach for an embedded analog configurable circuit (EACC, composed of operational amplifiers and interconnection resources that are embedded in the MSP430xG461x microcontrollers family. This test strategy is particularly useful for in-field application requiring reliability, safe operation, or fault tolerance characteristics. Our test proposal consists of programming a reduced set of available configurations for the EACC and testing its functionality by measuring only a few key parameters. The processor executes an embedded test routine that sequentially programs selected configurations, sets the test stimulus, acquires data from the internal ADC, and performs required calculations. The test approach is experimentally evaluated using an embedded system-based real application board. Our experimental results show very good repeatability, with very low errors. These results show that the ACT proposed here is useful for testing the functionality of the circuit under test in a real application context by using a simple strategy at a very low cost.

  14. Developing scenarios to assess future landslide risks: a model-based approach applied to mountainous regions

    Science.gov (United States)

    Vacquie, Laure; Houet, Thomas

    2016-04-01

    In the last century, European mountain landscapes have experienced significant transformations. Natural and anthropogenic changes, climate changes, touristic and industrial development, socio-economic interactions, and their implications in terms of LUCC (land use and land cover changes) have directly influenced the spatial organization and vulnerability of mountain landscapes. This study is conducted as part of the SAMCO project founded by the French National Science Agency (ANR). It aims at developing a methodological approach, combining various tools, modelling platforms and methods, to identify vulnerable regions to landslide hazards accounting for futures LUCC. It presents an integrated approach combining participative scenarios and a LULC changes simulation models to assess the combined effects of LUCC and climate change on landslide risks in the Cauterets valley (French Pyrenees Mountains) up to 2100. Through vulnerability and risk mapping, the objective is to gather information to support landscape planning and implement land use strategies with local stakeholders for risk management. Four contrasting scenarios are developed and exhibit contrasting trajectories of socio-economic development. Prospective scenarios are based on national and international socio-economic contexts relying on existing assessment reports. The methodological approach integrates knowledge from local stakeholders to refine each scenario during their construction and to reinforce their plausibility and relevance by accounting for local specificities, e.g. logging and pastoral activities, touristic development, urban planning, etc. A process-based model, the Forecasting Scenarios for Mountains (ForeSceM) model, developed on the Dinamica Ego modelling platform is used to spatially allocate futures LUCC for each prospective scenario. Concurrently, a spatial decision support tool, i.e. the SYLVACCESS model, is used to identify accessible areas for forestry in scenario projecting logging

  15. Applying systems theory to the evaluation of a whole school approach to violence prevention.

    Science.gov (United States)

    Kearney, Sarah; Leung, Loksee; Joyce, Andrew; Ollis, Debbie; Green, Celia

    2016-02-01

    Issue addressed Our Watch led a complex 12-month evaluation of a whole school approach to Respectful Relationships Education (RRE) implemented in 19 schools. RRE is an emerging field aimed at preventing gender-based violence. This paper will illustrate how from an implementation science perspective, the evaluation was a critical element in the change process at both a school and policy level. Methods Using several conceptual approaches from systems science, the evaluation sought to examine how the multiple systems layers - student, teacher, school, community and government - interacted and influenced each other. A distinguishing feature of the evaluation included 'feedback loops'; that is, evaluation data was provided to participants as it became available. Evaluation tools included a combination of standardised surveys (with pre- and post-intervention data provided to schools via individualised reports), reflection tools, regular reflection interviews and summative focus groups. Results Data was shared during implementation with project staff, department staff and schools to support continuous improvement at these multiple systems levels. In complex settings, implementation can vary according to context; and the impact of evaluation processes, tools and findings differed across the schools. Interviews and focus groups conducted at the end of the project illustrated which of these methods were instrumental in motivating change and engaging stakeholders at both a school and departmental level and why. Conclusion The evaluation methods were a critical component of the pilot's approach, helping to shape implementation through data feedback loops and reflective practice for ongoing, responsive and continuous improvement. Future health promotion research on complex interventions needs to examine how the evaluation itself is influencing implementation. So what? The pilot has demonstrated that the evaluation, including feedback loops to inform project activity, were an

  16. Local approach of cleavage fracture applied to a vessel with subclad flaw. A benchmark on computational simulation

    International Nuclear Information System (INIS)

    Moinereau, D.; Brochard, J.; Guichard, D.; Bhandari, S.; Sherry, A.; France, C.

    1996-10-01

    A benchmark on the computational simulation of a cladded vessel with a 6.2 mm sub-clad flaw submitted to a thermal transient has been conducted. Two-dimensional elastic and elastic-plastic finite element computations of the vessel have been performed by the different partners with respective finite element codes ASTER (EDF), CASTEM 2000 (CEA), SYSTUS (Framatome) and ABAQUS (AEA Technology). Main results have been compared: temperature field in the vessel, crack opening, opening stress at crack tips, stress intensity factor in cladding and base metal, Weibull stress σ w and probability of failure in base metal, void growth rate R/R 0 in cladding. This comparison shows an excellent agreement on main results, in particular on results obtained with local approach. (K.A.)

  17. Contemporary approaches to control system specification and design applied to KAON

    International Nuclear Information System (INIS)

    Ludgate, G.A.; Osberg, E.A.; Dohan, D.A.

    1991-05-01

    Large data acquisition and control systems have evolved from early centralized computer systems to become multi-processor, distributed systems. While the complexity of these systems has increased our ability to reliably manage their construction has not kept pace. Structured analysis and real-time structured analysis have been used successfully to specify systems but, from a project management viewpoint, both lead to different classes of problems during implementation and maintenance. The KAON Factory central control system study employed a uniform approach to requirements analysis and architectural design. The methodology was based on well established object-oriented principles and was free of the problems inherent in the older methodologies. The methodology is presently being used to implement two systems at TRIUMF. (Author) 12 refs

  18. A Cointegrated Regime-Switching Model Approach with Jumps Applied to Natural Gas Futures Prices

    Directory of Open Access Journals (Sweden)

    Daniel Leonhardt

    2017-09-01

    Full Text Available Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for different states of the underlying volatility of the futures. In this paper, we therefore allow for cointegration of the term structure within a multi-factor model, which includes seasonality, as well as joint and individual jumps in the price processes of futures with different maturities. The seasonality in this model is realized via a deterministic function, and the jumps are represented with thinned-out compound Poisson processes. The model also includes a regime-switching approach that is modelled through a Markov chain and extends the class of geometric models. We show how the model can be calibrated to empirical data and give some practical applications.

  19. The DPSIR approach applied to marine eutrophication in LCIA as a learning tool

    DEFF Research Database (Denmark)

    Cosme, Nuno Miguel Dias; Olsen, Stig Irving

    assessment and response design ultimately benefit from spatial differentiation in the results. DPSIR based on LCIA seems a useful tool to improve communication and learning, as it bridges science and management while promoting the basic elements of sustainable development in a practical educational...... eutrophication. The goal is to promote an educational example of environmental impacts assessment through science-based tools to predict the impacts, communicate knowledge and support decisions. The example builds on the (D) high demand for fixation of reactive nitrogen that supports several socio......: environmentally sustainable, technologically feasible, economically viable, socially desirable, legally permissible, and administratively achievable. Specific LCIA indicators may provide preliminary information to support a precautionary approach to act earlier on D-P and contribute to sustainability. Impacts...

  20. Fundamental parameters approach applied to focal construct geometry for X-ray diffraction

    International Nuclear Information System (INIS)

    Rogers, K.; Evans, P.; Prokopiou, D.; Dicken, A.; Godber, S.; Rogers, J.

    2012-01-01

    A novel geometry for the acquisition of powder X-ray diffraction data, referred to as focal construct geometry (FCG), is presented. Diffraction data obtained by FCG have been shown to possess significantly enhanced intensity due to the hollow tube beam arrangement utilized. In contrast to conventional diffraction, the detector is translated to collect images along a primary axis and record the location of Bragg maxima. These high intensity condensation foci are unique to FCG and appear due to the convergence of Debye cones at single points on the primary axis. This work focuses on a two dimensional, fundamental parameter's approach to simulate experimental data and subsequently aid with interpretation. This convolution method is shown to favorably reproduce the experimental diffractograms and can also accommodate preferred orientation effects in some circumstances.

  1. A Blockchain Approach Applied to a Teledermatology Platform in the Sardinian Region (Italy

    Directory of Open Access Journals (Sweden)

    Katiuscia Mannaro

    2018-02-01

    Full Text Available The use of teledermatology in primary care has been shown to be reliable, offering the possibility of improving access to dermatological care by using telecommunication technologies to connect several medical centers and enable the exchange of information about skin conditions over long distances. This paper describes the main points of a teledermatology project that we have implemented to promote and facilitate the diagnosis of skin diseases and improve the quality of care for rural and remote areas. Moreover, we present a blockchain-based approach which aims to add new functionalities to an innovative teledermatology platform which we developed and tested in the Sardinian Region (Italy. These functionalities include giving the patient complete access to his/her medical records while maintaining security. Finally, the advantages that this new decentralized system can provide for patients and specialists are presented.

  2. The Video Interaction Guidance approach applied to teaching communication skills in dentistry.

    Science.gov (United States)

    Quinn, S; Herron, D; Menzies, R; Scott, L; Black, R; Zhou, Y; Waller, A; Humphris, G; Freeman, R

    2016-05-01

    To examine dentists' views of a novel video review technique to improve communication skills in complex clinical situations. Dentists (n = 3) participated in a video review known as Video Interaction Guidance to encourage more attuned interactions with their patients (n = 4). Part of this process is to identify where dentists and patients reacted positively and effectively. Each dentist was presented with short segments of video footage taken during an appointment with a patient with intellectual disabilities and communication difficulties. Having observed their interactions with patients, dentists were asked to reflect on their communication strategies with the assistance of a trained VIG specialist. Dentists reflected that their VIG session had been insightful and considered the review process as beneficial to communication skills training in dentistry. They believed that this technique could significantly improve the way dentists interact and communicate with patients. The VIG sessions increased their awareness of the communication strategies they use with their patients and were perceived as neither uncomfortable nor threatening. The VIG session was beneficial in this exploratory investigation because the dentists could identify when their interactions were most effective. Awareness of their non-verbal communication strategies and the need to adopt these behaviours frequently were identified as key benefits of this training approach. One dentist suggested that the video review method was supportive because it was undertaken by a behavioural scientist rather than a professional counterpart. Some evidence supports the VIG approach in this specialist area of communication skills and dental training. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. A chaotic quantum-behaved particle swarm approach applied to optimization of heat exchangers

    International Nuclear Information System (INIS)

    Mariani, Viviana Cocco; Klassen Duck, Anderson Rodrigo; Guerra, Fabio Alessandro; Santos Coelho, Leandro dos; Rao, Ravipudi Venkata

    2012-01-01

    Particle swarm optimization (PSO) method is a population-based optimization technique of swarm intelligence field in which each solution called “particle” flies around in a multidimensional problem search space. During the flight, every particle adjusts its position according to its own experience, as well as the experience of neighboring particles, using the best position encountered by itself and its neighbors. In this paper, a new quantum particle swarm optimization (QPSO) approach combined with Zaslavskii chaotic map sequences (QPSOZ) to shell and tube heat exchanger optimization is presented based on the minimization from economic view point. The results obtained in this paper for two case studies using the proposed QPSOZ approach, are compared with those obtained by using genetic algorithm, PSO and classical QPSO showing the best performance of QPSOZ. In order to verify the capability of the proposed method, two case studies are also presented showing that significant cost reductions are feasible with respect to traditionally designed exchangers. Referring to the literature test cases, reduction of capital investment up to 20% and 6% for the first and second cases, respectively, were obtained. Therefore, the annual pumping cost decreased markedly 72% and 75%, with an overall decrease of total cost up to 30% and 27%, respectively, for the cases 1 and 2, respectively, showing the improvement potential of the proposed method, QPSOZ. - Highlights: ► Shell and tube heat exchanger is minimized from economic view point. ► A new quantum particle swarm optimization (QPSO) combined with Zaslavskii chaotic map sequences (QPSOZ) is proposed. ► Reduction of capital investment up to 20% and 6% for the first and second cases was obtained. ► Annual pumping cost decreased 72% and 75%, with an overall decrease of total cost up to 30% and 27% using QPSOZ.

  4. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    Science.gov (United States)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  5. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  6. Nonnegative Tensor Factorization Approach Applied to Fission Chamber’s Output Signals Blind Source Separation

    Science.gov (United States)

    Laassiri, M.; Hamzaoui, E.-M.; Cherkaoui El Moursli, R.

    2018-02-01

    Inside nuclear reactors, gamma-rays emitted from nuclei together with the neutrons introduce unwanted backgrounds in neutron spectra. For this reason, powerful extraction methods are needed to extract useful neutron signal from recorded mixture and thus to obtain clearer neutron flux spectrum. Actually, several techniques have been developed to discriminate between neutrons and gamma-rays in a mixed radiation field. Most of these techniques, tackle using analogue discrimination methods. Others propose to use some organic scintillators to achieve the discrimination task. Recently, systems based on digital signal processors are commercially available to replace the analog systems. As alternative to these systems, we aim in this work to verify the feasibility of using a Nonnegative Tensor Factorization (NTF) to blind extract neutron component from mixture signals recorded at the output of fission chamber (WL-7657). This last have been simulated through the Geant4 linked to Garfield++ using a 252Cf neutron source. To achieve our objective of obtaining the best possible neutron-gamma discrimination, we have applied the two different NTF algorithms, which have been found to be the best methods that allow us to analyse this kind of nuclear data.

  7. Benefits of Applying Hierarchical Models to the Empirical Green's Function Approach

    Science.gov (United States)

    Denolle, M.; Van Houtte, C.

    2017-12-01

    Stress drops calculated from source spectral studies currently show larger variability than what is implied by empirical ground motion models. One of the potential origins of the inflated variability is the simplified model-fitting techniques used in most source spectral studies. This study improves upon these existing methods, and shows that the fitting method may explain some of the discrepancy. In particular, Bayesian hierarchical modelling is shown to be a method that can reduce bias, better quantify uncertainties and allow additional effects to be resolved. The method is applied to the Mw7.1 Kumamoto, Japan earthquake, and other global, moderate-magnitude, strike-slip earthquakes between Mw5 and Mw7.5. It is shown that the variation of the corner frequency, fc, and the falloff rate, n, across the focal sphere can be reliably retrieved without overfitting the data. Additionally, it is shown that methods commonly used to calculate corner frequencies can give substantial biases. In particular, if fc were calculated for the Kumamoto earthquake using a model with a falloff rate fixed at 2 instead of the best fit 1.6, the obtained fc would be as large as twice its realistic value. The reliable retrieval of the falloff rate allows deeper examination of this parameter for a suite of global, strike-slip earthquakes, and its scaling with magnitude. The earthquake sequences considered in this study are from Japan, New Zealand, Haiti and California.

  8. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  9. Old and new approaches to the interpretation of acid-base metabolism, starting from historical data applied to diabetic acidosis.

    Science.gov (United States)

    Mioni, Roberto; Marega, Alessandra; Lo Cicero, Marco; Montanaro, Domenico

    2016-11-01

    The approach to acid-base chemistry in medicine includes several methods. Currently, the two most popular procedures are derived from Stewart's studies and from the bicarbonate/BE-based classical formulation. Another method, unfortunately little known, follows the Kildeberg theory applied to acid-base titration. By using the data produced by Dana Atchley in 1933, regarding electrolytes and blood gas analysis applied to diabetes, we compared the three aforementioned methods, in order to highlight their strengths and their weaknesses. The results obtained, by reprocessing the data of Atchley, have shown that Kildeberg's approach, unlike the other two methods, is consistent, rational and complete for describing the organ-physiological behavior of the hydrogen ion turnover in human organism. In contrast, the data obtained using the Stewart approach and the bicarbonate-based classical formulation are misleading and fail to specify which organs or systems are involved in causing or maintaining the diabetic acidosis. Stewart's approach, despite being considered 'quantitative', does not propose in any way the concept of 'an amount of acid' and becomes even more confusing, because it is not clear how to distinguish between 'strong' and 'weak' ions. As for Stewart's approach, the classical method makes no distinction between hydrogen ions managed by the intermediate metabolism and hydroxyl ions handled by the kidney, but, at least, it is based on the concept of titration (base-excess) and indirectly defines the concept of 'an amount of acid'. In conclusion, only Kildeberg's approach offers a complete understanding of the causes and remedies against any type of acid-base disturbance.

  10. Equivalent electrical network model approach applied to a double acting low temperature differential Stirling engine

    International Nuclear Information System (INIS)

    Formosa, Fabien; Badel, Adrien; Lottin, Jacques

    2014-01-01

    Highlights: • An equivalent electrical network modeling of Stirling engine is proposed. • This model is applied to a membrane low temperate double acting Stirling engine. • The operating conditions (self-startup and steady state behavior) are defined. • An experimental engine is presented and tested. • The model is validated against experimental results. - Abstract: This work presents a network model to simulate the periodic behavior of a double acting free piston type Stirling engine. Each component of the engine is considered independently and its equivalent electrical circuit derived. When assembled in a global electrical network, a global model of the engine is established. Its steady behavior can be obtained by the analysis of the transfer function for one phase from the piston to the expansion chamber. It is then possible to simulate the dynamic (steady state stroke and operation frequency) as well as the thermodynamic performances (output power and efficiency) for given mean pressure, heat source and heat sink temperatures. The motion amplitude especially can be determined by the spring-mass properties of the moving parts and the main nonlinear effects which are taken into account in the model. The thermodynamic features of the model have then been validated using the classical isothermal Schmidt analysis for a given stroke. A three-phase low temperature differential double acting free membrane architecture has been built and tested. The experimental results are compared with the model and a satisfactory agreement is obtained. The stroke and operating frequency are predicted with less than 2% error whereas the output power discrepancy is of about 30%. Finally, some optimization routes are suggested to improve the design and maximize the performances aiming at waste heat recovery applications

  11. Capturing ecology in modeling approaches applied to environmental risk assessment of endocrine active chemicals in fish.

    Science.gov (United States)

    Mintram, Kate S; Brown, A Ross; Maynard, Samuel K; Thorbek, Pernille; Tyler, Charles R

    2018-02-01

    Endocrine active chemicals (EACs) are widespread in freshwater environments and both laboratory and field based studies have shown reproductive effects in fish at environmentally relevant exposures. Environmental risk assessment (ERA) seeks to protect wildlife populations and prospective assessments rely on extrapolation from individual-level effects established for laboratory fish species to populations of wild fish using arbitrary safety factors. Population susceptibility to chemical effects, however, depends on exposure risk, physiological susceptibility, and population resilience, each of which can differ widely between fish species. Population models have significant potential to address these shortfalls and to include individual variability relating to life-history traits, demographic and density-dependent vital rates, and behaviors which arise from inter-organism and organism-environment interactions. Confidence in population models has recently resulted in the EU Commission stating that results derived from reliable models may be considered when assessing the relevance of adverse effects of EACs at the population level. This review critically assesses the potential risks posed by EACs for fish populations, considers the ecological factors influencing these risks and explores the benefits and challenges of applying population modeling (including individual-based modeling) in ERA for EACs in fish. We conclude that population modeling offers a way forward for incorporating greater environmental relevance in assessing the risks of EACs for fishes and for identifying key risk factors through sensitivity analysis. Individual-based models (IBMs) allow for the incorporation of physiological and behavioral endpoints relevant to EAC exposure effects, thus capturing both direct and indirect population-level effects.

  12. Bioremediation Approaches in a Laboratory Activity for the Industrial Biotechnology and Applied Microbiology (IBAM Course

    Directory of Open Access Journals (Sweden)

    L. Raiger Iustman

    2013-03-01

    Full Text Available Industrial Biotechnology and Applied Microbiology is an optional 128h-course for Chemistry and Biology students at the Faculty of Sciences, University of Buenos Aires, Argentina. This course is usually attended by 25 students, working in teams of two. The curriculum, with 8 lab exercises, includes an oil bioremediation practice covering an insight of bioremediation processes: the influence of pollutants on autochthonous microbiota, biodegrader isolation and biosurfactant production for bioavailability understanding. The experimental steps are: (A evaluation of microbial tolerance to pollutants by constructing pristine soil microcosms contaminated with diesel or xylene and (B isolation of degraders and biosurfactant production analysis. To check microbial tolerance, microcosms are incubated during one week at 25-28ºC. Samples are collected at 0, 4 and every 48 h for CFU/g soil testing. An initial decrease of total CFU/g related to toxicity is noticed. At the end of the experiment, a recovery of the CFU number is observed, evidencing enrichment in biodegraders. Some colonies from the CFU counting plates are streaked in M9-agar with diesel as sole carbon source. After a week, isolates are inoculated on M9-Broth supplemented with diesel to induce biosurfactant production. Surface tension and Emulsification Index are measured in culture supernatants to visualize tensioactive effect of bacterial products. Besides the improvement in the good microbiological practices, the students show enthusiasm in different aspects, depending on their own interests. While biology students explore and learn new concepts on solubility, emulsions and bioavailability, chemistry students show curiosity in bacterial behavior and manipulation of microorganisms for environmental benefits.

  13. Does the interpersonal model apply across eating disorder diagnostic groups? A structural equation modeling approach.

    Science.gov (United States)

    Ivanova, Iryna V; Tasca, Giorgio A; Proulx, Geneviève; Bissada, Hany

    2015-11-01

    Interpersonal model has been validated with binge-eating disorder (BED), but it is not yet known if the model applies across a range of eating disorders (ED). The goal of this study was to investigate the validity of the interpersonal model in anorexia nervosa (restricting type; ANR and binge-eating/purge type; ANBP), bulimia nervosa (BN), BED, and eating disorder not otherwise specified (EDNOS). Data from a cross-sectional sample of 1459 treatment-seeking women diagnosed with ANR, ANBP, BN, BED and EDNOS were examined for indirect effects of interpersonal problems on ED psychopathology mediated through negative affect. Findings from structural equation modeling demonstrated the mediating role of negative affect in four of the five diagnostic groups. There were significant, medium to large (.239, .558), indirect effects in the ANR, BN, BED and EDNOS groups but not in the ANBP group. The results of the first reverse model of interpersonal problems as a mediator between negative affect and ED psychopathology were nonsignificant, suggesting the specificity of these hypothesized paths. However, in the second reverse model ED psychopathology was related to interpersonal problems indirectly through negative affect. This is the first study to find support for the interpersonal model of ED in a clinical sample of women with diverse ED diagnoses, though there may be a reciprocal relationship between ED psychopathology and relationship problems through negative affect. Negative affect partially explains the relationship between interpersonal problems and ED psychopathology in women diagnosed with ANR, BN, BED and EDNOS. Interpersonal psychotherapies for ED may be addressing the underlying interpersonal-affective difficulties, thereby reducing ED psychopathology. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Applying of an Ontology based Modeling Approach to Cultural Heritage Systems

    Directory of Open Access Journals (Sweden)

    POPOVICI, D.-M.

    2011-08-01

    Full Text Available Any virtual environment (VE built in a classical way is dedicated to a very specific domain. Its modification or even adaptation to another domain requires an expensive human intervention measured in time and money. This way, the product, that means the VE, returns at the first phases of the development process. In a previous work we proposed an approach that combines domain ontologies and conceptual modeling to construct more accurate VEs. Our method is based on the description of the domain knowledge in a standard format and the assisted creation (using these pieces of knowledge of the VE. This permits the explanation within the virtual reality (VR simulation of the semantic of the whole context and of each object. This knowledge may be then transferred to the public users. In this paper we prove the effectiveness of our method on the construction process of an VE that simulates the organization of a Greek-Roman colony situated on the Black Sea coast and the economic and social activities of its people.

  15. A NURBS-based finite element model applied to geometrically nonlinear elastodynamics using a corotational approach

    KAUST Repository

    Espath, L. F R; Braun, Alexandre Luis; Awruch, Armando Miguel; Dalcin, Lisandro

    2015-01-01

    A numerical model to deal with nonlinear elastodynamics involving large rotations within the framework of the finite element based on NURBS (Non-Uniform Rational B-Spline) basis is presented. A comprehensive kinematical description using a corotational approach and an orthogonal tensor given by the exact polar decomposition is adopted. The state equation is written in terms of corotational variables according to the hypoelastic theory, relating the Jaumann derivative of the Cauchy stress to the Eulerian strain rate.The generalized-α method (Gα) method and Generalized Energy-Momentum Method with an additional parameter (GEMM+ξ) are employed in order to obtain a stable and controllable dissipative time-stepping scheme with algorithmic conservative properties for nonlinear dynamic analyses.The main contribution is to show that the energy-momentum conservation properties and numerical stability may be improved once a NURBS-based FEM in the spatial discretization is used. Also it is shown that high continuity can postpone the numerical instability when GEMM+ξ with consistent mass is employed; likewise, increasing the continuity class yields a decrease in the numerical dissipation. A parametric study is carried out in order to show the stability and energy budget in terms of several properties such as continuity class, spectral radius and lumped as well as consistent mass matrices.

  16. Personalized Medicine Applied to Forensic Sciences: New Advances and Perspectives for a Tailored Forensic Approach.

    Science.gov (United States)

    Santurro, Alessandro; Vullo, Anna Maria; Borro, Marina; Gentile, Giovanna; La Russa, Raffaele; Simmaco, Maurizio; Frati, Paola; Fineschi, Vittorio

    2017-01-01

    Personalized medicine (PM), included in P5 medicine (Personalized, Predictive, Preventive, Participative and Precision medicine) is an innovative approach to the patient, emerging from the need to tailor and to fit the profile of each individual. PM promises to dramatically impact also on forensic sciences and justice system in ways we are only beginning to understand. The application of omics (genomic, transcriptomics, epigenetics/imprintomics, proteomic and metabolomics) is ever more fundamental in the so called "molecular autopsy". Emerging fields of interest in forensic pathology are represented by diagnosis and detection of predisposing conditions to fatal thromboembolic and hypertensive events, determination of genetic variants related to sudden death, such as congenital long QT syndromes, demonstration of lesions vitality, identification of biological matrices and species diagnosis of a forensic trace on crime scenes without destruction of the DNA. The aim of this paper is to describe the state-of-art in the application of personalized medicine in forensic sciences, to understand the possibilities of integration in routine investigation of these procedures with classical post-mortem studies and to underline the importance of these new updates in medical examiners' armamentarium in determining cause of death or contributing factors to death. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  17. A NURBS-based finite element model applied to geometrically nonlinear elastodynamics using a corotational approach

    KAUST Repository

    Espath, L. F R

    2015-02-03

    A numerical model to deal with nonlinear elastodynamics involving large rotations within the framework of the finite element based on NURBS (Non-Uniform Rational B-Spline) basis is presented. A comprehensive kinematical description using a corotational approach and an orthogonal tensor given by the exact polar decomposition is adopted. The state equation is written in terms of corotational variables according to the hypoelastic theory, relating the Jaumann derivative of the Cauchy stress to the Eulerian strain rate.The generalized-α method (Gα) method and Generalized Energy-Momentum Method with an additional parameter (GEMM+ξ) are employed in order to obtain a stable and controllable dissipative time-stepping scheme with algorithmic conservative properties for nonlinear dynamic analyses.The main contribution is to show that the energy-momentum conservation properties and numerical stability may be improved once a NURBS-based FEM in the spatial discretization is used. Also it is shown that high continuity can postpone the numerical instability when GEMM+ξ with consistent mass is employed; likewise, increasing the continuity class yields a decrease in the numerical dissipation. A parametric study is carried out in order to show the stability and energy budget in terms of several properties such as continuity class, spectral radius and lumped as well as consistent mass matrices.

  18. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  19. Maximization of regional probabilities using Optimal Surface Graphs

    DEFF Research Database (Denmark)

    Arias Lorza, Andres M.; Van Engelen, Arna; Petersen, Jens

    2018-01-01

    Purpose: We present a segmentation method that maximizes regional probabilities enclosed by coupled surfaces using an Optimal Surface Graph (OSG) cut approach. This OSG cut determines the globally optimal solution given a graph constructed around an initial surface. While most methods for vessel...... wall segmentation only use edge information, we show that maximizing regional probabilities using an OSG improves the segmentation results. We applied this to automatically segment the vessel wall of the carotid artery in magnetic resonance images. Methods: First, voxel-wise regional probability maps...... were obtained using a Support Vector Machine classifier trained on local image features. Then, the OSG segments the regions which maximizes the regional probabilities considering smoothness and topological constraints. Results: The method was evaluated on 49 carotid arteries from 30 subjects...

  20. Increasing oral absorption of polar neuraminidase inhibitors: a prodrug transporter approach applied to oseltamivir analogue.

    Science.gov (United States)

    Gupta, Deepak; Varghese Gupta, Sheeba; Dahan, Arik; Tsume, Yasuhiro; Hilfinger, John; Lee, Kyung-Dall; Amidon, Gordon L

    2013-02-04

    Poor oral absorption is one of the limiting factors in utilizing the full potential of polar antiviral agents. The neuraminidase target site requires a polar chemical structure for high affinity binding, thus limiting oral efficacy of many high affinity ligands. The aim of this study was to overcome this poor oral absorption barrier, utilizing prodrug to target the apical brush border peptide transporter 1 (PEPT1). Guanidine oseltamivir carboxylate (GOCarb) is a highly active polar antiviral agent with insufficient oral bioavailability (4%) to be an effective therapeutic agent. In this report we utilize a carrier-mediated targeted prodrug approach to improve the oral absorption of GOCarb. Acyloxy(alkyl) ester based amino acid linked prodrugs were synthesized and evaluated as potential substrates of mucosal transporters, e.g., PEPT1. Prodrugs were also evaluated for their chemical and enzymatic stability. PEPT1 transport studies included [(3)H]Gly-Sar uptake inhibition in Caco-2 cells and cellular uptake experiments using HeLa cells overexpressing PEPT1. The intestinal membrane permeabilities of the selected prodrugs and the parent drug were then evaluated for epithelial cell transport across Caco-2 monolayers, and in the in situ rat intestinal jejunal perfusion model. Prodrugs exhibited a pH dependent stability with higher stability at acidic pHs. Significant inhibition of uptake (IC(50) 30-fold increase in affinity compared to GOCarb. The l-valyl prodrug exhibited significant enhancement of uptake in PEPT1/HeLa cells and compared favorably with the well-absorbed valacyclovir. Transepithelial permeability across Caco-2 monolayers showed that these amino acid prodrugs have a 2-5-fold increase in permeability as compared to the parent drug and showed that the l-valyl prodrug (P(app) = 1.7 × 10(-6) cm/s) has the potential to be rapidly transported across the epithelial cell apical membrane. Significantly, only the parent drug (GOCarb) appeared in the basolateral

  1. Advanced In-Service Inspection Approaches Applied to the Phenix Fast Breeder Reactor

    International Nuclear Information System (INIS)

    Guidez, J.; Martin, L.; Dupraz, R.

    2006-01-01

    The safety upgrading of the Phenix plant undertaken between 1994 and 1997 involved a vast inspection programme of the reactor, the external storage drum and the secondary sodium circuits in order to meet the requirements of the defence-in-depth safety approach. The three lines of defence were analysed for every safety related component: demonstration of the quality of design and construction, appropriate in-service inspection and controlling the consequences of an accident. The in-service reactor block inspection programme consisted in controlling the core support structures and the high-temperature elements. Despite the fact that limited consideration had been given to inspection constraints during the design stage of the reactor in the 1960's, as compared to more recent reactor projects such as the European Fast Reactor (EFR), all the core support line elements were able to be inspected. The three following main operations are described: Ultrasonic inspection of the upper hangers of the main vessel, using small transducers able to withstand temperatures of 130 deg. C, Inspection of the conical shell supporting the core dia-grid. A specific ultrasonic method and a special implementation technique were used to control the under sodium structure welds, located up to several meters away from the scan surface. Remote inspection of the hot pool structures, particularly the core cover plug after partial sodium drainage of the reactor vessel. Other inspections are also summarized: control of secondary sodium circuit piping, intermediate heat exchangers, primary sodium pumps, steam generator units and external storage drum. The pool type reactor concept, developed in France since the 1960's, presents several favourable safety and operational features. The feedback from the Phenix plant also shows real potential for in-service inspection. The design of future generation IV sodium fast reactors will benefit from the experience acquired from the Phenix plant. (authors)

  2. School Food Environment Promotion Program: Applying the Socio-ecological Approach

    Directory of Open Access Journals (Sweden)

    Fatemeh Bakhtari Aghdam

    2018-01-01

    Full Text Available Background Despite of healthy nutrition recommendations have been offered in recent decades, researches show an increasing rate of unhealthy junk food consumption among primary school children. The aim of this study was to investigate the effects of health promotion intervention on the school food buffets and the changes in nutritional behaviors of the students. Materials and Methods In this Quasi-interventional study, eight schools agreed to participate in Tabriz city, Iran. The schools were randomly selected and divided into an intervention and a control group, and a pretest was given to both groups. A four weeks interventional program was conducted in eight randomly selected schools of the city based on the socio-ecological model. A check list was designed for the assessment of food items available at the schools’ buffets, a 60-item semi-quantitative food frequency questionnaire (FFQ was used to assess the rate of food consumption and energy intake. Results evaluation and practice were analyzed using the Wilcoxon, Mann Whitney-U and Chi-square tests. Results The findings revealed reduction in the intervention group between before and after intervention with regard the range of junk food consumption, except for the sweets consumption. The number of junk foods provided in the schools buffets reduced in the intervention group. After the intervention on the intervention group significant decreases were found in the intake of energy, fat and saturated fatty acids compared to the control group (p = 0.00.   Conclusion In order to design effective school food environment promotion programs, school healthcare providers should consider multifaceted approaches.

  3. "Teamwork in hospitals": a quasi-experimental study protocol applying a human factors approach.

    Science.gov (United States)

    Ballangrud, Randi; Husebø, Sissel Eikeland; Aase, Karina; Aaberg, Oddveig Reiersdal; Vifladt, Anne; Berg, Geir Vegard; Hall-Lord, Marie Louise

    2017-01-01

    Effective teamwork and sufficient communication are critical components essential to patient safety in today's specialized and complex healthcare services. Team training is important for an improved efficiency in inter-professional teamwork within hospitals, however the scientific rigor of studies must be strengthen and more research is required to compare studies across samples, settings and countries. The aims of the study are to translate and validate teamwork questionnaires and investigate healthcare personnel's perception of teamwork in hospitals (Part 1). Further to explore the impact of an inter-professional teamwork intervention in a surgical ward on structure, process and outcome (Part 2). To address the aims, a descriptive, and explorative design (Part 1), and a quasi-experimental interventional design will be applied (Part 2). The study will be carried out in five different hospitals (A-E) in three hospital trusts in Norway. Frontline healthcare personnel in Hospitals A and B, from both acute and non-acute departments, will be invited to respond to three Norwegian translated teamwork questionnaires (Part 1). An inter-professional teamwork intervention in line with the TeamSTEPPS recommend Model of Change will be implemented in a surgical ward at Hospital C. All physicians, registered nurses and assistant nurses in the intervention ward and two control wards (Hospitals D and E) will be invited to to survey their perception of teamwork, team decision making, safety culture and attitude towards teamwork before intervention and after six and 12 months. Adult patients admitted to the intervention surgical unit will be invited to survey their perception of quality of care during their hospital stay before intervention and after six and 12 month. Moreover, anonymous patient registry data from local registers and data from patients' medical records will be collected (Part 2). This study will help to understand the impact of an inter-professional teamwork

  4. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  5. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  6. Quantifying aquifer properties and freshwater resource in coastal barriers: a hydrogeophysical approach applied at Sasihithlu (Karnataka state, India)

    Science.gov (United States)

    Vouillamoz, J.-M.; Hoareau, J.; Grammare, M.; Caron, D.; Nandagiri, L.; Legchenko, A.

    2012-11-01

    Many human communities living in coastal areas in Africa and Asia rely on thin freshwater lenses for their domestic supply. Population growth together with change in rainfall patterns and sea level will probably impact these vulnerable groundwater resources. Spatial knowledge of the aquifer properties and creation of a groundwater model are required for achieving a sustainable management of the resource. This paper presents a ready-to-use methodology for estimating the key aquifer properties and the freshwater resource based on the joint use of two non-invasive geophysical tools together with common hydrological measurements. We applied the proposed methodology in an unconfined aquifer of a coastal sandy barrier in South-Western India. We jointly used magnetic resonance and transient electromagnetic soundings and we monitored rainfall, groundwater level and groundwater electrical conductivity. The combined interpretation of geophysical and hydrological results allowed estimating the aquifer properties and mapping the freshwater lens. Depending on the location and season, we estimate the freshwater reserve to range between 400 and 700 L m-2 of surface area (± 50%). We also estimate the recharge using time lapse geophysical measurements with hydrological monitoring. After a rainy event close to 100% of the rain is reaching the water table, but the net recharge at the end of the monsoon is less than 10% of the rain. Thus, we conclude that a change in rainfall patterns will probably not impact the groundwater resource since most of the rain water recharging the aquifer is flowing towards the sea and the river. However, a change in sea level will impact both the groundwater reserve and net recharge.

  7. Quantifying aquifer properties and freshwater resource in coastal barriers: a hydrogeophysical approach applied at Sasihithlu (Karnataka state, India

    Directory of Open Access Journals (Sweden)

    J.-M. Vouillamoz

    2012-11-01

    Full Text Available Many human communities living in coastal areas in Africa and Asia rely on thin freshwater lenses for their domestic supply. Population growth together with change in rainfall patterns and sea level will probably impact these vulnerable groundwater resources. Spatial knowledge of the aquifer properties and creation of a groundwater model are required for achieving a sustainable management of the resource. This paper presents a ready-to-use methodology for estimating the key aquifer properties and the freshwater resource based on the joint use of two non-invasive geophysical tools together with common hydrological measurements.

    We applied the proposed methodology in an unconfined aquifer of a coastal sandy barrier in South-Western India. We jointly used magnetic resonance and transient electromagnetic soundings and we monitored rainfall, groundwater level and groundwater electrical conductivity. The combined interpretation of geophysical and hydrological results allowed estimating the aquifer properties and mapping the freshwater lens. Depending on the location and season, we estimate the freshwater reserve to range between 400 and 700 L m−2 of surface area (± 50%. We also estimate the recharge using time lapse geophysical measurements with hydrological monitoring. After a rainy event close to 100% of the rain is reaching the water table, but the net recharge at the end of the monsoon is less than 10% of the rain. Thus, we conclude that a change in rainfall patterns will probably not impact the groundwater resource since most of the rain water recharging the aquifer is flowing towards the sea and the river. However, a change in sea level will impact both the groundwater reserve and net recharge.

  8. Estimating the short run effects of South Africa's Employment Tax Incentive on youth employment probabilities using a difference-in-differences approach

    OpenAIRE

    Vimal Ranchhod; Arden Finn

    2014-01-01

    What effect did the introduction of the Employment Tax Incentive (ETI) have on youth employment probabilities in South Africa in the short run? The ETI came into effect on the 1st of January 2014. Its purpose is to stimulate youth employment levels and ease the challenges that many youth experience in finding their first jobs. Under the ETI, firms that employ youth are eligible to claim a deduction from their taxes due, for the portion of their wage bill that is paid to certain groups of yout...

  9. Economic and ecological impacts of bioenergy crop production—a modeling approach applied in Southwestern Germany

    Directory of Open Access Journals (Sweden)

    Hans-Georg Schwarz-v. Raumer

    2017-03-01

    Full Text Available This paper considers scenarios of cultivating energy crops in the German Federal State of Baden-Württemberg to identify potentials and limitations of a sustainable bioenergy production. Trade-offs are analyzed among income and production structure in agriculture, bioenergy crop production, greenhouse gas emissions, and the interests of soil, water and species habitat protection. An integrated modelling approach (IMA was implemented coupling ecological and economic models in a model chain. IMA combines the Economic Farm Emission Model (EFEM; key input: parameter sets on farm production activities, the Environmental Policy Integrated Climate model (EPIC; key input: parameter sets on environmental cropping effects and GIS geo-processing models. EFEM is a supply model that maximizes total gross margins on farm level with simultaneous calculation of greenhouse gas emission from agriculture production. Calculations by EPIC result in estimates for soil erosion by water, nitrate leaching, Soil Organic Carbon and greenhouse gas emissions from soil. GIS routines provide land suitability analyses, scenario settings concerning nature conservation and habitat models for target species and help to enable spatial explicit results. The model chain is used to calculate scenarios representing different intensities of energy crop cultivation. To design scenarios which are detailed and in step to practice, comprehensive data research as well as fact and effect analyses were carried out. The scenarios indicate that, not in general but when considering specific farm types, energy crop share extremely increases if not restricted and leads to an increase in income. If so this leads to significant increase in soil erosion by water, nitrate leaching and greenhouse gas emissions. It has to be expected that an extension of nature conservation leads to an intensification of the remaining grassland and of the arable land, which were not part of nature conservation measures

  10. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  11. Applying radiation approaches to the control of public risks from chemical agents

    International Nuclear Information System (INIS)

    Alexander, R.E.

    1989-01-01

    IF a hazardous agent has a threshold, prevention is the obvious measure of success. To the eyes of this author, success is also achieveable for a hazardous agent that may have no threshold and that causes its effects in a probabilistic manner. First, the technical people responsible for protection must be given a reasonable, well defined risk objective by governmental authorities. To the extent that they meet that objective (1) without unnecessarily increasing operational costs, (2) without interfering unnecessarily with operational activities, and (3) without diverting resources away from greater risks, they are successful. Considering these three qualifications, radiation protection for members of the public can hardly be presented as the panacea for other hazardous agents. It would be an error to dismiss the improvement opportunities discussed above as being of acdemic interest only. Decades of experience with radiation have demonstrated that these problems are both real adn significant. In the US the axioms discussed above are accepted as scientific fact for radiation by many policy makers, the news media and the public. For any operation the collective dose is calculated using zero dose as the lower limit of integration, the results are converted to cancer deaths using the risk coefficients, and decisions are made as though these deaths would actually occur without governmental intervention. As a result, billions of dollars and a very large number of highly skilled persons are being expended to protect against radiation doses far smaller than geographical variations in the natural radiation background. These expenditures are demanded by, and required for well-meaning, nontechnical people who have been misled. It is often stated by knowledgeable people that if the degree of protection required for radiation were also to be requested for the other hazards, human progress would come to a halt. If the radiation approaches are to be used in the control of public

  12. Applied Problems and Use of Technology in an Aligned Way in Basic Courses in Probability and Statistics for Engineering Students--A Way to Enhance Understanding and Increase Motivation

    Science.gov (United States)

    Zetterqvist, Lena

    2017-01-01

    Researchers and teachers often recommend motivating exercises and use of mathematics or statistics software for the teaching of basic courses in probability and statistics. Our courses are given to large groups of engineering students at Lund Institute of Technology. We found that the mere existence of real-life data and technology in a course…

  13. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    International Nuclear Information System (INIS)

    Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

    2013-01-01

    necessary to have the accurate spectrum information about the source-detector system. When dealing with experimental data, the spectrum can be predicted by a Monte Carlo simulator. For the materials between water and bone, less than 5% separation errors are observed on the estimated decomposition fractions.Conclusions: The proposed approach is a statistical reconstruction approach based on a nonlinear forward model counting the full beam polychromaticity and applied directly to the projections without taking negative-log. Compared to the approaches based on linear forward models and the BHA correction approaches, it has advantages in noise robustness and reconstruction accuracy

  14. Qualified Presumption of Safety (QPS) is a generic risk assessment approach applied by the European Food Safety Authority (EFSA)

    DEFF Research Database (Denmark)

    Leuschner, R. G. K.; Robinson, T. P.; Hugas, M.

    2010-01-01

    Qualified Presumption of Safety (QPS) is a generic risk assessment approach applied by the European Food Safety Authority (EFSA) to notified biological agents aiming at simplifying risk assessments across different scientific Panels and Units. The aim of this review is to outline the implementation...... and value of the QPS assessment for EFSA and to explain its principles such as the unambiguous identity of a taxonomic unit, the body of knowledge including potential safety concerns and how these considerations lead to a list of biological agents recommended for QPS which EFSA keeps updated through...

  15. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  16. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  17. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  18. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  19. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  20. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  1. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  2. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  3. Isotopic reconstruction of the weaning process in the archaeological population of Canímar Abajo, Cuba: A Bayesian probability mixing model approach.

    Directory of Open Access Journals (Sweden)

    Yadira Chinique de Armas

    Full Text Available The general lack of well-preserved juvenile skeletal remains from Caribbean archaeological sites has, in the past, prevented evaluations of juvenile dietary changes. Canímar Abajo (Cuba, with a large number of well-preserved juvenile and adult skeletal remains, provided a unique opportunity to fully assess juvenile paleodiets from an ancient Caribbean population. Ages for the start and the end of weaning and possible food sources used for weaning were inferred by combining the results of two Bayesian probability models that help to reduce some of the uncertainties inherent to bone collagen isotope based paleodiet reconstructions. Bone collagen (31 juveniles, 18 adult females was used for carbon and nitrogen isotope analyses. The isotope results were assessed using two Bayesian probability models: Weaning Ages Reconstruction with Nitrogen isotopes and Stable Isotope Analyses in R. Breast milk seems to have been the most important protein source until two years of age with some supplementary food such as tropical fruits and root cultigens likely introduced earlier. After two, juvenile diets were likely continuously supplemented by starch rich foods such as root cultigens and legumes. By the age of three, the model results suggest that the weaning process was completed. Additional indications suggest that animal marine/riverine protein and maize, while part of the Canímar Abajo female diets, were likely not used to supplement juvenile diets. The combined use of both models here provided a more complete assessment of the weaning process for an ancient Caribbean population, indicating not only the start and end ages of weaning but also the relative importance of different food sources for different age juveniles.

  4. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Making good theory practical: five lessons for an Applied Social Identity Approach to challenges of organizational, health, and clinical psychology.

    Science.gov (United States)

    Haslam, S Alexander

    2014-03-01

    Social identity research was pioneered as a distinctive theoretical approach to the analysis of intergroup relations but over the last two decades it has increasingly been used to shed light on applied issues. One early application of insights from social identity and self-categorization theories was to the organizational domain (with a particular focus on leadership), but more recently there has been a surge of interest in applications to the realm of health and clinical topics. This article charts the development of this Applied Social Identity Approach, and abstracts five core lessons from the research that has taken this forward. (1) Groups and social identities matter because they have a critical role to play in organizational and health outcomes. (2) Self-categorizations matter because it is people's self-understandings in a given context that shape their psychology and behaviour. (3) The power of groups is unlocked by working with social identities not across or against them. (4) Social identities need to be made to matter in deed not just in word. (5) Psychological intervention is always political because it always involves some form of social identity management. Programmes that seek to incorporate these principles are reviewed and important challenges and opportunities for the future are identified. © 2014 The British Psychological Society.

  6. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  7. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  8. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  9. Applying the Ecosystem Approach to Select Priority Areas for Forest Landscape Restoration in the Yungas, Northwestern Argentina

    Science.gov (United States)

    Ianni, Elena; Geneletti, Davide

    2010-11-01

    This paper proposes a method to select forest restoration priority areas consistently with the key principles of the Ecosystem Approach (EA) and the Forest Landscape Restoration (FLR) framework. The methodology is based on the principles shared by the two approaches: acting at ecosystem scale, involving stakeholders, and evaluating alternatives. It proposes the involvement of social actors which have a stake in forest management through multicriteria analysis sessions aimed at identifying the most suitable forest restoration intervention. The method was applied to a study area in the native forests of Northern Argentina (the Yungas). Stakeholders were asked to identify alternative restoration actions, i.e. potential areas implementing FLR. Ten alternative fincas—estates derived from the Spanish land tenure system—differing in relation to ownership, management, land use, land tenure, and size were evaluated. Twenty criteria were selected and classified into four groups: biophysical, social, economic and political. Finca Ledesma was the closest to the economic, social, environmental and political goals, according to the values and views of the actors involved in the decision. This study represented the first attempt to apply EA principles to forest restoration at landscape scale in the Yungas region. The benefits obtained by the application of the method were twofold: on one hand, researchers and local actors were forced to conceive the Yungas as a complex net of rights rather than as a sum of personal interests. On the other hand, the participatory multicriteria approach provided a structured process for collective decision-making in an area where it has never been implemented.

  10. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  11. Applying the institutional review board data repository approach to manage ethical considerations in evaluating and studying medical education.

    Science.gov (United States)

    Thayer, Erin K; Rathkey, Daniel; Miller, Marissa Fuqua; Palmer, Ryan; Mejicano, George C; Pusic, Martin; Kalet, Adina; Gillespie, Colleen; Carney, Patricia A

    2016-01-01

    Medical educators and educational researchers continue to improve their processes for managing medical student and program evaluation data using sound ethical principles. This is becoming even more important as curricular innovations are occurring across undergraduate and graduate medical education. Dissemination of findings from this work is critical, and peer-reviewed journals often require an institutional review board (IRB) determination. IRB data repositories, originally designed for the longitudinal study of biological specimens, can be applied to medical education research. The benefits of such an approach include obtaining expedited review for multiple related studies within a single IRB application and allowing for more flexibility when conducting complex longitudinal studies involving large datasets from multiple data sources and/or institutions. In this paper, we inform educators and educational researchers on our analysis of the use of the IRB data repository approach to manage ethical considerations as part of best practices for amassing, pooling, and sharing data for educational research, evaluation, and improvement purposes. Fostering multi-institutional studies while following sound ethical principles in the study of medical education is needed, and the IRB data repository approach has many benefits, especially for longitudinal assessment of complex multi-site data.

  12. Applying a developmental approach to quality of life assessment in children and adolescents with psychological disorders: challenges and guidelines.

    Science.gov (United States)

    Carona, Carlos; Silva, Neuza; Moreira, Helena

    2015-02-01

    Research on the quality of life (QL) of children/adolescents with psychological disorders has flourished over the last few decades. Given the developmental challenges of QL measurements in pediatric populations, the aim of this study was to ascertain the extent to which a developmental approach to QL assessment has been applied to pedopsychiatric QL research. A systematic literature search was conducted in three electronic databases (PubMed, PsycINFO, SocINDEX) from 1994 to May 2014. Quantitative studies were included if they assessed the self- or proxy-reported QL of children/adolescents with a psychological disorder. Data were extracted for study design, participants, QL instruments and informants, and statistical approach to age-related specificities. The systematic review revealed widespread utilization of developmentally appropriate QL instruments but less frequent use of both self and proxy reports and an inconsistent approach to age group specificities. Methodological guidelines are discussed to improve the developmental validity of QL research for children/adolescents with mental disorders.

  13. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  14. Applying the Innov8 approach for reviewing national health programmes to leave no one behind: lessons learnt from Indonesia

    Science.gov (United States)

    Saint, Victoria; Floranita, Rustini; Koemara Sakti, Gita Maya; Pambudi, Imran; Hermawan, Lukas; Villar, Eugenio; Magar, Veronica

    2018-01-01

    ABSTRACT The World Health Organization’s Innov8 Approach for Reviewing National Health Programmes to Leave No One Behind is an eight-step process that supports the operationalization of the Sustainable Development Goals’ commitment to ‘leave no one behind’. In 2014–2015, Innov8 was adapted and applied in Indonesia to review how the national neonatal and maternal health action plans could become more equity-oriented, rights-based and gender-responsive, and better address critical social determinants of health. The process was led by the Indonesian Ministry of Health, with the support of WHO. It involved a wide range of actors and aligned with/fed into the drafting of the maternal newborn health action plan and the implementation planning of the newborn action plan. Key activities included a sensitization meeting, diagnostic checklist, review workshop and in-country work by the review teams. This ‘methods forum’ article describes this adaptation and application process, the outcomes and lessons learnt. In conjunction with other sources, Innov8 findings and recommendations informed national and sub-national maternal and neonatal action plans and programming to strengthen a ‘leave no one behind’ approach. As follow-up during 2015–2017, components of the Innov8 methodology were integrated into district-level planning processes for maternal and newborn health, and Innov8 helped generate demand for health inequality monitoring and its use in planning. In Indonesia, Innov8 enhanced national capacity for equity-oriented, rights-based and gender-responsive approaches and addressing critical social determinants of health. Adaptation for the national planning context (e.g. decentralized structure) and linking with health inequality monitoring capacity building were important lessons learnt. The pilot of Innov8 in Indonesia suggests that this approach can help operationalize the SDGs’ commitment to leave no one behind, in particular in relation to

  15. Exploiting neurovascular coupling: a Bayesian sequential Monte Carlo approach applied to simulated EEG fNIRS data

    Science.gov (United States)

    Croce, Pierpaolo; Zappasodi, Filippo; Merla, Arcangelo; Chiarelli, Antonio Maria

    2017-08-01

    Objective. Electrical and hemodynamic brain activity are linked through the neurovascular coupling process and they can be simultaneously measured through integration of electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS). Thanks to the lack of electro-optical interference, the two procedures can be easily combined and, whereas EEG provides electrophysiological information, fNIRS can provide measurements of two hemodynamic variables, such as oxygenated and deoxygenated hemoglobin. A Bayesian sequential Monte Carlo approach (particle filter, PF) was applied to simulated recordings of electrical and neurovascular mediated hemodynamic activity, and the advantages of a unified framework were shown. Approach. Multiple neural activities and hemodynamic responses were simulated in the primary motor cortex of a subject brain. EEG and fNIRS recordings were obtained by means of forward models of volume conduction and light propagation through the head. A state space model of combined EEG and fNIRS data was built and its dynamic evolution was estimated through a Bayesian sequential Monte Carlo approach (PF). Main results. We showed the feasibility of the procedure and the improvements in both electrical and hemodynamic brain activity reconstruction when using the PF on combined EEG and fNIRS measurements. Significance. The investigated procedure allows one to combine the information provided by the two methodologies, and, by taking advantage of a physical model of the coupling between electrical and hemodynamic response, to obtain a better estimate of brain activity evolution. Despite the high computational demand, application of such an approach to in vivo recordings could fully exploit the advantages of this combined brain imaging technology.

  16. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  17. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  18. Information operator approach applied to the retrieval of vertical distributions of atmospheric constituents from ground-based FTIR measurements

    Science.gov (United States)

    Senten, Cindy; de Mazière, Martine; Vanhaelewyn, Gauthier; Vigouroux, Corinne; Delmas, Robert

    2010-05-01

    The retrieval of information about the vertical distribution of an atmospheric absorber from high spectral resolution ground-based Fourier Transform infrared (FTIR) solar absorption spectra is an important issue in remote sensing. A frequently used technique at present is the optimal estimation method. This work introduces the application of an alternative method, namely the information operator approach (Doicu et al., 2007; Hoogen et al., 1999), for extracting the available information from such FTIR measurements. This approach has been implemented within the well-known retrieval code SFIT2, by adapting the optimal estimation method such as to take into account only the significant contributions to the solution. In particular, we demonstrate the feasibility of the method when applied to ground-based FTIR spectra taken at the southern (sub)tropical site Ile de La Réunion (21° S, 55° E) in 2007. A thorough comparison has been made between the retrieval results obtained with the original optimal estimation method and the ones obtained with the information operator approach, regarding profile and column stability, information content and corresponding full error budget evaluation. This has been done for the target species ozone (O3), methane (CH4), nitrous oxide (N2O), and carbon monoxide (CO). It is shown that the information operator approach performs well and is capable of achieving the same accuracy as optimal estimation, with a gain of stability and with the additional advantage of being less sensitive to the choice of a priori information as well as to the actual signal-to-noise ratio. Keywords: ground-based FTIR, solar absorption spectra, greenhouse gases, information operator approach References Doicu, A., Hilgers, S., von Bargen, A., Rozanov, A., Eichmann, K.-U., von Savigny, C., and Burrows, J.P.: Information operator approach and iterative regularization methods for atmospheric remote sensing, J. Quant. Spectrosc. Radiat. Transfer, 103, 340-350, 2007

  19. A nonlinear adaptive backstepping approach applied to a three phase PWM AC-DC converter feeding induction heating

    Science.gov (United States)

    Hadri-Hamida, A.; Allag, A.; Hammoudi, M. Y.; Mimoune, S. M.; Zerouali, S.; Ayad, M. Y.; Becherif, M.; Miliani, E.; Miraoui, A.

    2009-04-01

    This paper presents a new control strategy for a three phase PWM converter, which consists of applying an adaptive nonlinear control. The input-output feedback linearization approach is based on the exact cancellation of the nonlinearity, for this reason, this technique is not efficient, because system parameters can vary. First a nonlinear system modelling is derived with state variables of the input current and the output voltage by using power balance of the input and output, the nonlinear adaptive backstepping control can compensate the nonlinearities in the nominal system and the uncertainties. Simulation results are obtained using Matlab/Simulink. These results show how the adaptive backstepping law updates the system parameters and provide an efficient control design both for tracking and regulation in order to improve the power factor.

  20. Evaluating and reducing the effect of data corruption when applying bag of words approaches to medical records.

    Science.gov (United States)

    Ruch, P; Baud, R; Geissbühler, A

    2002-12-04

    Unlike journal corpora, which are supposed to be carefully reviewed before being published, the quality of documents in a patient record are often corrupted by mispelled words and conventional graphies or abbreviations. After a survey of the domain, the paper focuses on evaluating the effect of such corruption on an information retrieval (IR) engine. The IR system uses a classical bag of words approach, with stems as representation items and term frequency-inverse document frequency (tf-idf) as weighting schema; we pay special attention to the normalization factor. First results shows that even low corruption levels (3%) do affect retrieval effectiveness (4-7%), whereas higher corruption levels can affect retrieval effectiveness by 25%. Then, we show that the use of an improved automatic spelling correction system, applied on the corrupted collection, can almost restore the retrieval effectiveness of the engine.

  1. Methodology to characterize a residential building stock using a bottom-up approach: a case study applied to Belgium

    Directory of Open Access Journals (Sweden)

    Samuel Gendebien

    2014-06-01

    Full Text Available In the last ten years, the development and implementation of measures to mitigate climate change have become of major importance. In Europe, the residential sector accounts for 27% of the final energy consumption [1], and therefore contributes significantly to CO2 emissions. Roadmaps towards energy-efficient buildings have been proposed [2]. In such a context, the detailed characterization of residential building stocks in terms of age, type of construction, insulation level, energy vector, and of evolution prospects appears to be a useful contribution to the assessment of the impact of implementation of energy policies. In this work, a methodology to develop a tree-structure characterizing a residential building stock is presented in the frame of a bottom-up approach that aims to model and simulate domestic energy use. The methodology is applied to the Belgian case for the current situation and up to 2030 horizon. The potential applications of the developed tool are outlined.

  2. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  3. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  4. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  5. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  6. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  7. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  8. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  9. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  10. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    Science.gov (United States)

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  11. SYMPOSIUM REPORT: An Evidence-Based Approach to IBS and CIC: Applying New Advances to Daily Practice

    Science.gov (United States)

    Chey, William D.

    2017-01-01

    Many nonpharmacologic and pharmacologic therapies are available to manage irritable bowel syndrome (IBS) and chronic idiopathic constipation (CIC). The American College of Gastroenterology (ACG) regularly publishes reviews on IBS and CIC therapies. The most recent of these reviews was published by the ACG Task Force on the Management of Functional Bowel Disorders in 2014. The key objective of this review was to evaluate the efficacy of therapies for IBS or CIC compared with placebo or no treatment in randomized controlled trials. Evidence-based approaches to managing diarrhea-predominant IBS include dietary measures, such as a diet low in gluten and fermentable oligo-, di-, and monosaccharides and polyols (FODMAPs); loperamide; antispasmodics; peppermint oil; probiotics; tricyclic antidepressants; alosetron; eluxadoline, and rifaximin. Evidence-based approaches to managing constipation-predominant IBS and CIC include fiber, stimulant laxatives, polyethylene glycol, selective serotonin reuptake inhibitors, lubiprostone, and guanylate cyclase agonists. With the growing evidence base for IBS and CIC therapies, it has become increasingly important for clinicians to assess the quality of evidence and understand how to apply it to the care of individual patients. PMID:28729815

  12. A lumped parameter method of characteristics approach and multigroup kernels applied to the subgroup self-shielding calculation in MPACT

    Directory of Open Access Journals (Sweden)

    Shane Stimpson

    2017-09-01

    Full Text Available An essential component of the neutron transport solver is the resonance self-shielding calculation used to determine equivalence cross sections. The neutron transport code, MPACT, is currently using the subgroup self-shielding method, in which the method of characteristics (MOC is used to solve purely absorbing fixed-source problems. Recent efforts incorporating multigroup kernels to the MOC solvers in MPACT have reduced runtime by roughly 2×. Applying the same concepts for self-shielding and developing a novel lumped parameter approach to MOC, substantial improvements have also been made to the self-shielding computational efficiency without sacrificing any accuracy. These new multigroup and lumped parameter capabilities have been demonstrated on two test cases: (1 a single lattice with quarter symmetry known as VERA (Virtual Environment for Reactor Applications Progression Problem 2a and (2 a two-dimensional quarter-core slice known as Problem 5a-2D. From these cases, self-shielding computational time was reduced by roughly 3–4×, with a corresponding 15–20% increase in overall memory burden. An azimuthal angle sensitivity study also shows that only half as many angles are needed, yielding an additional speedup of 2×. In total, the improvements yield roughly a 7–8× speedup. Given these performance benefits, these approaches have been adopted as the default in MPACT.

  13. A lumped parameter method of characteristics approach and multigroup kernels applied to the subgroup self-shielding calculation in MPACT

    International Nuclear Information System (INIS)

    Stimpson, Shane G.; Liu, Yuxuan; Collins, Benjamin S.; Clarno, Kevin T.

    2017-01-01

    An essential component of the neutron transport solver is the resonance self-shielding calculation used to determine equivalence cross sections. The neutron transport code, MPACT, is currently using the subgroup self-shielding method, in which the method of characteristics (MOC) is used to solve purely absorbing fixed-source problems. Recent efforts incorporating multigroup kernels to the MOC solvers in MPACT have reduced runtime by roughly 2×. Applying the same concepts for self-shielding and developing a novel lumped parameter approach to MOC, substantial improvements have also been made to the self-shielding computational efficiency without sacrificing any accuracy. These new multigroup and lumped parameter capabilities have been demonstrated on two test cases: (1) a single lattice with quarter symmetry known as VERA (Virtual Environment for Reactor Applications) Progression Problem 2a and (2) a two-dimensional quarter-core slice known as Problem 5a-2D. From these cases, self-shielding computational time was reduced by roughly 3–4×, with a corresponding 15–20% increase in overall memory burden. An azimuthal angle sensitivity study also shows that only half as many angles are needed, yielding an additional speedup of 2×. In total, the improvements yield roughly a 7–8× speedup. Furthermore given these performance benefits, these approaches have been adopted as the default in MPACT.

  14. Using inferred probabilities to measure the accuracy of imprecise forecasts

    Directory of Open Access Journals (Sweden)

    Paul Lehner

    2012-11-01

    Full Text Available Research on forecasting is effectively limited to forecasts that are expressed with clarity; which is to say that the forecasted event must be sufficiently well-defined so that it can be clearly resolved whether or not the event occurred and forecasts certainties are expressed as quantitative probabilities. When forecasts are expressed with clarity, then quantitative measures (scoring rules, calibration, discrimination, etc. can be used to measure forecast accuracy, which in turn can be used to measure the comparative accuracy of different forecasting methods. Unfortunately most real world forecasts are not expressed clearly. This lack of clarity extends to both the description of the forecast event and to the use of vague language to express forecast certainty. It is thus difficult to assess the accuracy of most real world forecasts, and consequently the accuracy the methods used to generate real world forecasts. This paper addresses this deficiency by presenting an approach to measuring the accuracy of imprecise real world forecasts using the same quantitative metrics routinely used to measure the accuracy of well-defined forecasts. To demonstrate applicability, the Inferred Probability Method is applied to measure the accuracy of forecasts in fourteen documents examining complex political domains. Key words: inferred probability, imputed probability, judgment-based forecasting, forecast accuracy, imprecise forecasts, political forecasting, verbal probability, probability calibration.

  15. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  16. Applying a learning design methodology in the flipped classroom approach – empowering teachers to reflect and design for learning

    Directory of Open Access Journals (Sweden)

    Evangelia Triantafyllou

    2016-05-01

    Full Text Available One of the recent developments in teaching that heavily relies on current technology is the “flipped classroom” approach. In a flipped classroom the traditional lecture and homework sessions are inverted. Students are provided with online material in order to gain necessary knowledge before class, while class time is devoted to clarifications and application of this knowledge. The hypothesis is that there could be deep and creative discussions when teacher and students physically meet. This paper discusses how the learning design methodology can be applied to represent, share and guide educators through flipped classroom designs. In order to discuss the opportunities arising by this approach, the different components of the Learning Design – Conceptual Map (LD-CM are presented and examined in the context of the flipped classroom. It is shown that viewing the flipped classroom through the lens of learning design can promote the use of theories and methods to evaluate its effect on the achievement of learning objectives, and that it may draw attention to the employment of methods to gather learner responses. Moreover, a learning design approach can enforce the detailed description of activities, tools and resources used in specific flipped classroom models, and it can make educators more aware of the decisions that have to be taken and people who have to be involved when designing a flipped classroom. By using the LD-CM, this paper also draws attention to the importance of characteristics and values of different stakeholders (i.e. institutions, educators, learners, and external agents, which influence the design and success of flipped classrooms. Moreover, it looks at the teaching cycle from a flipped instruction model perspective and adjusts it to cater for the reflection loops educators are involved when designing, implementing and re-designing a flipped classroom. Finally, it highlights the effect of learning design on the guidance

  17. Patient perception of nursing service quality; an applied model of Donabedian's structure-process-outcome approach theory.

    Science.gov (United States)

    Kobayashi, Hideyuki; Takemura, Yukie; Kanda, Katsuya

    2011-09-01

    Nursing is a labour-intensive field, and an extensive amount of latent information exists to aid in evaluating the quality of nursing service, with patients' experiences, the primary focus of such evaluations. To effect further improvement in nursing as well as medical care, Donabedian's structure-process-outcome approach has been applied. To classify and confirm patients' specific experiences with regard to nursing service based on Donabedian's structure-process-outcomes model for improving the quality of nursing care. Items were compiled from existing scales and assigned to structure, process or outcomes in Donabedian's model through discussion among expert nurses and pilot data collection. With regard to comfort, surroundings were classified as structure (e.g. accessibility to nurses, disturbance); with regard to patient-practitioner interaction, patient participation was classified as a process (e.g. expertise and skill, patient decision-making); and with regard to changes in patients, satisfaction was classified as an outcome (e.g. information support, overall satisfaction). Patient inquiry was carried out using the finalized questionnaire at general wards in Japanese hospitals in 2005-2006. Reliability and validity were tested using psychometric methods. Data from 1,810 patients (mean age: 59.7 years; mean length of stay: 23.7 days) were analysed. Internal consistency reliability was supported (α = 0.69-0.96), with factor analysis items of structure aggregated to one factor and overall satisfaction under outcome aggregated to one. The remaining items of outcome and process were distributed together in two factors. Inter-scale correlation (r = 0.442-0.807) supported the construct validity of each structure-process-outcome approach. All structure items were represented as negative-worded examples, as they dealt with basic conditions under Japanese universal health care system, and were regarded as representative related to concepts of dissatisfaction and no

  18. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  19. A comparison of a modified sequential oral sensory approach to an applied behavior-analytic approach in the treatment of food selectivity in children with autism spectrum disorder.

    Science.gov (United States)

    Peterson, Kathryn M; Piazza, Cathleen C; Volkert, Valerie M

    2016-09-01

    Treatments of pediatric feeding disorders based on applied behavior analysis (ABA) have the most empirical support in the research literature (Volkert & Piazza, 2012); however, professionals often recommend, and caregivers often use, treatments that have limited empirical support. In the current investigation, we compared a modified sequential oral sensory approach (M-SOS; Benson, Parke, Gannon, & Muñoz, 2013) to an ABA approach for the treatment of the food selectivity of 6 children with autism. We randomly assigned 3 children to ABA and 3 children to M-SOS and compared the effects of treatment in a multiple baseline design across novel, healthy target foods. We used a multielement design to assess treatment generalization. Consumption of target foods increased for children who received ABA, but not for children who received M-SOS. We subsequently implemented ABA with the children for whom M-SOS was not effective and observed a potential treatment generalization effect during ABA when M-SOS preceded ABA. © 2016 Society for the Experimental Analysis of Behavior.

  20. Probable existence of a Gondwana transcontinental rift system in western India: Implications in hydrocarbon exploration in Kutch and Saurashtra offshore: A GIS-based approach

    Science.gov (United States)

    Mazumder, S.; Tep, Blecy; Pangtey, K. K. S.; Das, K. K.; Mitra, D. S.

    2017-08-01

    The Gondwanaland assembly rifted dominantly during Late Carboniferous-Early Permian forming several intracratonic rift basins. These rifts were subsequently filled with a thick sequence of continental clastic sediments with minor marine intercalations in early phase. In western part of India, these sediments are recorded in enclaves of Bikaner-Nagaur and Jaisalmer basins in Rajasthan. Facies correlatives of these sediments are observed in a number of basins that were earlier thought to be associated with the western part of India. The present work is a GIS based approach to reconnect those basins to their position during rifting and reconstruct the tectono-sedimentary environment at that time range. The study indicates a rift system spanning from Arabian plate in the north and extending to southern part of Africa that passes through Indus basin, western part of India and Madagascar, and existed from Late Carboniferous to Early Jurassic. Extensions related to the opening of Neo-Tethys led to the formation of a number of cross trends in the rift systems that acted as barriers to marine transgressions from the north as well as disrupted the earlier continuous longitudinal drainage systems. The axis of this rift system is envisaged to pass through present day offshore Kutch and Saurashtra and implies a thick deposit of Late Carboniferous to Early Jurassic sediments in these areas. Based on analogy with other basins associated with this rift system, these sediments may be targeted for hydrocarbon exploration.

  1. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  2. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Joint probability distributions and fluctuation theorems

    International Nuclear Information System (INIS)

    García-García, Reinaldo; Kolton, Alejandro B; Domínguez, Daniel; Lecomte, Vivien

    2012-01-01

    We derive various exact results for Markovian systems that spontaneously relax to a non-equilibrium steady state by using joint probability distribution symmetries of different entropy production decompositions. The analytical approach is applied to diverse problems such as the description of the fluctuations induced by experimental errors, for unveiling symmetries of correlation functions appearing in fluctuation–dissipation relations recently generalized to non-equilibrium steady states, and also for mapping averages between different trajectory-based dynamical ensembles. Many known fluctuation theorems arise as special instances of our approach for particular twofold decompositions of the total entropy production. As a complement, we also briefly review and synthesize the variety of fluctuation theorems applying to stochastic dynamics of both continuous systems described by a Langevin dynamics and discrete systems obeying a Markov dynamics, emphasizing how these results emerge from distinct symmetries of the dynamical entropy of the trajectory followed by the system. For Langevin dynamics, we embed the 'dual dynamics' with a physical meaning, and for Markov systems we show how the fluctuation theorems translate into symmetries of modified evolution operators

  5. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  6. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  7. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  8. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  9. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  10. New diagnostic and therapeutic approach to thyroid-associated orbitopathy based on applied kinesiology and homeopathic therapy.

    Science.gov (United States)

    Moncayo, Roy; Moncayo, Helga; Ulmer, Hanno; Kainz, Hartmann

    2004-08-01

    To investigate pathogenetic mechanisms related to the lacrimal and lymphatic glands in patients with thyroid-associated orbitopathy (TAO), and the potential of applied kinesiology diagnosis and homeopathic therapeutic measures. Prospective. Thyroid outpatient unit and a specialized center for complementary medicine (WOMED, Innsbruck; R.M. and H.M.). Thirty-two (32) patients with TAO, 23 with a long-standing disease, and 9 showing discrete initial changes. All patients were euthyroid at the time of the investigation. Clinical investigation was done, using applied kinesiology methods. Departing from normal reacting muscles, both target organs as well as therapeutic measures were tested. Affected organs will produce a therapy localization (TL) that turns a normal muscle tone weak. Using the same approach, specific counteracting therapies (i.e., tonsillitis nosode and lymph mobilizing agents) were tested. Change of lid swelling, of ocular movement discomfort, ocular lock, tonsil reactivity and Traditional Chinese Medicine criteria including tenderness of San Yin Jiao (SP6) and tongue diagnosis were recorded in a graded fashion. Positive TL reactions were found in the submandibular tonsillar structures, the pharyngeal tonsils, the San Yin Jiao point, the lacrimal gland, and with the functional ocular lock test. Both Lymphdiaral (Pascoe, Giessen, Germany) and the homeopathic preparation chronic tonsillitis nosode at a C3 potency (Spagyra, Grödig, Austria) counteracted these changes. Both agents were used therapeutically over 3-6 months, after which all relevant parameters showed improvement. Our study demonstrates the involvement of lymphatic structures and flow in the pathogenesis of TAO. The tenderness of the San Yin Jiao point correlates to the above mentioned changes and should be included in the clinical evaluation of these patients.

  11. Whole brain approaches for identification of microstructural abnormalities in individual patients: comparison of techniques applied to mild traumatic brain injury.

    Directory of Open Access Journals (Sweden)

    Namhee Kim

    Full Text Available Group-wise analyses of DTI in mTBI have demonstrated evidence of traumatic axonal injury (TAI, associated with adverse clinical outcomes. Although mTBI is likely to have a unique spatial pattern in each patient, group analyses implicitly assume that location of injury will be the same across patients. The purpose of this study was to optimize and validate a procedure for analysis of DTI images acquired in individual patients, which could detect inter-individual differences and be applied in the clinical setting, where patients must be assessed as individuals.After informed consent and in compliance with HIPAA, 34 mTBI patients and 42 normal subjects underwent 3.0 Tesla DTI. Four voxelwise assessment methods (standard Z-score, "one vs. many" t-test, Family-Wise Error Rate control using pseudo t-distribution, EZ-MAP for use in individual patients, were applied to each patient's fractional anisotropy (FA maps and tested for its ability to discriminate patients from controls. Receiver Operating Characteristic (ROC analyses were used to define optimal thresholds (voxel-level significance and spatial extent for reliable and robust detection of mTBI pathology.ROC analyses showed EZ-MAP (specificity 71%, sensitivity 71%, "one vs. many" t-test and standard Z-score (sensitivity 65%, specificity 76% for both methods resulted in a significant area under the curve (AUC score for discriminating mTBI patients from controls in terms of the total number of abnormal white matter voxels detected while the FWER test was not significant. EZ-MAP is demonstrated to be robust to assumptions of Gaussian behavior and may serve as an alternative to methods that require strict Gaussian assumptions.EZ-MAP provides a robust approach for delineation of regional abnormal anisotropy in individual mTBI patients.

  12. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  13. Applying an Archetype-Based Approach to Electroencephalography/Event-Related Potential Experiments in the EEGBase Resource.

    Science.gov (United States)

    Papež, Václav; Mouček, Roman

    2017-01-01

    The purpose of this study is to investigate the feasibility of applying openEHR (an archetype-based approach for electronic health records representation) to modeling data stored in EEGBase, a portal for experimental electroencephalography/event-related potential (EEG/ERP) data management. The study evaluates re-usage of existing openEHR archetypes and proposes a set of new archetypes together with the openEHR templates covering the domain. The main goals of the study are to (i) link existing EEGBase data/metadata and openEHR archetype structures and (ii) propose a new openEHR archetype set describing the EEG/ERP domain since this set of archetypes currently does not exist in public repositories. The main methodology is based on the determination of the concepts obtained from EEGBase experimental data and metadata that are expressible structurally by the openEHR reference model and semantically by openEHR archetypes. In addition, templates as the third openEHR resource allow us to define constraints over archetypes. Clinical Knowledge Manager (CKM), a public openEHR archetype repository, was searched for the archetypes matching the determined concepts. According to the search results, the archetypes already existing in CKM were applied and the archetypes not existing in the CKM were newly developed. openEHR archetypes support linkage to external terminologies. To increase semantic interoperability of the new archetypes, binding with the existing odML electrophysiological terminology was assured. Further, to increase structural interoperability, also other current solutions besides EEGBase were considered during the development phase. Finally, a set of templates using the selected archetypes was created to meet EEGBase requirements. A set of eleven archetypes that encompassed the domain of experimental EEG/ERP measurements were identified. Of these, six were reused without changes, one was extended, and four were newly created. All archetypes were arranged in the

  14. Quantifying Vulnerability to Extreme Heat in Time Series Analyses: A Novel Approach Applied to Neighborhood Social Disparities under Climate Change.

    Science.gov (United States)

    Benmarhnia, Tarik; Grenier, Patrick; Brand, Allan; Fournier, Michel; Deguen, Séverine; Smargiassi, Audrey

    2015-09-22

    We propose a novel approach to examine vulnerability in the relationship between heat and years of life lost and apply to neighborhood social disparities in Montreal and Paris. We used historical data from the summers of 1990 through 2007 for Montreal and from 2004 through 2009 for Paris to estimate daily years of life lost social disparities (DYLLD), summarizing social inequalities across groups. We used Generalized Linear Models to separately estimate relative risks (RR) for DYLLD in association with daily mean temperatures in both cities. We used 30 climate scenarios of daily mean temperature to estimate future temperature distributions (2021-2050). We performed random effect meta-analyses to assess the impact of climate change by climate scenario for each city and compared the impact of climate change for the two cities using a meta-regression analysis. We show that an increase in ambient temperature leads to an increase in social disparities in daily years of life lost. The impact of climate change on DYLLD attributable to temperature was of 2.06 (95% CI: 1.90, 2.25) in Montreal and 1.77 (95% CI: 1.61, 1.94) in Paris. The city explained a difference of 0.31 (95% CI: 0.14, 0.49) on the impact of climate change. We propose a new analytical approach for estimating vulnerability in the relationship between heat and health. Our results suggest that in Paris and Montreal, health disparities related to heat impacts exist today and will increase in the future.

  15. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    Science.gov (United States)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  16. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  17. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  18. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  19. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  20. Applying a Markov approach as a Lean Thinking analysis of waste elimination in a Rice Production Process

    Directory of Open Access Journals (Sweden)

    Eldon Glen Caldwell Marin

    2015-01-01

    Full Text Available The Markov Chains Model was proposed to analyze stochastic events when recursive cycles occur; for example, when rework in a continuous flow production affects the overall performance. Typically, the analysis of rework and scrap is done through a wasted material cost perspective and not from the perspective of waste capacity that reduces throughput and economic value added (EVA. Also, we can not find many cases of this application in agro-industrial production in Latin America, given the complexity of the calculations and the need for robust applications. This scientific work presents the results of a quasi-experimental research approach in order to explain how to apply DOE methods and Markov analysis in a rice production process located in Central America, evaluating the global effects of a single reduction in rework and scrap in a part of the whole line. The results show that in this case it is possible to evaluate benefits from Global Throughput and EVA perspective and not only from the saving costs perspective, finding a relationship between operational indicators and corporate performance. However, it was found that it is necessary to analyze the markov chains configuration with many rework points, also it is still relevant to take into account the effects on takt time and not only scrap´s costs.

  1. The effects of music therapy incorporated with applied behavior analysis verbal behavior approach for children with autism spectrum disorders.

    Science.gov (United States)

    Lim, Hayoung A; Draper, Ellary

    2011-01-01

    This study compared a common form of Applied Behavior Analysis Verbal Behavior (ABA VB) approach and music incorporated with ABA VB method as part of developmental speech-language training in the speech production of children with Autism Spectrum Disorders (ASD). This study explored how the perception of musical patterns incorporated in ABA VB operants impacted the production of speech in children with ASD. Participants were 22 children with ASD, age range 3 to 5 years, who were verbal or pre verbal with presence of immediate echolalia. They were randomly assigned a set of target words for each of the 3 training conditions: (a) music incorporated ABA VB, (b) speech (ABA VB) and (c) no-training. Results showed both music and speech trainings were effective for production of the four ABA verbal operants; however, the difference between music and speech training was not statistically different. Results also indicated that music incorporated ABA VB training was most effective in echoic production, and speech training was most effective in tact production. Music can be incorporated into the ABA VB training method, and musical stimuli can be used as successfully as ABA VB speech training to enhance the functional verbal production in children with ASD.

  2. Applying the methodology of Design of Experiments to stability studies: a Partial Least Squares approach for evaluation of drug stability.

    Science.gov (United States)

    Jordan, Nika; Zakrajšek, Jure; Bohanec, Simona; Roškar, Robert; Grabnar, Iztok

    2018-05-01

    The aim of the present research is to show that the methodology of Design of Experiments can be applied to stability data evaluation, as they can be seen as multi-factor and multi-level experimental designs. Linear regression analysis is usually an approach for analyzing stability data, but multivariate statistical methods could also be used to assess drug stability during the development phase. Data from a stability study for a pharmaceutical product with hydrochlorothiazide (HCTZ) as an unstable drug substance was used as a case example in this paper. The design space of the stability study was modeled using Umetrics MODDE 10.1 software. We showed that a Partial Least Squares model could be used for a multi-dimensional presentation of all data generated in a stability study and for determination of the relationship among factors that influence drug stability. It might also be used for stability predictions and potentially for the optimization of the extent of stability testing needed to determine shelf life and storage conditions, which would be time and cost-effective for the pharmaceutical industry.

  3. Optical technologies applied alongside on-site and remote approaches for climate gas emission quantification at a wastewater treatment plant

    DEFF Research Database (Denmark)

    Samuelsson, Jerker; Delre, Antonio; Tumlin, Susanne

    2018-01-01

    Plant-integrated and on-site gas emissions were quantified from a Swedish wastewater treatment plant by applying several optical analytical techniques and measurement methods. Plant-integrated CH4 emission rates, measured using mobile ground-based remote sensing methods, varied between 28.5 and 33.......5 kg CH4 h−1, corresponding to an average emission factor of 5.9% as kg CH4 (kg CH4production) −1, whereas N2O emissions varied between 4.0 and 6.4 kg h−1, corresponding to an average emission factor of 1.5% as kg N2O-N (kg TN influent) −1. Plant-integrated NH3 emissions were around 0.4 kg h−1...... quantifications were approximately two-thirds of the plant-integrated emission quantifications, which may be explained by the different timeframes of the approaches and that not all emission sources were identified during on-site investigation. Off-site gas emission quantifications, using ground-based remote...

  4. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  5. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  6. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  7. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  8. Applying the Plan-Do-Study-Act (PDSA) approach to a large pragmatic study involving safety net clinics.

    Science.gov (United States)

    Coury, Jennifer; Schneider, Jennifer L; Rivelli, Jennifer S; Petrik, Amanda F; Seibel, Evelyn; D'Agostini, Brieshon; Taplin, Stephen H; Green, Beverly B; Coronado, Gloria D

    2017-06-19

    The Plan-Do-Study-Act (PDSA) cycle is a commonly used improvement process in health care settings, although its documented use in pragmatic clinical research is rare. A recent pragmatic clinical research study, called the Strategies and Opportunities to STOP Colon Cancer in Priority Populations (STOP CRC), used this process to optimize the research implementation of an automated colon cancer screening outreach program in intervention clinics. We describe the process of using this PDSA approach, the selection of PDSA topics by clinic leaders, and project leaders' reactions to using PDSA in pragmatic research. STOP CRC is a cluster-randomized pragmatic study that aims to test the effectiveness of a direct-mail fecal immunochemical testing (FIT) program involving eight Federally Qualified Health Centers in Oregon and California. We and a practice improvement specialist trained in the PDSA process delivered structured presentations to leaders of these centers; the presentations addressed how to apply the PDSA process to improve implementation of a mailed outreach program offering colorectal cancer screening through FIT tests. Center leaders submitted PDSA plans and delivered reports via webinar at quarterly meetings of the project's advisory board. Project staff conducted one-on-one, 45-min interviews with project leads from each health center to assess the reaction to and value of the PDSA process in supporting the implementation of STOP CRC. Clinic-selected PDSA activities included refining the intervention staffing model, improving outreach materials, and changing workflow steps. Common benefits of using PDSA cycles in pragmatic research were that it provided a structure for staff to focus on improving the program and it allowed staff to test the change they wanted to see. A commonly reported challenge was measuring the success of the PDSA process with the available electronic medical record tools. Understanding how the PDSA process can be applied to pragmatic

  9. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  10. Generic Degraded Configuration Probability Analysis for the Codisposal Waste Package

    International Nuclear Information System (INIS)

    S.F.A. Deng; M. Saglam; L.J. Gratton

    2001-01-01

    In accordance with the technical work plan, ''Technical Work Plan For: Department of Energy Spent Nuclear Fuel Work Packages'' (CRWMS M and O 2000c), this Analysis/Model Report (AMR) is developed for the purpose of screening out degraded configurations for U.S. Department of Energy (DOE) spent nuclear fuel (SNF) types. It performs the degraded configuration parameter and probability evaluations of the overall methodology specified in the ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2000, Section 3) to qualifying configurations. Degradation analyses are performed to assess realizable parameter ranges and physical regimes for configurations. Probability calculations are then performed for configurations characterized by k eff in excess of the Critical Limit (CL). The scope of this document is to develop a generic set of screening criteria or models to screen out degraded configurations having potential for exceeding a criticality limit. The developed screening criteria include arguments based on physical/chemical processes and probability calculations and apply to DOE SNF types when codisposed with the high-level waste (HLW) glass inside a waste package. The degradation takes place inside the waste package and is long after repository licensing has expired. The emphasis of this AMR is on degraded configuration screening and the probability analysis is one of the approaches used for screening. The intended use of the model is to apply the developed screening criteria to each DOE SNF type following the completion of the degraded mode criticality analysis internal to the waste package

  11. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  12. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  13. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  14. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  15. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  16. Haavelmo's Probability Approach and the Cointegrated VAR

    DEFF Research Database (Denmark)

    Juselius, Katarina

    Some key econometric concepts and problems addressed by Trygve Haavelmo and Ragnar Frisch are discussed within the general frame- work of a cointegrated VAR. The focus is on problems typical of time- series data such as multicollinearity, spurious correlation and regres- sion results, time......) the plausibility of the multivari- ate normality assumption underlying the VAR, (3) cointegration as a solution to the problem of spurious correlation and multicollinearity when data contain deterministic and stochastic trends, (4) the exis- tence of a universe, (5) the association between Frisch’s con...

  17. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  18. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  19. Investigation of natural convection in Miniature Neutron Source Reactor of Isfahan by applying the porous media approach

    Energy Technology Data Exchange (ETDEWEB)

    Abbassi, Yasser, E-mail: y.abbassi@mihanmail.ir [Department of Engineering, University of Shahid Beheshti, Tehran (Iran, Islamic Republic of); Asgarian, Shahla [Department of Chemical Engineering, Isfahan University, Tehran (Iran, Islamic Republic of); Ghahremani, Esmaeel; Abbasi, Mohammad [Department of Engineering, University of Shahid Beheshti, Tehran (Iran, Islamic Republic of)

    2016-12-01

    Highlights: • We carried out a CFD study to investigate transient natural convection in MNSR. • We applied porous media approach to simplify the complex core of MNSR. • Method have been verified with experimental data. • Temperature difference between the core inlet and outlet has been obtained. • Flow pattern and temperature distribution have been presented. - Abstract: The small and complex core of Isfahan Miniature Neutron Source Reactor (MNSR) in addition to its large tank makes a parametric study of natural convection difficult to perform in aspects of time and computational resources. In this study, in order to overcome this obstacle the porous media approximation has been used. This numerical technique includes two steps, (a) calculation of porous media variables such as porosity and pressure drops in the core region, (b) simulation of natural convection in the reactor tank by assuming the core region as a porous medium. Simulation has been carried out with ANSYS FLUENT® Academic Research, Release 16.2. The core porous medium resistance factors have been estimated to be, D{sub ij} = 1850 [1/m] and C{sub ij} = 415 [1/m{sup 2}]. Natural Convection simulation with Boussinesq approximation and variable property assumption have been performed. The experimental data and nuclear codes available in the literature, have verified the method. The average temperature difference between the experimental data and this study results was less than 0.5 °C and 2.0 °C for property variable technique and Boussinesq approximation, respectively. Temperature distribution and flow pattern in the entire reactor have been obtained. Results have shown that the temperature difference between core outlet and inlet is about 18°C and in this situation flow rate is about 0.004 kg/s. A full parametric study could be the topic of future investigations.

  20. Sensitivity and uncertainty analyses applied to one-dimensional radionuclide transport in a layered fractured rock: Evaluation of the Limit State approach, Iterative Performance Assessment, Phase 2

    International Nuclear Information System (INIS)

    Wu, Y.T.; Gureghian, A.B.; Sagar, B.; Codell, R.B.

    1992-12-01

    The Limit State approach is based on partitioning the parameter space into two parts: one in which the performance measure is smaller than a chosen value (called the limit state), and the other in which it is larger. Through a Taylor expansion at a suitable point, the partitioning surface (called the limit state surface) is approximated as either a linear or quadratic function. The success and efficiency of the limit state method depends upon choosing an optimum point for the Taylor expansion. The point in the parameter space that has the highest probability of producing the value chosen as the limit state is optimal for expansion. When the parameter space is transformed into a standard Gaussian space, the optimal expansion point, known as the lost Probable Point (MPP), has the property that its location on the Limit State surface is closest to the origin. Additionally, the projections onto the parameter axes of the vector from the origin to the MPP are the sensitivity coefficients. Once the MPP is determined and the Limit State surface approximated, formulas (see Equations 4-7 and 4-8) are available for determining the probability of the performance measure being less than the limit state. By choosing a succession of limit states, the entire cumulative distribution of the performance measure can be detemined. Methods for determining the MPP and also for improving the estimate of the probability are discussed in this report

  1. The Rock Engineering System (RES) applied to landslide susceptibility zonation of the northeastern flank of Etna: methodological approach and results

    Science.gov (United States)

    Apuani, Tiziana; Corazzato, Claudia

    2015-04-01

    instability-related numerical ratings are assigned to classes. An instability index map is then produced by assigning, to each areal elementary cell (in our case a 10 m pixel), the sum of the products of each weight factor to the normalized parameter rating coming from each input zonation map. This map is then opportunely classified in landslide susceptibility classes (expressed as a percentage), enabling to discriminate areas prone to instability. Overall, the study area is characterized by a low propensity to slope instability. Few areas have an instability index of more than 45% of the theoretical maximum imposed by the matrix. These are located in the few steep slopes associated with active faults, and strongly depending on the seismic activity. Some other areas correspond to limited outcrops characterized by significantly reduced lithotechnical properties (low shear strength). The produced susceptibility map combines the application of the RES with the parameter zonation, following methodology which had never been applied up to now in in active volcanic environments. The comparison of the results with the ground deformation evidence coming from monitoring networks suggests the validity of the approach.

  2. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  3. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  4. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  5. Estimating the Probability of Traditional Copying, Conditional on Answer-Copying Statistics.

    Science.gov (United States)

    Allen, Jeff; Ghattas, Andrew

    2016-06-01

    Statistics for detecting copying on multiple-choice tests produce p values measuring the probability of a value at least as large as that observed, under the null hypothesis of no copying. The posterior probability of copying is arguably more relevant than the p value, but cannot be derived from Bayes' theorem unless the population probability of copying and probability distribution of the answer-copying statistic under copying are known. In this article, the authors develop an estimator for the posterior probability of copying that is based on estimable quantities and can be used with any answer-copying statistic. The performance of the estimator is evaluated via simulation, and the authors demonstrate how to apply the formula using actual data. Potential uses, generalizability to other types of cheating, and limitations of the approach are discussed.

  6. An Overview of Modeling Approaches Applied to Aggregation-Based Fleet Management and Integration of Plug-in Electric Vehicles †

    DEFF Research Database (Denmark)

    You, Shi; Hu, Junjie; Ziras, Charalampos

    2016-01-01

    The design and implementation of management policies for plug-in electric vehicles (PEVs) need to be supported by a holistic understanding of the functional processes, their complex interactions, and their response to various changes. Models developed to represent different functional processes...... and systems are seen as useful tools to support the related studies for different stakeholders in a tangible way. This paper presents an overview of modeling approaches applied to support aggregation-based management and integration of PEVs from the perspective of fleet operators and grid operators......, respectively. We start by explaining a structured modeling approach, i.e., a flexible combination of process models and system models, applied to different management and integration studies. A state-of-the-art overview of modeling approaches applied to represent several key processes, such as charging...

  7. A steady state thermal duct model derived by fin-theory approach and applied on an unglazed solar collector

    Energy Technology Data Exchange (ETDEWEB)

    Stojanovic, B.; Hallberg, D.; Akander, J. [Building Materials Technology, KTH Research School, Centre for Built Environment, University of Gaevle, SE-801 76 Gaevle (Sweden)

    2010-10-15

    This paper presents the thermal modelling of an unglazed solar collector (USC) flat panel, with the aim of producing a detailed yet swift thermal steady-state model. The model is analytical, one-dimensional (1D) and derived by a fin-theory approach. It represents the thermal performance of an arbitrary duct with applied boundary conditions equal to those of a flat panel collector. The derived model is meant to be used for efficient optimisation and design of USC flat panels (or similar applications), as well as detailed thermal analysis of temperature fields and heat transfer distributions/variations at steady-state conditions; without requiring a large amount of computational power and time. Detailed surface temperatures are necessary features for durability studies of the surface coating, hence the effect of coating degradation on USC and system performance. The model accuracy and proficiency has been benchmarked against a detailed three-dimensional Finite Difference Model (3D FDM) and two simpler 1D analytical models. Results from the benchmarking test show that the fin-theory model has excellent capabilities of calculating energy performances and fluid temperature profiles, as well as detailed material temperature fields and heat transfer distributions/variations (at steady-state conditions), while still being suitable for component analysis in junction to system simulations as the model is analytical. The accuracy of the model is high in comparison to the 3D FDM (the prime benchmark), as long as the fin-theory assumption prevails (no 'or negligible' temperature gradient in the fin perpendicularly to the fin length). Comparison with the other models also shows that when the USC duct material has a high thermal conductivity, the cross-sectional material temperature adopts an isothermal state (for the assessed USC duct geometry), which makes the 1D isothermal model valid. When the USC duct material has a low thermal conductivity, the heat transfer

  8. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  9. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  10. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  11. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  12. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  13. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  14. Green, Brown, and probability

    CERN Document Server

    Chung, Kai Lai

    1995-01-01

    This volume shows modern probabilistic methods in action: Brownian Motion Process as applied to the electrical phenomena investigated by Green et al., beginning with the Newton-Coulomb potential and ending with solutions by first and last exits of Brownian paths from conductors.

  15. Innovative Method in Improving Communication Issues by Applying Interdisciplinary Approach. Psycholinguistic Perspective to Mitigate Communication Troubles During Cislunar Travel.

    Science.gov (United States)

    Anikushina, V.; Taratukhin, V.; Stutterheim, C. v.; Gushin, V.

    2018-02-01

    A new psycholinguistic view on the crew communication, combined with biochemical and psychological data, contributes to noninvasive methods for stress appraisal and proposes alternative approaches to improve in-group communication and cohesion.

  16. Regional Guidebook for Applying the Hydrogeomorphic Approach to Assessing Wetland Functions of Northwest Gulf of Mexico Tidal Fringe Wetlands

    National Research Council Canada - National Science Library

    Shafer, Deborah

    2002-01-01

    ... in a region The approach was initially designed to be used in the context of the Clean Water Act Section 404 Regulatory Program permit review sequence to consider alternatives, minimize impacts, assess...

  17. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...

  18. Towards a Capability Approach to Careers: Applying Amartya Sen's Thinking to Career Guidance and Development

    Science.gov (United States)

    Robertson, Peter J.

    2015-01-01

    Amartya Sen's capability approach characterizes an individual's well-being in terms of what they are able to be, and what they are able to do. This framework for thinking has many commonalities with the core ideas in career guidance. Sen's approach is abstract and not in itself a complete or explanatory theory, but a case can be…

  19. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  20. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  1. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  2. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  3. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  4. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  5. Applied mathematics

    International Nuclear Information System (INIS)

    Nedelec, J.C.

    1988-01-01

    The 1988 progress report of the Applied Mathematics center (Polytechnic School, France), is presented. The research fields of the Center are the scientific calculus, the probabilities and statistics and the video image synthesis. The research topics developed are: the analysis of numerical methods, the mathematical analysis of the physics and mechanics fundamental models, the numerical solution of complex models related to the industrial problems, the stochastic calculus and the brownian movement, the stochastic partial differential equations, the identification of the adaptive filtering parameters, the discrete element systems, statistics, the stochastic control and the development, the image synthesis techniques for education and research programs. The published papers, the congress communications and the thesis are listed [fr

  6. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  7. An Exploration of School Communication Approaches for Newly Arrived EAL Students: Applying Three Dimensions of Organisational Communication Theory

    Science.gov (United States)

    Schneider, Claudia; Arnot, Madeleine

    2018-01-01

    This article explores the modes of school communication associated with language and cultural diversity, demonstrating how organisational communication theory can be applied to the analysis of schools' communication responses to the presence of pupils who have English as an additional language (EAL). The article highlights three analytical…

  8. The Relationship of Deep and Surface Study Approaches on Factual and Applied Test-Bank Multiple-Choice Question Performance

    Science.gov (United States)

    Yonker, Julie E.

    2011-01-01

    With the advent of online test banks and large introductory classes, instructors have often turned to textbook publisher-generated multiple-choice question (MCQ) exams in their courses. Multiple-choice questions are often divided into categories of factual or applied, thereby implicating levels of cognitive processing. This investigation examined…

  9. Modern applied U-statistics

    CERN Document Server

    Kowalski, Jeanne

    2008-01-01

    A timely and applied approach to the newly discovered methods and applications of U-statisticsBuilt on years of collaborative research and academic experience, Modern Applied U-Statistics successfully presents a thorough introduction to the theory of U-statistics using in-depth examples and applications that address contemporary areas of study including biomedical and psychosocial research. Utilizing a "learn by example" approach, this book provides an accessible, yet in-depth, treatment of U-statistics, as well as addresses key concepts in asymptotic theory by integrating translational and cross-disciplinary research.The authors begin with an introduction of the essential and theoretical foundations of U-statistics such as the notion of convergence in probability and distribution, basic convergence results, stochastic Os, inference theory, generalized estimating equations, as well as the definition and asymptotic properties of U-statistics. With an emphasis on nonparametric applications when and where applic...

  10. Direct torque control method applied to the WECS based on the PMSG and controlled with backstepping approach

    Science.gov (United States)

    Errami, Youssef; Obbadi, Abdellatif; Sahnoun, Smail; Ouassaid, Mohammed; Maaroufi, Mohamed

    2018-05-01

    This paper proposes a Direct Torque Control (DTC) method for Wind Power System (WPS) based Permanent Magnet Synchronous Generator (PMSG) and Backstepping approach. In this work, generator side and grid-side converter with filter are used as the interface between the wind turbine and grid. Backstepping approach demonstrates great performance in complicated nonlinear systems control such as WPS. So, the control method combines the DTC to achieve Maximum Power Point Tracking (MPPT) and Backstepping approach to sustain the DC-bus voltage and to regulate the grid-side power factor. In addition, control strategy is developed in the sense of Lyapunov stability theorem for the WPS. Simulation results using MATLAB/Simulink validate the effectiveness of the proposed controllers.

  11. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  12. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  13. Probability evolution method for exit location distribution

    Science.gov (United States)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  14. Experiences with IR Top N Optimization in a Main Memory DBMS: Applying 'The Database Approach' in New Domains

    NARCIS (Netherlands)

    Read, B.; Blok, H.E.; de Vries, A.P.; Blanken, Henk; Apers, Peter M.G.

    Data abstraction and query processing techniques are usually studied in the domain of administrative applications. We present a case-study in the non-standard domain of (multimedia) information retrieval, mainly intended as a feasibility study in favor of the `database approach' to data management.

  15. (Re)Acting Medicine: Applying Theatre in Order to Develop a Whole-Systems Approach to Understanding the Healing Response

    Science.gov (United States)

    Goldingay, S.; Dieppe, P.; Mangan, M.; Marsden, D.

    2014-01-01

    This critical reflection is based on the belief that creative practitioners should be using their own well-established approaches to trouble dominant paradigms in health and care provision to both form and inform the future of healing provision and well-being creation. It describes work by a transdisciplinary team (drama and medicine) that is…

  16. Advancing Dose-Response Assessment Methods for Environmental Regulatory Impact Analysis: A Bayesian Belief Network Approach Applied to Inorganic Arsenic.

    Science.gov (United States)

    Zabinski, Joseph W; Garcia-Vargas, Gonzalo; Rubio-Andrade, Marisela; Fry, Rebecca C; Gibson, Jacqueline MacDonald

    2016-05-10

    Dose-response functions used in regulatory risk assessment are based on studies of whole organisms and fail to incorporate genetic and metabolomic data. Bayesian belief networks (BBNs) could provide a powerful framework for incorporating such data, but no prior research has examined this possibility. To address this gap, we develop a BBN-based model predicting birthweight at gestational age from arsenic exposure via drinking water and maternal metabolic indicators using a cohort of 200 pregnant women from an arsenic-endemic region of Mexico. We compare BBN predictions to those of prevailing slope-factor and reference-dose approaches. The BBN outperforms prevailing approaches in balancing false-positive and false-negative rates. Whereas the slope-factor approach had 2% sensitivity and 99% specificity and the reference-dose approach had 100% sensitivity and 0% specificity, the BBN's sensitivity and specificity were 71% and 30%, respectively. BBNs offer a promising opportunity to advance health risk assessment by incorporating modern genetic and metabolomic data.

  17. Applying DoE's Graded Approach for assessing radiation impacts to non-human biota at the Incl

    International Nuclear Information System (INIS)

    Morris, Randall C.

    2006-01-01

    In July 2002, The US Department of Energy (DOE) released a new technical standard entitled A Graded Approach for Evaluating Radiation Doses to Aquatic and Terrestrial Biota. DOE facilities are annually required to demonstrate that routine radioactive releases from their sites are protective of non-human receptors and sites are encouraged to use the Graded Approach for this purpose. Use of the Graded Approach requires completion of several preliminary steps, to evaluate the degree to which the site environmental monitoring program is appropriate for evaluating impacts to non-human biota. We completed these necessary activities at the Idaho National Laboratory (INL) using the following four tasks: (1) develop conceptual models and evaluate exposure pathways; (2) define INL evaluation areas; (3) evaluate sampling locations and media; (4) evaluate data gaps. All of the information developed in the four steps was incorporated, data sources were identified, departures from the Graded Approach were justified, and a step-by-step procedure for biota dose assessment at the INL was specified. Finally, we completed a site-wide biota dose assessment using the 2002 environmental surveillance data and an offsite assessment using soil and surface water data collected since 1996. These assessments demonstrated the environmental concentrations of radionuclides measured on and near the INL do not present significant risks to populations of non-human biota

  18. Applying Rock Engineering Systems (RES approach to Evaluate and Classify the Coal Spontaneous Combustion Potential in Eastern Alborz Coal Mines

    Directory of Open Access Journals (Sweden)

    Amir Saffari

    2013-12-01

    Full Text Available Subject analysis of the potential of spontaneous combustion in coal layers with analytical and numerical methods has been always considered as a difficult task because of the complexity of the coal behavior and the number of factors influencing it. Empirical methods, due to accounting for certain and specific factors, have not accuracy and efficiency for all positions. The Rock Engineering Systems (RES approach as a systematic method for analyzing and classifying is proposed in engineering projects. The present study is concerned with employing the RES approach to categorize coal spontaneous combustion in coal regions. Using this approach, the interaction of parameters affecting each other in an equal scale on the coal spontaneous combustion was evaluated. The Intrinsic, geological and mining characteristics of coal seams were studied in order to identifying important parameters. Then, the main stages of implementation of the RES method i.e. interaction matrix formation, coding matrix and forming a list category were performed. Later, an index of Coal Spontaneous Combustion Potential (CSCPi was determined to format the mathematical equation. Then, the obtained data related to the intrinsic, geological and mining, and special index were calculated for each layer in the case study (Pashkalat coal region, Iran. So, the study offers a perfect and comprehensive classification of the layers. Finally, by using the event of spontaneous combustion occurred in Pashkalat coal region, an initial validation for this systematic approach in the study area was conducted, which suggested relatively good concordance in Pashkalat coal region.

  19. Visual Thinking Routines: A Mixed Methods Approach Applied to Student Teachers at the American University in Dubai

    Science.gov (United States)

    Gholam, Alain

    2017-01-01

    Visual thinking routines are principles based on several theories, approaches, and strategies. Such routines promote thinking skills, call for collaboration and sharing of ideas, and above all, make thinking and learning visible. Visual thinking routines were implemented in the teaching methodology graduate course at the American University in…

  20. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.