WorldWideScience

Sample records for high open probability

  1. Sill intrusion in volcanic calderas: implications for vent opening probability

    Science.gov (United States)

    Giudicepietro, Flora; Macedonio, Giovanni; Martini, Marcello; D'Auria, Luca

    2017-04-01

    Calderas show peculiar behaviors with remarkable dynamic processes, which do not often culminate in eruptions. Observations and studies conducted in recent decades have shown that the most common cause of unrest in the calderas is due to magma intrusion; in particular, the intrusion of sills at shallow depths. Monogenic cones, with large areal dispersion, are quite common in the calderas, suggesting that the susceptibility analysis based on geological features, is not strictly suitable for estimating the vent opening probability in calderas. In general, the opening of a new eruptive vent can be regarded as a rock failure process. The stress field in the rocks that surrounds and tops the magmatic reservoirs plays an important role in causing the rock failure and creating the path that magma can follow towards the surface. In this conceptual framework, we approach the problem of getting clues about the probability of vent opening in volcanic calderas through the study of the stress field produced by the intrusion of magma, in particular, by the intrusion of a sill. We simulate the intrusion of a sill free to expand radially, with shape and dimensions which vary with time. The intrusion process is controlled by the elastic response of the rock plate above the sill, which bends because of the intrusion, and by gravity, that drives the magma towards the zones where the thickness of the sill is smaller. We calculated the stress field in the plate rock above the sill. We found that at the bottom of the rock plate above the sill the maximum intensity of tensile stress is concentrated at the front of the sill and spreads radially with it, over time. For this reason, we think that the front of the spreading sill is prone to open for eruptive vents. Even in the central area of the sill the intensity of stress is relatively high, but at the base of the rock plate stress is compressive. Under isothermal conditions, the stress soon reaches its maximum value (time interval

  2. Statistical analysis on failure-to-open/close probability of motor-operated valve in sodium system

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1998-08-01

    The objective of this work is to develop basic data for examination on efficiency of preventive maintenance and actuation test from the standpoint of failure probability. This work consists of a statistical trend analysis of valve failure probability in a failure-to-open/close mode on time since installation and time since last open/close action, based on the field data of operating- and failure-experience. In this work, the terms both dependent and independent on time were considered in the failure probability. The linear aging model was modified and applied to the first term. In this model there are two terms with both failure rates in proportion to time since installation and to time since last open/close-demand. Because of sufficient statistical population, motor-operated valves (MOV's) in sodium system were selected to be analyzed from the CORDS database which contains operating data and failure data of components in the fast reactors and sodium test facilities. According to these data, the functional parameters were statistically estimated to quantify the valve failure probability in a failure-to-open/close mode, with consideration of uncertainty. (J.P.N.)

  3. A statistical analysis on failure-to open/close probability of pneumatic valve in sodium cooling systems

    International Nuclear Information System (INIS)

    Kurisaka, Kenichi

    1999-11-01

    The objective of this study is to develop fundamental data for examination on efficiency of preventive maintenance and surveillance test from the standpoint of failure probability. In this study, as a major standby component, a pneumatic valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve in sodium cooling systems was selected. A statistical analysis was made about a trend of valve failure-to-open/close (FTOC) probability depending on number of demands ('n'), time since installation ('t') and standby time since last open/close action ('T'). The analysis is based on the field data of operating- and failure-experiences stored in the Component Reliability Database and Statistical Analysis System for LMFBR's (CORDS). In the analysis, the FTOC probability ('P') was expressed as follows: P=1-exp{-C-En-F/n-λT-aT(t-T/2)-AT 2 /2}. The functional parameters, 'C', 'E', 'F', 'λ', 'a' and 'A', were estimated with the maximum likelihood estimation method. As a result, the FTOC probability is almost expressed with the failure probability being derived from the failure rate under assumption of the Poisson distribution only when valve cycle (i.e. open-close-open cycle) exceeds about 100 days. When the valve cycle is shorter than about 100 days, the FTOC probability can be adequately estimated with the parameter model proposed in this study. The results obtained from this study may make it possible to derive an adequate frequency of surveillance test for a given target of the FTOC probability. (author)

  4. Targets of DNA-binding proteins in bacterial promoter regions present enhanced probabilities for spontaneous thermal openings

    International Nuclear Information System (INIS)

    Apostolaki, Angeliki; Kalosakas, George

    2011-01-01

    We mapped promoter regions of double-stranded DNA with respect to the probabilities of appearance of relatively large bubble openings exclusively due to thermal fluctuations at physiological temperatures. We analyzed five well-studied promoter regions of procaryotic type and found a spatial correlation between the binding sites of transcription factors and the position of peaks in the probability pattern of large thermal openings. Other distinct peaks of the calculated patterns correlate with potential binding sites of DNA-binding proteins. These results suggest that a DNA molecule would more frequently expose the bases that participate in contacts with proteins, which would probably enhance the probability of the latter to reach their targets. It also stands for using this method as a means to analyze DNA sequences based on their intrinsic thermal properties

  5. Spatial vent opening probability map of El Hierro Island (Canary Islands, Spain)

    Science.gov (United States)

    Becerril, Laura; Cappello, Annalisa; Galindo, Inés; Neri, Marco; Del Negro, Ciro

    2013-04-01

    The assessment of the probable spatial distribution of new eruptions is useful to manage and reduce the volcanic risk. It can be achieved in different ways, but it becomes especially hard when dealing with volcanic areas less studied, poorly monitored and characterized by a low frequent activity, as El Hierro. Even though it is the youngest of the Canary Islands, before the 2011 eruption in the "Las Calmas Sea", El Hierro had been the least studied volcanic Island of the Canaries, with more historically devoted attention to La Palma, Tenerife and Lanzarote. We propose a probabilistic method to build the susceptibility map of El Hierro, i.e. the spatial distribution of vent opening for future eruptions, based on the mathematical analysis of the volcano-structural data collected mostly on the Island and, secondly, on the submerged part of the volcano, up to a distance of ~10-20 km from the coast. The volcano-structural data were collected through new fieldwork measurements, bathymetric information, and analysis of geological maps, orthophotos and aerial photographs. They have been divided in different datasets and converted into separate and weighted probability density functions, which were then included in a non-homogeneous Poisson process to produce the volcanic susceptibility map. Future eruptive events on El Hierro is mainly concentrated on the rifts zones, extending also beyond the shoreline. The major probabilities to host new eruptions are located on the distal parts of the South and West rifts, with the highest probability reached in the south-western area of the West rift. High probabilities are also observed in the Northeast and South rifts, and the submarine parts of the rifts. This map represents the first effort to deal with the volcanic hazard at El Hierro and can be a support tool for decision makers in land planning, emergency plans and civil defence actions.

  6. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  7. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  8. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  9. A novel approach to estimate the eruptive potential and probability in open conduit volcanoes.

    Science.gov (United States)

    De Gregorio, Sofia; Camarda, Marco

    2016-07-26

    In open conduit volcanoes, volatile-rich magma continuously enters into the feeding system nevertheless the eruptive activity occurs intermittently. From a practical perspective, the continuous steady input of magma in the feeding system is not able to produce eruptive events alone, but rather surplus of magma inputs are required to trigger the eruptive activity. The greater the amount of surplus of magma within the feeding system, the higher is the eruptive probability.Despite this observation, eruptive potential evaluations are commonly based on the regular magma supply, and in eruptive probability evaluations, generally any magma input has the same weight. Conversely, herein we present a novel approach based on the quantification of surplus of magma progressively intruded in the feeding system. To quantify the surplus of magma, we suggest to process temporal series of measurable parameters linked to the magma supply. We successfully performed a practical application on Mt Etna using the soil CO2 flux recorded over ten years.

  10. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  11. Mining of high utility-probability sequential patterns from uncertain databases.

    Directory of Open Access Journals (Sweden)

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  12. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  13. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  14. A High Five for ChemistryOpen.

    Science.gov (United States)

    Peralta, David; Ortúzar, Natalia

    2016-02-01

    Fabulous at five! When ChemistryOpen was launched in 2011, it was the first society-owned general chemistry journal to publish open-access articles exclusively. Five years down the line, it has featured excellent work in all fields of chemistry, leading to an impressive first full impact factor of 3.25. In this Editorial, read about how ChemistryOpen has grown over the past five years and made its mark as a high-quality open-access journal with impact.

  15. How to Recognize and Avoid Potential, Possible, or Probable Predatory Open-Access Publishers, Standalone, and Hijacked Journals.

    Science.gov (United States)

    Danevska, Lenche; Spiroski, Mirko; Donev, Doncho; Pop-Jordanova, Nada; Polenakovic, Momir

    2016-11-01

    The Internet has enabled an easy method to search through the vast majority of publications and has improved the impact of scholarly journals. However, it can also pose threats to the quality of published articles. New publishers and journals have emerged so-called open-access potential, possible, or probable predatory publishers and journals, and so-called hijacked journals. It was our aim to increase the awareness and warn scholars, especially young researchers, how to recognize these journals and how to avoid submission of their papers to these journals. Review and critical analysis of the relevant published literature, Internet sources and personal experience, thoughts, and observations of the authors. The web blog of Jeffrey Beall, University of Colorado, was greatly consulted. Jeffrey Beall is a Denver academic librarian who regularly maintains two lists: the first one, of potential, possible, or probable predatory publishers and the second one, of potential, possible, or probable predatory standalone journals. Aspects related to this topic presented by other authors have been discussed as well. Academics should bear in mind how to differentiate between trustworthy and reliable journals and predatory ones, considering: publication ethics, peer-review process, international academic standards, indexing and abstracting, preservation in digital repositories, metrics, sustainability, etc.

  16. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  17. E-cigarette openness, curiosity, harm perceptions and advertising exposure among U.S. middle and high school students.

    Science.gov (United States)

    Margolis, Katherine A; Donaldson, Elisabeth A; Portnoy, David B; Robinson, Joelle; Neff, Linda J; Jamal, Ahmed

    2018-07-01

    Understanding factors associated with youth e-cigarette openness and curiosity are important for assessing probability of future use. We examined how e-cigarette harm perceptions and advertising exposure are associated with openness and curiosity among tobacco naive youth. Findings from the 2015 National Youth Tobacco Survey (NYTS) were analyzed. The 2015 NYTS is a nationally representative survey of 17,711 U.S. middle and high school students. We calculated weighted prevalence estimates of never users of tobacco products (cigarettes, cigars/cigarillos/little cigars, waterpipe/hookah, smokeless tobacco, bidis, pipes, dissolvables, e-cigarettes) who were open to or curious about e-cigarette use, by demographics. Weighted regression models examined how e-cigarette harm perceptions and advertising exposure were associated with openness using e-cigarettes and curiosity about trying e-cigarettes. Among respondents who never used tobacco products, 23.8% were open to using e-cigarettes and 25.4% were curious. Respondents that perceived e-cigarettes cause a lot of harm had lower odds of both openness (OR = 0.10, 95% CI = 0.07, 0.15) and curiosity about e-cigarettes (OR = 0.10, 95% CI = 0.07, 0.13) compared to those with lower harm perception. Respondents who reported high exposure to e-cigarette advertising in stores had greater odds of being open to e-cigarette use (OR = 1.22, 95% CI = 1.03, 1.44) and highly curious (OR = 1.25, 95% CI = 1.01, 1.53) compared to those not highly exposed. These findings demonstrate that youth exposed to e-cigarette advertising are open and curious to e-cigarette use. These findings could help public health practitioners better understand the interplay of advertising exposure and harm perceptions with curiosity and openness to e-cigarette use in a rapidly changing marketplace. Published by Elsevier Inc.

  18. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  19. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  20. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  1. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  2. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    Science.gov (United States)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  3. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  4. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  5. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  6. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  7. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  8. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  9. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  10. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  11. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  12. Experimental studies on a new highly porous hydroxyapatite matrix for obliterating open mastoid cavities.

    Science.gov (United States)

    Punke, Christoph; Zehlicke, Thorsten; Boltze, Carsten; Pau, Hans Wilhelm

    2008-09-01

    In an initial preliminary study, the applicability of a new high-porosity hydroxyapatite (HA) ceramic for obliterating large open mastoid cavities was proven and tested in an animal model (bulla of guinea pig). Experimental study. NanoBone, a highly porous matrix consisting of 76% hydroxyl apatite and 24% silicone dioxide fabricated in a sol-gel technique, was administered unilaterally into the opened bullae of 30 guinea pigs. In each animal, the opposite bulla was filled with Bio-Oss, a bone substitute consisting of a portion of mineral bovine bone. Histologic evaluations were performed 1, 2, 3, 4, 5, and 12 weeks after the implantation. After an initial phase in which the ceramic granules were surrounded by inflammatory cells (1-2 wk), there were increasing signs of vascularization. Osteoneogenesis and-at the same time-resorption of the HA ceramic were observed after the third week. No major difference in comparison to the bovine bone material could be found. Our results confirm the favorable qualities of the new ceramic reported in association with current maxillofacial literature. Conventional HA granules used for mastoid obliteration to date often showed problems with prolonged inflammatory reactions and, finally, extrusions. In contrast to those ceramics, the new material seems to induce more osteoneogenesis and undergoes early resorption probably due to its high porosity. Overall, it is similar to the bovine bone substance tested on the opposite ear in each animal. Further clinical studies may reveal whether NanoBone can be an adequate material for obliterating open mastoid cavities in patients.

  13. Interpretation of highly visual 'open' advertisements in Dutch magazines

    NARCIS (Netherlands)

    Ketelaar, P.E.; Gisbergen, M.S.; Beentjes, J.

    2012-01-01

    In recent decades magazine advertisers have used an increasing number of highly visual open ads. Open ads do not guide consumers toward a specific interpretation as traditional ads do. An experiment was carried out to establish the effects of openness on interpretation. As expected, openness was

  14. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  15. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  16. Maximum Entropy and Probability Kinematics Constrained by Conditionals

    Directory of Open Access Journals (Sweden)

    Stefan Lukits

    2015-03-01

    Full Text Available Two open questions of inductive reasoning are solved: (1 does the principle of maximum entropy (PME give a solution to the obverse Majerník problem; and (2 isWagner correct when he claims that Jeffrey’s updating principle (JUP contradicts PME? Majerník shows that PME provides unique and plausible marginal probabilities, given conditional probabilities. The obverse problem posed here is whether PME also provides such conditional probabilities, given certain marginal probabilities. The theorem developed to solve the obverse Majerník problem demonstrates that in the special case introduced by Wagner PME does not contradict JUP, but elegantly generalizes it and offers a more integrated approach to probability updating.

  17. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  18. Resveratrol enhances airway surface liquid depth in sinonasal epithelium by increasing cystic fibrosis transmembrane conductance regulator open probability.

    Directory of Open Access Journals (Sweden)

    Shaoyan Zhang

    Full Text Available Chronic rhinosinusitis engenders enormous morbidity in the general population, and is often refractory to medical intervention. Compounds that augment mucociliary clearance in airway epithelia represent a novel treatment strategy for diseases of mucus stasis. A dominant fluid and electrolyte secretory pathway in the nasal airways is governed by the cystic fibrosis transmembrane conductance regulator (CFTR. The objectives of the present study were to test resveratrol, a strong potentiator of CFTR channel open probability, in preparation for a clinical trial of mucociliary activators in human sinus disease.Primary sinonasal epithelial cells, immortalized bronchoepithelial cells (wild type and F508del CFTR, and HEK293 cells expressing exogenous human CFTR were investigated by Ussing chamber as well as patch clamp technique under non-phosphorylating conditions. Effects on airway surface liquid depth were measured using confocal laser scanning microscopy. Impact on CFTR gene expression was measured by quantitative reverse transcriptase polymerase chain reaction.Resveratrol is a robust CFTR channel potentiator in numerous mammalian species. The compound also activated temperature corrected F508del CFTR and enhanced CFTR-dependent chloride secretion in human sinus epithelium ex vivo to an extent comparable to the recently approved CFTR potentiator, ivacaftor. Using inside out patches from apical membranes of murine cells, resveratrol stimulated an ~8 picosiemens chloride channel consistent with CFTR. This observation was confirmed in HEK293 cells expressing exogenous CFTR. Treatment of sinonasal epithelium resulted in a significant increase in airway surface liquid depth (in µm: 8.08+/-1.68 vs. 6.11+/-0.47,control,p<0.05. There was no increase CFTR mRNA.Resveratrol is a potent chloride secretagogue from the mucosal surface of sinonasal epithelium, and hydrates airway surface liquid by increasing CFTR channel open probability. The foundation for a

  19. Fast-opening vacuum switches for high-power inductive energy storage

    International Nuclear Information System (INIS)

    Cooperstein, G.

    1988-01-01

    The subject of fast-opening vacuum switches for high-power inductive energy storage is emerging as an exciting new area of plasma science research. This opening switch technology, which generally involves the use of plasmas as the switching medium, is key to the development of inductive energy storage techniques for pulsed power which have a number of advantages over conventional capacitive techniques with regard to cost and size. This paper reviews the state of the art in this area with emphasis on applications to inductive storage pulsed power generators. Discussion focuses on fast-opening vacuum switches capable of operating at high power (≥10 12 W). These include plasma erosion opening switches, ion beam opening switches, plasma filled diodes, reflex diodes, plasma flow switches, and other novel vacuum opening switches

  20. Sudden transition and sudden change from open spin environments

    International Nuclear Information System (INIS)

    Hu, Zheng-Da; Xu, Jing-Bo; Yao, Dao-Xin

    2014-01-01

    We investigate the necessary conditions for the existence of sudden transition or sudden change phenomenon for appropriate initial states under dephasing. As illustrative examples, we study the behaviors of quantum correlation dynamics of two noninteracting qubits in independent and common open spin environments, respectively. For the independent environments case, we find that the quantum correlation dynamics is closely related to the Loschmidt echo and the dynamics exhibits a sudden transition from classical to quantum correlation decay. It is also shown that the sudden change phenomenon may occur for the common environment case and stationary quantum discord is found at the high temperature region of the environment. Finally, we investigate the quantum criticality of the open spin environment by exploring the probability distribution of the Loschmidt echo and the scaling transformation behavior of quantum discord, respectively. - Highlights: • Sudden transition or sudden change from open spin baths are studied. • Quantum discord is related to the Loschmidt echo in independent open spin baths. • Steady quantum discord is found in a common open spin bath. • The probability distribution of the Loschmidt echo is analyzed. • The scaling transformation behavior of quantum discord is displayed

  1. Early diagnosis and research of high myopia with primary open angle glaucoma

    Directory of Open Access Journals (Sweden)

    Yan Guo

    2014-04-01

    Full Text Available People with high myopia are high risk populations to have primary open angle glaucoma. Clinically, we found that patients with primary open angle glaucoma and high myopia is closely related. So to understand the clinical features of high myopia with primary open angle glaucoma and the importance of early diagnosis, to avoiding missed diagnosis or lower misdiagnosed rate, can help to improve the vigilance and level of early diagnosis of the clinicians. In this paper, high myopia with clinical features of primary open angle glaucoma, and the research progress on the main points of early diagnosis were reviewed.

  2. Dynamic Open Inquiry Performances of High-School Biology Students

    Science.gov (United States)

    Zion, Michal; Sadeh, Irit

    2010-01-01

    In examining open inquiry projects among high-school biology students, we found dynamic inquiry performances expressed in two criteria: "changes occurring during inquiry" and "procedural understanding". Characterizing performances in a dynamic open inquiry project can shed light on both the procedural and epistemological…

  3. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    Science.gov (United States)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    cover.For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of −9.0%, −21%, −8.6%, 17.8%, 3.6%, and −2.3%.This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  4. News from the Library: Publishing Open Access articles beyond High Energy Physics

    CERN Multimedia

    CERN Library

    2012-01-01

    CERN has supported Open Access Publishing for many years, and the Scientific Information Service is working to implement this vision. We have just launched the flagship project SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics) aimed at converting high-quality journals in High Energy Physics to Open Access for articles published as of 2014. More details here.   In parallel, several win-win arrangements allow experimental and theoretical high-energy physics results from CERN to be published in Open Access in a variety of high-impact journals. More information can be found here. Open Access publishing at CERN goes far beyond High Energy Physics. Indeed, CERN is a key supporter of Open Access in accelerator science, through sponsorship of the APS journal PRSTAB and participation in the JACoW collaboration. Now CERN authors publishing in the field of engineering will also have th...

  5. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  6. Long-term survival in laparoscopic vs open resection for colorectal liver metastases: inverse probability of treatment weighting using propensity scores.

    Science.gov (United States)

    Lewin, Joel W; O'Rourke, Nicholas A; Chiow, Adrian K H; Bryant, Richard; Martin, Ian; Nathanson, Leslie K; Cavallucci, David J

    2016-02-01

    This study compares long-term outcomes between intention-to-treat laparoscopic and open approaches to colorectal liver metastases (CLM), using inverse probability of treatment weighting (IPTW) based on propensity scores to control for selection bias. Patients undergoing liver resection for CLM by 5 surgeons at 3 institutions from 2000 to early 2014 were analysed. IPTW based on propensity scores were generated and used to assess the marginal treatment effect of the laparoscopic approach via a weighted Cox proportional hazards model. A total of 298 operations were performed in 256 patients. 7 patients with planned two-stage resections were excluded leaving 284 operations in 249 patients for analysis. After IPTW, the population was well balanced. With a median follow up of 36 months, 5-year overall survival (OS) and recurrence-free survival (RFS) for the cohort were 59% and 38%. 146 laparoscopic procedures were performed in 140 patients, with weighted 5-year OS and RFS of 54% and 36% respectively. In the open group, 138 procedures were performed in 122 patients, with a weighted 5-year OS and RFS of 63% and 38% respectively. There was no significant difference between the two groups in terms of OS or RFS. In the Brisbane experience, after accounting for bias in treatment assignment, long term survival after LLR for CLM is equivalent to outcomes in open surgery. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  7. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    CERN Document Server

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  8. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  9. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  10. Innovative and high quality education through Open Education and OER

    OpenAIRE

    Stracke, Christian M.

    2017-01-01

    Online presentation and webinar by Stracke, C. M. (2017, 18 December) on "Innovative and high quality education through Open Education and OER" for the Belt and Road Open Education Learning Week by the Beijing Normal University, China.

  11. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  12. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems

    Science.gov (United States)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-01

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper, we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E. The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. `explore or not?'; `open new well or not?'; `contaminated by water or not?'; `double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism). This article is part of the theme issue `Hilbert's sixth problem'.

  13. From axiomatics of quantum probability to modelling geological uncertainty and management of intelligent hydrocarbon reservoirs with the theory of open quantum systems.

    Science.gov (United States)

    Lozada Aguilar, Miguel Ángel; Khrennikov, Andrei; Oleschko, Klaudia

    2018-04-28

    As was recently shown by the authors, quantum probability theory can be used for the modelling of the process of decision-making (e.g. probabilistic risk analysis) for macroscopic geophysical structures such as hydrocarbon reservoirs. This approach can be considered as a geophysical realization of Hilbert's programme on axiomatization of statistical models in physics (the famous sixth Hilbert problem). In this conceptual paper , we continue development of this approach to decision-making under uncertainty which is generated by complexity, variability, heterogeneity, anisotropy, as well as the restrictions to accessibility of subsurface structures. The belief state of a geological expert about the potential of exploring a hydrocarbon reservoir is continuously updated by outputs of measurements, and selection of mathematical models and scales of numerical simulation. These outputs can be treated as signals from the information environment E The dynamics of the belief state can be modelled with the aid of the theory of open quantum systems: a quantum state (representing uncertainty in beliefs) is dynamically modified through coupling with E ; stabilization to a steady state determines a decision strategy. In this paper, the process of decision-making about hydrocarbon reservoirs (e.g. 'explore or not?'; 'open new well or not?'; 'contaminated by water or not?'; 'double or triple porosity medium?') is modelled by using the Gorini-Kossakowski-Sudarshan-Lindblad equation. In our model, this equation describes the evolution of experts' predictions about a geophysical structure. We proceed with the information approach to quantum theory and the subjective interpretation of quantum probabilities (due to quantum Bayesianism).This article is part of the theme issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  14. Varicocoelectomy in adolescents: Laparoscopic versus open high ...

    African Journals Online (AJOL)

    Background: Treatment of varicocoele is aimed at eliminating the retrograde reflux of venous blood through the internal spermatic veins. The purpose of this investigation was to compare laparoscopic varicocoelectomy (LV) with open high ligation technique in the adolescent population. Materials and Methods: We ...

  15. Opening the high-energy frontier

    International Nuclear Information System (INIS)

    Quigg, C.

    1988-12-01

    I review the scientific motivation for an experimental assault on the 1-TeV scale, elaborating the idea of technicolor as one interesting possibility for what may be found there. I then summarize some of the discovery possibilities opened by a high-luminosity, multi-TeV proton-proton collider. After a brief resume of the experimental environment anticipated at the SSC, I report on the status of the SSC R ampersand D effort and discuss the work to be carried out over the course of the next year. 37 refs., 10 figs., 1 tab

  16. Jump probabilities in the non-Markovian quantum jump method

    International Nuclear Information System (INIS)

    Haerkoenen, Kari

    2010-01-01

    The dynamics of a non-Markovian open quantum system described by a general time-local master equation is studied. The propagation of the density operator is constructed in terms of two processes: (i) deterministic evolution and (ii) evolution of a probability density functional in the projective Hilbert space. The analysis provides a derivation for the jump probabilities used in the recently developed non-Markovian quantum jump (NMQJ) method (Piilo et al 2008 Phys. Rev. Lett. 100 180402).

  17. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  18. Repetitive plasma opening switch for powerful high-voltage pulse generators

    International Nuclear Information System (INIS)

    Dolgachev, G.I.; Zakatov, L.P.; Nitishinskii, M.S.; Ushakov, A.G.

    1998-01-01

    Results are presented of experimental studies of plasma opening switches that serve to sharpen the pulses of inductive microsecond high-voltage pulse generators. It is demonstrated that repetitive plasma opening switches can be used to create super-powerful generators operating in a quasi-continuous regime. An erosion switching mechanism and the problem of magnetic insulation in repetitive switches are considered. Achieving super-high peak power in plasma switches makes it possible to develop new types of high-power generators of electron beams and X radiation. Possible implementations and the efficiency of these generators are discussed

  19. A prototype method for diagnosing high ice water content probability using satellite imager data

    Science.gov (United States)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  20. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  1. Winter School on Operator Spaces, Noncommutative Probability and Quantum Groups

    CERN Document Server

    2017-01-01

    Providing an introduction to current research topics in functional analysis and its applications to quantum physics, this book presents three lectures surveying recent progress and open problems.  A special focus is given to the role of symmetry in non-commutative probability, in the theory of quantum groups, and in quantum physics. The first lecture presents the close connection between distributional symmetries and independence properties. The second introduces many structures (graphs, C*-algebras, discrete groups) whose quantum symmetries are much richer than their classical symmetry groups, and describes the associated quantum symmetry groups. The last lecture shows how functional analytic and geometric ideas can be used to detect and to quantify entanglement in high dimensions.  The book will allow graduate students and young researchers to gain a better understanding of free probability, the theory of compact quantum groups, and applications of the theory of Banach spaces to quantum information. The l...

  2. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  3. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Directory of Open Access Journals (Sweden)

    R. Strauch

    2018-02-01

    Full Text Available We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m, and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  4. A hydroclimatological approach to predicting regional landslide probability using Landlab

    Science.gov (United States)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  5. Application of a few orthogonal polynomials to the assessment of the fracture failure probability of a spherical tank

    International Nuclear Information System (INIS)

    Cao Tianjie; Zhou Zegong

    1993-01-01

    This paper presents some methods to assess the fracture failure probability of a spherical tank. These methods convert the assessment of the fracture failure probability into the calculation of the moment of cracks and a one-dimensional integral. In the paper, we first derive series' formulae to calculation the moments of cracks on the occasion of the crack fatigue growth and the moments of crack opening displacements according to JWES-2805 code. We then use the first n moments of crack opening displacements and a few orthogonal polynomials to compose the probability density function of the crack opening displacement. Lastly, the fracture failure probability is obtained according to the interference theory. An example proves that these methods are simpler, quicker, and more accurate. At the same time, these methods avoid the disadvantage of Edgeworth's series method. (author)

  6. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  7. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  8. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  9. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  10. Neutron emission probability at high excitation and isospin

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  11. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  12. Open high-level data formats and software for gamma-ray astronomy

    Science.gov (United States)

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  13. Genefer: Programs for Finding Large Probable Generalized Fermat Primes

    Directory of Open Access Journals (Sweden)

    Iain Arthur Bethune

    2015-11-01

    Full Text Available Genefer is a suite of programs for performing Probable Primality (PRP tests of Generalised Fermat numbers 'b'2'n'+1 (GFNs using a Fermat test. Optimised implementations are available for modern CPUs using single instruction, multiple data (SIMD instructions, as well as for GPUs using CUDA or OpenCL. Genefer has been extensively used by PrimeGrid – a volunteer computing project searching for large prime numbers of various kinds, including GFNs. Genefer’s architecture separates the high level logic such as checkpointing and user interface from the architecture-specific performance-critical parts of the implementation, which are suitable for re-use. Genefer is released under the MIT license. Source and binaries are available from www.assembla.com/spaces/genefer.

  14. A high open-circuit voltage gallium nitride betavoltaic microbattery

    International Nuclear Information System (INIS)

    Cheng, Zaijun; Chen, Xuyuan; San, Haisheng; Feng, Zhihong; Liu, Bo

    2012-01-01

    A high open-circuit voltage betavoltaic microbattery based on a gallium nitride (GaN) p–i–n homojunction is demonstrated. As a beta-absorbing layer, the low electron concentration of the n-type GaN layer is achieved by the process of Fe compensation doping. Under the irradiation of a planar solid 63 Ni source with activity of 0.5 mCi, the open-circuit voltage of the fabricated microbattery with 2 × 2 mm 2 area reaches as much as 1.64 V, which is the record value reported for betavoltaic batteries with 63 Ni source, the short-circuit current was measured as 568 pA and the conversion effective of 0.98% was obtained. The experimental results suggest that GaN is a high-potential candidate for developing the betavoltaic microbattery. (paper)

  15. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  16. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  17. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  18. Open cell conducting foams for high synchrotron radiation accelerators

    Directory of Open Access Journals (Sweden)

    S. Petracca

    2014-08-01

    Full Text Available The possible use of open cell conductive foams in high synchrotron radiation particle accelerators is considered. Available materials and modeling tools are reviewed, potential pros and cons are discussed, and preliminary conclusions are drawn.

  19. Open Access Publishing in High-Energy Physics

    CERN Document Server

    Mele, S

    2007-01-01

    The goal of Open Access (OA) is to grant anyone, anywhere and anytime free access to the results of scientific research. The High- Energy Physics (HEP) community has pioneered OA with its "pre-print culture": the mass mailing, first, and the online posting, later, of preliminary versions of its articles. After almost half a century of widespread dissemination of pre-prints, the time is ripe for the HEP community to explore OA publishing. Among other possible models, a sponsoring consortium appears as the most viable option for a transition of HEP peer-reviewed literature to OA. A Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is proposed as a central body which would remunerate publishers for the peer-review service, effectively replacing the "reader-pays" model of traditional subscriptions with an "author-side" funding. Funding to SCOAP3 would come from HEP funding agencies and library consortia through a re-direction of subscriptions. This model is discussed in details togethe...

  20. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  1. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  2. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  3. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  4. The Research and Implementation of MUSER CLEAN Algorithm Based on OpenCL

    Science.gov (United States)

    Feng, Y.; Chen, K.; Deng, H.; Wang, F.; Mei, Y.; Wei, S. L.; Dai, W.; Yang, Q. P.; Liu, Y. B.; Wu, J. P.

    2017-03-01

    It's urgent to carry out high-performance data processing with a single machine in the development of astronomical software. However, due to the different configuration of the machine, traditional programming techniques such as multi-threading, and CUDA (Compute Unified Device Architecture)+GPU (Graphic Processing Unit) have obvious limitations in portability and seamlessness between different operation systems. The OpenCL (Open Computing Language) used in the development of MUSER (MingantU SpEctral Radioheliograph) data processing system is introduced. And the Högbom CLEAN algorithm is re-implemented into parallel CLEAN algorithm by the Python language and PyOpenCL extended package. The experimental results show that the CLEAN algorithm based on OpenCL has approximately equally operating efficiency compared with the former CLEAN algorithm based on CUDA. More important, the data processing in merely CPU (Central Processing Unit) environment of this system can also achieve high performance, which has solved the problem of environmental dependence of CUDA+GPU. Overall, the research improves the adaptability of the system with emphasis on performance of MUSER image clean computing. In the meanwhile, the realization of OpenCL in MUSER proves its availability in scientific data processing. In view of the high-performance computing features of OpenCL in heterogeneous environment, it will probably become the preferred technology in the future high-performance astronomical software development.

  5. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  6. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  7. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  8. The Best of Two Open Worlds at the National Open University of Nigeria

    Directory of Open Access Journals (Sweden)

    Jane-frances Obiageli Agbu

    2016-05-01

    Full Text Available It will be wise for educational institutions, from primary to tertiary level, globally, to reflect on their position and profile with respect to the new concepts of Open Educational Resources (OER and Massive Open Online Courses (MOOCs. Responses will be diverse of course but the potential is so manifest that many institutions probably will consider the benefits to outweigh the barriers. The National Open University of Nigeria (NOUN has decided to combine its ‘classical’ openness with the new digital openness by fully embracing the OER approach and converting its complete course base into OER. Step-by-step, NOUN is currently implementing its strategy towards becoming an OER-based Open University with a special niche for MOOCs. During a launch event in December 2015, the first 40 OER-based courses were presented as well as the first 3 OER-based MOOCs. This paper therefore presents NOUN’s OER strategy with insight on lessons learned. To the authors’ knowledge NOUN is the first Open University in the world with such a full-fledged OER (& MOOCs implementation route.

  9. Quasi-open inflation

    CERN Document Server

    García-Bellido, J; Montes, X; Garcia-Bellido, Juan; Garriga, Jaume; Montes, Xavier

    1998-01-01

    We show that a large class of two-field models of single-bubble open inflation do not lead to infinite open universes, as it was previously thought, but to an ensemble of very large but finite inflating `islands'. The reason is that the quantum tunneling responsible for the nucleation of the bubble does not occur simultaneously along both field directions and equal-time hypersurfaces in the open universe are not synchronized with equal-density or fixed-field hypersurfaces. The most probable tunneling trajectory corresponds to a zero value of the inflaton field; large values, necessary for the second period of inflation inside the bubble, only arise as localized fluctuations. The interior of each nucleated bubble will contain an infinite number of such inflating regions of comoving size of order $\\gamma^{-1}$, where $\\gamma$ depends on the parameters of the model. Each one of these islands will be a quasi-open universe. Since the volume of the hyperboloid is infinite, inflating islands with all possible values...

  10. Naive Probability: Model-based Estimates of Unique Events

    Science.gov (United States)

    2014-05-04

    of inference. Argument and Computation, 1–17, iFirst. Khemlani, S., & Johnson-Laird, P.N. (2012b). Theories of the syllogism: A meta -analysis...is the probability that… 1 space tourism will achieve widespread popularity in the next 50 years? advances in material science will lead to the... governments dedicate more resources to contacting extra-terrestrials? 8 the United States adopts an open border policy of universal acceptance? English is

  11. Semi-automated categorization of open-ended questions

    Directory of Open Access Journals (Sweden)

    Matthias Schonlau

    2016-08-01

    Full Text Available Text data from open-ended questions in surveys are difficult to analyze and are frequently ignored. Yet open-ended questions are important because they do not constrain respondents’ answer choices. Where open-ended questions are necessary, sometimes multiple human coders hand-code answers into one of several categories. At the same time, computer scientists have made impressive advances in text mining that may allow automation of such coding. Automated algorithms do not achieve an overall accuracy high enough to entirely replace humans. We categorize open-ended questions soliciting narrative responses using text mining for easy-to-categorize answers and humans for the remainder using expected accuracies to guide the choice of the threshold delineating between “easy” and “hard”. Employing multinomial boosting avoids the common practice of converting machine learning “confidence scores” into pseudo-probabilities. This approach is illustrated with examples from open-ended questions related to respondents’ advice to a patient in a hypothetical dilemma, a follow-up probe related to respondents’ perception of disclosure/privacy risk, and from a question on reasons for quitting smoking from a follow-up survey from the Ontario Smoker’s Helpline. Targeting 80% combined accuracy, we found that 54%-80% of the data could be categorized automatically in research surveys.

  12. High Quality Education and Learning for All through Open Education

    NARCIS (Netherlands)

    Stracke, Christian M.

    2016-01-01

    Keynote at the International Lensky Education Forum 2016, Yakutsk, Republic of Sakha, Russian Federation, by Stracke, C. M. (2016, 16 August): "High Quality Education and Learning for All through Open Education"

  13. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  14. Is high myopia a risk factor for visual field progression or disk hemorrhage in primary open-angle glaucoma?

    Science.gov (United States)

    Nitta, Koji; Sugiyama, Kazuhisa; Wajima, Ryotaro; Tachibana, Gaku

    2017-01-01

    The purpose of this study was to clarify differences between highly myopic and non-myopic primary open-angle glaucoma (POAG) patients, including normal-tension glaucoma patients. A total of 269 POAG patients were divided into two groups: patients with ≥26.5 mm of axial length (highly myopic group) and patients with field (VF) loss was significantly greater in the highly myopic group (10-year survival rate, 73.7%±6.8%) than in the non-myopic group (10-year survival rate, 46.3%±5.8%; log-rank test, P =0.0142). The occurrence of disk hemorrhage (DH) in the non-myopic group (1.60±3.04) was significantly greater than that in the highly myopic group (0.93±2.13, P =0.0311). The cumulative probability of DH was significantly lower in the highly myopic group (10-year survival rate, 26.4%±5.4%) than in the non-myopic group (10-year survival rate, 47.2%±6.6%, P =0.0413). Highly myopic POAG is considered as a combination of myopic optic neuropathy and glaucomatous optic neuropathy (GON). If GON is predominant, it has frequent DH and more progressive VF loss. However, when the myopic optic neuropathy is predominant, it has less DH and less progressive VF loss.

  15. Open Access Publishing in High-Energy Physics: the SCOAP(3) Initiative

    CERN Document Server

    Mele, Salvatore

    2010-01-01

    Scholarly communication in High-Energy Physics (HEP) shows traits very similar to Astronomy and Astrophysics: pervasiveness of Open Access to preprints through community-based services; a culture of openness and sharing among its researchers; a compact number of yearly articles published by a relatively small number of journals which are dear to the community. These aspects have led HEP to spearhead an innovative model for the transition of its scholarly publishing to Open Access. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP(3)) aims to be a central body to finance peer-review service rather than the purchase of access to information as in the traditional subscription model, with all articles in the discipline eventually available in Open Access. Sustainable funding to SCOAP(3) would come from libraries, library consortia and HEP funding agencies, through a re-direction of funds currently spent for subscriptions to HEP journals. This paper presents the cultural and bibliomet...

  16. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  17. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  18. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  19. ATLAS OpenData and OpenKey: using low tech computational tools for students training in High Energy Physics

    CERN Document Server

    Sanchez Pineda, Arturos; The ATLAS collaboration

    2018-01-01

    One of the big challenges in High Energy Physics development is the fact that many potential -and very valuable- students and young researchers live in countries where internet access and computational infrastructure are poor compared to institutions already participating. In order to accelerate the process, the ATLAS Open Data project releases useful and meaningful data and tools using standard and easy-to-deploy computational means, such as custom and light Linux Virtual Machines, open source technologies, web and desktop applications. The ATLAS Open Key, a simple USB pen, allows transporting all those resources around the globe. As simple as it sounds, this approach is helping to train students that are now PhD candidates and to integrate HEP educational programs at Master level in universities where did not exist before. The software tools and resources used will be presented, as well as results and stories, ideas and next steps of the ATLAS Open Data project.

  20. Discovery of a probable galaxy with a redshift of 3.218

    International Nuclear Information System (INIS)

    Djorgovski, S.; Spinard, H.; McCarthy, P.; Strauss, M.A.

    1985-01-01

    We report the discovery of a narrow emission line object, probably a galaxy, with a redshift of 3.218. The object is a companion to the quasar PKS 1614+051, which is at a redshift of 3.209. This is the most distant non--QSO, non--gravitationally lensed object presently known by a large margin. Its properties are consistent with those expected of a high-redshift galaxy. This object has an age of only a few percent of the present age of the universe. The object was discovered with a novel technique, which promises to push studies of distant galaxies to redshifts as high as those of the most distant quasars known, and which may eventually lead to the discovery of primeval galaxies. This discovery opens the way for studies of galaxies beyond z = 3, which should prove invaluable for observational cosmology

  1. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  2. SILENE and TDT: A code for collision probability calculations in XY geometries

    International Nuclear Information System (INIS)

    Sanchez, R.; Stankovski, Z.

    1993-01-01

    Collision probability methods are routinely used for cell and assembly multigroup transport calculations in core design tasks. Collision probability methods use a specialized tracking routine to compute neutron trajectories within a given geometric object. These trajectories are then used to generate the appropriate collision matrices in as many groups as required. Traditional tracking routines are based on open-quotes globalclose quotes geometric descriptions (such as regular meshes) and are not able to cope with the geometric detail required in actual core calculations. Therefore, users have to modify their geometry in order to match the geometric model accepted by the tracking routine, introducing thus a modeling error whose evaluation requires the use of a open-quotes referenceclose quotes method. Recently, an effort has been made to develop more flexible tracking routines either by directly adopting tracking Monte Carlo techniques or by coding of complicated geometries. Among these, the SILENE and TDT package is being developed at the Commissariat a l' Energie Atomique to provide routine as well as reference calculations in arbitrarily shaped XY geometries. This package combines a direct graphical acquisition system (SILENE) together with a node-based collision probability code for XY geometries (TDT)

  3. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  4. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  5. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Open Access Publishing in High-Energy Physics: the SCOAP3 Initiative

    Science.gov (United States)

    Mele, S.

    2010-10-01

    Scholarly communication in High-Energy Physics (HEP) shows traits very similar to Astronomy and Astrophysics: pervasiveness of Open Access to preprints through community-based services; a culture of openness and sharing among its researchers; a compact number of yearly articles published by a relatively small number of journals which are dear to the community. These aspects have led HEP to spearhead an innovative model for the transition of its scholarly publishing to Open Access. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP) aims to be a central body to finance peer-review service rather than the purchase of access to information as in the traditional subscription model, with all articles in the discipline eventually available in Open Access. Sustainable funding to SCOAP would come from libraries, library consortia and HEP funding agencies, through a re-direction of funds currently spent for subscriptions to HEP journals. This paper presents the cultural and bibliometric factors at the roots of SCOAP and the current status of this worldwide initiative.

  8. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  9. An analysis of the annual probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP)

    Energy Technology Data Exchange (ETDEWEB)

    Greenfield, M.A. [Univ. of California, Los Angeles, CA (United States); Sargent, T.J.

    1995-11-01

    The Environmental Evaluation Group (EEG) previously analyzed the probability of a catastrophic accident in the waste hoist of the Waste Isolation Pilot Plant (WIPP) and published the results in Greenfield (1990; EEG-44) and Greenfield and Sargent (1993; EEG-53). The most significant safety element in the waste hoist is the hydraulic brake system, whose possible failure was identified in these studies as the most important contributor in accident scenarios. Westinghouse Electric Corporation, Waste Isolation Division has calculated the probability of an accident involving the brake system based on studies utilizing extensive fault tree analyses. This analysis conducted for the U.S. Department of Energy (DOE) used point estimates to describe the probability of failure and includes failure rates for the various components comprising the brake system. An additional controlling factor in the DOE calculations is the mode of operation of the brake system. This factor enters for the following reason. The basic failure rate per annum of any individual element is called the Event Probability (EP), and is expressed as the probability of failure per annum. The EP in turn is the product of two factors. One is the {open_quotes}reported{close_quotes} failure rate, usually expressed as the probability of failure per hour and the other is the expected number of hours that the element is in use, called the {open_quotes}mission time{close_quotes}. In many instances the {open_quotes}mission time{close_quotes} will be the number of operating hours of the brake system per annum. However since the operation of the waste hoist system includes regular {open_quotes}reoperational check{close_quotes} tests, the {open_quotes}mission time{close_quotes} for standby components is reduced in accordance with the specifics of the operational time table.

  10. An analysis of the annual probability of failure of the waste hoist brake system at the Waste Isolation Pilot Plant (WIPP)

    International Nuclear Information System (INIS)

    Greenfield, M.A.; Sargent, T.J.

    1995-11-01

    The Environmental Evaluation Group (EEG) previously analyzed the probability of a catastrophic accident in the waste hoist of the Waste Isolation Pilot Plant (WIPP) and published the results in Greenfield (1990; EEG-44) and Greenfield and Sargent (1993; EEG-53). The most significant safety element in the waste hoist is the hydraulic brake system, whose possible failure was identified in these studies as the most important contributor in accident scenarios. Westinghouse Electric Corporation, Waste Isolation Division has calculated the probability of an accident involving the brake system based on studies utilizing extensive fault tree analyses. This analysis conducted for the U.S. Department of Energy (DOE) used point estimates to describe the probability of failure and includes failure rates for the various components comprising the brake system. An additional controlling factor in the DOE calculations is the mode of operation of the brake system. This factor enters for the following reason. The basic failure rate per annum of any individual element is called the Event Probability (EP), and is expressed as the probability of failure per annum. The EP in turn is the product of two factors. One is the open-quotes reportedclose quotes failure rate, usually expressed as the probability of failure per hour and the other is the expected number of hours that the element is in use, called the open-quotes mission timeclose quotes. In many instances the open-quotes mission timeclose quotes will be the number of operating hours of the brake system per annum. However since the operation of the waste hoist system includes regular open-quotes reoperational checkclose quotes tests, the open-quotes mission timeclose quotes for standby components is reduced in accordance with the specifics of the operational time table

  11. The Everett-Wheeler interpretation and the open future

    International Nuclear Information System (INIS)

    Sudbery, Anthony

    2011-01-01

    I discuss the meaning of probability in the Everett-Wheeler interpretation of quantum mechanics, together with the problem of defining histories. To resolve these, I propose an understanding of probability arising from a form of temporal logic: the probability of a future-tense proposition is identified with its truth value in a many-valued and context-dependent logic. In short, probability is degree of truth. These ideas relate to traditional naive ideas of time and chance. Indeed, I argue that Everettian quantum mechanics is the only form of scientific theory that truly incorporates the perception that the future is open.

  12. Early age stress-crack opening relationships for high performance concrete

    DEFF Research Database (Denmark)

    Østergaard, Lennart; Lange, David A.; Stang, Henrik

    2004-01-01

    Stress–crack opening relationships for concrete in early age have been determined for two high performance concrete mixes with water to cementitious materials ratios of 0.307 and 0.48. The wedge splitting test setup was used experimentally and the cracked nonlinear hinge model based...... on the fictitious crack model was applied for the interpretation of the results. A newly developed inverse analysis algorithm was utilized for the extraction of the stress–crack opening relationships. Experiments were conducted at 8, 10, 13, 17, 22, 28, 48, 168 h (7 days) and 672 h (28 days). At the same ages...

  13. Demonstration of a High Open-Circuit Voltage GaN Betavoltaic Microbattery

    International Nuclear Information System (INIS)

    Cheng Zai-Jun; San Hai-Sheng; Chen Xu-Yuan; Liu Bo; Feng Zhi-Hong

    2011-01-01

    A high open-circuit voltage betavoltaic microbattery based on a GaN p-i-n diode is demonstrated. Under the irradiation of a 4×4 mm 2 planar solid 63 Ni source with an activity of 2 mCi, the open-circuit voltage V oc of the fabricated single 2×2mm 2 cell reaches as high as 1.62 V, the short-circuit current density J sc is measured to be 16nA/cm 2 . The microbattery has a fill factor of 55%, and the energy conversion efficiency of beta radiation into electricity reaches to 1.13%. The results suggest that GaN is a highly promising potential candidate for long-life betavoltaic microbatteries used as power supplies for microelectromechanical system devices. (cross-disciplinary physics and related areas of science and technology)

  14. ON THE ORIGIN OF HIGH-ALTITUDE OPEN CLUSTERS IN THE MILKY WAY

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Medina, L. A.; Pichardo, B.; Moreno, E.; Peimbert, A. [Instituto de Astronomía, Universidad Nacional Autónoma de México, A.P. 70-264, 04510, México, D.F., México (Mexico); Velazquez, H., E-mail: lamartinez@astro.unam.mx [Instituto de Astronomía, Universidad Nacional Autónoma de México, Apartado Postal 877, 22860 Ensenada, B.C., México (Mexico)

    2016-01-20

    We present a dynamical study of the effect of the bar and spiral arms on the simulated orbits of open clusters in the Galaxy. Specifically, this work is devoted to the puzzling presence of high-altitude open clusters in the Galaxy. For this purpose we employ a very detailed observationally motivated potential model for the Milky Way and a careful set of initial conditions representing the newly born open clusters in the thin disk. We find that the spiral arms are able to raise an important percentage of open clusters (about one-sixth of the total employed in our simulations, depending on the structural parameters of the arms) above the Galactic plane to heights beyond 200 pc, producing a bulge-shaped structure toward the center of the Galaxy. Contrary to what was expected, the spiral arms produce a much greater vertical effect on the clusters than the bar, both in quantity and height; this is due to the sharper concentration of the mass on the spiral arms, when compared to the bar. When a bar and spiral arms are included, spiral arms are still capable of raising an important percentage of the simulated open clusters through chaotic diffusion (as tested from classification analysis of the resultant high-z orbits), but the bar seems to restrain them, diminishing the elevation above the plane by a factor of about two.

  15. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  16. BOOK REVIEW: OPENING SCIENCE, THE EVOLVING GUIDE ...

    Science.gov (United States)

    The way we get our funding, collaborate, do our research, and get the word out has evolved over hundreds of years but we can imagine a more open science world, largely facilitated by the internet. The movement towards this more open way of doing and presenting science is coming, and it is not taking hundreds of years. If you are interested in these trends, and would like to find out more about where this is all headed and what it means to you, consider downloding Opening Science, edited by Sönke Bartling and Sascha Friesike, subtitled The Evolving Guide on How the Internet is Changing Research, Collaboration, and Scholarly Publishing. In 26 chapters by various authors from a range of disciplines the book explores the developing world of open science, starting from the first scientific revolution and bringing us to the next scientific revolution, sometimes referred to as “Science 2.0”. Some of the articles deal with the impact of the changing landscape of how science is done, looking at the impact of open science on Academia, or journal publishing, or medical research. Many of the articles look at the uses, pitfalls, and impact of specific tools, like microblogging (think Twitter), social networking, and reference management. There is lots of discussion and definition of terms you might use or misuse like “altmetrics” and “impact factor”. Science will probably never be completely open, and Twitter will probably never replace the journal article,

  17. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    NARCIS (Netherlands)

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  18. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  19. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  20. Maladaptively high and low openness: the case for experiential permeability.

    Science.gov (United States)

    Piedmont, Ralph L; Sherman, Martin F; Sherman, Nancy C

    2012-12-01

    The domain of Openness within the Five-Factor Model (FFM) has received inconsistent support as a source for maladaptive personality functioning, at least when the latter is confined to the disorders of personality included within the American Psychiatric Association's (APA) Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR; APA, ). However, an advantage of the FFM relative to the DSM-IV-TR is that the former was developed to provide a reasonably comprehensive description of general personality structure. Rather than suggest that the FFM is inadequate because the DSM-IV-TR lacks much representation of Openness, it might be just as reasonable to suggest that the DSM-IV-TR is inadequate because it lacks an adequate representation of maladaptive variants of both high and low Openness. This article discusses the development and validation of a measure of these maladaptive variants, the Experiential Permeability Inventory. © 2012 The Authors. Journal of Personality © 2012, Wiley Periodicals, Inc.

  1. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  2. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  3. Ethoscopes: An open platform for high-throughput ethomics.

    Directory of Open Access Journals (Sweden)

    Quentin Geissmann

    2017-10-01

    Full Text Available Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in Drosophila and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at http://lab.gilest.ro/ethoscope.

  4. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  5. Which Type of Inquiry Project Do High School Biology Students Prefer: Open or Guided?

    Science.gov (United States)

    Sadeh, Irit; Zion, Michal

    2012-10-01

    In teaching inquiry to high school students, educators differ on which method of teaching inquiry is more effective: Guided or open inquiry? This paper examines the influence of these two different inquiry learning approaches on the attitudes of Israeli high school biology students toward their inquiry project. The results showed significant differences between the two groups: Open inquiry students were more satisfied and felt they gained benefits from implementing the project to a greater extent than guided inquiry students. On the other hand, regarding documentation throughout the project, guided inquiry students believed that they conducted more documentation, as compared to their open inquiry peers. No significant differences were found regarding `the investment of time', but significant differences were found in the time invested and difficulties which arose concerning the different stages of the inquiry process: Open inquiry students believed they spent more time in the first stages of the project, while guided inquiry students believed they spent more time in writing the final paper. In addition, other differences were found: Open inquiry students felt more involved in their project, and felt a greater sense of cooperation with others, in comparison to guided inquiry students. These findings may help teachers who hesitate to teach open inquiry to implement this method of inquiry; or at least provide their students with the opportunity to be more involved in inquiry projects, and ultimately provide their students with more autonomy, high-order thinking, and a deeper understanding in performing science.

  6. Emergence and stability of intermediate open vesicles in disk-to-vesicle transitions.

    Science.gov (United States)

    Li, Jianfeng; Zhang, Hongdong; Qiu, Feng; Shi, An-Chang

    2013-07-01

    The transition between two basic structures, a disk and an enclosed vesicle, of a finite membrane is studied by examining the minimum energy path (MEP) connecting these two states. The MEP is constructed using the string method applied to continuum elastic membrane models. The results reveal that, besides the commonly observed disk and vesicle, open vesicles (bowl-shaped vesicles or vesicles with a pore) can become stable or metastable shapes. The emergence, stability, and probability distribution of these open vesicles are analyzed. It is demonstrated that open vesicles can be stabilized by higher-order elastic energies. The estimated probability distribution of the different structures is in good agreement with available experiments.

  7. The development of an open architecture control system for CBN high speed grinding

    OpenAIRE

    Silva, E. Jannone da; Biffi, M.; Oliveira, J. F. G. de

    2004-01-01

    The aim of this project is the development of an open architecture controlling (OAC) system to be applied in the high speed grinding process using CBN tools. Besides other features, the system will allow a new monitoring and controlling strategy, by the adoption of open architecture CNC combined with multi-sensors, a PC and third-party software. The OAC system will be implemented in a high speed CBN grinding machine, which is being developed in a partnership between the University of São Paul...

  8. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    International Nuclear Information System (INIS)

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  9. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  10. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  11. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  12. Opening of K+ channels by capacitive stimulation from silicon chip

    Science.gov (United States)

    Ulbrich, M. H.; Fromherz, P.

    2005-10-01

    The development of stable neuroelectronic systems requires a stimulation of nerve cells from semiconductor devices without electrochemical effects at the electrolyte/solid interface and without damage of the cell membrane. The interaction must rely on a reversible opening of voltage-gated ion channels by capacitive coupling. In a proof-of-principle experiment, we demonstrate that Kv1.3 potassium channels expressed in HEK293 cells can be opened from an electrolyte/oxide/silicon (EOS) capacitor. A sufficient strength of electrical coupling is achieved by insulating silicon with a thin film of TiO2 to achieve a high capacitance and by removing NaCl from the electrolyte to enhance the resistance of the cell-chip contact. When a decaying voltage ramp is applied to the EOS capacitor, an outward current through the attached cell membrane is observed that is specific for Kv1.3 channels. An open probability up to fifty percent is estimated by comparison with a numerical simulation of the cell-chip contact.

  13. Causality between trade openness and energy consumption: What causes what in high, middle and low income countries

    International Nuclear Information System (INIS)

    Shahbaz, Muhammad; Nasreen, Samia; Ling, Chong Hui; Sbia, Rashid

    2014-01-01

    This paper explores the relationship between trade openness and energy consumption using data of 91 high, middle and low income countries. The study covers the period of 1980–2010. We have applied panel cointegration to examine long run relationship between the variables. The direction of causal relationship between trade openness is investigated by applying Homogenous non-causality, Homogenous causality and Heterogeneous causality tests. Our variables are integrated at I(1) confirmed by time series and panel unit root tests and cointegration is found between trade openness and energy consumption. The relationship between trade openness and energy consumption is inverted U-shaped in high income countries but U-shaped in middle and low income countries. The homogenous and non-homogenous causality analysis reveals the bidirectional causality between trade openness and energy consumption. This paper opens up new insights for policy makers to design a comprehensive economic, trade and policies for sustainable economic growth in long run following heterogeneous causality findings. - Highlights: • Trade openness and energy consumption are cointegrated for long run. • The feedback effect exists between trade openness and energy consumption. • The inverted U-shaped relationship is found between both variables in high income countries

  14. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  15. Estimating the empirical probability of submarine landslide occurrence

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  16. Search for open-quote open-quote polarized close-quote close-quote instantons in the vacuum

    International Nuclear Information System (INIS)

    Kuchiev, M.Y.

    1996-01-01

    The new phase of a gauge theory in which the instantons are open-quote open-quote polarized,close-quote close-quote i.e., have the preferred orientation, is discussed. A class of gauge theories with the specific condensates of the scalar fields is considered. In these models there exists an interaction between instantons resulting from one-fermion loop corrections. The interaction makes the identical orientation of instantons the most probable, permitting one to expect the system to undergo a phase transition into the state with polarized instantons. The existence of this phase is confirmed in the mean-field approximation in which there is a first-order phase transition separating the open-quote open-quote polarized phase close-quote close-quote from the usual nonpolarized one. The considered phase can be important for the description of gravity in the framework of the gauge field theory. copyright 1996 The American Physical Society

  17. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  18. The Post-Embargo Open Access Citation Advantage: It Exists (Probably), Its Modest (Usually), and the Rich Get Richer (of Course).

    Science.gov (United States)

    Ottaviani, Jim

    2016-01-01

    Many studies show that open access (OA) articles-articles from scholarly journals made freely available to readers without requiring subscription fees-are downloaded, and presumably read, more often than closed access/subscription-only articles. Assertions that OA articles are also cited more often generate more controversy. Confounding factors (authors may self-select only the best articles to make OA; absence of an appropriate control group of non-OA articles with which to compare citation figures; conflation of pre-publication vs. published/publisher versions of articles, etc.) make demonstrating a real citation difference difficult. This study addresses those factors and shows that an open access citation advantage as high as 19% exists, even when articles are embargoed during some or all of their prime citation years. Not surprisingly, better (defined as above median) articles gain more when made OA.

  19. Hydralazine-induced vasodilation involves opening of high conductance Ca2+-activated K+ channels

    DEFF Research Database (Denmark)

    Bang, Lone; Nielsen-Kudsk, J E; Gruhn, N

    1998-01-01

    The purpose of this study was to investigate whether high conductance Ca2+-activated K+ channels (BK(Ca)) are mediating the vasodilator action of hydralazine. In isolated porcine coronary arteries, hydralazine (1-300 microM), like the K+ channel opener levcromakalim, preferentially relaxed......M) suppressed this response by 82% (P opening of BK(Ca) takes part in the mechanism whereby...

  20. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  1. Open-field behavior of house mice selectively bred for high voluntary wheel-running.

    Science.gov (United States)

    Bronikowski, A M; Carter, P A; Swallow, J G; Girard, I A; Rhodes, J S; Garland, T

    2001-05-01

    Open-field behavioral assays are commonly used to test both locomotor activity and emotionality in rodents. We performed open-field tests on house mice (Mus domesticus) from four replicate lines genetically selected for high voluntary wheel-running for 22 generations and from four replicate random-bred control lines. Individual mice were recorded by video camera for 3 min in a 1-m2 open-field arena on 2 consecutive days. Mice from selected lines showed no statistical differences from control mice with respect to distance traveled, defecation, time spent in the interior, or average distance from the center of the arena during the trial. Thus, we found little evidence that open-field behavior, as traditionally defined, is genetically correlated with wheel-running behavior. This result is a useful converse test of classical studies that report no increased wheel-running in mice selected for increased open-field activity. However, mice from selected lines turned less in their travel paths than did control-line mice, and females from selected lines had slower travel times (longer latencies) to reach the wall. We discuss these results in the context of the historical open-field test and newly defined measures of open-field activity.

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  4. High School Teachers' Openness to Adopting New Practices: The Role of Personal Resources and Organizational Climate.

    Science.gov (United States)

    Johnson, Stacy R; Pas, Elise T; Loh, Deanna; Debnam, Katrina J; Bradshaw, Catherine P

    2017-03-01

    Although evidence-based practices for students' social, emotional, and behavioral health are readily available, their adoption and quality implementation in schools are of increasing concern. Teachers are vital to implementation; yet, there is limited research on teachers' openness to adopting new practices, which may be essential to successful program adoption and implementation. The current study explored how perceptions of principal support, teacher affiliation, teacher efficacy, and burnout relate to teachers' openness to new practices. Data came from 2,133 teachers across 51 high schools. Structural equation modeling assessed how organizational climate (i.e., principal support and teacher affiliation) related to teachers' openness directly and indirectly via teacher resources (i.e., efficacy and burnout). Teachers with more favorable perceptions of both principal support and teacher affiliation reported greater efficacy, and, in turn, more openness; however, burnout was not significantly associated with openness. Post hoc analyses indicated that among teachers with high levels of burnout, only principal support related to greater efficacy, and in turn, higher openness. Implications for promoting teachers' openness to new program adoption are discussed.

  5. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  6. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  7. Open tube guideway for high speed air cushioned vehicles

    Science.gov (United States)

    Goering, R. S. (Inventor)

    1974-01-01

    This invention is a tubular shaped guideway for high-speed air-cushioned supported vehicles. The tubular guideway is split and separated such that the sides of the guideway are open. The upper portion of the tubular guideway is supported above the lower portion by truss-like structural members. The lower portion of the tubular guideway may be supported by the terrain over which the vehicle travels, on pedestals or some similar structure.

  8. Canopy cover negatively affects arboreal ant species richness in a tropical open habitat

    Directory of Open Access Journals (Sweden)

    A. C. M. Queiroz

    Full Text Available Abstract We tested the hypothesis of a negative relationship between vegetation characteristics and ant species richness in a Brazilian open vegetation habitat, called candeial. We set up arboreal pitfalls to sample arboreal ants and measured the following environmental variables, which were used as surrogate of environmental heterogeneity: tree richness, tree density, tree height, circumference at the base of the plants, and canopy cover. Only canopy cover had a negative effect on the arboreal ant species richness. Vegetation characteristics and plant species composition are probably homogeneous in candeial, which explains the lack of relationship between other environmental variables and ant richness. Open vegetation habitats harbor a large number of opportunistic and generalist species, besides specialist ants from habitats with high temperatures. An increase in canopy cover decreases sunlight incidence and may cause local microclimatic differences, which negatively affect the species richness of specialist ants from open areas. Canopy cover regulates the richness of arboreal ants in open areas, since only few ant species are able to colonize sites with dense vegetation; most species are present in sites with high temperature and luminosity. Within open vegetation habitats the relationship between vegetation characteristics and species richness seems to be the opposite from closed vegetation areas, like forests.

  9. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  10. Does human body odor represent a significant and rewarding social signal to individuals high in social openness?

    Science.gov (United States)

    Lübke, Katrin T; Croy, Ilona; Hoenen, Matthias; Gerber, Johannes; Pause, Bettina M; Hummel, Thomas

    2014-01-01

    Across a wide variety of domains, experts differ from novices in their response to stimuli linked to their respective field of expertise. It is currently unknown whether similar patterns can be observed with regard to social expertise. The current study therefore focuses on social openness, a central social skill necessary to initiate social contact. Human body odors were used as social cues, as they inherently signal the presence of another human being. Using functional MRI, hemodynamic brain responses to body odors of women reporting a high (n = 14) or a low (n = 12) level of social openness were compared. Greater activation within the inferior frontal gyrus and the caudate nucleus was observed in high socially open individuals compared to individuals low in social openness. With the inferior frontal gyrus being a crucial part of the human mirror neuron system, and the caudate nucleus being implicated in social reward, it is discussed whether human body odor might constitute more of a significant and rewarding social signal to individuals high in social openness compared to individuals low in social openness process.

  11. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  13. Trending in Probability of Collision Measurements

    Science.gov (United States)

    Vallejo, J. J.; Hejduk, M. D.; Stamey, J. D.

    2015-01-01

    A simple model is proposed to predict the behavior of Probabilities of Collision (P(sub c)) for conjunction events. The model attempts to predict the location and magnitude of the peak P(sub c) value for an event by assuming the progression of P(sub c) values can be modeled to first order by a downward-opening parabola. To incorporate prior information from a large database of past conjunctions, the Bayes paradigm is utilized; and the operating characteristics of the model are established through a large simulation study. Though the model is simple, it performs well in predicting the temporal location of the peak (P(sub c)) and thus shows promise as a decision aid in operational conjunction assessment risk analysis.

  14. Serial follow up V/P scanning in assessment of treatment response in high probability scans for pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Moustafa, H; Elhaddad, SH; Wagih, SH; Ziada, G; Samy, A; Saber, R [Department of nuclear medicine and radiology, faculty of medicine, Cairo university, Cairo, (Egypt)

    1995-10-01

    138 patients proved with V/P scan to have different probabilities of pulmonary emboli event. Serial follow up scanning after 3 days, 2 weeks, 1 month and 3 months was done, with anticoagulant therapy. Out of the remaining 10 patients, 6 patients died with documented P.E. by P.M. study and lost follow up recorded in 4 patients. Complete response with disappearance of all perfusion defects after 2 weeks was detected in 37 patients (49.3%), partial improvement of lesions after 3 months was elicited in 32%. The overall incidence of response was (81.3%) such response was complete in low probability group (100%), (84.2%) in intermediate group and (79.3%) in high probability group with partial response in 45.3%. New lesions were evident in 18.7% of this series. To conclude that serial follow up V/P scan is mandatory for evaluation of response to anticoagulant therapy specially in first 3 months. 2 figs., 3 tabs.

  15. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  16. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    International Nuclear Information System (INIS)

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  17. OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Greiner, Annette; Cholia, Shreyas; Louie, Katherine; Bethel, E. Wes; Northen, Trent R.; Bowen, Benjamin P.

    2013-10-02

    Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data access (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.

  18. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. High voltage, high power operation of the plasma erosion opening switch

    International Nuclear Information System (INIS)

    Neri, J.M.; Boller, J.R.; Ottinger, P.F.; Weber, B.V.; Young, F.C.

    1987-01-01

    A Plasma Erosion Opening Switch (PEOS) is used as the opening switch for a vacuum inductive storage system driven by a 1.8-MV, 1.6-TW pulsed power generator. A 135-nH vacuum inductor is current charged to ∼750 kA in 50 ns through the closed PEOS which then opens in <10 ns into an inverse ion diode load. Electrical diagnostics and nuclear activations from ions accelerated in the diode yield a peak load voltage (4.25 MV) and peak load power (2.8 TW) that are 2.4 and 1.8 times greater than ideal matched load values for the same generator pulse

  20. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  1. Does human body odor represent a significant and rewarding social signal to individuals high in social openness?

    Directory of Open Access Journals (Sweden)

    Katrin T Lübke

    Full Text Available Across a wide variety of domains, experts differ from novices in their response to stimuli linked to their respective field of expertise. It is currently unknown whether similar patterns can be observed with regard to social expertise. The current study therefore focuses on social openness, a central social skill necessary to initiate social contact. Human body odors were used as social cues, as they inherently signal the presence of another human being. Using functional MRI, hemodynamic brain responses to body odors of women reporting a high (n = 14 or a low (n = 12 level of social openness were compared. Greater activation within the inferior frontal gyrus and the caudate nucleus was observed in high socially open individuals compared to individuals low in social openness. With the inferior frontal gyrus being a crucial part of the human mirror neuron system, and the caudate nucleus being implicated in social reward, it is discussed whether human body odor might constitute more of a significant and rewarding social signal to individuals high in social openness compared to individuals low in social openness process.

  2. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  3. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  4. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  5. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  6. A narrow open tubular column for high efficiency liquid chromatographic separation.

    Science.gov (United States)

    Chen, Huang; Yang, Yu; Qiao, Zhenzhen; Xiang, Piliang; Ren, Jiangtao; Meng, Yunzhu; Zhang, Kaiqi; Juan Lu, Joann; Liu, Shaorong

    2018-04-30

    We report a great feature of open tubular liquid chromatography when it is run using an extremely narrow (e.g., 2 μm inner diameter) open tubular column: more than 10 million plates per meter can be achieved in less than 10 min and under an elution pressure of ca. 20 bar. The column is coated with octadecylsilane and both isocratic and gradient separations are performed. We reveal a focusing effect that may be used to interpret the efficiency enhancement. We also demonstrate the feasibility of using this technique for separating complex peptide samples. This high-resolution and fast separation technique is promising and can lead to a powerful tool for trace sample analysis.

  7. The opening of a high care hostel for problem drinkers.

    Science.gov (United States)

    Bretherton, H

    1992-12-01

    This paper gives a personal and practice based account by one of the Team Leaders of the opening of a high-care hostel for problem drinkers in North London. The hostel, Rugby House, was set up to provide detoxification and assessment facilities for thirteen residents. It was part of the Rugby House Project, an alcohol agency in the voluntary sector. The paper explores the processes involved in setting up a new project; how the new paid employees turn a committee's vision into practice; how a group of individuals become a team; the importance of clarity about boundaries and underlying values and assumptions; the need for openness about negative as well as positive feelings; and the recognition that some of the experiences of staff will resonate with those of the residents for whom giving up drinking is a major life change.

  8. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  9. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.; Keyes, David E.; Turkiyyah, George

    2017-01-01

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  10. The effect of lower anterior high pull headgear on treatment of moderate open bite in adults

    Directory of Open Access Journals (Sweden)

    Rahman Showkatbakhsh

    2012-01-01

    Full Text Available Background and Aims: Various methods are used for treatment of open bite. The objective of this study was to investigate the effects of Lower Anterior High Pull Headgear (LAHPH appliance in Class I subjects with moderate open bite and high lower lip line.Materials and Methods: The study group was composed of 10 subjects with a mean age of 15.8±2.5 years and 3.05 ± 0.07 mm moderate open bite. All the patients rejected orthognathic surgery. The treatment included extraction of upper and lower second premolars followed by leveling, banding, bonding, posterior space closure, and anterior retraction. After these procedures, the open bite was reduced to 2.04±1.17 mm. Afterwards, LAHPH was applied for 18 hours per day for 8±2 months. LAHPH appliance was composed of High Pull Headgear and two hooks mounted on its inner bow. Two elastics (1.8, light, Dentaurum connected the upper hooks on the inner bow to the lower hooks on the mandibular canines vertically. The forces produced by the prescribed elastics were 10 and 60 g during mouth closing and opening, respectively. Paired T-test was used to evaluate pre-andpost-treatment outcomes.Results: The pre-and post-treatment cephalometric evaluations showed that the LAHPH reduced effectively the open bite of the patients to 0.15±1.7 mm (P<0.001.Conclusion: This appliance can be used as an acceptable method for closing the open bite in Class I subjects.

  11. OPEN AIR DEMOLITION OF FACILITIES HIGHLY CONTAMINATED WITH PLUTONIUM

    International Nuclear Information System (INIS)

    LLOYD, E.R.

    2007-01-01

    The demolition of highly contaminated plutonium buildings usually is a long and expensive process that involves decontaminating the building to near free- release standards and then using conventional methods to remove the structure. It doesn't, however, have to be that way. Fluor has torn down buildings highly contaminated with plutonium without excessive decontamination. By removing the select source term and fixing the remaining contamination on the walls, ceilings, floors, and equipment surfaces; open-air demolition is not only feasible, but it can be done cheaper, better (safer), and faster. Open-air demolition techniques were used to demolish two highly contaminated buildings to slab-on-grade. These facilities on the Department of Energy's Hanford Site were located in, or very near, compounds of operating nuclear facilities that housed hundreds of people working on a daily basis. To keep the facilities operating and the personnel safe, the projects had to be creative in demolishing the structures. Several key techniques were used to control contamination and keep it within the confines of the demolition area: spraying fixatives before demolition; applying fixative and misting with a fine spray of water as the buildings were being taken down; and demolishing the buildings in a controlled and methodical manner. In addition, detailed air-dispersion modeling was done to establish necessary building and meteorological conditions and to confirm the adequacy of the proposed methods. Both demolition projects were accomplished without any spread of contamination outside the modest buffer areas established for contamination control. Furthermore, personnel exposure to radiological and physical hazards was significantly reduced by using heavy equipment rather than ''hands on'' techniques

  12. Ammonia losses and nitrogen partitioning at a southern High Plains open lot dairy

    Science.gov (United States)

    Todd, Richard W.; Cole, N. Andy; Hagevoort, G. Robert; Casey, Kenneth D.; Auvermann, Brent W.

    2015-06-01

    Animal agriculture is a significant source of ammonia (NH3). Cattle excrete most ingested nitrogen (N); most urinary N is converted to NH3, volatilized and lost to the atmosphere. Open lot dairies on the southern High Plains are a growing industry and face environmental challenges as well as reporting requirements for NH3 emissions. We quantified NH3 emissions from the open lot and wastewater lagoons of a commercial New Mexico dairy during a nine-day summer campaign. The 3500-cow dairy consisted of open lot, manure-surfaced corrals (22.5 ha area). Lactating cows comprised 80% of the herd. A flush system using recycled wastewater intermittently removed manure from feeding alleys to three lagoons (1.8 ha area). Open path lasers measured atmospheric NH3 concentration, sonic anemometers characterized turbulence, and inverse dispersion analysis was used to quantify emissions. Ammonia fluxes (15-min) averaged 56 and 37 μg m-2 s-1 at the open lot and lagoons, respectively. Ammonia emission rate averaged 1061 kg d-1 at the open lot and 59 kg d-1 at the lagoons; 95% of NH3 was emitted from the open lot. The per capita emission rate of NH3 was 304 g cow-1 d-1 from the open lot (41% of N intake) and 17 g cow-1 d-1 from lagoons (2% of N intake). Daily N input at the dairy was 2139 kg d-1, with 43, 36, 19 and 2% of the N partitioned to NH3 emission, manure/lagoons, milk, and cows, respectively.

  13. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  14. High-resolution spectroscopic observations of binary stars and yellow stragglers in three open clusters: NGC 2360, NGC 3680, and NGC 5822

    Energy Technology Data Exchange (ETDEWEB)

    Sales Silva, J. V.; Peña Suárez, V. J.; Katime Santrich, O. J.; Pereira, C. B.; Drake, N. A.; Roig, F., E-mail: joaovictor@on.br, E-mail: jearim@on.br, E-mail: osantrich@on.br, E-mail: claudio@on.br, E-mail: drake@on.br, E-mail: froig@on.br [Observatório Nacional/MCT, Rua Gen. José Cristino, 77, 20921-400 Rio de Janeiro (Brazil)

    2014-11-01

    Binary stars in open clusters are very useful targets in constraining the nucleosynthesis process. The luminosities of the stars are known because the distances of the clusters are also known, so chemical peculiarities can be linked directly to the evolutionary status of a star. In addition, binary stars offer the opportunity to verify a relationship between them and the straggler population in both globular and open clusters. We carried out a detailed spectroscopic analysis to derive the atmospheric parameters for 16 red giants in binary systems and the chemical composition of 11 of them in the open clusters NGC 2360, NGC 3680, and NGC 5822. We obtained abundances of C, N, O, Na, Mg, Al, Ca, Si, Ti, Ni, Cr, Y, Zr, La, Ce, and Nd. The atmospheric parameters of the studied stars and their chemical abundances were determined using high-resolution optical spectroscopy. We employ the local thermodynamic equilibrium model atmospheres of Kurucz and the spectral analysis code MOOG. The abundances of the light elements were derived using the spectral synthesis technique. We found that the stars NGC 2360-92 and 96, NGC 3680-34, and NGC 5822-4 and 312 are yellow straggler stars. We show that the spectra of NGC 5822-4 and 312 present evidence of contamination by an A-type star as a secondary star. For the other yellow stragglers, evidence of contamination is given by the broad wings of the Hα. Detection of yellow straggler stars is important because the observed number can be compared with the number predicted by simulations of binary stellar evolution in open clusters. We also found that the other binary stars are not s-process enriched, which may suggest that in these binaries the secondary star is probably a faint main-sequence object. The lack of any s-process enrichment is very useful in setting constraints for the number of white dwarfs in the open cluster, a subject that is related to the birthrate of these kinds of stars in open clusters and also to the age of a

  15. Open access for REF2020

    Directory of Open Access Journals (Sweden)

    Simon Kerridge

    2014-03-01

    Full Text Available Open access (OA may have been the ‘big thing’ in 2013 but the OA juggernaut is still rolling and plans are now afoot for the requirements for the ‘next REF’ (which from now on we will refer to as REF2020. In 2013, on behalf of the four UK Funding Councils, the Higher Education Funding Council for England (HEFCE undertook a two-stage consultation exercise on open access requirements for articles submitted to REF2020. There are a number of nuances and caveats to the current proposals. This article will reflect on what the probable rules might be, and their implications for research managers, administrators and institutional repository managers alike.

  16. Dissipative tunneling through a potential barrier in the Lindblad theory of open quantum systems

    International Nuclear Information System (INIS)

    Isar, A.

    2000-01-01

    In the Lindblad theory for open quantum systems, and analytical expression of the tunneling probability through an inverted parabola is obtained. This probability depends on the environment coefficient and increase with the dissipation and the temperature of the thermal bath. (author)

  17. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Science.gov (United States)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  18. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    International Nuclear Information System (INIS)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-01-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=| | 2 decays exactly as |1-at| 2 e -Γ EP t/(ℎ/2π) , where Γ EP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  19. Fingerprints of exceptional points in the survival probability of resonances in atomic spectra

    Science.gov (United States)

    Cartarius, Holger; Moiseyev, Nimrod

    2011-07-01

    The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=||2 decays exactly as |1-at|2e-ΓEPt/ℏ, where ΓEP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.

  20. OpenTopography: Enabling Online Access to High-Resolution Lidar Topography Data and Processing Tools

    Science.gov (United States)

    Crosby, Christopher; Nandigam, Viswanath; Baru, Chaitan; Arrowsmith, J. Ramon

    2013-04-01

    High-resolution topography data acquired with lidar (light detection and ranging) technology are revolutionizing the way we study the Earth's surface and overlying vegetation. These data, collected from airborne, tripod, or mobile-mounted scanners have emerged as a fundamental tool for research on topics ranging from earthquake hazards to hillslope processes. Lidar data provide a digital representation of the earth's surface at a resolution sufficient to appropriately capture the processes that contribute to landscape evolution. The U.S. National Science Foundation-funded OpenTopography Facility (http://www.opentopography.org) is a web-based system designed to democratize access to earth science-oriented lidar topography data. OpenTopography provides free, online access to lidar data in a number of forms, including the raw point cloud and associated geospatial-processing tools for customized analysis. The point cloud data are co-located with on-demand processing tools to generate digital elevation models, and derived products and visualizations which allow users to quickly access data in a format appropriate for their scientific application. The OpenTopography system is built using a service-oriented architecture (SOA) that leverages cyberinfrastructure resources at the San Diego Supercomputer Center at the University of California San Diego to allow users, regardless of expertise level, to access these massive lidar datasets and derived products for use in research and teaching. OpenTopography hosts over 500 billion lidar returns covering 85,000 km2. These data are all in the public domain and are provided by a variety of partners under joint agreements and memoranda of understanding with OpenTopography. Partners include national facilities such as the NSF-funded National Center for Airborne Lidar Mapping (NCALM), as well as non-governmental organizations and local, state, and federal agencies. OpenTopography has become a hub for high-resolution topography

  1. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  2. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  3. Policy on synthetic biology: deliberation, probability, and the precautionary paradox.

    Science.gov (United States)

    Wareham, Christopher; Nardini, Cecilia

    2015-02-01

    Synthetic biology is a cutting-edge area of research that holds the promise of unprecedented health benefits. However, in tandem with these large prospective benefits, synthetic biology projects entail a risk of catastrophic consequences whose severity may exceed that of most ordinary human undertakings. This is due to the peculiar nature of synthetic biology as a 'threshold technology' which opens doors to opportunities and applications that are essentially unpredictable. Fears about these potentially unstoppable consequences have led to declarations from civil society groups calling for the use of a precautionary principle to regulate the field. Moreover, the principle is prevalent in law and international agreements. Despite widespread political recognition of a need for caution, the precautionary principle has been extensively criticized as a guide for regulatory policy. We examine a central objection to the principle: that its application entails crippling inaction and incoherence, since whatever action one takes there is always a chance that some highly improbable cataclysm will occur. In response to this difficulty, which we call the 'precautionary paradox,' we outline a deliberative means for arriving at threshold of probability below which potential dangers can be disregarded. In addition, we describe a Bayesian mechanism with which to assign probabilities to harmful outcomes. We argue that these steps resolve the paradox. The rehabilitated PP can thus provide a viable policy option to confront the uncharted waters of synthetic biology research. © 2013 John Wiley & Sons Ltd.

  4. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  5. OpenMM 4: A Reusable, Extensible, Hardware Independent Library for High Performance Molecular Simulation.

    Science.gov (United States)

    Eastman, Peter; Friedrichs, Mark S; Chodera, John D; Radmer, Randall J; Bruns, Christopher M; Ku, Joy P; Beauchamp, Kyle A; Lane, Thomas J; Wang, Lee-Ping; Shukla, Diwakar; Tye, Tony; Houston, Mike; Stich, Timo; Klein, Christoph; Shirts, Michael R; Pande, Vijay S

    2013-01-08

    OpenMM is a software toolkit for performing molecular simulations on a range of high performance computing architectures. It is based on a layered architecture: the lower layers function as a reusable library that can be invoked by any application, while the upper layers form a complete environment for running molecular simulations. The library API hides all hardware-specific dependencies and optimizations from the users and developers of simulation programs: they can be run without modification on any hardware on which the API has been implemented. The current implementations of OpenMM include support for graphics processing units using the OpenCL and CUDA frameworks. In addition, OpenMM was designed to be extensible, so new hardware architectures can be accommodated and new functionality (e.g., energy terms and integrators) can be easily added.

  6. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  7. Wavelet Entropy-Based Traction Inverter Open Switch Fault Diagnosis in High-Speed Railways

    Directory of Open Access Journals (Sweden)

    Keting Hu

    2016-03-01

    Full Text Available In this paper, a diagnosis plan is proposed to settle the detection and isolation problem of open switch faults in high-speed railway traction system traction inverters. Five entropy forms are discussed and compared with the traditional fault detection methods, namely, discrete wavelet transform and discrete wavelet packet transform. The traditional fault detection methods cannot efficiently detect the open switch faults in traction inverters because of the low resolution or the sudden change of the current. The performances of Wavelet Packet Energy Shannon Entropy (WPESE, Wavelet Packet Energy Tsallis Entropy (WPETE with different non-extensive parameters, Wavelet Packet Energy Shannon Entropy with a specific sub-band (WPESE3,6, Empirical Mode Decomposition Shannon Entropy (EMDESE, and Empirical Mode Decomposition Tsallis Entropy (EMDETE with non-extensive parameters in detecting the open switch fault are evaluated by the evaluation parameter. Comparison experiments are carried out to select the best entropy form for the traction inverter open switch fault detection. In addition, the DC component is adopted to isolate the failure Isolated Gate Bipolar Transistor (IGBT. The simulation experiments show that the proposed plan can diagnose single and simultaneous open switch faults correctly and timely.

  8. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  9. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  10. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  11. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  12. Open ISEmeter: An open hardware high-impedance interface for potentiometric detection

    International Nuclear Information System (INIS)

    Salvador, C.; Carbajo, J.; Mozo, J. D.; Mesa, M. S.; Durán, E.; Alvarez, J. L.

    2016-01-01

    In this work, a new open hardware interface based on Arduino to read electromotive force (emf) from potentiometric detectors is presented. The interface has been fully designed with the open code philosophy and all documentation will be accessible on web. The paper describes a comprehensive project including the electronic design, the firmware loaded on Arduino, and the Java-coded graphical user interface to load data in a computer (PC or Mac) for processing. The prototype was tested by measuring the calibration curve of a detector. As detection element, an active poly(vinyl chloride)-based membrane was used, doped with cetyltrimethylammonium dodecylsulphate (CTA"+-DS"−). The experimental measures of emf indicate Nernstian behaviour with the CTA"+ content of test solutions, as it was described in the literature, proving the validity of the developed prototype. A comparative analysis of performance was made by using the same chemical detector but changing the measurement instrumentation.

  13. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  14. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Science.gov (United States)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  15. The link between organisational citizenship behaviours and open innovation: A case of Malaysian high-tech sector

    Directory of Open Access Journals (Sweden)

    M. Muzamil Naqshbandi

    2016-12-01

    Full Text Available We examine the role of organisational citizenship behaviours (OCBs in two types of open innovation—inbound and outbound. Data were collected using the questionnaire survey technique from middle and top managers working in high-tech industries in Malaysia. Results show that OCBs positively predict both inbound and outbound open innovation. A closer look reveals that OCBs relate positively to out-bound open innovation in aggregate and in isolation. However, OCBs relate to in-bound open innovation in aggregate only. The implications of these results are discussed and limitations of the study are highlighted.

  16. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  17. OpenSHMEM-UCX : Evaluation of UCX for implementing OpenSHMEM Programming Model

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Matthew B [ORNL; Gorentla Venkata, Manjunath [ORNL; Aderholdt, William Ferrol [ORNL; Shamis, Pavel [ARM Research

    2016-01-01

    The OpenSHMEM reference implementation was developed towards the goal of developing an open source and high-performing Open- SHMEM implementation. To achieve portability and performance across various networks, the OpenSHMEM reference implementation uses GAS- Net and UCCS for network operations. Recently, new network layers have emerged with the promise of providing high-performance, scalabil- ity, and portability for HPC applications. In this paper, we implement the OpenSHMEM reference implementation to use the UCX framework for network operations. Then, we evaluate its performance and scalabil- ity on Cray XK systems to understand UCX s suitability for developing the OpenSHMEM programming model. Further, we develop a bench- mark called SHOMS for evaluating the OpenSHMEM implementation. Our experimental results show that OpenSHMEM-UCX outperforms the vendor supplied OpenSHMEM implementation in most cases on the Cray XK system by up to 40% with respect to message rate and up to 70% for the execution of application kernels.

  18. Application of plasma erosion opening switches to high power accelerators for pulse compression and power multiplication

    International Nuclear Information System (INIS)

    Meyer, R.A.; Boller, J.R.; Commisso, R.J.

    1983-01-01

    A new vacuum opening switch called a plasma erosion opening switch is described. A model of its operation is presented and the energy efficiency of such a switch is discussed. Recent high power experiments on the Gamble II accelerator are described and compared to previous experiments

  19. Slow relaxation in weakly open rational polygons.

    Science.gov (United States)

    Kokshenev, Valery B; Vicentini, Eduardo

    2003-07-01

    The interplay between the regular (piecewise-linear) and irregular (vertex-angle) boundary effects in nonintegrable rational polygonal billiards (of m equal sides) is discussed. Decay dynamics in polygons (of perimeter P(m) and small opening Delta) is analyzed through the late-time survival probability S(m) approximately equal t(-delta). Two distinct slow relaxation channels are established. The primary universal channel exhibits relaxation of regular sliding orbits, with delta=1. The secondary channel is given by delta>1 and becomes open when m>P(m)/Delta. It originates from vertex order-disorder dual effects and is due to relaxation of chaoticlike excitations.

  20. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  1. Open ISEmeter: An open hardware high-impedance interface for potentiometric detection

    Energy Technology Data Exchange (ETDEWEB)

    Salvador, C.; Carbajo, J.; Mozo, J. D., E-mail: jdaniel.mozo@diq.uhu.es [Applied Electrochemistry Laboratory, Faculty of Experimental Sciences, University of Huelva, Av. 3 de Marzo s/n., 21007 Huelva (Spain); Mesa, M. S.; Durán, E. [Department of Electronics Engineering, Computers and Automatic, ETSI, University of Huelva, Campus de La Rabida, 21810 Huelva (Spain); Alvarez, J. L. [Department of Information Technologies, ETSI, University of Huelva, Campus de La Rabida, 21810 Huelva (Spain)

    2016-05-15

    In this work, a new open hardware interface based on Arduino to read electromotive force (emf) from potentiometric detectors is presented. The interface has been fully designed with the open code philosophy and all documentation will be accessible on web. The paper describes a comprehensive project including the electronic design, the firmware loaded on Arduino, and the Java-coded graphical user interface to load data in a computer (PC or Mac) for processing. The prototype was tested by measuring the calibration curve of a detector. As detection element, an active poly(vinyl chloride)-based membrane was used, doped with cetyltrimethylammonium dodecylsulphate (CTA{sup +}-DS{sup −}). The experimental measures of emf indicate Nernstian behaviour with the CTA{sup +} content of test solutions, as it was described in the literature, proving the validity of the developed prototype. A comparative analysis of performance was made by using the same chemical detector but changing the measurement instrumentation.

  2. Highly enhanced avalanche probability using sinusoidally-gated silicon avalanche photodiode

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Shingo; Namekata, Naoto, E-mail: nnao@phys.cst.nihon-u.ac.jp; Inoue, Shuichiro [Institute of Quantum Science, Nihon University, 1-8-14 Kanda-Surugadai, Chiyoda-ku, Tokyo 101-8308 (Japan); Tsujino, Kenji [Tokyo Women' s Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo 162-8666 (Japan)

    2014-01-27

    We report on visible light single photon detection using a sinusoidally-gated silicon avalanche photodiode. Detection efficiency of 70.6% was achieved at a wavelength of 520 nm when an electrically cooled silicon avalanche photodiode with a quantum efficiency of 72.4% was used, which implies that a photo-excited single charge carrier in a silicon avalanche photodiode can trigger a detectable avalanche (charge) signal with a probability of 97.6%.

  3. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  4. A sustainable business model for Open-Access journal publishing a proposed plan for High-Energy Physics

    CERN Document Server

    Vigen, Jens

    2007-01-01

    The High Energy Physics community over the last 15 years has achieved so-called full green Open Access through the wide dissemination of preprints via arXiv, a central subject repository managed by Cornell University. However, green Open Access does not alleviate the economic difficulties of libraries as they are still expected to offer access to versions of record of the peer-reviewed literature. For this reason the particle physics community is now addressing the issue of gold Open Access by converting a set of the existing core journals to Open Access. A Working Party has been established to bring together funding agencies, laboratories and libraries into a single consortium, called SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). This consortium will engage with publishers to build a sustainable model for Open Access publishing. In this model, subscription fees from multiple institutions are replaced by contracts with publishers of Open Access journals, where the SCOAP3 conso...

  5. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  6. Long-term outcome of high-energy open Lisfranc injuries: a retrospective study.

    Science.gov (United States)

    Nithyananth, Manasseh; Boopalan, Palapattu R J V C; Titus, Vijay T K; Sundararaj, Gabriel D; Lee, Vernon N

    2011-03-01

    The outcome of open Lisfranc injuries has been reported infrequently. Should these injuries be managed as closed injuries and is their outcome different? We undertook a retrospective study of high-energy, open Lisfranc injuries treated between 1999 and 2005. The types of dislocation, the associated injuries to the same foot, the radiologic and functional outcome, and the complications were studied. There were 22 patients. Five patients died. One had amputation. Of the remaining 16 patients, 13 men were followed up at a mean of 56 months (range, 29-88 months). The average age was 36 years (range, 7-55 years). According to the modified Hardcastle classification, type B2 injury was the commonest. Ten patients had additional forefoot or midfoot injury. All patients were treated with debridement, open reduction, and multiple Kirschner (K) wire fixation. All injuries were Gustilo Anderson type IIIa or IIIb. Nine patients had split skin graft for soft tissue cover. Mean time taken for wound healing was 16 days (range, 10-30 days). Ten patients (77%) had fracture comminution. Eight patients had anatomic reduction, whereas five had nonanatomic reduction. Ten of 13 (77%) patients had at least one spontaneous tarsometatarsal joint fusion. The mean American Orthopaedic Foot and Ankle Society score was 82 (range, 59-100). Nonanatomic reduction, osteomyelitis, deformity of toes, planus foot, and mild discomfort on prolonged walking were the unfavorable outcomes present. In open Lisfranc injuries, multiple K wire fixation should be considered especially in the presence of comminution and soft tissue loss. Although anatomic reduction is always not obtained, the treatment principles should include adequate debridement, maintaining alignment with multiple K wires, and obtaining early soft tissue cover. There is a high incidence of fusion across tarsometatarsal joints. Copyright © 2011 by Lippincott Williams & Wilkins

  7. Skin damage probabilities using fixation materials in high-energy photon beams

    International Nuclear Information System (INIS)

    Carl, J.; Vestergaard, A.

    2000-01-01

    Patient fixation, such as thermoplastic masks, carbon-fibre support plates and polystyrene bead vacuum cradles, is used to reproduce patient positioning in radiotherapy. Consequently low-density materials may be introduced in high-energy photon beams. The aim of the this study was to measure the increase in skin dose when low-density materials are present and calculate the radiobiological consequences in terms of probabilities of early and late skin damage. An experimental thin-windowed plane-parallel ion chamber was used. Skin doses were measured using various overlaying low-density fixation materials. A fixed geometry of a 10 x 10 cm field, a SSD = 100 cm and photon energies of 4, 6 and 10 MV on Varian Clinac 2100C accelerators were used for all measurements. Radiobiological consequences of introducing these materials into the high-energy photon beams were evaluated in terms of early and late damage of the skin based on the measured surface doses and the LQ-model. The experimental ion chamber save results consistent with other studies. A relationship between skin dose and material thickness in mg/cm 2 was established and used to calculate skin doses in scenarios assuming radiotherapy treatment with opposed fields. Conventional radiotherapy may apply mid-point doses up to 60-66 Gy in daily 2-Gy fractions opposed fields. Using thermoplastic fixation and high-energy photons as low as 4 MV do increase the dose to the skin considerably. However, using thermoplastic materials with thickness less than 100 mg/cm 2 skin doses are comparable with those produced by variation in source to skin distance, field size or blocking trays within clinical treatment set-ups. The use of polystyrene cradles and carbon-fibre materials with thickness less than 100 mg/cm 2 should be avoided at 4 MV at doses above 54-60 Gy. (author)

  8. Changes in patellofemoral alignment do not cause clinical impact after open-wedge high tibial osteotomy.

    Science.gov (United States)

    Lee, Yong Seuk; Lee, Sang Bok; Oh, Won Seok; Kwon, Yong Eok; Lee, Beom Koo

    2016-01-01

    The objectives of this study were (1) to evaluate the clinical and radiologic outcomes of open-wedge high tibial osteotomy focusing on patellofemoral alignment and (2) to search for correlation between variables and patellofemoral malalignment. A total of 46 knees (46 patients) from 32 females and 14 males who underwent open-wedge high tibial osteotomy were included in this retrospective case series. Outcomes were evaluated using clinical scales and radiologic parameters at the last follow-up. Pre-operative and final follow-up values were compared for the outcome analysis. For the focused analysis of the patellofemoral joint, correlation analyses between patellofemoral variables and pre- and post-operative weight-bearing line (WBL), clinical score, posterior slope, Blackburn Peel ratio, lateral patellar tilt, lateral patellar shift, and congruence angle were performed. The minimum follow-up period was 2 years and median follow-up period was 44 months (range 24-88 months). The percentage of weight-bearing line was shifted from 17.2 ± 11.1 to 56.7 ± 12.7%, and it was statistically significant (p patellofemoral malalignment, the pre-operative weight-bearing line showed an association with the change in lateral patellar tilt and lateral patellar shift (correlation coefficient: 0.3). After open-wedge high tibial osteotomy, clinical results showed improvement, compared to pre-operative values. The patellar tilt and lateral patellar shift were not changed; however, descent of the patella was observed. Therefore, mild patellofemoral problems should not be a contraindication of the open-wedge high tibial osteotomy. Case series, Level IV.

  9. Early weight bearing versus delayed weight bearing in medial opening wedge high tibial osteotomy: a randomized controlled trial.

    Science.gov (United States)

    Lansdaal, Joris Radboud; Mouton, Tanguy; Wascher, Daniel Charles; Demey, Guillaume; Lustig, Sebastien; Neyret, Philippe; Servien, Elvire

    2017-12-01

    The need for a period of non-weight bearing after medial opening wedge high tibial osteotomy remains controversial. It is hypothesized that immediate weight bearing after medial opening wedge high tibial osteotomy would have no difference in functional scores at one year compared to delayed weight bearing. Fifty patients, median age 54 years (range 40-65), with medial compartment osteoarthritis, underwent a medial opening wedge high tibial osteotomy utilizing a locking plate without bone grafting. Patients were randomized into an Immediate or a Delayed (2 months) weight bearing group. All patients were assessed at one-year follow-up and the two groups compared. The primary outcome measure was the IKS score. Secondary outcome measures included the IKDC score, the VAS pain score and rate of complications. The functional scores significantly improved in both groups. The IKS score increased from 142 ± 31 to 171 ± 26 in the Immediate group (p bearing after medial opening wedge high tibial osteotomy had no effect on functional scores at 1 year follow-up and did not significantly increase the complication rate. Immediate weight bearing after medial opening wedge high tibial osteotomy appears to be safe and can allow some patients a quicker return to activities of daily living and a decreased convalescence period. II.

  10. Y(4143) is probably a molecular partner of Y(3930)

    International Nuclear Information System (INIS)

    Liu Xiang; Zhu Shilin

    2009-01-01

    After discussing the various possible interpretations of the Y(4143) signal observed by the CDF collaboration in the J/ψφ mode, we tend to conclude that Y(4143) is probably a D s *D s * molecular state with J PC =0 ++ or 2 ++ while Y(3930) is its D*D* molecular partner as predicted in our previous work [X. Liu, Z. G. Luo, Y. R. Liu, and Shi-Lin Zhu, Eur. Phys. J. C 61, 411 (2009)]. Both the hidden-charm and open-charm two-body decays occur through the rescattering of the vector components within the molecular states while the three- and four-body open-charm decay modes are forbidden kinematically. Hence, their widths are narrow naturally. CDF, BABAR and Belle collaborations may have discovered heavy molecular states already. We urge experimentalists to measure their quantum numbers and explore their radiative decay modes in the future.

  11. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  12. The open-quotes synergisticclose quotes action of mixed irradiation with high-LET and low-LET radiation

    International Nuclear Information System (INIS)

    Suzuki, Shozo

    1994-01-01

    The combined modalities of various agents such as radiation, chemicals and physical agents are often used, and exposure to mixture of agents sometimes occurs in nature. However, it is not clear whether these combined effects are synergistic, partly because definition of the term open-quotes synergismclose quotes is confusing, as pointed out by Streffer and Mueller. It is, of course, desirable that the definition should be simple and widely applicable to all agents. Yet the underlying mechanisms of the effects of different agents are probably different, and the mechanisms of combined effects are different and more complicated than those of a single agent. It is therefore important to define synergism taking each underlying mechanism into consideration. From this viewpoint, the definitions of synergism which have been used to date are examined with respect to the effect of a mixture of different types of radiation on cells, and they are shown to be inappropriate and misleading. This is probably attributable to simply treating the resulting phenomena (cell survival in most cases) without adequately taking into consideration the knowledge of underlying biological mechanisms in defining the synergism that may occur with irradiation. This commentary discusses the inappropriateness of current definitions and proposes a new definition in terms of biological mechanisms as a counterproposal. 16 refs., 6 figs

  13. Effects of high frequency fluctuations on DNS of turbulent open-channel flow with high Pr passive scalar transport

    International Nuclear Information System (INIS)

    Yamamoto, Yoshinobu; Kunugi, Tomoaki; Serizawa, Akimi

    2002-01-01

    In this study, investigation on effects of high frequency fluctuations on DNS of turbulent open-channel flows with high Pr passive scalar transport was conducted. As the results, although significant differences of energy spectra behaviors in temperature fields, are caused at high wave number region where insignificant area for velocity components, large difference dose not caused in mean and statistic behaviors in temperature component. But, if the buoyancy were considered, this temperature high-frequency fluctuations would be greatly changed mean and statistics behaviors from the difference of the accuracy and resolution at high wave number region. (author)

  14. Highly reversible open framework nanoscale electrodes for divalent ion batteries.

    Science.gov (United States)

    Wang, Richard Y; Wessells, Colin D; Huggins, Robert A; Cui, Yi

    2013-01-01

    The reversible insertion of monovalent ions such as lithium into electrode materials has enabled the development of rechargeable batteries with high energy density. Reversible insertion of divalent ions such as magnesium would allow the creation of new battery chemistries that are potentially safer and cheaper than lithium-based batteries. Here we report that nanomaterials in the Prussian Blue family of open framework materials, such as nickel hexacyanoferrate, allow for the reversible insertion of aqueous alkaline earth divalent ions, including Mg(2+), Ca(2+), Sr(2+), and Ba(2+). We show unprecedented long cycle life and high rate performance for divalent ion insertion. Our results represent a step forward and pave the way for future development in divalent batteries.

  15. OpenSubspace

    DEFF Research Database (Denmark)

    Müller, Emmanuel; Assent, Ira; Günnemann, Stephan

    2009-01-01

    Subspace clustering and projected clustering are recent research areas for clustering in high dimensional spaces. As the field is rather young, there is a lack of comparative studies on the advantages and disadvantages of the different algorithms. Part of the underlying problem is the lack...... of available open source implementations that could be used by researchers to understand, compare, and extend subspace and projected clustering algorithms. In this paper, we discuss the requirements for open source evaluation software. We propose OpenSubspace, an open source framework that meets...... these requirements. OpenSubspace integrates state-of-the-art performance measures and visualization techniques to foster research in subspace and projected clustering....

  16. Poor concordance of spiral CT (SCT) and high probability ventilation-perfusion (V/Q) studies in the diagnosis of pulmonary embolism (PE)

    International Nuclear Information System (INIS)

    Roman, M.R.; Angelides, S.; Chen, N.

    2000-01-01

    Full text: Despite its limitations, V/Q scintigraphy remains the favoured non-invasive technique for the diagnosis of pulmonary embolism (PE). PE is present in 85-90% and 30-40% of high and intermediate probability V/Q studies respectively. The value of spiral CT (SCT), a newer imaging modality, has yet to be determined. The aims of this study were to determine the frequency of positive SCT for PE in high and intermediate probability V/Q studies performed within 24hr apart. 15 patients (6M, 9F, mean age - 70.2) with a high probability study were included. Six (40%) SCT were reported as positive (four with emboli present in the main pulmonary arteries), seven as negative, one equivocal and one was technically sub-optimal. Pulmonary angiography was not performed in any patient. In all seven negative studies, the SCT was performed before the V/Q study. Of these, two studies were revised to positive once the result of the V/Q study was known, while, three others had resolving mismatch V/Q defects on follow-up studies (performed 5-14 days later); two of these three also had a positive duplex scan of the lower limbs. One other was most likely due to chronic thromboembolic disease. Only three patients had a V/Q scan prior to the SCT; all were positive for PE on both imaging modalities. Of 26 patients (11M, 15F, mean age - 68.5) with an intermediate probability V/Q study, SCT was positive in only two (8%). Thus the low detection rate of PE by SCT in this albeit small series, raises doubts as to its role in the diagnosis of PE. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  17. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  18. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  19. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  20. The eGo grid model: An open source approach towards a model of German high and extra-high voltage power grids

    Science.gov (United States)

    Mueller, Ulf Philipp; Wienholt, Lukas; Kleinhans, David; Cussmann, Ilka; Bunke, Wolf-Dieter; Pleßmann, Guido; Wendiggensen, Jochen

    2018-02-01

    There are several power grid modelling approaches suitable for simulations in the field of power grid planning. The restrictive policies of grid operators, regulators and research institutes concerning their original data and models lead to an increased interest in open source approaches of grid models based on open data. By including all voltage levels between 60 kV (high voltage) and 380kV (extra high voltage), we dissolve the common distinction between transmission and distribution grid in energy system models and utilize a single, integrated model instead. An open data set for primarily Germany, which can be used for non-linear, linear and linear-optimal power flow methods, was developed. This data set consists of an electrically parameterised grid topology as well as allocated generation and demand characteristics for present and future scenarios at high spatial and temporal resolution. The usability of the grid model was demonstrated by the performance of exemplary power flow optimizations. Based on a marginal cost driven power plant dispatch, being subject to grid restrictions, congested power lines were identified. Continuous validation of the model is nescessary in order to reliably model storage and grid expansion in progressing research.

  1. Open magnetic fields in active regions

    Science.gov (United States)

    Svestka, Z.; Solodyna, C. V.; Howard, R.; Levine, R. H.

    1977-01-01

    Soft X-ray images and magnetograms of several active regions and coronal holes are examined which support the interpretation that some of the dark X-ray gaps seen between interconnecting loops and inner cores of active regions are foot points of open field lines inside the active regions. Characteristics of the investigated dark gaps are summarized. All the active regions with dark X-ray gaps at the proper place and with the correct polarity predicted by global potential extrapolation of photospheric magnetic fields are shown to be old active regions, indicating that field opening is accomplished only in a late phase of active-region development. It is noted that some of the observed dark gaps probably have nothing in common with open fields, but are either due to the decreased temperature in low-lying portions of interconnecting loops or are the roots of higher and less dense or cooler loops.

  2. A Hall-current model of electron loss after POS opening into high-impedance loads

    International Nuclear Information System (INIS)

    Greenly, J.B.

    1989-01-01

    The author discusses how a self-consistent relativistic model of laminar Hall (E x B) electron flow across a POS plasma allows a loss mechanism after opening even in a strongly magnetically-insulated line, downstream of the remaining POS plasma. Opening is assumed to occur at the cathode, either by erosion or push-back. The loss results only when a large voltage appears after opening into a high impedance load. Then the difference in potential between the plasma, which is near anode potential, and the cathode results in an axial component of E at the load end of the plasma, which supports an E x B drift of electrons across the gap. The analytic model predicts that this loss should increase with higher voltage after opening, and could be eliminated only by removing the plasma from the gap, or eliminating cathode electron emission (both difficult), or by confining this downstream electron flow with an applied magnetic field

  3. Exploring Differences between Self-Regulated Learning Strategies of High and Low Achievers in Open Distance Learning

    Science.gov (United States)

    Geduld, Bernadette

    2016-01-01

    Open distance students differ in their preparedness for higher education studies. Students who are less self-regulated risk failure and drop out in the challenging milieu of open distance learning. In this study, the differences between the application of self-regulated learning strategies by low and high achievers were explored. A multi-method…

  4. High-resolution Spectroscopic Observations of Binary Stars and Yellow Stragglers in Three Open Clusters : NGC 2360, NGC 3680, and NGC 5822

    Science.gov (United States)

    Sales Silva, J. V.; Peña Suárez, V. J.; Katime Santrich, O. J.; Pereira, C. B.; Drake, N. A.; Roig, F.

    2014-11-01

    Binary stars in open clusters are very useful targets in constraining the nucleosynthesis process. The luminosities of the stars are known because the distances of the clusters are also known, so chemical peculiarities can be linked directly to the evolutionary status of a star. In addition, binary stars offer the opportunity to verify a relationship between them and the straggler population in both globular and open clusters. We carried out a detailed spectroscopic analysis to derive the atmospheric parameters for 16 red giants in binary systems and the chemical composition of 11 of them in the open clusters NGC 2360, NGC 3680, and NGC 5822. We obtained abundances of C, N, O, Na, Mg, Al, Ca, Si, Ti, Ni, Cr, Y, Zr, La, Ce, and Nd. The atmospheric parameters of the studied stars and their chemical abundances were determined using high-resolution optical spectroscopy. We employ the local thermodynamic equilibrium model atmospheres of Kurucz and the spectral analysis code MOOG. The abundances of the light elements were derived using the spectral synthesis technique. We found that the stars NGC 2360-92 and 96, NGC 3680-34, and NGC 5822-4 and 312 are yellow straggler stars. We show that the spectra of NGC 5822-4 and 312 present evidence of contamination by an A-type star as a secondary star. For the other yellow stragglers, evidence of contamination is given by the broad wings of the Hα. Detection of yellow straggler stars is important because the observed number can be compared with the number predicted by simulations of binary stellar evolution in open clusters. We also found that the other binary stars are not s-process enriched, which may suggest that in these binaries the secondary star is probably a faint main-sequence object. The lack of any s-process enrichment is very useful in setting constraints for the number of white dwarfs in the open cluster, a subject that is related to the birthrate of these kinds of stars in open clusters and also to the age of a

  5. Which Type of Inquiry Project Do High School Biology Students Prefer: Open or Guided?

    Science.gov (United States)

    Sadeh, Irit; Zion, Michal

    2012-01-01

    In teaching inquiry to high school students, educators differ on which method of teaching inquiry is more effective: Guided or open inquiry? This paper examines the influence of these two different inquiry learning approaches on the attitudes of Israeli high school biology students toward their inquiry project. The results showed significant…

  6. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  7. Outlook for the use of microsecond plasma opening switches to generate high-power nanosecond current pulses

    International Nuclear Information System (INIS)

    Dolgachev, G.I.; Maslennikov, D.D.; Ushakov, A.G.

    2006-01-01

    Paper deals with a phenomenon of current breaking in a conducting plasma volume of plasma opening switchers with a nanosecond time of energy initiation and their application in high-power generators. One determined the conditions to ensure megavolt voltages under the erosion mode making use of external applied magnetic field to ensure magnetic insulation of gap of plasma opening switchers. One studied the peculiar features of application of plasma opening switchers under 5-6 MV voltages to ensure X-ray and gamma-radiation pulses [ru

  8. A sustainable business model for Open-Access journal publishing: a proposed plan for High-Energy Physics

    Directory of Open Access Journals (Sweden)

    Jens Vigen

    2008-01-01

    Full Text Available The High Energy Physics community over the last 15 years has achieved so-called full green Open Access through the wide dissemination of preprints via arXiv, a central subject repository managed by Cornell University. However, green Open Access does not alleviate the economic difficulties of libraries as they are still expected to offer access to versions of record of the peer-reviewed literature. For this reason the particle physics community is now addressing the issue of gold Open Access by converting a set of the existing core journals to Open Access. A Working Party has been established to bring together funding agencies, laboratories and libraries into a single consortium, called SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics. This consortium will engage with publishers to build a sustainable model for Open Access publishing. In this model, subscription fees from multiple institutions are replaced by contracts with publishers of Open Access journals, where the SCOAP3 consortium is a single financial partner.

  9. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  10. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    International Nuclear Information System (INIS)

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  11. Slope stability probability classification, Waikato Coal Measures, New Zealand

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, P.; Gillard, G.R.; Moore, T.A. [CRL Energy, PO Box 29-415, Christchurch (New Zealand); Campbell, R.N.; Fergusson, D.A. [Solid Energy North, Private Bag 502, Huntly (New Zealand)

    2001-01-01

    Ferm classified lithological units have been identified and described in the Waikato Coal Measures in open pits in the Waikato coal region. These lithological units have been classified geotechnically by mechanical tests and discontinuity measurements. Using these measurements slope stability probability classifications (SSPC) have been quantified based on an adaptation of Hack's [Slope Stability Probability Classification, ITC Delft Publication, Enschede, Netherlands, vol. 43, 1998, 273 pp.] SSPC system, which places less influence on rock quality designation and unconfined compressive strength than previous slope/rock mass rating systems. The Hack weathering susceptibility rating has been modified by using chemical index of alteration values determined from XRF major element analyses. Slaking is an important parameter in slope stability in the Waikato Coal Measures lithologies and hence, a non-subjective method of assessing slaking in relation to the chemical index of alteration has been introduced. Another major component of this adapted SSPC system is the inclusion of rock moisture content effects on slope stability. The main modifications of Hack's SSPC system are the introduction of rock intact strength derived from the modified Mohr-Coulomb failure criterion, which has been adapted for varying moisture content, weathering state and confining pressure. It is suggested that the subjectivity in assessing intact rock strength within broad bands in the initial SSPC system is a major weakness of the initial system. Initial results indicate a close relationship between rock mass strength values, calculated from rock mass friction angles and rock mass cohesion values derived from two established rock mass classification methods (modified Hoek-Brown failure criteria and MRMR) and the adapted SSPC system. The advantage of the modified SSPC system is that slope stability probabilities based on discontinuity-independent and discontinuity-dependent data and a

  12. On possibility of agreement of quantum mechanics with classical probability theory

    International Nuclear Information System (INIS)

    Slavnov, D.A.

    2006-01-01

    Paper describes a scheme to carry out a construction of the quantum mechanics where the quantum system is assumed to be a pattern of the open classical subsystems. It enables to make use both of the formal classical logic and the classical probability theory in the quantum mechanics. On the other hand, in terms of the mentioned approach one manages to ensure complete reconstruction of the quantum mechanics standard mathematical tool specifying its application limits. The problem dealing with the quantum state reduction is scrutinized [ru

  13. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  14. Helium-air exchange flows through partitioned opening and two-opening

    International Nuclear Information System (INIS)

    Kang, T. I.

    1997-01-01

    This paper describes experimental investigations of helium-air exchange flows through partitioned opening and two-opening. Such exchange flows may occur following rupture accident of stand pipe in high temperature engineering test reactor. A test vessel with the two types of small opening on top of test cylinder is used for experiments. An estimation method of mass increment is developed to measure the exchange flow rate. Upward flow of the helium and downward flow of the air in partitioned opening system interact out of entrance and exit of the opening. Therefore, an experiment with two-opening system is made to investigate effect of the fluids interaction of partitioned opening system. As a result of comparison of the exchange flow rates between two types of the opening system, it is demonstrated that the exchange flow rate of the two-opening system is larger than that of the partitioned opening system because of absence of the effect of fluids interaction. (author)

  15. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    Science.gov (United States)

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  16. The open-circuit voltage in microcrystalline silicon solar cells of different degrees of crystallinity

    International Nuclear Information System (INIS)

    Nath, Madhumita; Roca i Cabarrocas, P.; Johnson, E.V.; Abramov, A.; Chatterjee, P.

    2008-01-01

    We have used a detailed electrical-optical computer model (ASDMP) in conjunction with the experimental characterization of microcrystalline silicon thin-film solar cells of different degrees of crystallinity (but having identical P- and N-layers) to understand the observed decrease of the open-circuit voltage with increasing crystalline fraction. In order to model all aspects of the experimental current density-voltage and quantum efficiency characteristics of cells having low (∼ 75%) and high (over 90%) crystalline fraction, we had to assume both a higher mobility gap defect density and a lower band gap for the more crystallized material. The former fact is widely known to bring down the open-circuit voltage. Our calculations also reveal that the proximity of the quasi-Fermi levels to the energy bands in the cell based on highly crystallized (and assumed to have a lower band gap) microcrystalline silicon results in higher free and trapped carrier densities in this device. The trapped hole population is particularly high at and close to the P/I interface on account of the higher inherent defect density in this region and the fact that the hole quasi-Fermi level is close to the valence band edge here. This fact results in a strong interface field, a collapse of the field in the volume, and hence a lower open-circuit voltage. Thus a combination of higher mobility gap defects and a lower band gap is probably the reason for the lower open-circuit voltage in cells based on highly crystallized microcrystalline silicon

  17. Slope stability probability classification, Waikato Coal Measures, New Zealand

    Energy Technology Data Exchange (ETDEWEB)

    Lindsay, P.; Campbell, R.; Fergusson, D.A.; Ferm, J.C.; Gillard, G.R.; Moore, T.A. [CRL Energy Ltd., Christchurch (New Zealand)

    1999-07-01

    Ferm classified lithological units have been identified and described in the Waikato Coal Measures in open pits in the Waikato coal region. These lithological units have been classified geotechnically with mechanical tests and discontinuity measurements. Using these measurements, slope stability probability classification (SSPC) have been quantified based on an adaption of Hack's SSPC system which places less influence on rock quality designation and unconfined compressive strength than previous rock mass rating systems. An attempt has been made to modify the Hack weathering susceptibility rating by using chemical index of alteration values from XRF major element analysis. Another major component of this adapted SSPC system is the inclusion of rock moisture content effects on slope stability. The paper explains the systematic initial approach of using the adapted SSPC system to classify slope stability in the Waikato open pit coal mines. The XRF major element results obtained for lithologies in the Waikato coal region may be a useful mine management tool to quantify stratigraphic thickness and palaeoweathering from wash drill cuttings. 14 refs., 7 figs., 3 tabs.

  18. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  1. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  2. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  3. The results of high tibial open wedge osteotomy in patients with varus deformity

    Directory of Open Access Journals (Sweden)

    Mahmood Jabalameli

    2013-07-01

    Full Text Available Background: High tibial open wedg osteotomy is one of the most important modality for treatment of varus deformity in order to correct deformity and improving signs and symptoms of patients with primary degenerative osteoarthritis. The aim of this study was to investigate the results of high tibial open wedge osteotomy in patients with varus deformities.Methods: This retrospective study conducted on twenty nine patients (36 knees undergone proximal tibial osteotomy operation in Shafa Yahyaian University Hospital from 2004 to 2010. Inclusion criteria were: age less than 60 years, high physical activity, varus deformity and involvement of medical compartment of knee. Patients with obesity, smoking, patelofemoral pain, lateral compartment lesion, deformity degree more than 20 degree, extension limitation and range of motion less than 90 degree were excluded. The clinical and radiologic characteristics were measured before and after operation.Results: Fourteen patients were females. All of them were younger than 50 years, with mean (±SD 27.64 (±10.88. The mean (±SD of follow up time was 4.33 (±1.7. All the patients were satisfied with the results of operation. Tenderness and pain decreased in all of them. In all patients autologus bone graft were used, in 15 cases (42.5% casting and in the rest T.Buttress plate were used for fixation of fractures. In both groups of primary and double varus the International knee documentation committee (IKDC and modified Larson indices were improved after operation, but there was no significant difference between two groups.Conclusion: High tibial open wedge osteotomy can have satisfying results in clinical signs and symptoms of patients with primary medial joint degenerative osteoarthritis. This procedure also may correct the deformity and improves the radiologic parameters of the patients.

  4. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  5. Subsequent investigation and management of patients with intermediate-category and - probability ventilation - perfusion scintigraphy

    International Nuclear Information System (INIS)

    Walsh, G.; Jones, D.N.

    2000-01-01

    The authors wished to determine the proportion of patients with intermediate-category and intermediate-probability ventilation-perfusion scintigraphy (IVQS) who proceed to further imaging for investigation of thromboembolism, to identify the defining clinical parameters and to determine the proportion of patients who have a definite imaging diagnosis of thromboembolism prior to discharge from hospital on anticoagulation therapy. One hundred and twelve VQS studies performed at the Flinders Medical Centre over a 9-month period were reported as having intermediate category and probability for pulmonary embolism. Medical case notes were available for review in 99 of these patients and from these the pretest clinical probability, subsequent patient progress and treatment were recorded. Eight cases were excluded because they were already receiving anticoagulation therapy. In the remaining 91 patients the pretest clinical probability was considered to be low in 25; intermediate in 30; and high in 36 cases. In total, 51.6% (n = 47) of these patients (8% (n = 2) with low, 66% (n = 20) with intermediate, and 69.4% (n = 25) with high pretest probability) proceeded to CT pulmonary angiography (CTPA) and/or lower limb duplex Doppler ultrasound (DUS) evaluation. Of the patients with IVQS results, 30.7% (n 28) were evaluated with CTPA. No patient with a low, all patients with a high and 46% of patients with an intermediate pretest probability initially received anticoagulation therapy. This was discontinued in three patients with high and in 12 patients with intermediate clinical probability prior to discharge from hospital. Overall, 40% of patients discharged on anticoagulation therapy (including 39% of those with a high pretest probability) had a positive imaging diagnosis of thromboembolism The results suggest that, although the majority of patients with intermediate-to-high pretest probability and IVQS proceed to further imaging investigation, CTPA is relatively underused in

  6. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  7. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response to…

  8. Open Access @ DTU

    DEFF Research Database (Denmark)

    Ekstrøm, Jeannette

    Open Access is high on the agenda in Denmark and internationally. Denmark has announced a national strategy for Open Access that aims to achieve Open Access to 80% in 2017 and 100% in 2022 to peer review research articles. All public Danish funders as well as H2020 requires that all peer review...... articles that is an outcome of their funding will be Open Access. Uploading your full texts (your final author manuscript after review ) to DTU Orbit is a fundamental part of providing Open Access to your research. We are here to answer all your questions with regards to Open Access and related topics...... such as copyright, DTU Orbit, Open Access journals, APCs, Vouchers etc....

  9. Deriving Animal Behaviour from High-Frequency GPS: Tracking Cows in Open and Forested Habitat.

    Science.gov (United States)

    de Weerd, Nelleke; van Langevelde, Frank; van Oeveren, Herman; Nolet, Bart A; Kölzsch, Andrea; Prins, Herbert H T; de Boer, W Fred

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable classification method to infer behaviour from location data. Behavioural observations were carried out during tracking of cows (Bos Taurus) fitted with high-frequency GPS (Global Positioning System) receivers. Data were obtained in an open field and forested area, and movement metrics were calculated for 1 min, 12 s and 2 s intervals. We observed four behaviour types (Foraging, Lying, Standing and Walking). We subsequently used Classification and Regression Trees to classify the simultaneously obtained GPS data as these behaviour types, based on distances and turning angles between fixes. GPS data with a 1 min interval from the open field was classified correctly for more than 70% of the samples. Data from the 12 s and 2 s interval could not be classified successfully, emphasizing that the interval should be long enough for the behaviour to be defined by its characteristic movement metrics. Data obtained in the forested area were classified with a lower accuracy (57%) than the data from the open field, due to a larger positional error of GPS locations and differences in behavioural performance influenced by the habitat type. This demonstrates the importance of understanding the relationship between behaviour and movement metrics, derived from GNSS fixes at different frequencies and in different habitats, in order to successfully infer behaviour. When spatially accurate location data can be obtained, behaviour can be inferred from high-frequency GNSS fixes by calculating simple movement metrics and using easily interpretable decision trees. This allows for the combined study of animal behaviour and habitat use based on location data, and might make it possible to detect deviations

  10. Deriving Animal Behaviour from High-Frequency GPS: Tracking Cows in Open and Forested Habitat.

    Directory of Open Access Journals (Sweden)

    Nelleke de Weerd

    Full Text Available The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable classification method to infer behaviour from location data. Behavioural observations were carried out during tracking of cows (Bos Taurus fitted with high-frequency GPS (Global Positioning System receivers. Data were obtained in an open field and forested area, and movement metrics were calculated for 1 min, 12 s and 2 s intervals. We observed four behaviour types (Foraging, Lying, Standing and Walking. We subsequently used Classification and Regression Trees to classify the simultaneously obtained GPS data as these behaviour types, based on distances and turning angles between fixes. GPS data with a 1 min interval from the open field was classified correctly for more than 70% of the samples. Data from the 12 s and 2 s interval could not be classified successfully, emphasizing that the interval should be long enough for the behaviour to be defined by its characteristic movement metrics. Data obtained in the forested area were classified with a lower accuracy (57% than the data from the open field, due to a larger positional error of GPS locations and differences in behavioural performance influenced by the habitat type. This demonstrates the importance of understanding the relationship between behaviour and movement metrics, derived from GNSS fixes at different frequencies and in different habitats, in order to successfully infer behaviour. When spatially accurate location data can be obtained, behaviour can be inferred from high-frequency GNSS fixes by calculating simple movement metrics and using easily interpretable decision trees. This allows for the combined study of animal behaviour and habitat use based on location data, and might make it possible to

  11. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  12. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  13. Google Classroom and Open Clusters: An Authentic Science Research Project for High School Students

    Science.gov (United States)

    Johnson, Chelen H.; Linahan, Marcella; Cuba, Allison Frances; Dickmann, Samantha Rose; Hogan, Eleanor B.; Karos, Demetra N.; Kozikowski, Kendall G.; Kozikowski, Lauren Paige; Nelson, Samantha Brooks; O'Hara, Kevin Thomas; Ropinski, Brandi Lucia; Scarpa, Gabriella; Garmany, Catharine D.

    2016-01-01

    STEM education is about offering unique opportunities to our students. For the past three years, students from two high schools (Breck School in Minneapolis, MN, and Carmel Catholic High School in Mundelein, IL) have collaborated on authentic astronomy research projects. This past year they surveyed archival data of open clusters to determine if a clear turnoff point could be unequivocally determined. Age and distance to each open cluster were calculated. Additionally, students requested time on several telescopes to obtain original data to compare to the archival data. Students from each school worked in collaborative teams, sharing and verifying results through regular online hangouts and chats. Work papers were stored in a shared drive and on a student-designed Google site to facilitate dissemination of documents between the two schools.

  14. Knotting probabilities after a local strand passage in unknotted self-avoiding polygons

    International Nuclear Information System (INIS)

    Szafron, M L; Soteros, C E

    2011-01-01

    We investigate, both theoretically and numerically, the knotting probabilities after a local strand passage is performed in an unknotted self-avoiding polygon (SAP) on the simple cubic lattice. In the polygons studied, it is assumed that two polygon segments have already been brought close together for the purpose of performing a strand passage. This restricts the polygons considered to those that contain a specific pattern called Θ at a fixed location; an unknotted polygon containing Θ is called a Θ-SAP. It is proved that the number of n-edge Θ-SAPs grows exponentially (with n) at the same rate as the total number of n-edge unknotted SAPs (those with no prespecified strand passage structure). Furthermore, it is proved that the same holds for subsets of n-edge Θ-SAPs that yield a specific after-strand-passage knot-type. Thus, the probability of a given after-strand-passage knot-type does not grow (or decay) exponentially with n. Instead, it is conjectured that these after-strand-passage knot probabilities approach, as n goes to infinity, knot-type dependent amplitude ratios lying strictly between 0 and 1. This conjecture is supported by numerical evidence from Monte Carlo data generated using a composite (aka multiple) Markov chain Monte Carlo BFACF algorithm developed to study Θ-SAPs. A new maximum likelihood method is used to estimate the critical exponents relevant to this conjecture. We also obtain strong numerical evidence that the after-strand-passage knotting probability depends on the local structure around the strand-passage site. If the local structure and the crossing sign at the strand-passage site are considered, then we observe that the more 'compact' the local structure, the less likely the after-strand-passage polygon is to be knotted. This trend for compactness versus knotting probability is consistent with results obtained for other strand-passage models; however, we are the first to note the influence of the crossing-sign information. We

  15. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  16. High pressure does not counterbalance the advantages of open techniques over closed techniques during heated intraperitoneal chemotherapy with oxaliplatin.

    Science.gov (United States)

    Facy, Olivier; Combier, Christophe; Poussier, Matthieu; Magnin, Guy; Ladoire, Sylvain; Ghiringhelli, François; Chauffert, B; Rat, Patrick; Ortega-Deballon, Pablo

    2015-01-01

    Heated intraperitoneal chemotherapy (HIPEC) treats residual microscopic disease after cytoreductive surgery. In experimental models, the open HIPEC technique has shown a higher and more homogenous concentration of platinum in the peritoneum than achieved using the closed technique. A 25-cm H2O pressure enhances the penetration of oxaliplatin. Because pressure is easier to set up with the closed technique, high pressure may counterbalance the drawbacks of this technique versus open HIPEC, and a higher pressure may induce a higher penetration. Because higher concentration does not mean deeper penetration, a study of tissues beneath the peritoneum is required. Finally, achieving a deeper penetration (and a higher concentration) raises the question of the passage of drugs through the surgical glove and the surgeon's safety. Four groups of pigs underwent HIPEC with oxaliplatin (150 mg/L) for 30 minutes in open isobaric pressure and pressure at 25 cm H2O, and closed pressure at 25 and 40 cm H2O. Systemic absorption and peritoneal mapping of the concentration of platinum were analyzed, as well as in the retroperitoneal tissue and the surgical gloves. Blood concentrations were higher in open groups. In the parietal surfaces, the concentrations were not different between the isobaric and the closed groups (47.08, 56.39, and 48.57 mg/kg, respectively), but were higher in the open high-pressure group (85.93 mg/kg). In the visceral surfaces, they were lower in the closed groups (3.2 and 3.05 mg/kg) than in the open groups (7.03 and 9.56 mg/kg). Platinum concentrations were similar in the deep retroperitoneal tissue when compared between isobaric and high-pressure procedures. No platin was detected in the internal aspect of the gloves. The use of high pressure during HIPEC does not counterbalance the drawbacks of closed techniques. The tissue concentration of oxaliplatin achieved with the open techniques is higher, even if high pressure is applied during a closed technique

  17. Scalable High Performance Message Passing over InfiniBand for Open MPI

    Energy Technology Data Exchange (ETDEWEB)

    Friedley, A; Hoefler, T; Leininger, M L; Lumsdaine, A

    2007-10-24

    InfiniBand (IB) is a popular network technology for modern high-performance computing systems. MPI implementations traditionally support IB using a reliable, connection-oriented (RC) transport. However, per-process resource usage that grows linearly with the number of processes, makes this approach prohibitive for large-scale systems. IB provides an alternative in the form of a connectionless unreliable datagram transport (UD), which allows for near-constant resource usage and initialization overhead as the process count increases. This paper describes a UD-based implementation for IB in Open MPI as a scalable alternative to existing RC-based schemes. We use the software reliability capabilities of Open MPI to provide the guaranteed delivery semantics required by MPI. Results show that UD not only requires fewer resources at scale, but also allows for shorter MPI startup times. A connectionless model also improves performance for applications that tend to send small messages to many different processes.

  18. Open Access Publishing in High-Energy Physics the SCOAP$^{3}$ model

    CERN Document Server

    Mele, S

    2009-01-01

    The Open Access (OA) movement is gaining an increasing momentum: its goal is to grant anyone, anywhere and anytime free access to the results of publicly funded scientific research. The High- Energy Physics (HEP) community has pioneered OA for decades, through its widespread “pre-print culture”. After almost half a century of worldwide dissemination of pre-prints, in paper first and electronically later, OA journals are becoming the natural evolution of scholarly communication in HEP. Among other OA business models, the one based on a sponsoring consortium appears as the most viable option for a transition of the HEP peer-reviewed literature to OA. The Sponsoring Consortium for Open Access Publishing in Particle Physics (SCOAP3) is proposed as a central body to remunerate publishers for their peer-review service, effectively replacing the “reader-pays” model of traditional subscriptions with an “author-side” funding, without any direct financial burden on individual authors and research groups. Su...

  19. CellProfiler and KNIME: open source tools for high content screening.

    Science.gov (United States)

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  20. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  1. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  2. High-efficiency one-dimensional atom localization via two parallel standing-wave fields

    International Nuclear Information System (INIS)

    Wang, Zhiping; Wu, Xuqiang; Lu, Liang; Yu, Benli

    2014-01-01

    We present a new scheme of high-efficiency one-dimensional (1D) atom localization via measurement of upper state population or the probe absorption in a four-level N-type atomic system. By applying two classical standing-wave fields, the localization peak position and number, as well as the conditional position probability, can be easily controlled by the system parameters, and the sub-half-wavelength atom localization is also observed. More importantly, there is 100% detecting probability of the atom in the subwavelength domain when the corresponding conditions are satisfied. The proposed scheme may open up a promising way to achieve high-precision and high-efficiency 1D atom localization. (paper)

  3. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  4. Exact joint density-current probability function for the asymmetric exclusion process.

    Science.gov (United States)

    Depken, Martin; Stinchcombe, Robin

    2004-07-23

    We study the asymmetric simple exclusion process with open boundaries and derive the exact form of the joint probability function for the occupation number and the current through the system. We further consider the thermodynamic limit, showing that the resulting distribution is non-Gaussian and that the density fluctuations have a discontinuity at the continuous phase transition, while the current fluctuations are continuous. The derivations are performed by using the standard operator algebraic approach and by the introduction of new operators satisfying a modified version of the original algebra. Copyright 2004 The American Physical Society

  5. Automated measurement of spatial preference in the open field test with transmitted lighting.

    Science.gov (United States)

    Kulikov, Alexander V; Tikhonova, Maria A; Kulikov, Victor A

    2008-05-30

    New modification of the open field was designed to improve automation of the test. The main innovations were: (1) transmitted lighting and (2) estimation of probability to find pixels associated with an animal in the selected region of arena as an objective index of spatial preference. Transmitted (inverted) lighting significantly ameliorated the contrast between an animal and arena and allowed to track white animals with similar efficacy as colored ones. Probability as a measure of preference of selected region was mathematically proved and experimentally verified. A good correlation between probability and classic indices of spatial preference (number of region entries and time spent therein) was shown. The algorithm of calculation of probability to find pixels associated with an animal in the selected region was implemented in the EthoStudio software. Significant interstrain differences in locomotion and the central zone preference (index of anxiety) were shown using the inverted lighting and the EthoStudio software in mice of six inbred strains. The effects of arena shape (circle or square) and a novel object presence in the center of arena on the open field behavior in mice were studied.

  6. Unrelated Hematopoietic Stem Cell Donor Matching Probability and Search Algorithm

    Directory of Open Access Journals (Sweden)

    J.-M. Tiercy

    2012-01-01

    Full Text Available In transplantation of hematopoietic stem cells (HSCs from unrelated donors a high HLA compatibility level decreases the risk of acute graft-versus-host disease and mortality. The diversity of the HLA system at the allelic and haplotypic level and the heterogeneity of HLA typing data of the registered donors render the search process a complex task. This paper summarizes our experience with a search algorithm that includes at the start of the search a probability estimate (high/intermediate/low to identify a HLA-A, B, C, DRB1, DQB1-compatible donor (a 10/10 match. Based on 2002–2011 searches about 30% of patients have a high, 30% an intermediate, and 40% a low probability search. Search success rate and duration are presented and discussed in light of the experience of other centers. Overall a 9-10/10 matched HSC donor can now be identified for 60–80% of patients of European descent. For high probability searches donors can be selected on the basis of DPB1-matching with an estimated success rate of >40%. For low probability searches there is no consensus on which HLA incompatibilities are more permissive, although HLA-DQB1 mismatches are generally considered as acceptable. Models for the discrimination of more detrimental mismatches based on specific amino acid residues rather than specific HLA alleles are presented.

  7. Jihadist Foreign Fighter Phenomenon in Western Europe: A Low-Probability, High-Impact Threat

    Directory of Open Access Journals (Sweden)

    Edwin Bakker

    2015-11-01

    Full Text Available The phenomenon of foreign fighters in Syria and Iraq is making headlines. Their involvement in the atrocities committed by terrorist groups such as the so-called “Islamic State” and Jabhat al-Nusra have caused grave concern and public outcry in the foreign fighters’ European countries of origin. While much has been written about these foreign fighters and the possible threat they pose, the impact of this phenomenon on Western European societies has yet to be documented. This Research Paper explores four particular areas where this impact is most visible: a violent incidents associated with (returned foreign fighters, b official and political responses linked to these incidents, c public opinion, and d anti-Islam reactions linked to these incidents. The authors conclude that the phenomenon of jihadist foreign fighters in European societies should be primarily regarded as a social and political threat, not a physical one. They consider the phenomenon of European jihadist foreign fighters a “low-probability, high-impact” threat.

  8. MANTA--an open-source, high density electrophysiology recording suite for MATLAB.

    Science.gov (United States)

    Englitz, B; David, S V; Sorenson, M D; Shamma, S A

    2013-01-01

    The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to decipher the neural code, whether single cell, local field potential (LFP), micro-electrocorticograms (μECoG), electroencephalographic (EEG), magnetoencephalographic (MEG) or in vitro micro-electrode array (MEA) data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ) systems with high channel counts (>100) can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users. We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC), usage of analog or digital headstages, low per channel cost (1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays), some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems. With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point.

  9. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  10. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    Science.gov (United States)

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  12. Advancing a holistic approach to openness

    DEFF Research Database (Denmark)

    Søndergaard, Helle Alsted; Araújo, Ana Luiza Lara de

    Open innovation has emerged as a new and interesting research area, and with this paper we wish to contribute to the research on open innovation by proposing a more holistic approach to openness that includes the internal sphere of openness. We use data from 170 Danish SMEs in the high-tech and m......Open innovation has emerged as a new and interesting research area, and with this paper we wish to contribute to the research on open innovation by proposing a more holistic approach to openness that includes the internal sphere of openness. We use data from 170 Danish SMEs in the high...

  13. Cost-effectiveness of laparoscopic versus open distal pancreatectomy for pancreatic cancer.

    Science.gov (United States)

    Gurusamy, Kurinchi Selvan; Riviere, Deniece; van Laarhoven, C J H; Besselink, Marc; Abu-Hilal, Mohammed; Davidson, Brian R; Morris, Steve

    2017-01-01

    A recent Cochrane review compared laparoscopic versus open distal pancreatectomy for people with for cancers of the body and tail of the pancreas and found that laparoscopic distal pancreatectomy may reduce the length of hospital stay. We compared the cost-effectiveness of laparoscopic distal pancreatectomy versus open distal pancreatectomy for pancreatic cancer. Model based cost-utility analysis estimating mean costs and quality-adjusted life years (QALYs) per patient from the perspective of the UK National Health Service. A decision tree model was constructed using probabilities, outcomes and cost data from published sources. A time horizon of 5 years was used. One-way and probabilistic sensitivity analyses were undertaken. The probabilistic sensitivity analysis showed that the incremental net monetary benefit was positive (£3,708.58 (95% confidence intervals (CI) -£9,473.62 to £16,115.69) but the 95% CI includes zero, indicating that there is significant uncertainty about the cost-effectiveness of laparoscopic distal pancreatectomy versus open distal pancreatectomy. The probability laparoscopic distal pancreatectomy was cost-effective compared to open distal pancreatectomy for pancreatic cancer was between 70% and 80% at the willingness-to-pay thresholds generally used in England (£20,000 to £30,000 per QALY gained). Results were sensitive to the survival proportions and the operating time. There is considerable uncertainty about whether laparoscopic distal pancreatectomy is cost-effective compared to open distal pancreatectomy for pancreatic cancer in the NHS setting.

  14. Developing a Model and Applications for Probabilities of Student Success: A Case Study of Predictive Analytics

    Science.gov (United States)

    Calvert, Carol Elaine

    2014-01-01

    This case study relates to distance learning students on open access courses. It demonstrates the use of predictive analytics to generate a model of the probabilities of success and retention at different points, or milestones, in a student journey. A core set of explanatory variables has been established and their varying relative importance at…

  15. Long term monitoring of window opening behaviour in Danish dwellings

    DEFF Research Database (Denmark)

    Andersen, Rune Vinther; Toftum, Jørn; Olesen, Bjarne W.

    2009-01-01

    ABSTRACT: During the first eight months of 2008, measurements of occupant behaviour and eight environmental variables was carried out in 15 dwellings. Logistical regression was applied to infer the probability of open window as a function of the outdoor temperature. The results were compared...

  16. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  17. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  18. Deriving animal behaviour from high-frequency GPS: tracking cows in open and forested habitat

    NARCIS (Netherlands)

    de Weerd, N.; van Langevelde, F.; van Oeveren, H.; Nolet, Bart A.; Kölzsch, Andrea; Prins, H.H.T.; De Boer, W.F.

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data. We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable

  19. Deriving animal behaviour from high-frequency GPS: tracking cows in open and forested habitat

    NARCIS (Netherlands)

    Weerd, de N.; Langevelde, van F.; Oeveren, van H.; Nolet, B.A.; Kölzsch, A.; Prins, H.H.T.; Boer, de W.F.

    2015-01-01

    The increasing spatiotemporal accuracy of Global Navigation Satellite Systems (GNSS) tracking systems opens the possibility to infer animal behaviour from tracking data.We studied the relationship between high-frequency GNSS data and behaviour, aimed at developing an easily interpretable

  20. Open data innovation capabilities : Towards a framework of how to innovate with open data

    NARCIS (Netherlands)

    Eckartz, S.; Broek, T. van den; Ooms, M.

    2016-01-01

    Innovation based on open data lags behind the high expectations of policy makers. Hence, open data researchers have investigated the barriers of open data publication and adoption. This paper contributes to this literature by taking a capabilities perspective on how successful open data re-users

  1. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  2. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  3. Exploring Infiniband Hardware Virtualization in OpenNebula towards Efficient High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Pais Pitta de Lacerda Ruivo, Tiago [IIT, Chicago; Bernabeu Altayo, Gerard [Fermilab; Garzoglio, Gabriele [Fermilab; Timm, Steven [Fermilab; Kim, Hyun-Woo [Fermilab; Noh, Seo-Young [KISTI, Daejeon; Raicu, Ioan [IIT, Chicago

    2014-11-11

    has been widely accepted that software virtualization has a big negative impact on high-performance computing (HPC) application performance. This work explores the potential use of Infiniband hardware virtualization in an OpenNebula cloud towards the efficient support of MPI-based workloads. We have implemented, deployed, and tested an Infiniband network on the FermiCloud private Infrastructure-as-a-Service (IaaS) cloud. To avoid software virtualization towards minimizing the virtualization overhead, we employed a technique called Single Root Input/Output Virtualization (SRIOV). Our solution spanned modifications to the Linux’s Hypervisor as well as the OpenNebula manager. We evaluated the performance of the hardware virtualization on up to 56 virtual machines connected by up to 8 DDR Infiniband network links, with micro-benchmarks (latency and bandwidth) as well as w a MPI-intensive application (the HPL Linpack benchmark).

  4. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  5. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  6. Falcon: a highly flexible open-source software for closed-loop neuroscience.

    Science.gov (United States)

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those requiring use of complex data structures and real

  7. Falcon: a highly flexible open-source software for closed-loop neuroscience

    Science.gov (United States)

    Ciliberti, Davide; Kloosterman, Fabian

    2017-08-01

    Objective. Closed-loop experiments provide unique insights into brain dynamics and function. To facilitate a wide range of closed-loop experiments, we created an open-source software platform that enables high-performance real-time processing of streaming experimental data. Approach. We wrote Falcon, a C++ multi-threaded software in which the user can load and execute an arbitrary processing graph. Each node of a Falcon graph is mapped to a single thread and nodes communicate with each other through thread-safe buffers. The framework allows for easy implementation of new processing nodes and data types. Falcon was tested both on a 32-core and a 4-core workstation. Streaming data was read from either a commercial acquisition system (Neuralynx) or the open-source Open Ephys hardware, while closed-loop TTL pulses were generated with a USB module for digital output. We characterized the round-trip latency of our Falcon-based closed-loop system, as well as the specific latency contribution of the software architecture, by testing processing graphs with up to 32 parallel pipelines and eight serial stages. We finally deployed Falcon in a task of real-time detection of population bursts recorded live from the hippocampus of a freely moving rat. Main results. On Neuralynx hardware, round-trip latency was well below 1 ms and stable for at least 1 h, while on Open Ephys hardware latencies were below 15 ms. The latency contribution of the software was below 0.5 ms. Round-trip and software latencies were similar on both 32- and 4-core workstations. Falcon was used successfully to detect population bursts online with ~40 ms average latency. Significance. Falcon is a novel open-source software for closed-loop neuroscience. It has sub-millisecond intrinsic latency and gives the experimenter direct control of CPU resources. We envisage Falcon to be a useful tool to the neuroscientific community for implementing a wide variety of closed-loop experiments, including those

  8. THE UNIQUE Na:O ABUNDANCE DISTRIBUTION IN NGC 6791: THE FIRST OPEN(?) CLUSTER WITH MULTIPLE POPULATIONS

    International Nuclear Information System (INIS)

    Geisler, D.; Villanova, S.; Cummings, J.; Carraro, G.; Pilachowski, C.; Johnson, C. I.; Bresolin, F.

    2012-01-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  9. The Unique Na:O Abundance Distribution in NGC 6791: The First Open(?) Cluster with Multiple Populations

    Science.gov (United States)

    Geisler, D.; Villanova, S.; Carraro, G.; Pilachowski, C.; Cummings, J.; Johnson, C. I.; Bresolin, F.

    2012-09-01

    Almost all globular clusters investigated exhibit a spread in their light element abundances, the most studied being an Na:O anticorrelation. In contrast, open clusters show a homogeneous composition and are still regarded as Simple Stellar Populations. The most probable reason for this difference is that globulars had an initial mass high enough to retain primordial gas and ejecta from the first stellar generation and thus formed a second generation with a distinct composition, an initial mass exceeding that of open clusters. NGC 6791 is a massive open cluster and warrants a detailed search for chemical inhomogeneities. We collected high-resolution, high signal-to-noise spectra of 21 members covering a wide range of evolutionary status and measured their Na, O, and Fe content. We found [Fe/H] = +0.42 ± 0.01, in good agreement with previous values, and no evidence for a spread. However, the Na:O distribution is completely unprecedented. It becomes the first open cluster to show intrinsic abundance variations that cannot be explained by mixing, and thus the first discovered to host multiple populations. It is also the first star cluster to exhibit two subpopulations in the Na:O diagram with one being chemically homogeneous while the second has an intrinsic spread that follows the anticorrelation so far displayed only by globular clusters. NGC 6791 is unique in many aspects, displaying certain characteristics typical of open clusters, others more reminiscent of globulars, and yet others, in particular its Na:O behavior investigated here, that are totally unprecedented. It clearly had a complex and fascinating history.

  10. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  11. Highly Active N,O Zinc Guanidine Catalysts for the Ring-Opening Polymerization of Lactide.

    Science.gov (United States)

    Schäfer, Pascal M; Fuchs, Martin; Ohligschläger, Andreas; Rittinghaus, Ruth; McKeown, Paul; Akin, Enver; Schmidt, Maximilian; Hoffmann, Alexander; Liauw, Marcel A; Jones, Matthew D; Herres-Pawlis, Sonja

    2017-09-22

    New zinc guanidine complexes with N,O donor functionalities were prepared, characterized by X-Ray crystallography, and examined for their catalytic activity in the solvent-free ring-opening polymerization (ROP) of technical-grade rac-lactide at 150 °C. All complexes showed a high activity. The fastest complex [ZnCl 2 (DMEGasme)] (C1) produced colorless poly(lactide) (PLA) after 90 min with a conversion of 52 % and high molar masses (M w =69 100, polydispersity=1.4). The complexes were tested with different monomer-to-initiator ratios to determine the rate constant k p . Furthermore, a polymerization with the most active complex C1 was monitored by in situ Raman spectroscopy. Overall, conversion of up to 90 % can be obtained. End-group analysis was performed to clarify the mechanism. All four complexes combine robustness against impurities in the lactide with high polymerization rates, and they represent the fastest robust lactide ROP catalysts to date, opening new avenues to a sustainable ROP catalyst family for industrial use. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. High-uniformity centimeter-wide Si etching method for MEMS devices with large opening elements

    Science.gov (United States)

    Okamoto, Yuki; Tohyama, Yukiya; Inagaki, Shunsuke; Takiguchi, Mikio; Ono, Tomoki; Lebrasseur, Eric; Mita, Yoshio

    2018-04-01

    We propose a compensated mesh pattern filling method to achieve highly uniform wafer depth etching (over hundreds of microns) with a large-area opening (over centimeter). The mesh opening diameter is gradually changed between the center and the edge of a large etching area. Using such a design, the etching depth distribution depending on sidewall distance (known as the local loading effect) inversely compensates for the over-centimeter-scale etching depth distribution, known as the global or within-die(chip)-scale loading effect. Only a single DRIE with test structure patterns provides a micro-electromechanical systems (MEMS) designer with the etched depth dependence on the mesh opening size as well as on the distance from the chip edge, and the designer only has to set the opening size so as to obtain a uniform etching depth over the entire chip. This method is useful when process optimization cannot be performed, such as in the cases of using standard conditions for a foundry service and of short turn-around-time prototyping. To demonstrate, a large MEMS mirror that needed over 1 cm2 of backside etching was successfully fabricated using as-is-provided DRIE conditions.

  13. Open3DQSAR

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    2011-01-01

    Open3DQSAR is a freely available open-source program aimed at chemometric analysis of molecular interaction fields. MIFs can be imported from different sources (GRID, CoMFA/CoMSIA, quantum-mechanical electrostatic potential or electron density grids) or generated by Open3DQSAR itself. Much focus...... has been put on automation through the implementation of a scriptable interface, as well as on high computational performance achieved by algorithm parallelization. Flexibility and interoperability with existing molecular modeling software make Open3DQSAR a powerful tool in pharmacophore assessment...

  14. ESTIMATING LONG GRB JET OPENING ANGLES AND REST-FRAME ENERGETICS

    Energy Technology Data Exchange (ETDEWEB)

    Goldstein, Adam [Space Science Office, VP62, NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Connaughton, Valerie [Science and Technology Institute, Universities Space Research Association, Huntsville, AL 35805 (United States); Briggs, Michael S.; Burns, Eric, E-mail: adam.m.goldstein@nasa.gov [Center for Space Plasma and Aeronomic Research, University of Alabama in Huntsville, 320 Sparkman Drive, Huntsville, AL 35899 (United States)

    2016-02-10

    We present a method to estimate the jet opening angles of long duration gamma-ray bursts (GRBs) using the prompt gamma-ray energetics and an inversion of the Ghirlanda relation, which is a correlation between the time-integrated peak energy of the GRB prompt spectrum and the collimation-corrected energy in gamma-rays. The derived jet opening angles using this method and detailed assumptions match well with the corresponding inferred jet opening angles obtained when a break in the afterglow is observed. Furthermore, using a model of the predicted long GRB redshift probability distribution observable by the Fermi Gamma-ray Burst Monitor (GBM), we estimate the probability distributions for the jet opening angle and rest-frame energetics for a large sample of GBM GRBs for which the redshifts have not been observed. Previous studies have only used a handful of GRBs to estimate these properties due to the paucity of observed afterglow jet breaks, spectroscopic redshifts, and comprehensive prompt gamma-ray observations, and we potentially expand the number of GRBs that can be used in this analysis by more than an order of magnitude. In this analysis, we also present an inferred distribution of jet breaks which indicates that a large fraction of jet breaks are not observable with current instrumentation and observing strategies. We present simple parameterizations for the jet angle, energetics, and jet break distributions so that they may be used in future studies.

  15. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  16. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  17. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  18. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  19. EMISSIONS OF ORGANIC AIR TOXICS FROM OPEN ...

    Science.gov (United States)

    A detailed literature search was performed to collect and collate available data reporting emissions of toxic organic substances into the air from open burning sources. Availability of data varied according to the source and the class of air toxics of interest. Volatile organic compound (VOC) and polycyclic aromatic hydrocarbon (PAH) data were available for many of the sources. Data on semivolatile organic compounds (SVOCs) that are not PAHs were available for several sources. Carbonyl and polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofuran (PCDD/F) data were available for only a few sources. There were several sources for which no emissions data were available at all. Several observations were made including: 1) Biomass open burning sources typically emitted less VOCs than open burning sources with anthropogenic fuels on a mass emitted per mass burned basis, particularly those where polymers were concerned; 2) Biomass open burning sources typically emitted less SVOCs and PAHs than anthropogenic sources on a mass emitted per mass burned basis. Burning pools of crude oil and diesel fuel produced significant amounts of PAHs relative to other types of open burning. PAH emissions were highest when combustion of polymers was taking place; and 3) Based on very limited data, biomass open burning sources typically produced higher levels of carbonyls than anthropogenic sources on a mass emitted per mass burned basis, probably due to oxygenated structures r

  20. Three-phase multilevel inverter configuration for open-winding high power application

    DEFF Research Database (Denmark)

    Sanjeevikumar, Padmanaban; Blaabjerg, Frede; Wheeler, Patrick William

    2015-01-01

    This paper work exploits a new dual open-winding three-phase multilevel inverter configuration suitable for high power medium-voltage applications. Modular structure comprised of standard three-phase voltage source inverter (VSI) along with one additional bi-directional semiconductor device (MOSFET...... for implementation purpose. Proposed dual-inverter configuration generates multilevel outputs with benefit includes reduced THD and dv/dt in comparison to other dual-inverter topologies. Complete model of the multilevel ac drive is developed with simple MSCFM modulation in Matlab/PLECs numerical software...

  1. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T.; Samorodnitsky, G.

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  2. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  3. College Students' Openness toward Autism Spectrum Disorders: Improving Peer Acceptance

    Science.gov (United States)

    Nevill, Rose E. A.; White, Susan W.

    2011-01-01

    One probable consequence of rising rates of autism spectrum disorder diagnosis in individuals without co-occurring intellectual disability is that more young adults with diagnoses or traits of ASD will attend college and require appropriate supports. This study sought to explore college students' openness to peers who demonstrate…

  4. OpenAPC. Open-Access-Publikationskosten als Open Data

    OpenAIRE

    Tullney, Marco

    2015-01-01

    Präsentationsfolien zum Vortrag „OpenAPC. Open-Access-Publikationskosten als Open Data“ in der Session „Ausgestaltung eines wissenschaftsadäquaten APC-Marktes: Grundsätze, Finanzierungsansätze und Management“ der Open-Access-Tage 2015 in Zürich (https://www.open-access.net/community/open-access-tage/open-access-tage-2015-zuerich/programm/#c1974)

  5. Finite Element Analysis of Dam-Reservoir Interaction Using High-Order Doubly Asymptotic Open Boundary

    Directory of Open Access Journals (Sweden)

    Yichao Gao

    2011-01-01

    Full Text Available The dam-reservoir system is divided into the near field modeled by the finite element method, and the far field modeled by the excellent high-order doubly asymptotic open boundary (DAOB. Direct and partitioned coupled methods are developed for the analysis of dam-reservoir system. In the direct coupled method, a symmetric monolithic governing equation is formulated by incorporating the DAOB with the finite element equation and solved using the standard time-integration methods. In contrast, the near-field finite element equation and the far-field DAOB condition are separately solved in the partitioned coupled methodm, and coupling is achieved by applying the interaction force on the truncated boundary. To improve its numerical stability and accuracy, an iteration strategy is employed to obtain the solution of each step. Both coupled methods are implemented on the open-source finite element code OpenSees. Numerical examples are employed to demonstrate the performance of these two proposed methods.

  6. Thermocleavable Materials for Polymer Solar Cells with High Open Circuit Voltage-A Comparative Study

    DEFF Research Database (Denmark)

    Tromholt, Thomas; Gevorgyan, Suren; Jørgensen, Mikkel

    2009-01-01

    The search for polymer solar cells giving a high open circuit voltage was conducted through a comparative study of four types of bulk-heterojunction solar cells employing different photoactive layers. As electron donors the thermo-cleavable polymer poly-(3-(2-methylhexyloxycarbonyl)dithiophene) (P3......MHOCT) and unsubstituted polythiophene (PT) were used, the latter of which results from thermo cleaving the former at 310 °C. As reference, P3HT solar cells were built in parallel. As electron acceptors, either PCBM or bis-[60]PCBM were used. In excess of 300 solar cells were produced under as identical...... conditions as possible, varying only the material combination of the photo active layer. It was observed that on replacing PCBM with bis[60]PCBM, the open circuit voltage on average increased by 100 mV for P3MHOCT and 200 mV for PT solar cells. Open circuit voltages approaching 1 V were observed for the PT:bis...

  7. Mobile Measurements of Methane Using High-Speed Open-Path Technology

    Science.gov (United States)

    Burba, G. G.; Anderson, T.; Ediger, K.; von Fischer, J.; Gioli, B.; Ham, J. M.; Hupp, J. R.; Kohnert, K.; Levy, P. E.; Polidori, A.; Pikelnaya, O.; Price, E.; Sachs, T.; Serafimovich, A.; Zondlo, M. A.; Zulueta, R. C.

    2016-12-01

    Methane plays a critical role in the radiation balance, chemistry of the atmosphere, and air quality. The major anthropogenic sources of CH4 include oil and gas development sites, natural gas distribution networks, landfill emissions, and agricultural production. The majority of oil and gas and urban CH4 emission occurs via variable-rate point sources or diffused spots in topographically challenging terrains (e.g., street tunnels, elevated locations at water treatment plants, vents, etc.). Locating and measuring such CH4 emissions is challenging when using traditional micrometeorological techniques, and requires development of novel approaches. Landfill CH4 emissions traditionally assessed at monthly or longer time intervals are subject to large uncertainties because of the snapshot nature of the measurements and the barometric pumping phenomenon. The majority of agricultural and natural CH4 production occurs in areas with little infrastructure or easily available grid power (e.g., rice fields, arctic and boreal wetlands, tropical mangroves, etc.). A lightweight, high-speed, high-resolution, open-path technology was recently developed for eddy covariance measurements of CH4 flux, with power consumption 30-150 times below other available technologies. It was designed to run on solar panels or a small generator and be placed in the middle of the methane-producing ecosystem without a need for grid power. Lately, this instrumentation has been utilized increasingly more frequently outside of the traditional use on stationary flux towers. These novel approaches include measurements from various moving platforms, such as cars, aircraft, and ships. Projects included mapping of concentrations and vertical profiles, leak detection and quantification, mobile emission detection from natural gas-powered cars, soil CH4 flux surveys, etc. This presentation will describe key projects utilizing the novel lightweight low-power high-resolution open-path technology, and will highlight

  8. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  9. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  10. The Role of Transforms in Gulf of Mexico Opening

    Science.gov (United States)

    Lundin, E.; Doré, A. G.

    2017-12-01

    The curious pie-shaped Gulf of Mexico (GoM) may be considered a high-angle back-arc basin to the Pacific Ocean. Opening was strongly facilitated by transforms, including a terminal transform on its Pacific side. GoM also formed synchronously with the nearby Central Atlantic when Gondwanaland pulled away from Laurasia in the Jurassic. Notably, GoM's oceanic crust never connected with that of the Atlantic, and the isolated nature of this small ocean led to periodically confined conditions that influenced the petroleum system. Of particular importance are the deposition of Callovian age salt and Tithonian age source rocks. The central part of GoM is generally accepted as underlain by oceanic crust, but the position of the continent-ocean boundaries (COB) is debated, as well as the nature of intervening crust. We favor an interpretation of the COBs marked by the regional scale, large-amplitude Houston, Florida, and Campeche magnetic anomalies, in turn probably reflecting seaward dipping reflectors of magma-rich margins. GoM's unusual shape may indirectly represent utilization of pre-existing transforms during the break-up of Pangea. Transforms represent long, linear weaknesses where the crust and lithosphere is already broken. Transforms seem to have governed the break-up of several oceanic segments in the North Atlantic and Arctic. The Suwanne suture of the Rheic Ocean is a pronounced magnetic anomaly that crosses Georgia-Florida and becomes aligned with the Houston magnetic anomaly, which here is interpreted as the northern COB to GoM. The Suwanne suture is oriented at high angle to the rest of the Rheic suture along the Appalachians and probably experienced lateral motion during the transpressional closure of the Rheic Ocean. This transform arguably represents a weak element in the Ouachita-Marathon orogen that allowed the Yucatan microcontinent to easily be plucked from the North American margin during the dispersal of Pangea, forming the GoM in the process. This

  11. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  12. The mediating effect of psychosocial factors on suicidal probability among adolescents.

    Science.gov (United States)

    Hur, Ji-Won; Kim, Won-Joong; Kim, Yong-Ku

    2011-01-01

    Suicidal probability is an actual tendency including negative self-evaluation, hopelessness, suicidal ideation, and hostility. The purpose of this study was to examine the role of psychosocial variances in the suicidal probability of adolescents, especially the role of mediating variance. This study investigated the mediating effects of psychosocial factors such as depression, anxiety, self-esteem, stress, and social support on the suicidal probability among 1,586 adolescents attending middle and high schools in the Kyunggi Province area of South Korea. The relationship between depression and anxiety/suicidal probability was mediated by both social resources and self-esteem. Furthermore, the influence of social resources was mediated by interpersonal and achievement stress as well as self-esteem. This study suggests that suicidal probability in adolescents has various relationships, including mediating relations, with several psychosocial factors. The interventions on suicidal probability in adolescents should focus on social factors as well as clinical symptoms.

  13. Open Veterinary Journal

    African Journals Online (AJOL)

    Open Veterinary Journal is a peer reviewed international open access online and printed journal that publishes high-quality original research articles, reviews, short communications and case reports dedicated to all aspects of veterinary sciences and its related subjects. Other websites associated with this journal: ...

  14. Determination of the failure probability in the weld region of ap-600 vessel for transient condition

    International Nuclear Information System (INIS)

    Wahyono, I.P.

    1997-01-01

    Failure probability in the weld region of AP-600 vessel was determined for transient condition scenario. The type of transient is increase of the heat removal from primary cooling system due to sudden opening of safety valves or steam relief valves on the secondary cooling system or the steam generator. Temperature and pressure in the vessel was considered as the base of deterministic calculation of the stress intensity factor. Calculation of film coefficient of the convective heat transfers is a function of the transient time and water parameter. Pressure, material temperature, flaw depth and transient time are variables for the stress intensity factor. Failure probability consideration was done by using the above information in regard with the flaw and probability distributions of Octavia II and Marshall. Calculation of the failure probability by probability fracture mechanic simulation is applied on the weld region. Failure of the vessel is assumed as a failure of the weld material with one crack which stress intensity factor applied is higher than the critical stress intensity factor. VISA II code (Vessel Integrity Simulation Analysis II) was used for deterministic calculation and simulation. Failure probability of the material is 1.E-5 for Octavia II distribution and 4E-6 for marshall distribution for each transient event postulated. The failure occurred at the 1.7th menit of the initial transient under 12.53 ksi of the pressure

  15. Open Standards, Open Source, and Open Innovation: Harnessing the Benefits of Openness

    Science.gov (United States)

    Committee for Economic Development, 2006

    2006-01-01

    Digitization of information and the Internet have profoundly expanded the capacity for openness. This report details the benefits of openness in three areas--open standards, open-source software, and open innovation--and examines the major issues in the debate over whether openness should be encouraged or not. The report explains each of these…

  16. Open-Source Colorimeter

    OpenAIRE

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial porta...

  17. A simplistic analytical unit cell based model for the effective thermal conductivity of high porosity open-cell metal foams

    International Nuclear Information System (INIS)

    Yang, X H; Kuang, J J; Lu, T J; Han, F S; Kim, T

    2013-01-01

    We present a simplistic yet accurate analytical model for the effective thermal conductivity of high porosity open-cell metal foams saturated in a low conducting fluid (air). The model is derived analytically based on a realistic representative unit cell (a tetrakaidecahedron) under the assumption of one-dimensional heat conduction along highly tortuous-conducting ligaments at high porosity ranges (ε ⩾ 0.9). Good agreement with existing experimental data suggests that heat conduction along highly conducting and tortuous ligaments predominantly defines the effective thermal conductivity of open-cell metal foams with negligible conduction in parallel through the fluid phase. (paper)

  18. Value of semi-open corridors for simultaneously connecting open and wooded habitats: a case study with ground beetles.

    Science.gov (United States)

    Eggers, Britta; Matern, Andrea; Drees, Claudia; Eggers, Jan; Härdtle, Werner; Assmann, Thorsten

    2010-02-01

    To counteract habitat fragmentation, the connectivity of a landscape should be enhanced. Corridors are thought to facilitate movement between disconnected patches of habitat, and linear strips of habitat connecting isolated patches are a popular type of corridor. On the other hand, the creation of new corridors can lead to fragmentation of the surrounding habitat. For example, heathland corridors connect patches of heathland and alternatively hedgerows connect patches of woodland. Nevertheless, these corridors themselves also break up previously connected patches of their surrounding habitat and in so doing fragment another type of habitat (heathland corridors fragment woodlands and woodland strips or hedgerows fragment heathlands). To overcome this challenge we propose the use of semi-open habitats (a mixture of heathland and woodland vegetation) as conservation corridors to enable dispersal of both stenotopic heathland and woodland species. We used two semi-open corridors with a mosaic of heathland and woody vegetation to investigate the efficiency of semi-open corridors for species dispersal and to assess whether these corridors might be a suitable approach for nature conservation. We conducted a mark-recapture study on three stenotopic flightless carabid beetles of heathlands and woodlands and took an inventory of all the carabid species in two semi-open corridors. Both methodological approaches showed simultaneous immigration of woodland and heathland species in the semi-open corridor. Detrended correspondence analysis showed a clear separation of the given habitats and affirmed that semi-open corridors are a good strategy for connecting woodlands and heathlands. The best means of creating and preserving semi-open corridors is probably through extensive grazing.

  19. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  20. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  1. Open charm production at high energies and the quark Reggeization hypothesis

    International Nuclear Information System (INIS)

    Kniehl, B.A.; Shipilova, A.V.

    2008-12-01

    We study open charm production at high energies in the framework of the quasi-multi-Regge-kinematics approach applying the quark-Reggeization hypothesis implemented with Reggeon-Reggeon-particle and Reggeon-particle-particle effective vertices. Adopting the Kimber-Martin-Ryskin unintegrated quark and gluon distribution functions of the proton and photon, we thus nicely describe the proton structure function F 2,c measured at DESY HERA as well as the transverse-momentum distributions of D mesons created by photoproduction at HERA and by hadroproduction at the Fermilab Tevatron. (orig.)

  2. Early deprivation increases high-leaning behavior, a novel anxiety-like behavior, in the open field test in rats.

    Science.gov (United States)

    Kuniishi, Hiroshi; Ichisaka, Satoshi; Yamamoto, Miki; Ikubo, Natsuko; Matsuda, Sae; Futora, Eri; Harada, Riho; Ishihara, Kohei; Hata, Yoshio

    2017-10-01

    The open field test is one of the most popular ethological tests to assess anxiety-like behavior in rodents. In the present study, we examined the effect of early deprivation (ED), a model of early life stress, on anxiety-like behavior in rats. In ED animals, we failed to find significant changes in the time spent in the center or thigmotaxis area of the open field, the common indexes of anxiety-like behavior. However, we found a significant increase in high-leaning behavior in which animals lean against the wall standing on their hindlimbs while touching the wall with their forepaws at a high position. The high-leaning behavior was decreased by treatment with an anxiolytic, diazepam, and it was increased under intense illumination as observed in the center activity. In addition, we compared the high-leaning behavior and center activity under various illumination intensities and found that the high-leaning behavior is more sensitive to illumination intensity than the center activity in the particular illumination range. These results suggest that the high-leaning behavior is a novel anxiety-like behavior in the open field test that can complement the center activity to assess the anxiety state of rats. Copyright © 2017 Elsevier Ireland Ltd and Japan Neuroscience Society. All rights reserved.

  3. Production of 147Eu for gamma-ray emission probability measurement

    International Nuclear Information System (INIS)

    Katoh, Keiji; Marnada, Nada; Miyahara, Hiroshi

    2002-01-01

    Gamma-ray emission probability is one of the most important decay parameters of radionuclide and many researchers are paying efforts to improve the certainty of it. The certainties of γ-ray emission probabilities for neutron-rich nuclides are being improved little by little, but the improvements of those for proton-rich nuclides are still insufficient. Europium-147 that decays by electron capture or β + -particle emission is a proton-rich nuclide and the γ-ray emission probabilities evaluated by Mateosian and Peker have large uncertainties. They referred to only one report concerning with γ-ray emission probabilities. Our final purpose is to determine the precise γ-ray emission probabilities of 147 Eu from disintegration rates and γ-ray intensities by using a 4πβ-γ coincidence apparatus. Impurity nuclides affect largely to the determination of disintegration rate; therefore, a highly pure 147 Eu source is required. This short note will describe the most proper energy for 147 Eu production through 147 Sm(p, n) reaction. (author)

  4. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  5. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    Science.gov (United States)

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.

  6. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  7. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  8. Pro OpenSSH

    CERN Document Server

    Stahnke, Michael

    2006-01-01

    SSH, acronym for Secure Socket Shell, is for users and administrators wishing to establish secure communication between disparate networks. 'Pro OpenSSH', authored by two Fortune 100 system administrators, provides readers with a highly practical reference for configuring and deploying OpenSSH in their own environment.

  9. Habituating to handling: factors affecting preorbital gland opening in red deer calves.

    Science.gov (United States)

    Ceacero, F; Landete-Castillejos, T; Bartošová, J; García, A J; Bartoš, L; Komárková, M; Gallego, L

    2014-09-01

    The preorbital gland plays not only an olfactory role in cervids but also a visual one. Opening this gland is an easy way for the calf to communicate with the mother, indicating hunger/satiety, stress, pain, fear, or excitement. This information can be also useful for farm operators to assess how fast the calves habituate to handling routines and to detect those calves that do not habituate and may suffer chronic stress in the future. Thirty-one calves were subjected to 2 consecutive experiments to clarify if observing preorbital gland opening is related to habituation to handling in red deer calves (Cervus elaphus). Calves were born in 3 different paddocks, handled as newborns (Exp. 1), and then subjected to the same routine handling but with different periodicity: every 1, 2, or 3 wk (Exp. 2). In Exp. 1, preorbital gland opening was recorded in newborns during an initial handling (including weighing, ear tagging, and sex determination). Preorbital gland opening occurred in 93% of calves during this procedure and was not affected by sex, time since birth, or birth weight. Experiment 2 consisted of measuring preorbital opening during the same routine handling (weighing, blood sampling, and rump touching to assess body condition) when calves were 1, 3, and 5 mo old. Binary logistic regression showed that gland opening was associated with habituation to handling, since at 1 and 3 mo the probability of opening the gland decreased with the number of handlings that a calf experienced before (P = 0.008 and P = 0.028, respectively). However, there were no further changes in preorbital gland opening rate in the 5-mo-old calves (P = 0.182). The significant influence of the number of previous handlings on the probability of opening the preorbital gland was confirmed through generalized linear model with repeated measures (P = 0.007). Preorbital gland opening decreased along the phases of the study. Nevertheless, we found a significant trend in individuals to keep similar

  10. Open Science Training Handbook

    OpenAIRE

    Sonja Bezjak; April Clyburne-Sherin; Philipp Conzett; Pedro Fernandes; Edit Görögh; Kerstin Helbig; Bianca Kramer; Ignasi Labastida; Kyle Niemeyer; Fotis Psomopoulos; Tony Ross-Hellauer; René Schneider; Jon Tennant; Ellen Verbakel; Helene Brinken

    2018-01-01

    For a readable version of the book, please visit https://book.fosteropenscience.eu A group of fourteen authors came together in February 2018 at the TIB (German National Library of Science and Technology) in Hannover to create an open, living handbook on Open Science training. High-quality trainings are fundamental when aiming at a cultural change towards the implementation of Open Science principles. Teaching resources provide great support for Open Science instructors and trainers. The ...

  11. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  12. Facing the fight for the control. Weight displacements through energy market opening

    International Nuclear Information System (INIS)

    Frey, D. C.

    1999-01-01

    On February 19, 1999 the European power markets opened, in the framework of the new directives of the European Union. According to a new study by Anderson Consulting the greatest advantage from the energy markets' opening in Europe will be earned probably by the natural-gas industry. Up to the year 2015, 30 to 40% of the electrical power in Europe will be produced from natural gas. Distributors of power and gas are moving towards offering new services [de

  13. Strong lensing probability in TeVeS (tensor-vector-scalar) theory

    Science.gov (United States)

    Chen, Da-Ming

    2008-01-01

    We recalculate the strong lensing probability as a function of the image separation in TeVeS (tensor-vector-scalar) cosmology, which is a relativistic version of MOND (MOdified Newtonian Dynamics). The lens is modeled by the Hernquist profile. We assume an open cosmology with Ωb = 0.04 and ΩΛ = 0.5 and three different kinds of interpolating functions. Two different galaxy stellar mass functions (GSMF) are adopted: PHJ (Panter, Heavens and Jimenez 2004 Mon. Not. R. Astron. Soc. 355 764) determined from SDSS data release 1 and Fontana (Fontana et al 2006 Astron. Astrophys. 459 745) from GOODS-MUSIC catalog. We compare our results with both the predicted probabilities for lenses from singular isothermal sphere galaxy halos in LCDM (Lambda cold dark matter) with a Schechter-fit velocity function, and the observational results for the well defined combined sample of the Cosmic Lens All-Sky Survey (CLASS) and Jodrell Bank/Very Large Array Astrometric Survey (JVAS). It turns out that the interpolating function μ(x) = x/(1+x) combined with Fontana GSMF matches the results from CLASS/JVAS quite well.

  14. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  15. Thermal performance of an open thermosyphon using nanofluid for evacuated tubular high temperature air solar collector

    International Nuclear Information System (INIS)

    Liu, Zhen-Hua; Hu, Ren-Lin; Lu, Lin; Zhao, Feng; Xiao, Hong-shen

    2013-01-01

    Highlights: • A novel solar air collector with simplified CPC and open thermosyphon is designed and tested. • Simplified CPC has a much lower cost at the expense of slight efficiency loss. • Nanofluid effectively improves thermal performance of the above solar air collector. • Solar air collector with open thermosyphon is better than that with concentric tube. - Abstract: A novel evacuated tubular solar air collector integrated with simplified CPC (compound parabolic concentrator) and special open thermosyphon using water based CuO nanofluid as the working fluid is designed to provide air with high and moderate temperature. The experimental system has two linked panels and each panel includes an evacuated tube, a simplified CPC and an open thermosyphon. Outdoor experimental study has been carried out to investigate the actual solar collecting performance of the designed system. Experimental results show that air outlet temperature and system collecting efficiency of the solar air collector using nanofluid as the open thermosyphon’s working fluid are both higher than that using water. Its maximum air outlet temperature exceeds 170 °C at the air volume rate of 7.6 m 3 /h in winter, even though the experimental system consists of only two collecting panels. The solar collecting performance of the solar collector integrated with open thermosyphon is also compared with that integrated with common concentric tube. Experimental results show that the solar collector integrated with open thermosyphon has a much better collecting performance

  16. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  17. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  18. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  19. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  20. Risk and protective factors of dissocial behavior in a probability sample.

    Science.gov (United States)

    Moral de la Rubia, José; Ortiz Morales, Humberto

    2012-07-01

    The aims of this study were to know risk and protective factors for dissocial behavior keeping in mind that the self-report of dissocial behavior is biased by the impression management. A probability sample of adolescents that lived in two neighborhoods with high indexes of gangs and offenses (112 male and 86 women) was collected. The 27-item Dissocial Behavior Scale (ECODI27; Pacheco & Moral, 2010), Balanced Inventory of Desirable Responding, version 6 (BIDR-6; Paulhus, 1991), Sensation Seeking Scale, form V (SSS-V; Zuckerman, Eysenck, & Eysenck, 1978), Parent-Adolescent Communication Scale (PACS; Barnes & Olson, 1982), 30-item Rathus Assertiveness Schedule (RAS; Rathus, 1973), Interpersonal Reactivity Index (IRI; Davis, 1983) and a social relationship questionnaire (SRQ) were applied. Binary logistic regression was used for the data analysis. A third of the participants showed dissocial behavior. Belonging to a gang in the school (schooled adolescents) or to a gang out of school and job (total sample) and desinhibition were risk factors; being woman, perspective taking and open communication with the father were protective factors. School-leaving was a differential aspect. We insisted on the need of intervention on these variables.

  1. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  4. Uncertainty of soil erosion modelling using open source high resolution and aggregated DEMs

    Directory of Open Access Journals (Sweden)

    Arun Mondal

    2017-05-01

    Full Text Available Digital Elevation Model (DEM is one of the important parameters for soil erosion assessment. Notable uncertainties are observed in this study while using three high resolution open source DEMs. The Revised Universal Soil Loss Equation (RUSLE model has been applied to analysis the assessment of soil erosion uncertainty using open source DEMs (SRTM, ASTER and CARTOSAT and their increasing grid space (pixel size from the actual. The study area is a part of the Narmada river basin in Madhya Pradesh state, which is located in the central part of India and the area covered 20,558 km2. The actual resolution of DEMs is 30 m and their increasing grid spaces are taken as 90, 150, 210, 270 and 330 m for this study. Vertical accuracy of DEMs has been assessed using actual heights of the sample points that have been taken considering planimetric survey based map (toposheet. Elevations of DEMs are converted to the same vertical datum from WGS 84 to MSL (Mean Sea Level, before the accuracy assessment and modelling. Results indicate that the accuracy of the SRTM DEM with the RMSE of 13.31, 14.51, and 18.19 m in 30, 150 and 330 m resolution respectively, is better than the ASTER and the CARTOSAT DEMs. When the grid space of the DEMs increases, the accuracy of the elevation and calculated soil erosion decreases. This study presents a potential uncertainty introduced by open source high resolution DEMs in the accuracy of the soil erosion assessment models. The research provides an analysis of errors in selecting DEMs using the original and increased grid space for soil erosion modelling.

  5. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  6. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  7. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  8. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  9. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  10. Numerical experiments on the probability of seepage into underground openings in heterogeneous fractured rock

    International Nuclear Information System (INIS)

    Birkholzer, J.; Li, G.; Tsang, C.F.; Tsang, Y.

    1998-01-01

    An important issue for the performance of underground nuclear waste repositories is the rate of seepage into the waste emplacement drifts. A prediction of this rate is particularly complicated for the potential repository site at Yucca Mountain, Nevada, because it is located in thick, unsaturated, fractured tuff formations. Underground opening in unsaturated media might act as capillary barriers, diverting water around them. In the present work, they study the potential rate of seepage into drifts as a function of the percolation flux at Yucca Mountain, based on a stochastic model of the fractured rock mass in the drift vicinity. A variety of flow scenarios are considered, assuming present-day and possible future climate conditions. They show that the heterogeneity in the flow domain is a key factor controlling seepage rates, since it causes channelized flow and local ponding in the unsaturated flow field

  11. openBEB: open biological experiment browser for correlative measurements.

    Science.gov (United States)

    Ramakrishnan, Chandrasekhar; Bieri, Andrej; Sauter, Nora; Roizard, Sophie; Ringler, Philippe; Müller, Shirley A; Goldie, Kenneth N; Enimanev, Kaloyan; Stahlberg, Henning; Rinn, Bernd; Braun, Thomas

    2014-03-26

    New experimental methods must be developed to study interaction networks in systems biology. To reduce biological noise, individual subjects, such as single cells, should be analyzed using high throughput approaches. The measurement of several correlative physical properties would further improve data consistency. Accordingly, a considerable quantity of data must be acquired, correlated, catalogued and stored in a database for subsequent analysis. We have developed openBEB (open Biological Experiment Browser), a software framework for data acquisition, coordination, annotation and synchronization with database solutions such as openBIS. OpenBEB consists of two main parts: A core program and a plug-in manager. Whereas the data-type independent core of openBEB maintains a local container of raw-data and metadata and provides annotation and data management tools, all data-specific tasks are performed by plug-ins. The open architecture of openBEB enables the fast integration of plug-ins, e.g., for data acquisition or visualization. A macro-interpreter allows the automation and coordination of the different modules. An update and deployment mechanism keeps the core program, the plug-ins and the metadata definition files in sync with a central repository. The versatility, the simple deployment and update mechanism, and the scalability in terms of module integration offered by openBEB make this software interesting for a large scientific community. OpenBEB targets three types of researcher, ideally working closely together: (i) Engineers and scientists developing new methods and instruments, e.g., for systems-biology, (ii) scientists performing biological experiments, (iii) theoreticians and mathematicians analyzing data. The design of openBEB enables the rapid development of plug-ins, which will inherently benefit from the "house keeping" abilities of the core program. We report the use of openBEB to combine live cell microscopy, microfluidic control and visual

  12. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    Science.gov (United States)

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  13. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  14. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  15. Negative quasi-probability as a resource for quantum computation

    International Nuclear Information System (INIS)

    Veitch, Victor; Ferrie, Christopher; Emerson, Joseph; Gross, David

    2012-01-01

    A central problem in quantum information is to determine the minimal physical resources that are required for quantum computational speed-up and, in particular, for fault-tolerant quantum computation. We establish a remarkable connection between the potential for quantum speed-up and the onset of negative values in a distinguished quasi-probability representation, a discrete analogue of the Wigner function for quantum systems of odd dimension. This connection allows us to resolve an open question on the existence of bound states for magic state distillation: we prove that there exist mixed states outside the convex hull of stabilizer states that cannot be distilled to non-stabilizer target states using stabilizer operations. We also provide an efficient simulation protocol for Clifford circuits that extends to a large class of mixed states, including bound universal states. (paper)

  16. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  17. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  18. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  19. An experimental and simulation study of novel channel designs for open-cathode high-temperature polymer electrolyte membrane fuel cells

    DEFF Research Database (Denmark)

    Thomas, Sobi; Bates, Alex; Park, Sam

    2016-01-01

    A minimum balance of plant (BOP) is desired for an open-cathode high temperature polymer electrolyte membrane (HTPEM) fuel cell to ensure low parasitic losses and a compact design. The advantage of an open-cathode system is the elimination of the coolant plate and incorporation of a blower for ox...

  20. DNA Open states and DNA hydratation

    International Nuclear Information System (INIS)

    Lema-Larre, B. de; Martin-Landrove, M

    1995-01-01

    It is a very well-known fact that an protonic exchange exists among natural DNA filaments and synthetic polynucleotides with the solvent (1--2). The existence of DNA open states, that is to say states for which the interior of the DNA molecule is exposed to the external environment, it has been demonstrated by means of proton-deuterium exchange (3). This work has carried out experiments measuring the dispersion of the traverse relaxation rate (4), as a pulsation rate function in a Carr-Purcell-Meiboom-Gill (CPMG) pulses sequence rate, to determine changes in the moist layer of the DNA molecule. The experiments were carried out under different experimental conditions in order to vary the probability that open states occurs, such as temperature or the exposure to electromagnetic fields. Some theoretical models were supposed to adjust the experimental results including those related to DNA non linear dynamic [es

  1. Impact probabilities of meteoroid streams with artificial satellites: An assessment

    International Nuclear Information System (INIS)

    Foschini, L.; Cevolani, G.

    1997-01-01

    Impact probabilities of artificial satellites with meteoroid streams were calculated using data collected with the CNR forward scatter (FS) bistatic radar over the Bologna-Lecce baseline (about 700 km). Results show that impact probabilities are 2 times higher than other previously calculated values. Nevertheless, although catastrophic impacts are still rare even in the case of meteor storm conditions, it is expected that high meteoroid fluxes can erode satellites surfaces and weaken their external structures

  2. Spacetime quantum probabilities II: Relativized descriptions and Popperian propensities

    Science.gov (United States)

    Mugur-Schächter, M.

    1992-02-01

    In the first part of this work(1) we have explicated the spacetime structure of the probabilistic organization of quantum mechanics. We have shown that each quantum mechanical state, in consequence of the spacetime characteristics of the epistemic operations by which the observer produces the state to be studied and the processes of qualification of these, brings in a tree-like spacetime structure, a “quantum mechanical probability tree,” that transgresses the theory of probabilities as it now stands. In this second part we develop the general implications of these results. Starting from the lowest level of cognitive action and creating an appropriate symbolism, we construct a “relativizing epistemic syntax,” a “general method of relativized conceptualization” where—systematically—each description is explicitly referred to the epistemic operations by which the observer produces the entity to be described and obtains qualifications of it. The method generates a typology of increasingly complex relativized descriptions where the question of realism admits of a particularly clear pronouncement. Inside this typology the epistemic processes that lie—UNIVERSALLY—at the basis of any conceptualization, reveal a tree-like spacetime structure. It appears in particular that the spacetime structure of the relativized representation of a probabilistic description, which transgresses the nowadays theory of probabilities, is the general mould of which the quantum mechanical probability trees are only particular realizations. This entails a clear definition of the descriptional status of quantum mechanics. While the recognition of the universal cognitive content of the quantum mechanical formalism opens up vistas toward mathematical developments of the relativizing epistemic syntax. The relativized representation of a probabilistic description leads with inner necessity to a “morphic” interpretation of probabilities that can be regarded as a formalized and

  3. Five-Phase Five-Level Open-Winding/Star-Winding Inverter Drive for Low-Voltage/High-Current Applications

    DEFF Research Database (Denmark)

    Padmanaban, Sanjeevi Kumar; Blaabjerg, Frede; Wheeler, Patrick

    2016-01-01

    This paper work proposed a five-phase five-level open-/star-winding multilevel AC converter suitable for low-voltage/high-current applications. Modular converter consists of classical two-level five-phase voltage source inverter (VSI) with slight reconfiguration to serve as a multilevel converter...... for open-/star-winding loads. Elaborately, per phase of the VSI is built with one additional bi-directional switch (MOSFET/IGBT) and all five legs links to the neutral through two capacitors. The structure allows multilevel generation to five-level output with greater potential for fault tolerability under...

  4. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  5. Nickel oxide film with open macropores fabricated by surfactant-assisted anodic deposition for high capacitance supercapacitors.

    Science.gov (United States)

    Wu, Mao-Sung; Wang, Min-Jyle

    2010-10-07

    Nickel oxide film with open macropores prepared by anodic deposition in the presence of surfactant shows a very high capacitance of 1110 F g(-1) at a scan rate of 10 mV s(-1), and the capacitance value reduces to 950 F g(-1) at a high scan rate of 200 mV s(-1).

  6. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  7. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  8. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  9. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  10. Multistate and multihypothesis discrimination with open quantum systems

    Science.gov (United States)

    Kiilerich, Alexander Holm; Mølmer, Klaus

    2018-05-01

    We show how an upper bound for the ability to discriminate any number N of candidates for the Hamiltonian governing the evolution of an open quantum system may be calculated by numerically efficient means. Our method applies an effective master-equation analysis to evaluate the pairwise overlaps between candidate full states of the system and its environment pertaining to the Hamiltonians. These overlaps are then used to construct an N -dimensional representation of the states. The optimal positive-operator valued measure (POVM) and the corresponding probability of assigning a false hypothesis may subsequently be evaluated by phrasing optimal discrimination of multiple nonorthogonal quantum states as a semidefinite programming problem. We provide three realistic examples of multihypothesis testing with open quantum systems.

  11. The high-risk HPV infection and urinary system tumor

    Directory of Open Access Journals (Sweden)

    Yang Wenyan

    2018-04-01

    Full Text Available HPV is classified into high-risk and low-risk types depending on its probability of leading to tumorigenesis. Many studies have shown that HPV infection, especially the infection caused by the high-risk type, is always related to prostate cancer, bladder cancer, penile cancer, testicular cancer, and other urinary system tumors. However, previous studies differed in sexual openness and racial genetic susceptibility of the study object, sample size, and experimental methods. Hence, the correlation between high-risk HPV infection and urinary system tumors remains controversial. The early open reading frame of the HPV genome is composed of E1–E7, among which E6 and E7 are the key transfer proteins. The combination of these proteins with oncogene and anti-oncogene may be one of the mechanisms leading to tumorigenesis.

  12. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  13. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  14. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  15. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. The role of grazing in nutrient-rich areas of the open sea

    International Nuclear Information System (INIS)

    Frost, B.W.

    1991-01-01

    No single factor accounts fully for the persistently low phytoplankton stocks in the nutrient-rich areas of the open sea. However, grazing plays the necessary role of consuming phytoplankton produced in excess of losses due to physical processes and sinking. Without grazing, even if specific growth rate of the phytoplankton is less than optimal for the prevailing light and temperature conditions, as might be so under limitation by a trace nutrient such as Fe, the phytoplankton stock would still accumulate with attendant depletion of nutrients. Observations during spring and summer in the open subarctic Pacific argue against limitation of phytoplankton growth to the point where phytoplankton stock could not increase in the absence of grazing. An ecosystem process model of the phytoplankton-grazer interaction suggests that two processes - grazing control of phytoplankton stock and preferential utilization of NH 4 by the phytoplankton - are sufficient to explain the continuously low phytoplankton stock and high concentrations of macronutrients. However, the grazing control may be exerted on a phytoplankton assemblage structured by Fe limitation. In particular, the intrinsic growth rates of potentially fast-growing diatoms seem to be depressed in the open subarctic Pacific. These conditions probably apply to two other nutrient-rich areas of the open sea, the Pacific equatorial upwelling region and the subantarctic circumpolar ocean, although in the latter region light limitation of phytoplankton growth may be more severe and silica limitation may influence the specific composition of the phytoplankton assemblage

  17. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  18. OpenCL programming guide

    CERN Document Server

    Munshi, Aaftab; Mattson, Timothy G; Fung, James; Ginsburg, Dan

    2011-01-01

    Using the new OpenCL (Open Computing Language) standard, you can write applications that access all available programming resources: CPUs, GPUs, and other processors such as DSPs and the Cell/B.E. processor. Already implemented by Apple, AMD, Intel, IBM, NVIDIA, and other leaders, OpenCL has outstanding potential for PCs, servers, handheld/embedded devices, high performance computing, and even cloud systems. This is the first comprehensive, authoritative, and practical guide to OpenCL 1.1 specifically for working developers and software architects. Written by five leading OpenCL authorities, OpenCL Programming Guide covers the entire specification. It reviews key use cases, shows how OpenCL can express a wide range of parallel algorithms, and offers complete reference material on both the API and OpenCL C programming language. Through complete case studies and downloadable code examples, the authors show how to write complex parallel programs that decompose workloads across many different devices. They...

  19. Quantifying probabilities of eruptions at Mount Etna (Sicily, Italy).

    Science.gov (United States)

    Brancato, Alfonso

    2010-05-01

    One of the major goals of modern volcanology is to set up sound risk-based decision-making in land-use planning and emergency management. Volcanic hazard must be managed with reliable estimates of quantitative long- and short-term eruption forecasting, but the large number of observables involved in a volcanic process suggests that a probabilistic approach could be a suitable tool in forecasting. The aim of this work is to quantify probabilistic estimate of the vent location for a suitable lava flow hazard assessment at Mt. Etna volcano, through the application of the code named BET (Marzocchi et al., 2004, 2008). The BET_EF model is based on the event tree philosophy assessed by Newhall and Hoblitt (2002), further developing the concept of vent location, epistemic uncertainties, and a fuzzy approach for monitoring measurements. A Bayesian event tree is a specialized branching graphical representation of events in which individual branches are alternative steps from a general prior event, and evolving into increasingly specific subsequent states. Then, the event tree attempts to graphically display all relevant possible outcomes of volcanic unrest in progressively higher levels of detail. The procedure is set to estimate an a priori probability distribution based upon theoretical knowledge, to accommodate it by using past data, and to modify it further by using current monitoring data. For the long-term forecasting, an a priori model, dealing with the present tectonic and volcanic structure of the Mt. Etna, is considered. The model is mainly based on past vent locations and fracture location datasets (XX century of eruptive history of the volcano). Considering the variation of the information through time, and their relationship with the structural setting of the volcano, datasets we are also able to define an a posteriori probability map for next vent opening. For short-term forecasting vent opening hazard assessment, the monitoring has a leading role, primarily

  20. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  1. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  2. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  3. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  5. Probability judgments under ambiguity and conflict.

    Science.gov (United States)

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.

  6. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  7. High School Coaches' Experiences With Openly Lesbian, Gay, and Bisexual Athletes.

    Science.gov (United States)

    Halbrook, Meghan K; Watson, Jack C; Voelker, Dana K

    2018-01-17

    Despite reports that there has been a positive trend in perception and treatment of lesbian, gay, and bisexual (LGB) individuals in recent years (Griffin, 2012 ; Loftus, 2001 ), sport, in general, is still an uncertain, and sometimes even hostile, environment for LGB athletes (Anderson, 2005 ; Waldron & Krane, 2005 ). To gain more information on coach understanding and perceptions of the team environment, 10 high school head coaches in the United States were interviewed to explore their experiences coaching openly LGB athletes. Qualitative analyses revealed four primary themes associated with coach experiences: team environment dogmas and observations, fundamental beliefs contributing to perceptions of LGB athletes, types and timing of sexual orientation disclosure, and differential LGB athlete characteristics. Future research should examine these primary themes in more detail through interviews with LGB athletes, as well as high school coaches in more traditionally masculine sports, such as football, men's basketball, and wrestling.

  8. Open3DQSAR

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    2011-01-01

    Description Open3DQSAR is an open-source tool aimed at pharmacophore exploration by high-throughput chemometric analysis of molecular interaction fields (MIFs). Open3DQSAR can generate steric potential, electron density and MM/QM electrostatic potential fields; furthermore, it can import GRIDKONT...... binary files produced by GRID and CoMFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). Subsequently, Open3DQSAR performs fast, automated PLS chemometric analysis of MIFs allowing to quickly generate and challenge the predictivity of many 3D-QSAR models using different training...... integration with OpenBabel, PyMOL, gnuplot •Multi-threaded computation of MIFs (both MM and QM); support for MMFF94 and GAFF force-fields with automated assignment of atom types to the imported molecular structures •Comprehensive output, including SDF molecular databases, 3D maps and many different plots...

  9. Are voluntary wheel running and open-field behavior correlated in mice? Different answers from comparative and artificial selection approaches.

    Science.gov (United States)

    Careau, Vincent; Bininda-Emonds, Olaf R P; Ordonez, Genesis; Garland, Theodore

    2012-09-01

    Voluntary wheel running and open-field behavior are probably the two most widely used measures of locomotion in laboratory rodents. We tested whether these two behaviors are correlated in mice using two approaches: the phylogenetic comparative method using inbred strains of mice and an ongoing artificial selection experiment on voluntary wheel running. After taking into account the measurement error and phylogenetic relationships among inbred strains, we obtained a significant positive correlation between distance run on wheels and distance moved in the open-field for both sexes. Thigmotaxis was negatively correlated with distance run on wheels in females but not in males. By contrast, mice from four replicate lines bred for high wheel running did not differ in either distance covered or thigmotaxis in the open field as compared with mice from four non-selected control lines. Overall, results obtained in the selection experiment were generally opposite to those observed among inbred strains. Possible reasons for this discrepancy are discussed.

  10. NPV risk simulation of an open pit gold mine project under the O'Hara cost model by using GAs

    Institute of Scientific and Technical Information of China (English)

    Franco-Sepulveda Giovanni; Campuzano Carlos; Pineda Cindy

    2017-01-01

    This paper analyzes an open pit gold mine project based on the O'Hara cost model. Hypothetical data is proposed based on different authors that have studied open pit gold projects, and variations are proposed according to the probability distributions associated to key variables affecting the NPV, like production level, ore grade, price of ore, and others, so as to see what if, in a gold open pit mine project of 3000 metric tons per day of ore. Two case scenarios were analyzed to simulate the NPV, one where there is low certainty data available, and the other where the information available is of high certainty. Results based on genetic algorithm metaheuristic simulations, which combine basically Montecarlo simulations provided by the Palisade Risk software, the O'Hara cost model, net smelter return and financial analysis tools offered by Excel are reported, in order to determine to which variables of the project is more sensitive the NPV.

  11. Strong lensing probability in TeVeS (tensor–vector–scalar) theory

    International Nuclear Information System (INIS)

    Chen Daming

    2008-01-01

    We recalculate the strong lensing probability as a function of the image separation in TeVeS (tensor–vector–scalar) cosmology, which is a relativistic version of MOND (MOdified Newtonian Dynamics). The lens is modeled by the Hernquist profile. We assume an open cosmology with Ω b = 0.04 and Ω Λ = 0.5 and three different kinds of interpolating functions. Two different galaxy stellar mass functions (GSMF) are adopted: PHJ (Panter, Heavens and Jimenez 2004 Mon. Not. R. Astron. Soc. 355 764) determined from SDSS data release 1 and Fontana (Fontana et al 2006 Astron. Astrophys. 459 745) from GOODS-MUSIC catalog. We compare our results with both the predicted probabilities for lenses from singular isothermal sphere galaxy halos in LCDM (Lambda cold dark matter) with a Schechter-fit velocity function, and the observational results for the well defined combined sample of the Cosmic Lens All-Sky Survey (CLASS) and Jodrell Bank/Very Large Array Astrometric Survey (JVAS). It turns out that the interpolating function μ(x) = x/(1+x) combined with Fontana GSMF matches the results from CLASS/JVAS quite well

  12. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  13. Proposal of a Python interface to OpenMI, as the base for open source hydrological framework

    Directory of Open Access Journals (Sweden)

    Robert Szczepanek

    2012-03-01

    Full Text Available Hydrologists need simple, yet powerful, open source framework for developing and testing mathematical models. Such framework should ensure long-term interoperability and high scalability. This can be done by implementation of the existing, already tested standards. At the moment two interesting options exist: Open Modelling Interface (OpenMI and Object Modeling System (OMS. OpenMI was developed within the Fifth European Framework Programme for integrated watershed management, described in the Water Framework Directive. OpenMI interfaces are available for the C# and Java programming languages. OpenMI Association is now in the process of agreement with Open Geospatial Consortium (OGC, so the spatial standards existing in OpenMI 2.0 should be better implemented in the future. The OMS project is pure Java, object-oriented modeling framework coordinated by the U.S. Department of Agriculture. Big advantage of OMS compared to OpenMI is its simplicity of implementation. On the other hand, OpenMI seems to be more powerful and better suited for hydrological models. Finally, OpenMI model was selected as the base interface for the proposed open source hydrological framework.  The existing hydrological libraries and models focus usually on just one GIS package (HydroFOSS – GRASS or one operating system (HydroDesktop – Microsoft Windows. The new hydrological framework should break those limitations. To make hydrological models’ implementation as easy as possible, the framework should be based on a simple, high-level computer language. Low and mid-level languages, like Java (SEXTANTE or C (GRASS, SAGA were excluded, as too complicated for regular hydrologist. From popular, high-level languages, Python seems to be a good choice. Leading GIS desktop applications – GRASS and QGIS – use Python as second native language, providing well documented API. This way, a Python-based hydrological library could be easily integrated with any GIS package supporting

  14. Fluctuation relation for heat exchange in Markovian open quantum systems

    Science.gov (United States)

    Ramezani, M.; Golshani, M.; Rezakhani, A. T.

    2018-04-01

    A fluctuation relation for the heat exchange of an open quantum system under a thermalizing Markovian dynamics is derived. We show that the probability that the system absorbs an amount of heat from its bath, at a given time interval, divided by the probability of the reverse process (releasing the same amount of heat to the bath) is given by an exponential factor which depends on the amount of heat and the difference between the temperatures of the system and the bath. Interestingly, this relation is akin to the standard form of the fluctuation relation (for forward-backward dynamics). We also argue that the probability of the violation of the second law of thermodynamics in the form of the Clausius statement (i.e., net heat transfer from a cold system to its hot bath) drops exponentially with both the amount of heat and the temperature differences of the baths.

  15. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  16. Probability analysis of MCO over-pressurization during staging

    International Nuclear Information System (INIS)

    Pajunen, A.L.

    1997-01-01

    The purpose of this calculation is to determine the probability of Multi-Canister Overpacks (MCOs) over-pressurizing during staging at the Canister Storage Building (CSB). Pressurization of an MCO during staging is dependent upon changes to the MCO gas temperature and the build-up of reaction products during the staging period. These effects are predominantly limited by the amount of water that remains in the MCO following cold vacuum drying that is available for reaction during staging conditions. Because of the potential for increased pressure within an MCO, provisions for a filtered pressure relief valve and rupture disk have been incorporated into the MCO design. This calculation provides an estimate of the frequency that an MCO will contain enough water to pressurize beyond the limits of these design features. The results of this calculation will be used in support of further safety analyses and operational planning efforts. Under the bounding steady state CSB condition assumed for this analysis, an MCO must contain less than 1.6 kg (3.7 lbm) of water available for reaction to preclude actuation of the pressure relief valve at 100 psid. To preclude actuation of the MCO rupture disk at 150 psid, an MCO must contain less than 2.5 kg (5.5 lbm) of water available for reaction. These limits are based on the assumption that hydrogen generated by uranium-water reactions is the sole source of gas produced within the MCO and that hydrates in fuel particulate are the primary source of water available for reactions during staging conditions. The results of this analysis conclude that the probability of the hydrate water content of an MCO exceeding 1.6 kg is 0.08 and the probability that it will exceed 2.5 kg is 0.01. This implies that approximately 32 of 400 staged MCOs may experience pressurization to the point where the pressure relief valve actuates. In the event that an MCO pressure relief valve fails to open, the probability is 1 in 100 that the MCO would experience

  17. Calculating the albedo characteristics by the method of transmission probabilities

    International Nuclear Information System (INIS)

    Lukhvich, A.A.; Rakhno, I.L.; Rubin, I.E.

    1983-01-01

    The possibility to use the method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones is studied. The transmission probabilities method is a numerical method for solving the transport equation in the integrated form. All calculations have been conducted as a one-group approximation for the planes and rods with different optical thicknesses and capture-to-scattering ratios. Above calculations for plane and cylindrical geometries have shown the possibility to use the numerical method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones with high accuracy. In this case the computer time consumptions are minimum even with the cylindrical geometry, if the interpolation calculation of characteristics is used for the neutrons of the first path

  18. Laparoscopic Versus Open Resection for Colorectal Liver Metastases: The OSLO-COMET Randomized Controlled Trial.

    Science.gov (United States)

    Fretland, Åsmund Avdem; Dagenborg, Vegar Johansen; Bjørnelv, Gudrun Maria Waaler; Kazaryan, Airazat M; Kristiansen, Ronny; Fagerland, Morten Wang; Hausken, John; Tønnessen, Tor Inge; Abildgaard, Andreas; Barkhatov, Leonid; Yaqub, Sheraz; Røsok, Bård I; Bjørnbeth, Bjørn Atle; Andersen, Marit Helen; Flatmark, Kjersti; Aas, Eline; Edwin, Bjørn

    2018-02-01

    To perform the first randomized controlled trial to compare laparoscopic and open liver resection. Laparoscopic liver resection is increasingly used for the surgical treatment of liver tumors. However, high-level evidence to conclude that laparoscopic liver resection is superior to open liver resection is lacking. Explanatory, assessor-blinded, single center, randomized superiority trial recruiting patients from Oslo University Hospital, Oslo, Norway from February 2012 to January 2016. A total of 280 patients with resectable liver metastases from colorectal cancer were randomly assigned to undergo laparoscopic (n = 133) or open (n = 147) parenchyma-sparing liver resection. The primary outcome was postoperative complications within 30 days (Accordion grade 2 or higher). Secondary outcomes included cost-effectiveness, postoperative hospital stay, blood loss, operation time, and resection margins. The postoperative complication rate was 19% in the laparoscopic-surgery group and 31% in the open-surgery group (12 percentage points difference [95% confidence interval 1.67-21.8; P = 0.021]). The postoperative hospital stay was shorter for laparoscopic surgery (53 vs 96 hours, P < 0.001), whereas there were no differences in blood loss, operation time, and resection margins. Mortality at 90 days did not differ significantly from the laparoscopic group (0 patients) to the open group (1 patient). In a 4-month perspective, the costs were equal, whereas patients in the laparoscopic-surgery group gained 0.011 quality-adjusted life years compared to patients in the open-surgery group (P = 0.001). In patients undergoing parenchyma-sparing liver resection for colorectal metastases, laparoscopic surgery was associated with significantly less postoperative complications compared to open surgery. Laparoscopic resection was cost-effective compared to open resection with a 67% probability. The rate of free resection margins was the same in both groups. Our results support the continued

  19. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  20. Tailoring single-photon and multiphoton probabilities of a single-photon on-demand source

    International Nuclear Information System (INIS)

    Migdall, A.L.; Branning, D.; Castelletto, S.

    2002-01-01

    As typically implemented, single-photon sources cannot be made to produce single photons with high probability, while simultaneously suppressing the probability of yielding two or more photons. Because of this, single-photon sources cannot really produce single photons on demand. We describe a multiplexed system that allows the probabilities of producing one and more photons to be adjusted independently, enabling a much better approximation of a source of single photons on demand

  1. Open3DGRID

    DEFF Research Database (Denmark)

    Tosco, Paolo; Balle, Thomas

    2011-01-01

    MFA/CoMSIA fields (exported from SYBYL with the aid of a small SPL script). High computational performance is attained through implementation of parallelized algorithms for MIF generation. Most prominent features in Open3DGRID include: •Seamless integration with OpenBabel, PyMOL, GAUSSIAN, FIREFLY, GAMESS...... visualization of results in PyMOL, MOE, Maestro and SYBYL •User-friendly interface to all major QM packages (e.g. GAUSSIAN, FIREFLY, GAMESS-US, TURBOMOLE, MOLDEN), allows calculation of QM electron density and electrostatic potential 3D maps from within Open3DGRID •User-friendly interface to Molecular Discovery...

  2. Fabrication of nickel hydroxide electrodes with open-ended hexagonal nanotube arrays for high capacitance supercapacitors.

    Science.gov (United States)

    Wu, Mao-Sung; Huang, Kuo-Chih

    2011-11-28

    A nickel hydroxide electrode with open-ended hexagonal nanotube arrays, prepared by hydrolysis of nickel chloride in the presence of hexagonal ZnO nanorods, shows a very high capacitance of 1328 F g(-1) at a discharge current density of 1 A g(-1) due to the significantly improved ion transport.

  3. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  4. Open Source, Open Access, Open Review, Open Data. Initiativen zu mehr Offenheit in der digitalen Welt

    OpenAIRE

    Herb, Ulrich

    2011-01-01

    The article discusses the principles of openess, open access and open availability of information based on the examples of open access to scientific information, open government data, open geographical data and open source software.

  5. MANTA – An Open-Source, High Density Electrophysiology Recording Suite for MATLAB

    Directory of Open Access Journals (Sweden)

    Bernhard eEnglitz

    2013-05-01

    Full Text Available The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to break the neural code, whether single cell, local field potential (LFP, micro-electrocorticograms (μECoG, electroencephalographic (EEG, magnetoencephalographic (MEG or in vitro micro-electrode array (MEA data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ systems with high channel counts (>100 can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users.We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC, usage of analog or digital headstages, low per channel cost (<$90/channel, feature-rich display & filtering, a user-friendly interface, and a modular design permitting easy addition of new features. MANTA is licensed under the GPL and free of charge. The system has been tested by daily use in multiple setups for >1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays, some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems.With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point.

  6. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  7. Dynamic encoding of speech sequence probability in human temporal cortex.

    Science.gov (United States)

    Leonard, Matthew K; Bouchard, Kristofer E; Tang, Claire; Chang, Edward F

    2015-05-06

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. Copyright © 2015 the authors 0270-6474/15/357203-12$15.00/0.

  8. [Investigation of a new highly porous hydroxyapatite matrix for obliterating open mastoid cavities - application in guinea pigs bulla].

    Science.gov (United States)

    Punke, C; Zehlicke, T; Boltze, C; Pau, H W

    2009-04-01

    Many different techniques for obliterating open mastoid cavity have been described. The results after the application of alloplastic materials like Hydroxyapatite and Tricalciumphosphate were poor due to long-lasting resorption. Extrusion of those materials has been described. We investigated the applicability of a new high-porosity ceramic for obliterating large open mastoid cavities and tested it in an animal model (bulla of guinea pig). A highly porous matrix (NanoBone) bone-inductor fabricated in a sol-gel-technique was administered unilaterally into the opened bullae of 30 guinea pigs. In each animal the opposite bulla was filled with Bio-Oss, a bone substitute consisting of a portion of mineral bovine bone. Histological evaluations were performed 1, 2, 3, 4, 5 and 12 weeks after the implantation. After the initial phase with an inflammatory reaction creating a loose granulation tissue, we observed the formation of trabeculare bone within the fourth week in both groups. From the fifth week on we found osteoclasts on the surface of NanoBone and Bio-Oss with consecutive degradation of both materials. In our animal model study we found beneficial properties of the used bone-inductors NanoBone and Bio-Oss for obliterating open mastoid cavities.

  9. Laptop theft: a case study on effectiveness of security mechanisms in open organizations

    NARCIS (Netherlands)

    Dimkov, T.; Pieters, Wolter; Hartel, Pieter H.

    Organizations rely on physical, technical and procedural mechanisms to protect their physical assets. Of all physical assets, laptops are the probably the most troublesome to protect, since laptops are easy to remove and conceal. Organizations open to the public, such as hospitals and universities,

  10. TERRA REF: Advancing phenomics with high resolution, open access sensor and genomics data

    Science.gov (United States)

    LeBauer, D.; Kooper, R.; Burnette, M.; Willis, C.

    2017-12-01

    Automated plant measurement has the potential to improve understanding of genetic and environmental controls on plant traits (phenotypes). The application of sensors and software in the automation of high throughput phenotyping reflects a fundamental shift from labor intensive hand measurements to drone, tractor, and robot mounted sensing platforms. These tools are expected to speed the rate of crop improvement by enabling plant breeders to more accurately select plants with improved yields, resource use efficiency, and stress tolerance. However, there are many challenges facing high throughput phenomics: sensors and platforms are expensive, currently there are few standard methods of data collection and storage, and the analysis of large data sets requires high performance computers and automated, reproducible computing pipelines. To overcome these obstacles and advance the science of high throughput phenomics, the TERRA Phenotyping Reference Platform (TERRA-REF) team is developing an open-access database of high resolution sensor data. TERRA REF is an integrated field and greenhouse phenotyping system that includes: a reference field scanner with fifteen sensors that can generate terrabytes of data each day at mm resolution; UAV, tractor, and fixed field sensing platforms; and an automated controlled-environment scanner. These platforms will enable investigation of diverse sensing modalities, and the investigation of traits under controlled and field environments. It is the goal of TERRA REF to lower the barrier to entry for academic and industry researchers by providing high-resolution data, open source software, and online computing resources. Our project is unique in that all data will be made fully public in November 2018, and is already available to early adopters through the beta-user program. We will describe the datasets and how to use them as well as the databases and computing pipeline and how these can be reused and remixed in other phenomics pipelines

  11. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  12. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  13. About the Bernoulli’s Probability Formula Application for Analyzing the Results of the Unified State Examination

    Directory of Open Access Journals (Sweden)

    L. M. Nuriyeva

    2013-01-01

    Full Text Available The paper looks at implementing the probability theory and mathematical statistics while analyzing the outcomes of the unified state examination (USE. The research is aimed at investigating the impact of closed questions that make the greater part of USE, on test results. The methodology is based on so called Bernoulli’s trial. The research findings demonstrate the higher probability of incidental right answers to the closed questions compared with the open ones. The author makes a conclusion that the considerable number of closed questions in the test can misrepresent the final result which tends to improve. The proposed method of statistic analysis can provide the explanation for the USE results anomalies, evaluate the quality of examination materials and scoring system, and give the quantified assessment of social implications. 

  14. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  15. Openly Published Environmental Sensing (OPEnS) | Advancing Open-Source Research, Instrumentation, and Dissemination

    Science.gov (United States)

    Udell, C.; Selker, J. S.

    2017-12-01

    The increasing availability and functionality of Open-Source software and hardware along with 3D printing, low-cost electronics, and proliferation of open-access resources for learning rapid prototyping are contributing to fundamental transformations and new technologies in environmental sensing. These tools invite reevaluation of time-tested methodologies and devices toward more efficient, reusable, and inexpensive alternatives. Building upon Open-Source design facilitates community engagement and invites a Do-It-Together (DIT) collaborative framework for research where solutions to complex problems may be crowd-sourced. However, barriers persist that prevent researchers from taking advantage of the capabilities afforded by open-source software, hardware, and rapid prototyping. Some of these include: requisite technical skillsets, knowledge of equipment capabilities, identifying inexpensive sources for materials, money, space, and time. A university MAKER space staffed by engineering students to assist researchers is one proposed solution to overcome many of these obstacles. This presentation investigates the unique capabilities the USDA-funded Openly Published Environmental Sensing (OPEnS) Lab affords researchers, within Oregon State and internationally, and the unique functions these types of initiatives support at the intersection of MAKER spaces, Open-Source academic research, and open-access dissemination.

  16. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  17. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  20. Open source system OpenVPN in a function of Virtual Private Network

    Science.gov (United States)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  1. High explosive driven plasma opening switches

    International Nuclear Information System (INIS)

    Greene, A.E.; Bowers, R.L.; Brownell, J.H.; Goforth, J.H.; Oliphant, T.A.; Weiss, D.L.

    1983-01-01

    A joint theoretical and experimental effort is underway to understand and improve upon the performance of high explosive driven plasma opening switches such as those first described by Pavlovskii et al. We have modeled these switches in both planar and cylindrical geometry using a one dimensional Lagrangian MHD code. This one-dimensional analysis is now essentially complete. It has shown that simple, one-dimensional, compression of the current-carrying channel can explain the observed resistance increases during the time of flight of the HE detonation products. Our calculations imply that ionization plays an important role as an energy sink and the performance of these switches might be improved by a judicious choice of gases. We also predict improved performance by lowering the pressure in the plasma channel. The bulk of our experimental effort to date has been with planar switches. We have worked with current densities of 0.25 to 0.4 MA/cm and have observed resistance increases of 40 to 60 mΩ. Significant resistance increases are observed later than the time of flight of the HE detonation products. We suggest that these resistance increases are due to mixing between the hot plasma and the relatively cooler detonation products. Such mixing is not included in the 1-D, Lagrangian code. We are presently beginning a computational effort with a 2-D Eulerian code. The status of this effort is discussed. Experimentally we have designed an apparatus that will permit us to test the role of different gases and pressures. This system is also in a planar geometry, but the plasma channel is doughnut shaped, permitting us to avoid edge effects associated with the planar rectangular geometry. The first experiments with this design are quite encouraging and the status of this effort is also discussed

  2. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  3. A high-resolution open biomass burning emission inventory based on statistical data and MODIS observations in mainland China

    Science.gov (United States)

    Xu, Y.; Fan, M.; Huang, Z.; Zheng, J.; Chen, L.

    2017-12-01

    Open biomass burning which has adverse effects on air quality and human health is an important source of gas and particulate matter (PM) in China. Current emission estimations of open biomass burning are generally based on single source (alternative to statistical data and satellite-derived data) and thus contain large uncertainty due to the limitation of data. In this study, to quantify the 2015-based amount of open biomass burning, we established a new estimation method for open biomass burning activity levels by combining the bottom-up statistical data and top-down MODIS observations. And three sub-category sources which used different activity data were considered. For open crop residue burning, the "best estimate" of activity data was obtained by averaging the statistical data from China statistical yearbooks and satellite observations from MODIS burned area product MCD64A1 weighted by their uncertainties. For the forest and grassland fires, their activity levels were represented by the combination of statistical data and MODIS active fire product MCD14ML. Using the fire radiative power (FRP) which is considered as a better indicator of active fire level as the spatial allocation surrogate, coarse gridded emissions were reallocated into 3km ×3km grids to get a high-resolution emission inventory. Our results showed that emissions of CO, NOx, SO2, NH3, VOCs, PM2.5, PM10, BC and OC in mainland China were 6607, 427, 84, 79, 1262, 1198, 1222, 159 and 686 Gg/yr, respectively. Among all provinces of China, Henan, Shandong and Heilongjiang were the top three contributors to the total emissions. In this study, the developed open biomass burning emission inventory with a high-resolution could support air quality modeling and policy-making for pollution control.

  4. Passage and survival probabilities of juvenile Chinook salmon at Cougar Dam, Oregon, 2012

    Science.gov (United States)

    Beeman, John W.; Evans, Scott D.; Haner, Philip V.; Hansel, Hal C.; Hansen, Amy C.; Smith, Collin D.; Sprando, Jamie M.

    2014-01-01

    This report describes studies of juvenile-salmon dam passage and apparent survival at Cougar Dam, Oregon, during two operating conditions in 2012. Cougar Dam is a 158-meter tall rock-fill dam used primarily for flood control, and passes water through a temperature control tower to either a powerhouse penstock or to a regulating outlet (RO). The temperature control tower has moveable weir gates to enable water of different elevations and temperatures to be drawn through the dam to control water temperatures downstream. A series of studies of downstream dam passage of juvenile salmonids were begun after the National Oceanic and Atmospheric Administration determined that Cougar Dam was impacting the viability of anadromous fish stocks. The primary objectives of the studies described in this report were to estimate the route-specific fish passage probabilities at the dam and to estimate the survival probabilities of fish passing through the RO. The first set of dam operating conditions, studied in November, consisted of (1) a mean reservoir elevation of 1,589 feet, (2) water entering the temperature control tower through the weir gates, (3) most water routed through the turbines during the day and through the RO during the night, and (4) mean RO gate openings of 1.2 feet during the day and 3.2 feet during the night. The second set of dam operating conditions, studied in December, consisted of (1) a mean reservoir elevation of 1,507 ft, (2) water entering the temperature control tower through the RO bypass, (3) all water passing through the RO, and (4) mean RO gate openings of 7.3 feet during the day and 7.5 feet during the night. The studies were based on juvenile Chinook salmon (Oncorhynchus tshawytscha) surgically implanted with radio transmitters and passive integrated transponder (PIT) tags. Inferences about general dam passage percentage and timing of volitional migrants were based on surface-acclimated fish released in the reservoir. Dam passage and apparent

  5. Open principle for large high-resolution solar telescopes

    NARCIS (Netherlands)

    Hammerschlag, R.H.; Bettonvil, F.C.M.; Jägers, A.P.L.; Sliepen, G.

    2009-01-01

    Vacuum solar telescopes solve the problem of image deterioration inside the telescope due to refractive index fluctuations of the air heated by the solar light. However, such telescopes have a practical diameter limit somewhat over 1 m. The Dutch Open Telescope (DOT) was the pioneering demonstrator

  6. Synthesis of benzamides by microwave assisted ring opening of less reactive dimethylaminobenzylidene oxazolone

    Directory of Open Access Journals (Sweden)

    Saurabh C. Khadse

    2017-02-01

    Full Text Available This paper presents the synthesis of some benzamide compounds (B1–B10 by microwave-assisted ring opening of 4-(4-dimethylaminobenzylidene-2-phenyl-5-oxazolone (AZ4. By conventional synthesis involving heating, it was found difficult to obtain ring-opened products, probably due to poor tendency of the carbonyl carbon (C5 of AZ4 to undergo nucleophilic attack by mono/or disubstituted anilines. Microwave assisted reactions were easy to perform, have reduced the reaction time and produced good yields.

  7. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  8. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  9. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  10. Properties of polycrystalline indium oxide in open air and in vacuum

    International Nuclear Information System (INIS)

    Solov'eva, A.E.; Zhdanov, V.A.; Markov, V.L.; Shvangiradze, R.R.

    1982-01-01

    Properties of polycrystalline indium oxide according to annealing temperature in open air and in vacuum are investigated. It is established that the indium oxide begins to change its chemical composition during the annealing in the open air from 1200 deg C, and in the vacuum - form 800 deg C. During the annealing of the samples in ths open air in the temperature range of 1200-1450 deg C the lattice of the indium oxide loses probably, only oxygen; this process is accompanied by change of the samples color, electrophysical properties, lattice parameter density. Cation sublattice is disturbed in the vacuum beginning from 900 deg C, which is accompanied by destruction of the color centers. X-ray density and the activation energy of the reduction accounting the formation of the color centers are calculated on the base of the X-ray data and the deviation from stoichiometry of the indium oxide depending on the annealing temperature in the open air

  11. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  12. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  13. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  14. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  15. Beyond the drip-line: a high-resolution open-air Holocene hunter-gatherer sequence from highland Lesotho

    CSIR Research Space (South Africa)

    Mitchell, P

    2011-03-01

    Full Text Available the drip-line: a high-resolution open-air Holocene hunter-gatherer sequence from highland Lesotho Peter Mitchell1, Ina Plug2, Geoff Bailey3, Ruth Charles4, Amanda Esterhuysen5, Julia Lee Thorp6, Adrian Parker7 & Stephan Woodborne8 The activities...

  16. Open access, open education resources and open data in Uganda ...

    African Journals Online (AJOL)

    As a follow up to OpenCon 2014, International Federation of Medical Students' Associations (IFMSA) students organized a 3 day workshop Open Access, Open Education Resources and Open Data in Kampala from 15-18 December 2014. One of the aims of the workshop was to engage the Open Access movement in ...

  17. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  18. A novel carbon gun for use with plasma opening switches

    International Nuclear Information System (INIS)

    Stevenson, P.; Gregory, K.; Cliffe, R.J.; Smith, I.R.

    2001-01-01

    The carbon gun is probably the most common plasma source used in plasma opening switches. Nevertheless, it either produces a contaminated plasma, as the flashover surface erodes, or requires regular treatment with graphite paint. The novel form of the plasma gun described in this paper overcomes the disadvantages of existing designs and produces a cleaner plasma. Experimental results illustrate the performance of a prototype system. (author)

  19. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  20. Facilitating the openEHR approach - organizational structures for defining high-quality archetypes.

    Science.gov (United States)

    Kohl, Christian Dominik; Garde, Sebastian; Knaup, Petra

    2008-01-01

    Using openEHR archetypes to establish an electronic patient record promises rapid development and system interoperability by using or adopting existing archetypes. However, internationally accepted, high quality archetypes which enable a comprehensive semantic interoperability require adequate development and maintenance processes. Therefore, structures have to be created involving different health professions. In the following we present a model which facilitates and governs distributed but cooperative development and adoption of archetypes by different professionals including peer reviews. Our model consists of a hierarchical structure of professional committees and descriptions of the archetype development process considering these different committees.

  1. Plasma erosion opening switch in the double-pulse operation mode of a high-current electron accelerator

    International Nuclear Information System (INIS)

    Isakov, I.F.; Lopatin, V.S.; Remnev, G.E.

    1987-01-01

    This paper reports the results of investigations of the operation of a fast current opening switch, with a 10/sup 13/-10/sup 16/ plasma density produced either by dielectric surface flashover or by explosive emission of graphite. A series of two pulses was applied to two diodes in parallel. The first pulse produced plasma in the first diode which closed that diode gap by the arrival time of the second pulse. The first, shorted, diode then acted as an erosion switch for the second pulse. A factor of 2.5-3 power multiplication was obtained under optimum conditions. The opening-switch resistance during the magnetic insulation phase, neglecting the electron losses between the switch and the generating diode, exceeded 100 Ω. The duration of the rapid opening phase was less than 5 ns under optimum conditions. This method of plasma production does not require external plasma sources, and permits a wide variation of plasma density, which in turn allows high inductor currents and stored energies

  2. A high-order doubly asymptotic open boundary for scalar waves in semi-infinite layered systems

    International Nuclear Information System (INIS)

    Prempramote, S; Song, Ch; Birk, C

    2010-01-01

    Wave propagation in semi-infinite layered systems is of interest in earthquake engineering, acoustics, electromagnetism, etc. The numerical modelling of this problem is particularly challenging as evanescent waves exist below the cut-off frequency. Most of the high-order transmitting boundaries are unable to model the evanescent waves. As a result, spurious reflection occurs at late time. In this paper, a high-order doubly asymptotic open boundary is developed for scalar waves propagating in semi-infinite layered systems. It is derived from the equation of dynamic stiffness matrix obtained in the scaled boundary finite-element method in the frequency domain. A continued-fraction solution of the dynamic stiffness matrix is determined recursively by satisfying the scaled boundary finite-element equation at both high- and low-frequency limits. In the time domain, the continued-fraction solution permits the force-displacement relationship to be formulated as a system of first-order ordinary differential equations. Standard time-step schemes in structural dynamics can be directly applied to evaluate the response history. Examples of a semi-infinite homogeneous layer and a semi-infinite two-layered system are investigated herein. The displacement results obtained from the open boundary converge rapidly as the order of continued fractions increases. Accurate results are obtained at early time and late time.

  3. The growth of finfish in global open-ocean aquaculture under climate change.

    Science.gov (United States)

    Klinger, Dane H; Levin, Simon A; Watson, James R

    2017-10-11

    Aquaculture production is projected to expand from land-based operations to the open ocean as demand for seafood grows and competition increases for inputs to land-based aquaculture, such as freshwater and suitable land. In contrast to land-based production, open-ocean aquaculture is constrained by oceanographic factors, such as current speeds and seawater temperature, which are dynamic in time and space, and cannot easily be controlled. As such, the potential for offshore aquaculture to increase seafood production is tied to the physical state of the oceans. We employ a novel spatial model to estimate the potential of open-ocean finfish aquaculture globally, given physical, biological and technological constraints. Finfish growth potential for three common aquaculture species representing different thermal guilds-Atlantic salmon ( Salmo salar ), gilthead seabream ( Sparus aurata ) and cobia ( Rachycentron canadum )-is compared across species and regions and with climate change, based on outputs of a high-resolution global climate model. Globally, there are ample areas that are physically suitable for fish growth and potential expansion of the nascent aquaculture industry. The effects of climate change are heterogeneous across species and regions, but areas with existing aquaculture industries are likely to see increases in growth rates. In areas where climate change results in reduced growth rates, adaptation measures, such as selective breeding, can probably offset potential production losses. © 2017 The Author(s).

  4. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  5. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  6. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  7. The OpenMC Monte Carlo particle transport code

    International Nuclear Information System (INIS)

    Romano, Paul K.; Forget, Benoit

    2013-01-01

    Highlights: ► An open source Monte Carlo particle transport code, OpenMC, has been developed. ► Solid geometry and continuous-energy physics allow high-fidelity simulations. ► Development has focused on high performance and modern I/O techniques. ► OpenMC is capable of scaling up to hundreds of thousands of processors. ► Results on a variety of benchmark problems agree with MCNP5. -- Abstract: A new Monte Carlo code called OpenMC is currently under development at the Massachusetts Institute of Technology as a tool for simulation on high-performance computing platforms. Given that many legacy codes do not scale well on existing and future parallel computer architectures, OpenMC has been developed from scratch with a focus on high performance scalable algorithms as well as modern software design practices. The present work describes the methods used in the OpenMC code and demonstrates the performance and accuracy of the code on a variety of problems.

  8. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  9. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  10. Determining open cluster membership. A Bayesian framework for quantitative member classification

    Science.gov (United States)

    Stott, Jonathan J.

    2018-01-01

    Aims: My goal is to develop a quantitative algorithm for assessing open cluster membership probabilities. The algorithm is designed to work with single-epoch observations. In its simplest form, only one set of program images and one set of reference images are required. Methods: The algorithm is based on a two-stage joint astrometric and photometric assessment of cluster membership probabilities. The probabilities were computed within a Bayesian framework using any available prior information. Where possible, the algorithm emphasizes simplicity over mathematical sophistication. Results: The algorithm was implemented and tested against three observational fields using published survey data. M 67 and NGC 654 were selected as cluster examples while a third, cluster-free, field was used for the final test data set. The algorithm shows good quantitative agreement with the existing surveys and has a false-positive rate significantly lower than the astrometric or photometric methods used individually.

  11. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  12. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  13. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  14. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  15. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  16. Development of accident sequence precursors methodologies for core damage Probabilities in NPPs

    International Nuclear Information System (INIS)

    Munoz, R.; Minguez, E.; Melendez, E.; Sanchez-Perea, M.; Izquierdo, J.M.

    1998-01-01

    Several licensing programs have focused on the evaluation of the importance of operating events occurred in NPPs. Some have worked the dynamic aspects of the sequence of events involved, reproducing the incidents, while others are based on PSA applications to incident analysis. A method that controls the two above approaches to determine risk analysis derives from the Integrated Safety Assessment methodology (ISA). The dynamics of the event is followed by transient simulation in tree form, building a Setpoint or Deterministic Dynamic Event Tree (DDET). When a setpoint is reached, the actuation of a protection is triggered, then the tree is opened in branches corresponding to different functioning states. The engineering simulator with the new states followers each branch. One of these states is the nominal one, which is the PSA is associated to the success criterion of the system. The probability of the sequence is calculated in parallel to the dynamics. The following tools should perform the couple simulation: 1. A Scheduler that drives the simulation of the different sequences, and open branches upon demand. It will be the unique generator of processes while constructing the tree calculation, and will develop the computation in a distributed computational environment. 2. The Plant Simulator, which models the plant systems and the operator actions throughout a sequence. It receives the state of the equipment in each sequence and must provide information about setpoint crossing to the Scheduler. It will receive decision flags to continue or to stop each sequence, and to send new conditions to other plant simulators. 3. The Probability Calculator, linked only to the Scheduler, is the fault trees associated with each event tree header and performing their Boolean product. (Author)

  17. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  18. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  19. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  20. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  1. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  2. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  3. The Open Access Dilemma

    Science.gov (United States)

    Pratt, Timothy

    2017-01-01

    Community colleges, with their commitment to open access, admit millions of students each year who are unprepared for college-level work, even though they have earned a high-school diploma. For decades the schools had a built-in base of students attracted to their open doors and relative affordability. But enrollment at public two-year college has…

  4. Application of escape probability to line transfer in laser-produced plasmas

    International Nuclear Information System (INIS)

    Lee, Y.T.; London, R.A.; Zimmerman, G.B.; Haglestein, P.L.

    1989-01-01

    In this paper the authors apply the escape probability method to treat transfer of optically thick lines in laser-produced plasmas in plan-parallel geometry. They investigate the effect of self-absorption on the ionization balance and ion level populations. In addition, they calculate such effect on the laser gains in an exploding foil target heated by an optical laser. Due to the large ion streaming motion in laser-produced plasmas, absorption of an emitted photon occurs only over the length in which the Doppler shift is equal to the line width. They find that the escape probability calculated with the Doppler shift is larger compared to the escape probability for a static plasma. Therefore, the ion streaming motion contributes significantly to the line transfer process in laser-produced plasmas. As examples, they have applied escape probability to calculate transfer of optically thick lines in both ablating slab and exploding foil targets under irradiation of a high-power optical laser

  5. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    Science.gov (United States)

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Probable high prevalence of limb-girdle muscular dystrophy type 2D in Taiwan.

    Science.gov (United States)

    Liang, Wen-Chen; Chou, Po-Ching; Hung, Chia-Cheng; Su, Yi-Ning; Kan, Tsu-Min; Chen, Wan-Zi; Hayashi, Yukiko K; Nishino, Ichizo; Jong, Yuh-Jyh

    2016-03-15

    Limb-girdle muscular dystrophy type 2D (LGMD2D), an autosomal-recessive inherited LGMD, is caused by the mutations in SGCA. SGCA encodes alpha-sarcoglycan (SG) that forms a heterotetramer with other SGs in the sarcolemma, and comprises part of the dystrophin-glycoprotein complex. The frequency of LGMD2D is variable among different ethnic backgrounds, and so far only a few patients have been reported in Asia. We identified five patients with a novel homozygous mutation of c.101G>T (p.Arg34Leu) in SGCA from a big aboriginal family ethnically consisting of two tribes in Taiwan. Patient 3 is the maternal uncle of patients 1 and 2. All their parents, heterozygous for c.101G>T, denied consanguineous marriages although they were from the same tribe. The heterozygous parents of patients 4 and 5 were from two different tribes, originally residing in different geographic regions in Taiwan. Haplotype analysis showed that all five patients shared the same mutation-associated haplotype, indicating the probability of a founder effect and consanguinity. The results suggest that the carrier rate of c.101G>T in SGCA may be high in Taiwan, especially in the aboriginal population regardless of the tribes. It is important to investigate the prevalence of LGMD2D in Taiwan for early diagnosis and treatment. Copyright © 2016. Published by Elsevier B.V.

  7. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  8. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  9. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  10. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  11. Soybean P34 Probable Thiol Protease Probably Has Proteolytic Activity on Oleosins.

    Science.gov (United States)

    Zhao, Luping; Kong, Xiangzhen; Zhang, Caimeng; Hua, Yufei; Chen, Yeming

    2017-07-19

    P34 probable thiol protease (P34) and Gly m Bd 30K (30K) show high relationship with the protease of 24 kDa oleosin of soybean oil bodies. In this study, 9 day germinated soybean was used to separate bioprocessed P34 (P32) from bioprocessed 30K (28K). Interestingly, P32 existed as dimer, whereas 28K existed as monomer; a P32-rich sample had proteolytic activity and high cleavage site specificity (Lys-Thr of 24 kDa oleosin), whereas a 28K-rich sample showed low proteolytic activity; the P32-rich sample contained one thiol protease. After mixing with purified oil bodies, all P32 dimers were dissociated and bound to 24 kDa oleosins to form P32-24 kDa oleosin complexes. By incubation, 24 kDa oleosin was preferentially hydrolyzed, and two hydrolyzed products (HPs; 17 and 7 kDa) were confirmed. After most of 24 kDa oleosin was hydrolyzed, some P32 existed as dimer, and the other as P32-17 kDa HP. It was suggested that P32 was the protease.

  12. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    Science.gov (United States)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to

  13. Identification of High-Variation Fields based on Open Satellite Imagery

    DEFF Research Database (Denmark)

    Jeppesen, Jacob Høxbroe; Jacobsen, Rune Hylsberg; Nyholm Jørgensen, Rasmus

    2017-01-01

    . The categorization is based on vegetation indices derived from Sentinel-2 satellite imagery. A case study on 7678 winter wheat fields is presented, which employs open data and open source software to analyze the satellite imagery. Furthermore, the method can be automated to deliver categorizations at every update......This paper proposes a simple method for categorizing fields on a regional level, with respect to intra-field variations. It aims to identify fields where the potential benefits of applying precision agricultural practices are highest from an economic and environmental perspective...

  14. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  15. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  16. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  17. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    Science.gov (United States)

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  18. Organizing Open Innovation for Sustainability

    NARCIS (Netherlands)

    Ingenbleek, P.T.M.; Backus, G.B.C.

    2015-01-01

    Literature on open innovation has thus far predominantly focused on high technology contexts. Once an industry reaches the limits of a closed innovation model, open innovation may, however, also promise opportunities for sustainable development in a low-tech environment. Because in low-tech

  19. F.Y. Edgeworth’s Treatise on Probabilities

    OpenAIRE

    Alberto Baccini

    2007-01-01

    Probability theory has a central role in Edgeworth’s thought; this paper examines the philosophical foundation of the theory. Starting from a frequentist position, Edgeworth introduced some innovations on the definition of primitive probabilities. He distinguished between primitive probabilities based on experience of statistical evidence, and primitive a priori probabilities based on a more general and less precise kind of experience, inherited by the human race through evolution. Given prim...

  20. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    Science.gov (United States)

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, pprobability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Some open questions in 'wave chaos'

    International Nuclear Information System (INIS)

    Nonnenmacher, Stéphane

    2008-01-01

    The subject area referred to as 'wave chaos', 'quantum chaos' or 'quantum chaology' has been investigated mostly by the theoretical physics community in the last 30 years. The questions it raises have more recently also attracted the attention of mathematicians and mathematical physicists, due to connections with number theory, graph theory, Riemannian, hyperbolic or complex geometry, classical dynamical systems, probability, etc. After giving a rough account on 'what is quantum chaos?', I intend to list some pending questions, some of them having been raised a long time ago, some others more recent. The choice of problems (and of references) is of course partial and personal. (open problem)

  2. Open life science research, open software and the open century

    Directory of Open Access Journals (Sweden)

    Youhua Chen

    2015-05-01

    Full Text Available At the age of knowledge explosion and mass scientific information, I highlighted the importance of conducting open science in life and medical researches through the extensive usage of open software and documents. The proposal of conducting open science is to reduce the limited repeatability of researches in life science. I outlined the essential steps for conducting open life science and the necessary standards for creating, reusing and reproducing open materials. Different Creative Commons licenses were presented and compared of their usage scope and restriction. As a conclusion, I argued that open materials should be widely adopted in doing life and medical researches.

  3. Symmetry-Breaking Charge Transfer in a Zinc Chlorodipyrrin Acceptor for High Open Circuit Voltage Organic Photovoltaics

    KAUST Repository

    Bartynski, Andrew N.

    2015-04-29

    © 2015 American Chemical Society. Low open-circuit voltages significantly limit the power conversion efficiency of organic photovoltaic devices. Typical strategies to enhance the open-circuit voltage involve tuning the HOMO and LUMO positions of the donor (D) and acceptor (A), respectively, to increase the interfacial energy gap or to tailor the donor or acceptor structure at the D/A interface. Here, we present an alternative approach to improve the open-circuit voltage through the use of a zinc chlorodipyrrin, ZCl [bis(dodecachloro-5-mesityldipyrrinato)zinc], as an acceptor, which undergoes symmetry-breaking charge transfer (CT) at the donor/acceptor interface. DBP/ZCl cells exhibit open-circuit voltages of 1.33 V compared to 0.88 V for analogous tetraphenyldibenzoperyflanthrene (DBP)/C60-based devices. Charge transfer state energies measured by Fourier-transform photocurrent spectroscopy and electroluminescence show that C60 forms a CT state of 1.45 ± 0.05 eV in a DBP/C60-based organic photovoltaic device, while ZCl as acceptor gives a CT state energy of 1.70 ± 0.05 eV in the corresponding device structure. In the ZCl device this results in an energetic loss between ECT and qVOC of 0.37 eV, substantially less than the 0.6 eV typically observed for organic systems and equal to the recombination losses seen in high-efficiency Si and GaAs devices. The substantial increase in open-circuit voltage and reduction in recombination losses for devices utilizing ZCl demonstrate the great promise of symmetry-breaking charge transfer in organic photovoltaic devices.

  4. Open Source Software Projects Needing Security Investments

    Science.gov (United States)

    2015-06-19

    modtls, BouncyCastle, gpg, otr, axolotl. 7. Static analyzers: Clang, Frama-C. 8. Nginx. 9. OpenVPN . It was noted that the funding model may be similar...to OpenSSL, where consulting funds the company. It was also noted that OpenVPN needs to correctly use OpenSSL in order to be secure, so focusing on...Dovecot 4. Other high-impact network services: OpenSSH, OpenVPN , BIND, ISC DHCP, University of Delaware NTPD 5. Core infrastructure data parsers

  5. Trading on extinction: An open-access deterrence model for the South African abalone fishery

    Directory of Open Access Journals (Sweden)

    Douglas J. Crookes

    2016-03-01

    Full Text Available South African rhinoceros (e.g.Diceros bicornis and abalone (Haliotis midae have in common that they both are harvested under open-access conditions, are high-value commodities and are traded illegally. The difference is that a legal market for abalone already exists. An open-access deterrence model was developed for South African abalone, using Table Mountain National Park as a case study. It was found that illegal poaching spiked following the closure of the recreational fishery. The resource custodian's objective is to maximise returns from confiscations. This study showed that a legal trade results in a trading on extinction resource trap, with a race for profits, an increase in the probability of detection after a poaching event and the depletion of populations. In contrast with HS Gordon's seminal article (J Polit Econ 1954;62:124-142, profit maximisation does not automatically improve the sustainability of the resource. Under certain conditions (e.g. a legal trade with costly enforcement, profit maximisation may actually deplete abalone populations. The article also has implications for rhino populations, as a legal trade is currently proposed.

  6. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    International Nuclear Information System (INIS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements. (paper)

  7. Decreased Serum Lipids in Patients with Probable Alzheimer´s Disease

    Directory of Open Access Journals (Sweden)

    Orhan Lepara

    2009-08-01

    Full Text Available Alzheimer’s disease (AD is a multifactorial disease but its aetiology and pathophisiology are still not fully understood. Epidemiologic studies examining the association between lipids and dementia have reported conflicting results. High total cholesterol has been associated with both an increased, and decreased, risk of AD and/or vascular dementia (VAD, whereas other studies found no association. The aim of this study was to investigate the serum lipids concentration in patients with probable AD, as well as possible correlation between serum lipids concentrations and cognitive impairment.Our cross-sectional study included 30 patients with probable AD and 30 age and sex matched control subjects. The probable AD was clinically diagnosed by NINCDS-ADRDA criteria. Serum total cholesterol (TC, high-density lipoprotein cholesterol (HDL-C and triglyceride (TG levels were determined at the initial assessment using standard enzymatic colorimetric techniques. Low-den- sity lipoprotein cholesterol (LDL-C and very low density lipoprotein cholesterol (VLDL-C levels were calculated. Subjects with probable AD had significantly lower serum TG (p<0,01, TC (p<0,05, LDL-C (p<0,05 and VLDL-C (p<0,01 compared to the control group. We did not observe signifi-cant difference in HDL-C level between patients with probable AD and control subjects. Negative, although not significant correlation between TG, TC and VLDL-C and MMSE in patients with AD was observed. In the control group of subjects there was a negative correlation between TC and MMSE but it was not statistically significant (r = -0,28. Further studies are required to explore the possibility for serum lipids to serve as diagnostic and therapeutic markers of AD.

  8. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  9. Compensating for geographic variation in detection probability with water depth improves abundance estimates of coastal marine megafauna.

    Science.gov (United States)

    Hagihara, Rie; Jones, Rhondda E; Sobtzick, Susan; Cleguer, Christophe; Garrigue, Claire; Marsh, Helene

    2018-01-01

    The probability of an aquatic animal being available for detection is typically probability of detection is important for obtaining robust estimates of the population abundance and determining its status and trends. The dugong (Dugong dugon) is a bottom-feeding marine mammal and a seagrass community specialist. We hypothesized that the probability of a dugong being available for detection is dependent on water depth and that dugongs spend more time underwater in deep-water seagrass habitats than in shallow-water seagrass habitats. We tested this hypothesis by quantifying the depth use of 28 wild dugongs fitted with GPS satellite transmitters and time-depth recorders (TDRs) at three sites with distinct seagrass depth distributions: 1) open waters supporting extensive seagrass meadows to 40 m deep (Torres Strait, 6 dugongs, 2015); 2) a protected bay (average water depth 6.8 m) with extensive shallow seagrass beds (Moreton Bay, 13 dugongs, 2011 and 2012); and 3) a mixture of lagoon, coral and seagrass habitats to 60 m deep (New Caledonia, 9 dugongs, 2013). The fitted instruments were used to measure the times the dugongs spent in the experimentally determined detection zones under various environmental conditions. The estimated probability of detection was applied to aerial survey data previously collected at each location. In general, dugongs were least available for detection in Torres Strait, and the population estimates increased 6-7 fold using depth-specific availability correction factors compared with earlier estimates that assumed homogeneous detection probability across water depth and location. Detection probabilities were higher in Moreton Bay and New Caledonia than Torres Strait because the water transparency in these two locations was much greater than in Torres Strait and the effect of correcting for depth-specific detection probability much less. The methodology has application to visual survey of coastal megafauna including surveys using Unmanned

  10. OpenCL-Based Linear Algebra Libraries for High-Performance Computing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite its promise, OpenCL adoption is slow, owing to a lack of libraries and tools. Vendors have shown few signs of plans to provide OpenCL libraries, and were...

  11. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  12. Weak openness and almost openness

    Directory of Open Access Journals (Sweden)

    David A. Rose

    1984-01-01

    Full Text Available Weak openness and almost openness for arbitrary functions between topological spaces are defined as duals to the weak continuity of Levine and the almost continuity of Husain respectively. Independence of these two openness conditions is noted and comparison is made between these and the almost openness of Singal and Singal. Some results dual to those known for weak continuity and almost continuity are obtained. Nearly almost openness is defined and used to obtain an improved link from weak continuity to almost continuity.

  13. Modelling detection probabilities to evaluate management and control tools for an invasive species

    Science.gov (United States)

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By

  14. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    Science.gov (United States)

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  15. Measurement of probability distributions for internal stresses in dislocated crystals

    Energy Technology Data Exchange (ETDEWEB)

    Wilkinson, Angus J.; Tarleton, Edmund; Vilalta-Clemente, Arantxa; Collins, David M. [Department of Materials, University of Oxford, Parks Road, Oxford OX1 3PH (United Kingdom); Jiang, Jun; Britton, T. Benjamin [Department of Materials, Imperial College London, Royal School of Mines, Exhibition Road, London SW7 2AZ (United Kingdom)

    2014-11-03

    Here, we analyse residual stress distributions obtained from various crystal systems using high resolution electron backscatter diffraction (EBSD) measurements. Histograms showing stress probability distributions exhibit tails extending to very high stress levels. We demonstrate that these extreme stress values are consistent with the functional form that should be expected for dislocated crystals. Analysis initially developed by Groma and co-workers for X-ray line profile analysis and based on the so-called “restricted second moment of the probability distribution” can be used to estimate the total dislocation density. The generality of the results are illustrated by application to three quite different systems, namely, face centred cubic Cu deformed in uniaxial tension, a body centred cubic steel deformed to larger strain by cold rolling, and hexagonal InAlN layers grown on misfitting sapphire and silicon carbide substrates.

  16. Improving Design with Open Innovation

    DEFF Research Database (Denmark)

    Christiansen, John K.; Gasparin, Marta; Varnes, Claus

    2013-01-01

    of the best practices for open innovation. However, projects were not equally successful. The outcome seems to be highly influenced by the type of collaborative arrangements used and their application. In particular, the analysis indicates that organizational factors seem to be indicative but not sufficient...... in open innovation need both a broad knowledge of the various potential elements of an open innovation effort and a flexible attitude toward their application. INSET: Summary of Projects....

  17. Rate and time to develop first central line-associated bloodstream infections when comparing open and closed infusion containers in a Brazilian Hospital

    Directory of Open Access Journals (Sweden)

    Margarete Vilins

    Full Text Available The objective of the study was to determine the effect of switching from an open (glass or semi-rigid plastic infusion container to a closed, fully collapsible plastic infusion container (Viaflex® on rate and time to onset of central lineassociated bloodstream infections (CLABSI. An open-label, prospective cohort, active healthcare-associated infection surveillance, sequential study was conducted in three intensive care units in Brazil. The CLABSI rate using open infusion containers was compared to the rate using a closed infusion container. Probability of acquiring CLABSI was assessed over time and compared between open and closed infusion container periods; three-day intervals were examined. A total of 1125 adult ICU patients were enrolled. CLABSI rate was significantly higher during the open compared with the closed infusion container period (6.5 versus 3.2 CLABSI/1000 CL days; RR=0.49, 95%CI=0.26- 0.95, p=0.031. During the closed infusion container period, the probability of acquiring a CLABSI remained relatively constant along the time of central line use (0.8% Days 2-4 to 0.7% Days 11-13 but increased in the open infusion container period (1.5% Days 2-4 to 2.3% Days 11-13. Combined across all time intervals, the chance of a patient acquiring a CLABSI was significantly lower (55% in the closed infusion container period (Cox proportional hazard ratio 0.45, p= 0.019. CLABSIs can be reduced with the use of full barrier precautions, education, and performance feedback. Our results show that switching from an open to a closed infusion container may further reduce CLABSI rate as well as delay the onset of CLABSIs. Closed infusion containers significantly reduced CLABSI rate and the probability of acquiring CLABSI.

  18. Failure Probability Estimation of Wind Turbines by Enhanced Monte Carlo

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Naess, Arvid

    2012-01-01

    This paper discusses the estimation of the failure probability of wind turbines required by codes of practice for designing them. The Standard Monte Carlo (SMC) simulations may be used for this reason conceptually as an alternative to the popular Peaks-Over-Threshold (POT) method. However......, estimation of very low failure probabilities with SMC simulations leads to unacceptably high computational costs. In this study, an Enhanced Monte Carlo (EMC) method is proposed that overcomes this obstacle. The method has advantages over both POT and SMC in terms of its low computational cost and accuracy...... is controlled by the pitch controller. This provides a fair framework for comparison of the behavior and failure event of the wind turbine with emphasis on the effect of the pitch controller. The Enhanced Monte Carlo method is then applied to the model and the failure probabilities of the model are estimated...

  19. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  20. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.