WorldWideScience

Sample records for high breaching probability

  1. Detonation probabilities of high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  2. Data Breach Preparation

    Energy Technology Data Exchange (ETDEWEB)

    Belangia, David Warren [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-13

    The Home Depot Data Breach is the second largest data breach on record. It has or will affect up to 56 million debit or credit cards. A trusted vendor account, coupled with the use of a previously unknown variant of malware that allowed the establishment of a foothold, was the entry point into the Home Depot network. Once inside the perimeter, privilege escalation provided an avenue to obtain the desired information. Home Depot did, however, learn some lessons from Target. Home Depot certainly communicated better than Target, procured insurance, and instituted as secure an environment as possible. There are specific measures an institution should undertake to prepare for a data breach, and everyone can learn from this breach. Publicly available information about the Home Depot Data Breach provides insight into the attack, an old malware variant with a new twist.While the malware was modified as to be unrecognizable with tools, it probably should have been detected. There are also concerns with Home Depot’s insurance and the insurance provider’s apparent lack of fully reimbursing Home Depot for their losses. The effect on shareholders and Home Depot’s stock price was short lived. This story is still evolving but provides interesting lessons learned concerning how an organization should prepare for it inevitable breach.

  3. Intermittent ephemeral river-breaching

    Science.gov (United States)

    Reniers, A. J.; MacMahan, J. H.; Gallagher, E. L.; Shanks, A.; Morgan, S.; Jarvis, M.; Thornton, E. B.; Brown, J.; Fujimura, A.

    2012-12-01

    In the summer of 2011 we performed a field experiment in Carmel River State Beach, CA, at a time when the intermittent natural breaching of the ephemeral Carmel River occurred due to an unusually rainy period prior to the experiment associated with El Nino. At this time the river would fill the lagoon over the period of a number of days after which a breach would occur. This allowed us to document a number of breaches with unique pre- and post-breach topographic surveys, accompanying ocean and lagoon water elevations as well as extremely high flow (4m/s) velocities in the river mouth during the breaching event. The topographic surveys were obtained with a GPS-equipped backpack mounted on a walking human and show the evolution of the river breaching with a gradually widening and deepening river channel that cuts through the pre-existing beach and berm. The beach face is qualified as a steep with an average beach slope of 1:10 with significant reflection of the incident waves (MacMahan et al., 2012). The wave directions are generally shore normal as the waves refract over the deep canyon that is located offshore of the beach. The tide is mixed semi-diurnal with a range on the order of one meter. Breaching typically occurred during the low-low tide. Grain size is highly variable along the beach with layers of alternating fine and coarse material that could clearly be observed as the river exit channel was cutting through the beach. Large rocky outcroppings buried under the beach sand are also present along certain stretches of the beach controlling the depth of the breaching channel. The changes in the water level measured within the lagoon and the ocean side allows for an estimate of the volume flux associated with the breach as function of morphology, tidal elevation and wave conditions as well as an assessment of the conditions and mechanisms of breach closure, which occurred on the time scale of O(0.5 days). Exploratory model simulations will be presented at the

  4. 7th High Dimensional Probability Meeting

    CERN Document Server

    Mason, David; Reynaud-Bouret, Patricia; Rosinski, Jan

    2016-01-01

    This volume collects selected papers from the 7th High Dimensional Probability meeting held at the Institut d'Études Scientifiques de Cargèse (IESC) in Corsica, France. High Dimensional Probability (HDP) is an area of mathematics that includes the study of probability distributions and limit theorems in infinite-dimensional spaces such as Hilbert spaces and Banach spaces. The most remarkable feature of this area is that it has resulted in the creation of powerful new tools and perspectives, whose range of application has led to interactions with other subfields of mathematics, statistics, and computer science. These include random matrices, nonparametric statistics, empirical processes, statistical learning theory, concentration of measure phenomena, strong and weak approximations, functional estimation, combinatorial optimization, and random graphs. The contributions in this volume show that HDP theory continues to thrive and develop new tools, methods, techniques and perspectives to analyze random phenome...

  5. Flow hydrodynamics in embankment breach

    Institute of Scientific and Technical Information of China (English)

    ZHAO Gensheng; VISSER Paul J; REN Yankai; UIJTTEWAAL Wim S J

    2015-01-01

    Breaching flow occurs during the breach development of the embankment, dike, earthen dam, landslide barrier, etc. and plays an import role in the breaching erosion as the driving force. According to the previous research, the breaching process can be classified into initiation phase, breach widening phase and breach deepening phase. Based on the breaching development classifications, the breaching flow can be seen as a special compound weir flow when the breach channel is in the relatively equilibrium condition. There were five physical flow models were designed in the hypothesis of rectangular shape and trapezoidal shape for the breach channel cross sections to study the breaching flow characteristics. The distributions of water level and velocity were measured and analysed in the breaching flows in overtopping condition and emerged condition. There were two helicoidal flows above the breach channel slopes and triangular hydraulic jump in the downstream of the breach channel in the overtopping condition and emerged condition. The hydraulic energy loss was calculated according to the breaching velocity and water level distribution in the upstream and downstream of the model. It is found that the test results of the breach flow physical model can be valuable to bring insight of the breaching process of embankment and make contributions to the validations and verifications of breach numerical models.

  6. Medical data breaches

    DEFF Research Database (Denmark)

    Kierkegaard, Patrick

    2012-01-01

    notification law in the US to be characterized by less government intrusions, while the revised EU Privacy Directive, 2009/136/EC calls for tougher privacy protection for data held by electronic communication providers. While the EU law sets a global de facto standard, the law remains toothless without strong......The EU and the United States have implemented data breach notification rules that cover the health sectors. Nevertheless, data breach incidents involving medical data continue to rise, especially in the US and the UK. The HITECH Act, Pub. L. 111-5 Title XIII is the first federal health breach...

  7. Data breaches. Final rule.

    Science.gov (United States)

    2008-04-11

    This document adopts, without change, the interim final rule that was published in the Federal Register on June 22, 2007, addressing data breaches of sensitive personal information that is processed or maintained by the Department of Veterans Affairs (VA). This final rule implements certain provisions of the Veterans Benefits, Health Care, and Information Technology Act of 2006. The regulations prescribe the mechanisms for taking action in response to a data breach of sensitive personal information.

  8. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  9. Using High-Probability Foods to Increase the Acceptance of Low-Probability Foods

    Science.gov (United States)

    Meier, Aimee E.; Fryling, Mitch J.; Wallace, Michele D.

    2012-01-01

    Studies have evaluated a range of interventions to treat food selectivity in children with autism and related developmental disabilities. The high-probability instructional sequence is one intervention with variable results in this area. We evaluated the effectiveness of a high-probability sequence using 3 presentations of a preferred food on…

  10. probably

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    【说词】1. He can probably tell us the truth.2. Will it rain this afternoong ? Probably【解语】作副词,意为“大概、或许”,表示可能性很大,通常指根据目前情况作出积极推测或判断;

  11. Applying Mechanistic Dam Breach Models to Historic Levee Breaches

    Directory of Open Access Journals (Sweden)

    Risher Paul

    2016-01-01

    Full Text Available Hurricane Katrina elevated levee risk in the US national consciousness, motivating agencies to assess and improve their levee risk assessment methodology. Accurate computation of the flood flow magnitude and timing associated with a levee breach remains one of the most difficult and uncertain components of levee risk analysis. Contemporary methods are largely empirical and approximate, introducing substantial uncertainty to the damage and life loss models. Levee breach progressions are often extrapolated to the final width and breach formation time based on limited experience with past breaches or using regression equations developed from a limited data base of dam failures. Physically based embankment erosion models could improve levee breach modeling. However, while several mechanistic embankment breach models are available, they were developed for dams. Several aspects of the levee breach problem are distinct, departing from dam breach assumptions. This study applies three embankments models developed for dam breach analysis (DL Breach, HR BREACH, and WinDAM C to historic levee breaches with observed (or inferred breach rates, assessing the limitations, and applicability of each model to the levee breach problem.

  12. Signal probability effects on high-workload vigilance tasks.

    Science.gov (United States)

    Matthews, G

    1996-09-01

    Signal probability is an important influence on vigilance. Typically, higher signal probability is associated with higher hit rate, lower response criterion, and lower response:signal ratio. However, signal probability effects on demanding, high-workload vigilance tasks have not been investigated. It is believed that attentional resources become depleted during performance of such tasks, leading to perceptual sensitivity decrements. Forty subjects performed high- (.35) and low- (.10) probability versions of a demanding vigilance task. Results differed in two important respects from those previously obtained with less demanding tasks. First, the decrement in perceptual sensitivity over time was greater for the high-probability task. Second, there were no effects of signal probability on response criterion. Subjective workload was higher for the high-probability task. Implications of the data for resource-depletion and expectancy theories of vigilance are discussed.

  13. Breach to Nowhere

    Science.gov (United States)

    Schaffhauser, Dian

    2009-01-01

    Will that data breach be the end of a chief security officer (CSO)? Managing information security in higher education requires more than just technical expertise, especially when the heat is cranked up. This article takes a look at how two CSOs deal with hack attacks at their universities. When Purdue University Chief Information Security Officer…

  14. Breach to Nowhere

    Science.gov (United States)

    Schaffhauser, Dian

    2009-01-01

    Will that data breach be the end of a chief security officer (CSO)? Managing information security in higher education requires more than just technical expertise, especially when the heat is cranked up. This article takes a look at how two CSOs deal with hack attacks at their universities. When Purdue University Chief Information Security Officer…

  15. Flood hydrology and dam-breach hydraulic analyses of five reservoirs in Colorado

    Science.gov (United States)

    Stevens, Michael R.; Hoogestraat, Galen K.

    2013-01-01

    The U.S. Department of Agriculture Forest Service has identified hazard concerns for areas downstream from five Colorado dams on Forest Service land. In 2009, the U.S. Geological Survey, in cooperation with the Forest Service, initiated a flood hydrology analysis to estimate the areal extent of potential downstream flood inundation and hazard to downstream life, property, and infrastructure if dam breach occurs. Readily available information was used for dam-breach assessments of five small Colorado reservoirs (Balman Reservoir, Crystal Lake, Manitou Park Lake, McGinnis Lake, and Million Reservoir) that are impounded by an earthen dam, and no new data were collected for hydraulic modeling. For each reservoir, two dam-breach scenarios were modeled: (1) the dam is overtopped but does not fail (break), and (2) the dam is overtopped and dam-break occurs. The dam-breach scenarios were modeled in response to the 100-year recurrence, 500-year recurrence, and the probable maximum precipitation, 24-hour duration rainstorms to predict downstream flooding. For each dam-breach and storm scenario, a flood inundation map was constructed to estimate the extent of flooding in areas of concern downstream from each dam. Simulation results of the dam-break scenarios were used to determine the hazard classification of the dam structure (high, significant, or low), which is primarily based on the potential for loss of life and property damage resulting from the predicted downstream flooding.

  16. Breached cylinder incident at the Portsmouth gaseous diffusion plant

    Energy Technology Data Exchange (ETDEWEB)

    Boelens, R.A. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States)

    1991-12-31

    On June 16, 1990, during an inspection of valves on partially depleted product storage cylinders, a 14-ton partially depleted product cylinder was discovered breached. The cylinder had been placed in long-term storage in 1977 on the top row of Portsmouth`s (two rows high) storage area. The breach was observed when an inspector noticed a pile of green material along side of the cylinder. The breach was estimated to be approximately 8- inches wide and 16-inches long, and ran under the first stiffening ring of the cylinder. During the continuing inspection of the storage area, a second 14-ton product cylinder was discovered breached. This cylinder was stacked on the bottom row in the storage area in 1986. This breach was also located adjacent to a stiffening ring. This paper will discuss the contributing factors of the breaching of the cylinders, the immediate response, subsequent actions in support of the investigation, and corrective actions.

  17. 38 CFR 75.113 - Data breach.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Data breach. 75.113...) INFORMATION SECURITY MATTERS Data Breaches § 75.113 Data breach. Consistent with the definition of data breach in § 75.112 of this subpart, a data breach occurs under this subpart if there is a loss or theft...

  18. Antecedents of Psychological Contract Breach: The Role of Job Demands, Job Resources, and Affect

    OpenAIRE

    Tim Vantilborgh; Jemima Bidee; Roland Pepermans; Yannick Griep; Joeri Hofmans

    2016-01-01

    While it has been shown that psychological contract breach leads to detrimental outcomes, relatively little is known about factors leading to perceptions of breach. We examine if job demands and resources predict breach perceptions. We argue that perceiving high demands elicits negative affect, while perceiving high resources stimulates positive affect. Positive and negative affect, in turn, influence the likelihood that psychological contract breaches are perceived. We conducted two experien...

  19. Seasonal breaching of coastal barriers

    NARCIS (Netherlands)

    Tuan, Thieu Quang

    2007-01-01

    Natural or unintended breaching can be catastrophic, causing loss of human lives and damage to infrastructures, buildings and natural habitats. Quantitative understand-ing of coastal barrier breaching is therefore of great importance to vulnerability as-sessment of protection works as well as to

  20. Dam-breach analysis and flood-inundation mapping for selected dams in Oklahoma City, Oklahoma, and near Atoka, Oklahoma

    Science.gov (United States)

    Shivers, Molly J.; Smith, S. Jerrod; Grout, Trevor S.; Lewis, Jason M.

    2015-01-01

    Dams provide beneficial functions such as flood control, recreation, and storage of water supplies, but they also entail risk; dam breaches and resultant floods can cause substantial property damage and loss of life. The State of Oklahoma requires each owner of a high-hazard dam, which the Federal Emergency Management Agency defines as dams for which failure or improper operation probably will cause loss of human life, to develop an emergency action plan specific to that dam. Components of an emergency action plan are to simulate a flood resulting from a possible dam breach and map the resulting downstream flood-inundation areas. The resulting flood-inundation maps can provide valuable information to city officials, emergency managers, and local residents for planning an emergency response if a dam breach occurs.

  1. Recommendations for a Barrier Island Breach Management Plan for Fire Island National Seashore, including the Otis Pike High Dune Wilderness Area, Long Island, New York

    Science.gov (United States)

    Williams, S. Jeffress; Foley, Mary K.

    2007-01-01

    -control stabilization of the headlandds such as the Montauk Point headlands, and deepening of navigation channels by dredging through the tidal inlets and in the bays. Indirect impacts that have a bearing on decisions to deal with breaching are: high-risk development of the barrier islands and low-lying areas of the mainland vulnerable to flooding, and the dredging of nearshore sand shoals for beach nourishment. The NPS strives to employ a coastal management framework for decision making that is based on assessment of the physical and ecological properties of the shoreline as well as human welfare and property. In order to protect developed areas of Fire Island and the mainland from loss of life, flooding, and other economic and physical damage, the NPS will likely need to consider allowing artificial closure of some breaches within the FIIS under certain circumstances. The decision by the NPS to allow breaches to evolve naturally and possibly close or to allow artificially closing breaches is based on four criteria: 1. Volumes of sediment transported landward and exchange of water and nutrients;

  2. Bounding the Probability of Error for High Precision Recognition

    CERN Document Server

    Kae, Andrew; Learned-Miller, Erik

    2009-01-01

    We consider models for which it is important, early in processing, to estimate some variables with high precision, but perhaps at relatively low rates of recall. If some variables can be identified with near certainty, then they can be conditioned upon, allowing further inference to be done efficiently. Specifically, we consider optical character recognition (OCR) systems that can be bootstrapped by identifying a subset of correctly translated document words with very high precision. This "clean set" is subsequently used as document-specific training data. While many current OCR systems produce measures of confidence for the identity of each letter or word, thresholding these confidence values, even at very high values, still produces some errors. We introduce a novel technique for identifying a set of correct words with very high precision. Rather than estimating posterior probabilities, we bound the probability that any given word is incorrect under very general assumptions, using an approximate worst case ...

  3. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  4. Comparing overflow and wave-overtopping induced breach initiation mechanisms in an embankment breach experiment

    Directory of Open Access Journals (Sweden)

    van Damme Myron

    2016-01-01

    Full Text Available As part of the SAFElevee project Delft University of Technology collabored with Flanders Hydraulics Research, and Infram B.V. in the preperation and execution of a full scale embankment breach experiment in November 2015. This breach experiment was performed on an 3.5m high embankment with a sand core and clay outer layer situated along the tidal river Scheldt in Belgium near Schellebelle. During the experiment a wave overtopping simulator and overflow simulator were used to initiate a breach. Both simulators were placed near the top of the waterside slope. The use of the simulators facilitated comparison between the effects of continueous overflow and the effects of intermittent wave overtopping. This paper presents the data collected during the experiment, describe the development of hypotheses on the failure processes using the latest insights, and comment on the failure initiation process of a grass covered flood embankment with a clay outer layer and a sandy core.

  5. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  6. Management of Classes with Breaches of Discipline

    Institute of Scientific and Technical Information of China (English)

    陈海军

    2009-01-01

    As an only child is pampered and spoiled by parents, the contemporary student easily breaches class discipline. Class management appears more concerned than ever because breaches of class discipline have great impact on teaching. The author clarifies the necessity to study the management of classes with breaches of class discipline, numerates the phenomena of breaches of class discipline, precisely analyzes the causes from teachers and students and especially submits several measures to effectively prevent breaches of class discipline.

  7. An extreme breaching of a barrier spit: insights on large breach formation and its impact on barrier dynamics

    Science.gov (United States)

    Iulian Zăinescu, Florin; Vespremeanu-Stroe, Alfred; Tătui, Florin

    2017-04-01

    In this study, we document a case of exceptionally large natural breaching of a sandy spit (Sacalin barrier, Danube delta) using Lidar data and satellite imagery, annual (and seasonal) surveys of topography and bathymetry on successive cross-barrier profiles, and hourly datasets of wind and waves. The breach morphology and dynamics was monitored and described from its inception to closure, together with its impact on the adjoining features (upper shoreface, back-barrier lagoon, downdrift coast) and on the local sediment budgets. Breaching is first observed to occur on a beach-length of 0.5 km in April 2012 and two years later reached 3.5 km (May 2014). The barrier translates to a recovery stage dominated by continuous back-barrier deposition through subaqueous cross-breach sediment transport. Soon, the barrier widening triggers a negative feedback which limits the back-barrier sediment transfer. As a result, back-barrier deposition decreases whilst the barrier aggradation through overwash becomes more frequent. The event was found to be a natural experiment which switched the barrier's decadal evolution from low cross-shore transport to high cross-shore transport over the barrier. Although previously considered as constant, the cross-shore transport recorded during the large breach lifespan is an order of magnitude larger than in the non-breach period. 3 x 106 m3 of sediment were deposited in three years which is equivalent to the modelled longshore transport in the region. Nevertheless, the sediment circuits are more complex involving exchanges with the upper shoreface, as indicated by the extensive erosion down to -4m. In the absence of tides, the Sacalin breach closed naturally in 3 years and brings a valuable contribution on how breaches may evolve, as only limited data has been internationally reported until now. The very high deposition rate of sediment in the breach is a testimony of the high sediment volumes supplied by the longshore transport and the high

  8. Large earthquakes create vertical permeability by breaching aquitards

    Science.gov (United States)

    Wang, Chi-Yuen; Liao, Xin; Wang, Lee-Ping; Wang, Chung-Ho; Manga, Michael

    2016-08-01

    Hydrologic responses to earthquakes and their mechanisms have been widely studied. Some responses have been attributed to increases in the vertical permeability. However, basic questions remain: How do increases in the vertical permeability occur? How frequently do they occur? Is there a quantitative measure for detecting the occurrence of aquitard breaching? We try to answer these questions by examining data from a dense network of ˜50 monitoring stations of clustered wells in a sedimentary basin near the epicenter of the 1999 M7.6 Chi-Chi earthquake in western Taiwan. While most stations show evidence that confined aquifers remained confined after the earthquake, about 10% of the stations show evidence of coseismic breaching of aquitards, creating vertical permeability as high as that of aquifers. The water levels in wells without evidence of coseismic breaching of aquitards show tidal responses similar to that of a confined aquifer before and after the earthquake. Those wells with evidence of coseismic breaching of aquitards, on the other hand, show distinctly different postseismic tidal response. Furthermore, the postseismic tidal response of different aquifers became strikingly similar, suggesting that the aquifers became hydraulically connected and the connection was maintained many months thereafter. Breaching of aquitards by large earthquakes has significant implications for a number of societal issues such as the safety of water resources, the security of underground waste repositories, and the production of oil and gas. The method demonstrated here may be used for detecting the occurrence of aquitard breaching by large earthquakes in other seismically active areas.

  9. Douglas County Dam Breach Inundation Areas

    Data.gov (United States)

    Kansas Data Access and Support Center — Dam breach analysis provides a prediction of the extent and timing of flooding from a catastrophic breach of the dams. These results are sufficient for developing...

  10. An Information Theory of Willful Breach

    National Research Council Canada - National Science Library

    Oren Bar-Gill; Omri Ben-Shahar

    2009-01-01

    Should willful breach be sanctioned more severely than inadvertent breach? Strikingly, there is sharp disagreement on this matter within American legal doctrine, in legal theory, and in comparative law...

  11. Numerical methods for high-dimensional probability density function equations

    Science.gov (United States)

    Cho, H.; Venturi, D.; Karniadakis, G. E.

    2016-01-01

    In this paper we address the problem of computing the numerical solution to kinetic partial differential equations involving many phase variables. These types of equations arise naturally in many different areas of mathematical physics, e.g., in particle systems (Liouville and Boltzmann equations), stochastic dynamical systems (Fokker-Planck and Dostupov-Pugachev equations), random wave theory (Malakhov-Saichev equations) and coarse-grained stochastic systems (Mori-Zwanzig equations). We propose three different classes of new algorithms addressing high-dimensionality: The first one is based on separated series expansions resulting in a sequence of low-dimensional problems that can be solved recursively and in parallel by using alternating direction methods. The second class of algorithms relies on truncation of interaction in low-orders that resembles the Bogoliubov-Born-Green-Kirkwood-Yvon (BBGKY) framework of kinetic gas theory and it yields a hierarchy of coupled probability density function equations. The third class of algorithms is based on high-dimensional model representations, e.g., the ANOVA method and probabilistic collocation methods. A common feature of all these approaches is that they are reducible to the problem of computing the solution to high-dimensional equations via a sequence of low-dimensional problems. The effectiveness of the new algorithms is demonstrated in numerical examples involving nonlinear stochastic dynamical systems and partial differential equations, with up to 120 variables.

  12. Numerical methods for high-dimensional probability density function equations

    Energy Technology Data Exchange (ETDEWEB)

    Cho, H. [Department of Mathematics, University of Maryland College Park, College Park, MD 20742 (United States); Venturi, D. [Department of Applied Mathematics and Statistics, University of California Santa Cruz, Santa Cruz, CA 95064 (United States); Karniadakis, G.E., E-mail: gk@dam.brown.edu [Division of Applied Mathematics, Brown University, Providence, RI 02912 (United States)

    2016-01-15

    In this paper we address the problem of computing the numerical solution to kinetic partial differential equations involving many phase variables. These types of equations arise naturally in many different areas of mathematical physics, e.g., in particle systems (Liouville and Boltzmann equations), stochastic dynamical systems (Fokker–Planck and Dostupov–Pugachev equations), random wave theory (Malakhov–Saichev equations) and coarse-grained stochastic systems (Mori–Zwanzig equations). We propose three different classes of new algorithms addressing high-dimensionality: The first one is based on separated series expansions resulting in a sequence of low-dimensional problems that can be solved recursively and in parallel by using alternating direction methods. The second class of algorithms relies on truncation of interaction in low-orders that resembles the Bogoliubov–Born–Green–Kirkwood–Yvon (BBGKY) framework of kinetic gas theory and it yields a hierarchy of coupled probability density function equations. The third class of algorithms is based on high-dimensional model representations, e.g., the ANOVA method and probabilistic collocation methods. A common feature of all these approaches is that they are reducible to the problem of computing the solution to high-dimensional equations via a sequence of low-dimensional problems. The effectiveness of the new algorithms is demonstrated in numerical examples involving nonlinear stochastic dynamical systems and partial differential equations, with up to 120 variables.

  13. 13 CFR 115.69 - Imminent Breach.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Imminent Breach. 115.69 Section... Surety Bond (PSB) Guarantees § 115.69 Imminent Breach. (a) No prior approval requirement. SBA will... an Imminent Breach of the terms of a Contract covered by an SBA guaranteed bond. The PSB Surety...

  14. Dam-breach analysis and flood-inundation mapping for Lakes Ellsworth and Lawtonka near Lawton, Oklahoma

    Science.gov (United States)

    Rendon, Samuel H.; Ashworth, Chad E.; Smith, S. Jerrod

    2012-01-01

    Dams provide beneficial functions such as flood control, recreation, and reliable water supplies, but they also entail risk: dam breaches and resultant floods can cause substantial property damage and loss of life. The State of Oklahoma requires each owner of a high-hazard dam, which the Federal Emergency Management Agency defines as dams for which failure or misoperation probably will cause loss of human life, to develop an emergency action plan specific to that dam. Components of an emergency action plan are to simulate a flood resulting from a possible dam breach and map the resulting downstream flood-inundation areas. The resulting flood-inundation maps can provide valuable information to city officials, emergency managers, and local residents for planning the emergency response if a dam breach occurs. Accurate topographic data are vital for developing flood-inundation maps. This report presents results of a cooperative study by the city of Lawton, Oklahoma, and the U.S. Geological Survey (USGS) to model dam-breach scenarios at Lakes Ellsworth and Lawtonka near Lawton and to map the potential flood-inundation areas of such dam breaches. To assist the city of Lawton with completion of the emergency action plans for Lakes Ellsworth and Lawtonka Dams, the USGS collected light detection and ranging (lidar) data that were used to develop a high-resolution digital elevation model and a 1-foot contour elevation map for the flood plains downstream from Lakes Ellsworth and Lawtonka. This digital elevation model and field measurements, streamflow-gaging station data (USGS streamflow-gaging station 07311000, East Cache Creek near Walters, Okla.), and hydraulic values were used as inputs for the dynamic (unsteady-flow) model, Hydrologic Engineering Center's River Analysis System (HEC-RAS). The modeled flood elevations were exported to a geographic information system to produce flood-inundation maps. Water-surface profiles were developed for a 75-percent probable maximum

  15. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  16. Evaluation of a hydrological model based on Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.

    2016-04-01

    Evaluation and discrimination of model structures is crucial to ensure an appropriate use of hydrological models. When evaluating model results by aggregating their quality in (a subset of) individual observations, overall results of this analysis sometimes conceal important detailed information about model structural deficiencies. Analyzing model results within their local (time) context can uncover this detailed information. In this research, a methodology called Bidirectional Reach (BReach) is proposed to evaluate and analyze results of a hydrological model by assessing the maximum left and right reach in each observation point that is used for model evaluation. These maximum reaches express the capability of the model to describe a subset of the evaluation data both in the direction of the previous (left) and of the following data (right). This capability is evaluated on two levels. First, on the level of individual observations, the combination of a parameter set and an observation is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Second, the behavior in a sequence of observations is evaluated by means of a tolerance degree. This tolerance degree expresses the condition for satisfactory model behavior in a data series and is defined by the percentage of observations within this series that can have non-acceptable model results. Based on both criteria, the maximum left and right reaches of a model in an observation represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. After assessing these reaches for a variety of tolerance degrees, results can be plotted in a combined BReach plot that show temporal changes in the behavior of model results. The methodology is applied on a Probability Distributed Model (PDM) of the river

  17. An Evaluation of the High-Probability Instruction Sequence with and without Programmed Reinforcement for Compliance with High-Probability Instructions

    Science.gov (United States)

    Zuluaga, Carlos A.; Normand, Matthew P.

    2008-01-01

    We assessed the effects of reinforcement and no reinforcement for compliance to high-probability (high-p) instructions on compliance to low-probability (low-p) instructions using a reversal design. For both participants, compliance with the low-p instruction increased only when compliance with high-p instructions was followed by reinforcement.…

  18. The extreme risk of personal data breaches and the erosion of privacy

    Science.gov (United States)

    Wheatley, Spencer; Maillart, Thomas; Sornette, Didier

    2016-01-01

    Personal data breaches from organisations, enabling mass identity fraud, constitute an extreme risk. This risk worsens daily as an ever-growing amount of personal data are stored by organisations and on-line, and the attack surface surrounding this data becomes larger and harder to secure. Further, breached information is distributed and accumulates in the hands of cyber criminals, thus driving a cumulative erosion of privacy. Statistical modeling of breach data from 2000 through 2015 provides insights into this risk: A current maximum breach size of about 200 million is detected, and is expected to grow by fifty percent over the next five years. The breach sizes are found to be well modeled by an extremely heavy tailed truncated Pareto distribution, with tail exponent parameter decreasing linearly from 0.57 in 2007 to 0.37 in 2015. With this current model, given a breach contains above fifty thousand items, there is a ten percent probability of exceeding ten million. A size effect is unearthed where both the frequency and severity of breaches scale with organisation size like s0.6. Projections indicate that the total amount of breached information is expected to double from two to four billion items within the next five years, eclipsing the population of users of the Internet. This massive and uncontrolled dissemination of personal identities raises fundamental concerns about privacy.

  19. A cooled avalanche photodiode with high photon detection probability

    Science.gov (United States)

    Robinson, D. L.; Metscher, B. D.

    1986-01-01

    An avalanche photodiode has been operated as a photon-counting detector with 2 to 3 times the sensitivity of currently-available photomultiplier tubes. APD (avalanche photodiodes) detection probabilities that exceed 27% and approach 50% have been measured at an optimum operating temperature which minimizes noise. The sources of noise and their dependence on operating temperature and bias voltage are discussed.

  20. Experiences of using UAVs for monitoring levee breaches

    Science.gov (United States)

    Brauneck, J.; Pohl, R.; Juepner, R.

    2016-11-01

    During floods technical protection facilities are subjected to high loads and might fail as several examples have shown in the past. During the major 2002 and 2013 floods in the catchment area of the Elbe River (Germany), some breaching levees caused large inundations in the hinterland. In such situations the emergency forces need comprehensive and reliable realtime information about the situation, especially the breach enlargement and discharge, the spatial and temporal development of the inundation and the damages. After an impressive progress meanwhile unmanned aerial vehicles (UAV) also called remotely piloted aircraft systems (RPAS) are highly capable to collect and transmit precise information from not accessible areas to the task force very quickly. Using the example of the Breitenhagen levee failure near the Saale-Elbe junction in Germany in June 2013 the processing steps will be explained that are needed to come from the visual UAV-flight information to a hydronumeric model. Modelling of the breach was implemented using photogrammetric ranging methods, such as structure from motion and dense image matching. These methods utilize conventional digital multiple view images or videos recorded by either a moving aerial platform or terrestrial photography and allow the construction of 3D point clouds, digital surface models and orthophotos. At Breitenhagen, a UAV recorded the beginning of the levee failure. Due to the dynamic character of the breach and the moving areal platform, 4 different surface models show valid data with extrapolated breach widths of 9 to 40 meters. By means of these calculations the flow rate through the breach has been determined. In addition the procedure has been tested in a physical model, whose results will be presented too.

  1. Probability based high temperature engineering creep and structural fire resistance

    CERN Document Server

    Razdolsky, Leo

    2017-01-01

    This volume on structural fire resistance is for aerospace, structural, and fire prevention engineers; architects, and educators. It bridges the gap between prescriptive- and performance-based methods and simplifies very complex and comprehensive computer analyses to the point that the structural fire resistance and high temperature creep deformations will have a simple, approximate analytical expression that can be used in structural analysis and design. The book emphasizes methods of the theory of engineering creep (stress-strain diagrams) and mathematical operations quite distinct from those of solid mechanics absent high-temperature creep deformations, in particular the classical theory of elasticity and structural engineering. Dr. Razdolsky’s previous books focused on methods of computing the ultimate structural design load to the different fire scenarios. The current work is devoted to the computing of the estimated ultimate resistance of the structure taking into account the effect of high temperatur...

  2. The probability of occurrence of high-loss windstorms

    Science.gov (United States)

    Massey, Neil

    2016-04-01

    Windstorms are one of the largest meteorological risks to life and property in Europe. High - loss windstorms, in terms of insured losses, are a result of not only the windspeed of the storm but also the position and track of the storm. The two highest loss storms on record, Daria (1990) and Lothar (1999) caused so much damage because they tracked across highly populated areas of Europe. Although the frequency and intensity of high - loss wind storms in the observed record is known, there are not enough samples, due to the short observed record, to truly know the distribution of the frequency and intensity of windstorms over Europe and, by extension, the distribution of losses which could occur if the atmosphere had been in a different state due to the internal variability of the atmosphere. Risk and loss modelling exercises carried out by and for the reinsurance industry have typically stochastically perturbed the historical record of high - loss windstorms to produce distributions of potential windstorms with greater sample sizes than the observations. This poster presents a new method of generating many samples of potential windstorms and analyses the frequency of occurrence, intensity and potential losses of these windstorms. The large ensemble regional climate modelling project weather@home is used to generate many regional climate model representations (800 per year) of the weather over Europe between 1985 and 2010. The regional climate model is driven at the boundaries by a free running global climate model and so each ensemble member represents a potential state of the atmosphere, rather than an observed state. The winter storm season of October to March is analysed by applying an objective cyclone identification and tracking algorithm to each ensemble member. From the resulting tracks, the windspeed within a 1000km radius of the cyclone centre is extracted and the maximum windspeed over a 72 hour period is derived as the storm windspeed footprint. This

  3. What caused the breach? An examination of use of information technology and health data breaches.

    Science.gov (United States)

    Wikina, Suanu Bliss

    2014-01-01

    Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states-Virginia, Illinois, California, Florida, New York, and Tennessee-in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates.

  4. Data security breaches and privacy in Europe

    CERN Document Server

    Wong, Rebecca

    2013-01-01

    Data Security Breaches and Privacy in Europe aims to consider data protection and cybersecurity issues; more specifically, it aims to provide a fruitful discussion on data security breaches. A detailed analysis of the European Data Protection framework will be examined. In particular, the Data Protection Directive 95/45/EC, the Directive on Privacy and Electronic Communications and the proposed changes under the Data Protection Regulation (data breach notifications) and its implications are considered. This is followed by an examination of the Directive on Attacks against information systems a

  5. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    An efficient method for estimating low first passage probabilities of high-dimensional nonlinear systems based on asymptotic estimation of low probabilities is presented. The method does not require any a priori knowledge of the system, i.e. it is a black-box method, and has very low requirements......, the failure probabilities of three well-known nonlinear systems are estimated. Next, a reduced degree-of-freedom model of a wind turbine is developed and is exposed to a turbulent wind field. The model incorporates very high dimensions and strong nonlinearities simultaneously. The failure probability...

  6. Recent research on the high-probability instructional sequence: A brief review.

    Science.gov (United States)

    Lipschultz, Joshua; Wilder, David A

    2017-04-01

    The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.

  7. Corporate governance: remedying and ratifying directors' breaches

    OpenAIRE

    Worthington, Sarah

    2000-01-01

    Extent to which company may relax scope and content of directors' duties, whether it can exonerate directors who default on their duties and whether it can ratify actions of defaulting directors and determine remedy for breach of duty.

  8. How to Survive a Data Breach

    CERN Document Server

    Mitchell, Stewart

    2009-01-01

    This is the downloadable version of this new pocket guide which provides essential support for organisations who would like to have a tried and tested procedure in place for dealing with data breaches.

  9. High-Probability Neurotransmitter Release Sites Represent an Energy-Efficient Design.

    Science.gov (United States)

    Lu, Zhongmin; Chouhan, Amit K; Borycz, Jolanta A; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L; Zhou, You; Meinertzhagen, Ian A; Macleod, Gregory T

    2016-10-10

    Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High-probability release sites are not uncommon, but their advantages are not well understood. Here, we test the hypothesis that high-probability release sites represent an energy-efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements, we calculated release site probabilities to differ considerably between terminals (0.33 versus 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high-probability release site terminals were found to be more efficient (0.13 versus 0.06). Our analytical model indicates that energy efficiency is optimal (∼0.15) at high release site probabilities (∼0.76). As limitations in energy supply constrain neural function, high-probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high-efficiency terminals depress significantly during episodic bursts of activity.

  10. On the Importance of Default Breach Remedies

    OpenAIRE

    Sloof, Randolph; Oosterbeek, Hessel; Sonnemans, Joep

    2006-01-01

    Theory predicts that default breach remedies are immaterial whenever contracting costs are negligible. Some experimental studies, however, suggest that in practice default rules do matter, as they may affect parties' preferences over contract terms. This paper presents results from an experiment designed to address the importance of default breach remedies for actual contract outcomes. We find that default rules do have an influence. The reason for this is not that contract proposals and/or r...

  11. SWOT analysis of breach models for common dike failure mechanisms

    NARCIS (Netherlands)

    Peeters, P.; Van Hoestenberghe, T.; Vincke, L.; Visser, P.J.

    2011-01-01

    The use of breach models includes two tasks: predicting breach characteristics and estimating flow through the breach. Strengths and weaknesses as well as opportunities and threats of different simplified and detailed physically-based breach models are listed following theoretical and practical crit

  12. 16 CFR 318.3 - Breach notification requirement.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Breach notification requirement. 318.3... HEALTH BREACH NOTIFICATION RULE § 318.3 Breach notification requirement. (a) In general. In accordance... breach of security of unsecured PHR identifiable health information that is in a personal health...

  13. SWOT analysis of breach models for common dike failure mechanisms

    NARCIS (Netherlands)

    Peeters, P.; Van Hoestenberghe, T.; Vincke, L.; Visser, P.J.

    2011-01-01

    The use of breach models includes two tasks: predicting breach characteristics and estimating flow through the breach. Strengths and weaknesses as well as opportunities and threats of different simplified and detailed physically-based breach models are listed following theoretical and practical

  14. A Pilot Study of Naturally Occurring High-Probability Request Sequences in Hostage Negotiations

    Science.gov (United States)

    Hughes, James

    2009-01-01

    In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with…

  15. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  16. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  17. Numerical modelling of Glacial Lake Outburst Floods using physically based dam-breach models

    Science.gov (United States)

    Westoby, M. J.; Brasington, J.; Glasser, N. F.; Hambrey, M. J.; Reynolds, J. M.; Hassan, M. A. A. M.

    2014-06-01

    The rapid development and instability of moraine-dammed proglacial lakes is increasing the potential for the occurrence of catastrophic Glacial Lake Outburst Floods (GLOFs) in high-mountain regions. Advanced, physically-based numerical dam-breach models represent an improvement over existing methods for the derivation of breach outflow hydrographs. However, significant uncertainty surrounds the initial parameterisation of such models, and remains largely unexplored. We use a unique combination of numerical dam-breach and two-dimensional hydrodynamic modelling, employed with a Generalised Likelihood Uncertainty Estimation (GLUE) framework to quantify the degree of equifinality in dam-breach model output for the reconstruction of the failure of Dig Tsho, Nepal. Monte Carlo analysis was used to sample the model parameter space, and morphological descriptors of the moraine breach were used to evaluate model performance. Equifinal breach morphologies were produced by parameter ensembles associated with differing breach initiation mechanisms, including overtopping waves and mechanical failure of the dam face. The material roughness coefficient was discovered to exert a dominant influence over model performance. Percentile breach hydrographs derived from cumulative distribution function hydrograph data under- or overestimated total hydrograph volume and were deemed to be inappropriate for input to hydrodynamic modelling. Our results support the use of a Total Variation Diminishing solver for outburst flood modelling, which was found to be largely free of numerical instability and flow oscillation. Routing of scenario-specific optimal breach hydrographs revealed prominent differences in the timing and extent of inundation. A GLUE-based method for constructing likelihood-weighted maps of GLOF inundation extent, flow depth, and hazard is presented, and represents an effective tool for communicating uncertainty and equifinality in GLOF hazard assessment. However, future

  18. Outage Probability of General Ad Hoc Networks in the High-Reliability Regime

    CERN Document Server

    Giacomelli, Riccardo; Haenggi, Martin

    2010-01-01

    Outage probabilities in wireless networks depend on various factors: the node distribution, the MAC scheme, and the models for path loss, fading and transmission success. In prior work on outage characterization for networks with randomly placed nodes, most of the emphasis was put on networks whose nodes are Poisson distributed and where ALOHA is used as the MAC protocol. In this paper we provide a general framework for the analysis of outage probabilities in the high-reliability regime. The outage probability characterization is based on two parameters: the intrinsic spatial contention $\\gamma$ of the network, introduced in [1], and the coordination level achieved by the MAC as measured by the interference scaling exponent $\\kappa$ introduced in this paper. We study outage probabilities under the signal-to-interference ratio (SIR) model, Rayleigh fading, and power-law path loss, and explain how the two parameters depend on the network model. The main result is that the outage probability approaches $\\gamma\\e...

  19. Antecedents of Psychological Contract Breach: The Role of Job Demands, Job Resources, and Affect.

    Directory of Open Access Journals (Sweden)

    Tim Vantilborgh

    Full Text Available While it has been shown that psychological contract breach leads to detrimental outcomes, relatively little is known about factors leading to perceptions of breach. We examine if job demands and resources predict breach perceptions. We argue that perceiving high demands elicits negative affect, while perceiving high resources stimulates positive affect. Positive and negative affect, in turn, influence the likelihood that psychological contract breaches are perceived. We conducted two experience sampling studies to test our hypotheses: the first using daily surveys in a sample of volunteers, the second using weekly surveys in samples of volunteers and paid employees. Our results confirm that job demands and resources are associated with negative and positive affect respectively. Mediation analyses revealed that people who experienced high job resources were less likely to report psychological contract breach, because they experienced high levels of positive affect. The mediating role of negative affect was more complex, as it increased the likelihood to perceive psychological contract breach, but only in the short-term.

  20. Antecedents of Psychological Contract Breach: The Role of Job Demands, Job Resources, and Affect.

    Science.gov (United States)

    Vantilborgh, Tim; Bidee, Jemima; Pepermans, Roland; Griep, Yannick; Hofmans, Joeri

    2016-01-01

    While it has been shown that psychological contract breach leads to detrimental outcomes, relatively little is known about factors leading to perceptions of breach. We examine if job demands and resources predict breach perceptions. We argue that perceiving high demands elicits negative affect, while perceiving high resources stimulates positive affect. Positive and negative affect, in turn, influence the likelihood that psychological contract breaches are perceived. We conducted two experience sampling studies to test our hypotheses: the first using daily surveys in a sample of volunteers, the second using weekly surveys in samples of volunteers and paid employees. Our results confirm that job demands and resources are associated with negative and positive affect respectively. Mediation analyses revealed that people who experienced high job resources were less likely to report psychological contract breach, because they experienced high levels of positive affect. The mediating role of negative affect was more complex, as it increased the likelihood to perceive psychological contract breach, but only in the short-term.

  1. Further evaluation of the high-probability instructional sequence with and without programmed reinforcement.

    Science.gov (United States)

    Wilder, David A; Majdalany, Lina; Sturkie, Latasha; Smeltz, Lindsay

    2015-09-01

    In 2 experiments, we examined the effects of programmed reinforcement for compliance with high-probability (high-p) instructions on compliance with low-probability (low-p) instructions. In Experiment 1, we compared the high-p sequence with and without programmed reinforcement (i.e., edible items) for compliance with high-p instructions. Results showed that the high-p sequence increased compliance with low-p instructions only when compliance with high-p instructions was followed by reinforcement. In Experiment 2, we examined the role of reinforcer quality by delivering a lower quality reinforcer (praise) for compliance with high-p instructions. Results of Experiment 2 showed that the high-p sequence with lower quality reinforcement did not improve compliance with low-p instructions; the addition of a higher quality reinforcer (i.e., edible items) contingent on compliance with high-p instructions did increase compliance with low-p instructions.

  2. Controls on the breach geometry and flood hydrograph during overtopping of non-cohesive earthen dams

    Science.gov (United States)

    Walder, Joseph S.; Iverson, Richard M.; Godt, Jonathan W.; Logan, Matthew; Solovitz, Stephen A.

    2015-01-01

    Overtopping failure of non-cohesive earthen dams was investigated in 13 large-scale experiments with dams built of compacted, damp, fine-grained sand. Breaching was initiated by cutting a notch across the dam crest and allowing water escaping from a finite upstream reservoir to form its own channel. The channel developed a stepped profile, and upstream migration of the steps, which coalesced into a headcut, led to the establishment of hydraulic control (critical flow) at the channel head, or breach crest, an arcuate erosional feature that functions hydraulically as a weir. Novel photogrammetric methods, along with underwater videography, revealed that the retreating headcut maintained a slope near the angle of friction of the sand, while the cross section at the breach crest maintained a geometrically similar shape through time. That cross-sectional shape was nearly unaffected by slope failures, contrary to the assumption in many models of dam breaching. Flood hydrographs were quite reproducible--for sets of dams ranging in height from 0.55 m to 0.98 m--when the time datum was chosen as the time that the migrating headcut intersected the breach crest. Peak discharge increased almost linearly as a function of initial dam height. Early-time variability between flood hydrographs for nominally identical dams is probably a reflection of subtle experiment-to-experiment differences in groundwater hydrology and the interaction between surface water and groundwater.

  3. On the importance of default breach remedies

    NARCIS (Netherlands)

    Sloof, R.; Oosterbeek, H.; Sonnemans, J.

    2007-01-01

    Theory predicts that default breach remedies are immaterial whenever contracting costs are negligible. Some experimental studies, however, suggest that in practice default rules do matter, as they may affect parties' preferences over contract terms. This paper presents results from an experiment

  4. 7 CFR 3431.21 - Breach.

    Science.gov (United States)

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE VETERINARY MEDICINE LOAN REPAYMENT PROGRAM Administration of the Veterinary Medicine Loan Repayment Program § 3431.21 Breach. (a) General. If a program participant fails to...

  5. On the importance of default breach remedies

    NARCIS (Netherlands)

    Sloof, R.; Oosterbeek, H.; Sonnemans, J.

    2007-01-01

    Theory predicts that default breach remedies are immaterial whenever contracting costs are negligible. Some experimental studies, however, suggest that in practice default rules do matter, as they may affect parties' preferences over contract terms. This paper presents results from an experiment des

  6. Peculiarities of high-overtone transition probabilities in carbon monoxide revealed by high-precision calculation

    Energy Technology Data Exchange (ETDEWEB)

    Medvedev, Emile S., E-mail: esmedved@orc.ru [The Institute of Problems of Chemical Physics, Russian Academy of Sciences, Prospect Akademika Semenova 1, 142432 Chernogolovka (Russian Federation); Meshkov, Vladimir V.; Stolyarov, Andrey V. [Department of Chemistry, Lomonosov Moscow State University, Leninskie gory 1/3, 119991 Moscow (Russian Federation); Gordon, Iouli E. [Atomic and Molecular Physics Division, Harvard-Smithsonian Center for Astrophysics, 60 Garden St, Cambridge, Massachusetts 02138 (United States)

    2015-10-21

    In the recent work devoted to the calculation of the rovibrational line list of the CO molecule [G. Li et al., Astrophys. J., Suppl. Ser. 216, 15 (2015)], rigorous validation of the calculated parameters including intensities was carried out. In particular, the Normal Intensity Distribution Law (NIDL) [E. S. Medvedev, J. Chem. Phys. 137, 174307 (2012)] was employed for the validation purposes, and it was found that, in the original CO line list calculated for large changes of the vibrational quantum number up to Δn = 41, intensities with Δn > 11 were unphysical. Therefore, very high overtone transitions were removed from the published list in Li et al. Here, we show how this type of validation is carried out and prove that the quadruple precision is indispensably required to predict the reliable intensities using the conventional 32-bit computers. Based on these calculations, the NIDL is shown to hold up for the 0 → n transitions till the dissociation limit around n = 83, covering 45 orders of magnitude in the intensity. The low-intensity 0 → n transition predicted in the work of Medvedev [Determination of a new molecular constant for diatomic systems. Normal intensity distribution law for overtone spectra of diatomic and polyatomic molecules and anomalies in overtone absorption spectra of diatomic molecules, Institute of Chemical Physics, Russian Academy of Sciences, Chernogolovka, 1984] at n = 5 is confirmed, and two additional “abnormal” intensities are found at n = 14 and 23. Criteria for the appearance of such “anomalies” are formulated. The results could be useful to revise the high-overtone molecular transition probabilities provided in spectroscopic databases.

  7. Efficient Estimation of first Passage Probability of high-Dimensional Nonlinear Systems

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Bucher, Christian

    2011-01-01

    on the system memory. Consequently, high-dimensional problems can be handled, and nonlinearities in the model neither bring any difficulty in applying it nor lead to considerable reduction of its efficiency. These characteristics suggest that the method is a powerful candidate for complicated problems. First......, the failure probabilities of three well-known nonlinear systems are estimated. Next, a reduced degree-of-freedom model of a wind turbine is developed and is exposed to a turbulent wind field. The model incorporates very high dimensions and strong nonlinearities simultaneously. The failure probability...

  8. 14 CFR 1274.936 - Breach of safety or security.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Breach of safety or security. 1274.936... AGREEMENTS WITH COMMERCIAL FIRMS Other Provisions and Special Conditions § 1274.936 Breach of safety or security. Breach of Safety or Security July 2002 Safety is the freedom from those conditions that can...

  9. 24 CFR 982.453 - Owner breach of contract.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Owner breach of contract. 982.453... Contract and Owner Responsibility § 982.453 Owner breach of contract. (a) Any of the following actions by the owner (including a principal or other interested party) is a breach of the HAP contract by...

  10. Using Analogies To Produce Long Term Conceptual Change: Overcoming High School Mathematics Students' Probability Misconceptions.

    Science.gov (United States)

    Fast, Gerald R.

    The existence of probability misconceptions at all levels has been well documented. Furthermore, these misconceptions have been shown to be widespread and highly resistant to change. Previous research has shown considerable success in overcoming misconceptions in the short term by basing the knowledge reconstruction process on problems which draw…

  11. Modelling the regional variability of the probability of high trihalomethane occurrence in municipal drinking water.

    Science.gov (United States)

    Cool, Geneviève; Lebel, Alexandre; Sadiq, Rehan; Rodriguez, Manuel J

    2015-12-01

    The regional variability of the probability of occurrence of high total trihalomethane (TTHM) levels was assessed using multilevel logistic regression models that incorporate environmental and infrastructure characteristics. The models were structured in a three-level hierarchical configuration: samples (first level), drinking water utilities (DWUs, second level) and natural regions, an ecological hierarchical division from the Quebec ecological framework of reference (third level). They considered six independent variables: precipitation, temperature, source type, seasons, treatment type and pH. The average probability of TTHM concentrations exceeding the targeted threshold was 18.1%. The probability was influenced by seasons, treatment type, precipitations and temperature. The variance at all levels was significant, showing that the probability of TTHM concentrations exceeding the threshold is most likely to be similar if located within the same DWU and within the same natural region. However, most of the variance initially attributed to natural regions was explained by treatment types and clarified by spatial aggregation on treatment types. Nevertheless, even after controlling for treatment type, there was still significant regional variability of the probability of TTHM concentrations exceeding the threshold. Regional variability was particularly important for DWUs using chlorination alone since they lack the appropriate treatment required to reduce the amount of natural organic matter (NOM) in source water prior to disinfection. Results presented herein could be of interest to authorities in identifying regions with specific needs regarding drinking water quality and for epidemiological studies identifying geographical variations in population exposure to disinfection by-products (DBPs).

  12. Breaching barriers to collaboration in public spaces

    DEFF Research Database (Denmark)

    Heinemann, Trine; Mitchell, Robb

    2014-01-01

    Technology provoking disparate individuals to collaborate or share experiences in the public space faces a difficult barrier, namely the ordinary social order of urban places. We employed the notion of the breaching experiment to explore how this barrier might be overcome. We analyse responses to...... of life in public spaces. Arising from this, we argue for the importance of qualities such as availability, facilitation, perspicuous settings, and perspicuous participants to encourage and support co-located strangers to collaborate and share experiences.......Technology provoking disparate individuals to collaborate or share experiences in the public space faces a difficult barrier, namely the ordinary social order of urban places. We employed the notion of the breaching experiment to explore how this barrier might be overcome. We analyse responses...

  13. Mass Transfer Model for a Breached Waste Package

    Energy Technology Data Exchange (ETDEWEB)

    C. Hsu; J. McClure

    2004-07-26

    The degradation of waste packages, which are used for the disposal of spent nuclear fuel in the repository, can result in configurations that may increase the probability of criticality. A mass transfer model is developed for a breached waste package to account for the entrainment of insoluble particles. In combination with radionuclide decay, soluble advection, and colloidal transport, a complete mass balance of nuclides in the waste package becomes available. The entrainment equations are derived from dimensionless parameters such as drag coefficient and Reynolds number and based on the assumption that insoluble particles are subjected to buoyant force, gravitational force, and drag force only. Particle size distributions are utilized to calculate entrainment concentration along with geochemistry model abstraction to calculate soluble concentration, and colloid model abstraction to calculate colloid concentration and radionuclide sorption. Results are compared with base case geochemistry model, which only considers soluble advection loss.

  14. Combat Simulation Using Breach Computer Language

    Science.gov (United States)

    1979-09-01

    Armament Division, FC5SCWSL; and Mr. John Tobak, Scientific and Engineering Application Division, Management Information Systems Directorate. The...what was then the US Army Electonic Command, Fort Monmouth. At this point, BREACH became known as BREWS, which stands for Battlefield Related...November 1976 Goldberg, S. et al, "ASARS Battle Model," Book 1, Volume 1, Executive Summary, SA Group Technical Report TR 9073, Fort

  15. AFIP-6 Breach Assessment Report

    Energy Technology Data Exchange (ETDEWEB)

    Dan Wachs; Adam Robinson; Pavel Medvedev

    2011-02-01

    Analysis of the AFIP-6 experiment is summarized in this report in order to determine the cause of gaseous fission product release observed during irradiation. During the irradiation, a series of small fission product releases were observed. In order to limit the potential for primary coolant contamination, the operating cycle was terminated and the AFIP-6 experiment was removed for examination. Both in-canal and post-irradiation examination revealed the presence of an unusually thick oxide layer and discrete surface blisters on the fuel plates. These blisters were the likely cause of fission product release. Subsequent detailed thermal hydraulic analysis of the experiment indicated that the combination of the high operating power and test vehicle configuration led to high nominal operating temperatures for the fuel plates. This elevated temperature led to accelerated surface corrosion and eventually spallation of the fuel plate cladding. The thermal insulating nature of this corrosion layer led to significantly elevated fuel meat temperatures that induced blistering. Analysis was performed to validate a corrosion rate model and criteria for onset of spallation type surface corrosion were determined. The corrosion rate model will be used to estimate the oxide thickness anticipated for experiments in the future. The margin to the spallation threshold will then be used to project the experiment performance.

  16. All-Pairs Shortest Paths in $O(n^2)$ time with high probability

    CERN Document Server

    Peres, Yuval; Sudakov, Benny; Zwick, Uri

    2011-01-01

    We present an all-pairs shortest path algorithm whose running time on a complete directed graph on $n$ vertices whose edge weights are chosen independently and uniformly at random from $[0,1]$ is $O(n^2)$, in expectation and with high probability. This resolves a long standing open problem. The algorithm is a variant of the dynamic all-pairs shortest paths algorithm of Demetrescu and Italiano. The analysis relies on a proof that the number of \\emph{locally shortest paths} in such randomly weighted graphs is $O(n^2)$, in expectation and with high probability. We also present a dynamic version of the algorithm that recomputes all shortest paths after a random edge update in $O(\\log^{2}n)$ expected time.

  17. A high detection probability method for Gm-APD photon counting laser radar

    Science.gov (United States)

    Zhang, Zi-jing; Zhao, Yuan; Zhang, Yong; Wu, Long; Su, Jian-zhong

    2013-08-01

    Since Geiger mode Avalanche Photodiode (GmAPD) device was applied in laser radar system, the performance of system has been enhanced due to the ultra-high sensitivity of GmAPD, even responding a single photon. However, the background noise makes ultra-high sensitive GmAPD produce false alarms, which severely impacts on the detection of laser radar system based on Gm-APD and becomes an urgent problem which needs to be solved. To address this problem, a few times accumulated two-GmAPDs strategy is proposed in this paper. Finally, an experimental measurement is made under the background noise in sunny day. The results show a few times accumulated two- GmAPDs strategy can improve the detection probability and reduce the false alarm probability, and obtain a clear 3D image of target.

  18. HIV-1 Nef breaches placental barrier in rat model.

    Science.gov (United States)

    Singh, Poonam; Agnihotri, Saurabh Kumar; Tewari, Mahesh Chandra; Kumar, Sadan; Sachdev, Monika; Tripathi, Raj Kamal

    2012-01-01

    The vertical transmission of HIV-1 from the mother to fetus is known, but the molecular mechanism regulating this transmission is not fully characterized. The fetus is highly protected by the placenta, which does not permit microbial pathogens to cross the placental barrier. In the present study, a rat model was established to observe the effect of HIV-1 protein Nef on placental barrier. Evans blue dye was used to assay permeability of placental barrier and fourteen day pregnant Sprague Dawley rats were injected intravenously with 2% Evans blue dye along with various concentrations of recombinant Nef. After an hour, animals were sacrificed and dye migration was observed through the assimilation of peripheral blood into fetus. Interestingly, traces of recombinant Nef protein were detected in the embryo as well as amniotic fluid and amniotic membrane along with placenta and uterus. Our study indicates that recombinant HIV-1-Nef protein breaches the placental barrier and allows the migration of Evans blue dye to the growing fetus. Further the concentration of Nef protein in blood is directly proportional to the intensity of dye migration and to the amount of Nef protein detected in uterus, placenta, amniotic membrane, amniotic fluid and embryo. Based on this study, it can be concluded that the HIV-1 Nef protein has a direct effect on breaching of the placental barrier in the model we have established in this study. Our observations will be helpful to understand the molecular mechanisms related to this breach of placental barrier by Nef in humans and may be helpful to identify specific Nef inhibitors.

  19. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  20. High-frequency cranial electrostimulation (CES) in patients with probable Alzheimer's disease.

    Science.gov (United States)

    Scherder, Erik J A; van Tol, M J; Swaab, D F

    2006-07-01

    In a previous study, low-frequency cranial electrostimulation did not improve cognition and (affective) behavior in patients with probable Alzheimer's disease. In the present study, 21 Alzheimer's disease patients, divided into an experimental (n = 11) and a control group (n = 10), were treated for 30 mins/day, 5 days/wk, for 6 wks with high-frequency cranial electrostimulation. Similar to the previous study, no improvements on cognition and (affective) behavior were found.

  1. Efficient evaluation of small failure probability in high-dimensional groundwater contaminant transport modeling via a two-stage Monte Carlo method: FAILURE PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiangjiang [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Lin, Guang [Department of Mathematics and School of Mechanical Engineering, Purdue University, West Lafayette Indiana USA; Zeng, Lingzao [Zhejiang Provincial Key Laboratory of Agricultural Resources and Environment, Institute of Soil and Water Resources and Environmental Science, College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2017-03-01

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters. To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.

  2. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  3. Numerical modelling of glacial lake outburst floods using physically based dam-breach models

    Science.gov (United States)

    Westoby, M. J.; Brasington, J.; Glasser, N. F.; Hambrey, M. J.; Reynolds, J. M.; Hassan, M. A. A. M.; Lowe, A.

    2015-03-01

    The instability of moraine-dammed proglacial lakes creates the potential for catastrophic glacial lake outburst floods (GLOFs) in high-mountain regions. In this research, we use a unique combination of numerical dam-breach and two-dimensional hydrodynamic modelling, employed within a generalised likelihood uncertainty estimation (GLUE) framework, to quantify predictive uncertainty in model outputs associated with a reconstruction of the Dig Tsho failure in Nepal. Monte Carlo analysis was used to sample the model parameter space, and morphological descriptors of the moraine breach were used to evaluate model performance. Multiple breach scenarios were produced by differing parameter ensembles associated with a range of breach initiation mechanisms, including overtopping waves and mechanical failure of the dam face. The material roughness coefficient was found to exert a dominant influence over model performance. The downstream routing of scenario-specific breach hydrographs revealed significant differences in the timing and extent of inundation. A GLUE-based methodology for constructing probabilistic maps of inundation extent, flow depth, and hazard is presented and provides a useful tool for communicating uncertainty in GLOF hazard assessment.

  4. Overtopping breaching of river levees constructed with cohesive sediments

    Science.gov (United States)

    Wei, Hongyan; Yu, Minghui; Wang, Dangwei; Li, Yitian

    2016-07-01

    Experiments were conducted in a bend flume to study the overtopping breaching process and the corresponding overflow rates of river levees constructed with cohesive sediments. The river and land regions were separated by the constructed levee in the bend flume. Results showed that the levee breaching process can be subdivided into a slope erosion stage, a headcut retreat stage and a breach widening stage. Mechanisms such as flow shear erosion, impinging jet erosion, side slope erosion and cantilever collapse were discovered in the breaching process. The erosion characteristics were determined by both flow and soil properties. Finally, a depth-averaged 2-D flow model was used to simulate the levee breaching flow rates, which is well expressed by the broad-crested weir flow formula. The deduced discharge coefficient was smaller than that of common broad-crested rectangular weirs because of the shape and roughness of the breach.

  5. REDUCTION IN PROBABILITY OF TRAFFIC CONGESTION ON HIGH-CLASS ROAD USING RAMP ACCESS CONTROL

    Directory of Open Access Journals (Sweden)

    R. Yu. Lagerev

    2016-01-01

    Full Text Available Мerging traffic junctions on high-class roads are considered as bottlenecks in the network and quality of their operation determines a probability for formation of traffic congestions. Investigations on congestion situations in the merging zones of ramp and freeway traffic flows have demonstrated that queuing ramp traffic flow leads to formation of so called “turbulence” effect due to re-arrangement of transport facilities and reduction in their speed on main road direction. Having high queuing traffic flow on main road the “turbulence” component can result in formation of an impact blow in the main traffic flow. It has been proved that an impact of the ramp traffic flow on congestion probability is higher in comparison with main road traffic flow. The paper makes it possible to establish that some transport facilities moving along a high-way simul taneously occupy two lanes in the merging traffic zones and they reduce capacity of the used road section. It is necessary to take into account this specific feature and it is necessary to pay attention to it in the zones of “turbulence” effect formation. The paper presents main approaches, methodology, principles and stages required for access control of high-class roads which are directed on higher quality of their operation including improvement of road traffic safety. The paper proposes a methodоlogy that allows to evaluate and optimize ramp control in the context of a transport queue length minimization at adjoining ramps for the purposes of probability reduction in transport congestion.

  6. The high order dispersion analysis based on first-passage-time probability in financial markets

    Science.gov (United States)

    Liu, Chenggong; Shang, Pengjian; Feng, Guochen

    2017-04-01

    The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.

  7. Estimation of Discharge from Breached Earthfill Levee with Elapsed Time

    Science.gov (United States)

    Kim, Sooyoung; Yang, Jiro; Song, Chang Geun; Lee, Seung Oh

    2014-05-01

    Lack of the freeboard of levee has been occurred due to abnormally peaked flood events. Thus, the risk from overtopping of earthfill levee has been remarkably increased. When overflow on levee starts to occur, the breaching gap suddenly grows up at initial stage. As the breach width is extended, the discharge from breached section is also nonlinearly increased. Moreover, if the levee is located through multiple cities, the related damage cannot be predictable. However, researches about the breach mechanism have been focused on the breached shape of levee on the equilibrium state and the study on the development of levee breach is not enough to utilize the prediction of damage itself and select its countermeasure. In this study, the formula for breach discharge was presented to be able to predict that based on hydraulic experimental results. All experiments have been conducted with the movable levee which was the crown width of 0.3 m, the height of 0.3 m, the landside slope of 2:1 (H:V). Breach was induced by the lateral overflow for Froude numbers in main channel from 0.15 to 0.35 with the increment of 0.05. Based on the dimensional analysis with significant parameters such as main channel depth, breach width and discharge coefficient, temporal variation of each parameter was estimated with 25 experimental cases. Finally, the formula for prediction of breach flow due to overtopping failure of levee was presented considering the elapsed time for each Froude number after combing all significant parameters. When Froude number was less than 0.3, the breach discharge occurred to increase with Froude number while it became decreased with Froude number exceeding 0.3, which means the maximum breach discharge was occurred at Froude number = 0.3. It would be explained with the flow diversion caused by the collision of breach flow on the breached section downstream, which decreased the breach discharge into landside for higher Froude number of 0.3. As a future works, when the

  8. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    Science.gov (United States)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We

  9. Reconstructing phylogenies from noisy quartets in polynomial time with a high success probability

    Directory of Open Access Journals (Sweden)

    Wu Gang

    2008-01-01

    Full Text Available Abstract Background In recent years, quartet-based phylogeny reconstruction methods have received considerable attentions in the computational biology community. Traditionally, the accuracy of a phylogeny reconstruction method is measured by simulations on synthetic datasets with known "true" phylogenies, while little theoretical analysis has been done. In this paper, we present a new model-based approach to measuring the accuracy of a quartet-based phylogeny reconstruction method. Under this model, we propose three efficient algorithms to reconstruct the "true" phylogeny with a high success probability. Results The first algorithm can reconstruct the "true" phylogeny from the input quartet topology set without quartet errors in O(n2 time by querying at most (n - 4 log(n - 1 quartet topologies, where n is the number of the taxa. When the input quartet topology set contains errors, the second algorithm can reconstruct the "true" phylogeny with a probability approximately 1 - p in O(n4 log n time, where p is the probability for a quartet topology being an error. This probability is improved by the third algorithm to approximately 11+q2+12q4+116q5 MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaqcfa4aaSaaaeaacqaIXaqmaeaacqaIXaqmcqGHRaWkcqWGXbqCdaahaaqabeaacqaIYaGmaaGaey4kaSYaaSaaaeaacqaIXaqmaeaacqaIYaGmaaGaemyCae3aaWbaaeqabaGaeGinaqdaaiabgUcaRmaalaaabaGaeGymaedabaGaeGymaeJaeGOnaydaaiabdghaXnaaCaaabeqaaiabiwda1aaaaaaaaa@3D5A@, where q=p1−p MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaGaemyCaeNaeyypa0tcfa4aaSaaaeaacqWGWbaCaeaacqaIXaqmcqGHsislcqWGWbaCaaaaaa@3391@, with

  10. Cheating in OSCEs: The Impact of Simulated Security Breaches on OSCE Performance.

    Science.gov (United States)

    Gotzmann, Andrea; De Champlain, André; Homayra, Fahmida; Fotheringham, Alexa; de Vries, Ingrid; Forgie, Melissa; Pugh, Debra

    2017-01-01

    Construct: Valid score interpretation is important for constructs in performance assessments such as objective structured clinical examinations (OSCEs). An OSCE is a type of performance assessment in which a series of standardized patients interact with the student or candidate who is scored by either the standardized patient or a physician examiner. In high-stakes examinations, test security is an important issue. Students accessing unauthorized test materials can create an unfair advantage and lead to examination scores that do not reflect students' true ability level. The purpose of this study was to assess the impact of various simulated security breaches on OSCE scores. Seventy-six 3rd-year medical students participated in an 8-station OSCE and were randomized to either a control group or to 1 of 2 experimental conditions simulating test security breaches: station topic (i.e., providing a list of station topics prior to the examination) or egregious security breach (i.e., providing detailed content information prior to the examination). Overall total scores were compared for the 3 groups using both a one-way between-subjects analysis of variance and a repeated measure analysis of variance to compare the checklist, rating scales, and oral question subscores across the three conditions. Overall total scores were highest for the egregious security breach condition (81.8%), followed by the station topic condition (73.6%), and they were lowest for the control group (67.4%). This trend was also found with checklist subscores only (79.1%, 64.9%, and 60.3%, respectively for the security breach, station topic, and control conditions). Rating scale subscores were higher for both the station topic and egregious security breach conditions compared to the control group (82.6%, 83.1%, and 77.6%, respectively). Oral question subscores were significantly higher for the egregious security breach condition (88.8%) followed by the station topic condition (64.3%), and they were

  11. An indicator of probable semicircular canal dehiscence: ocular vestibular evoked myogenic potentials to high frequencies.

    Science.gov (United States)

    Manzari, Leonardo; Burgess, Ann M; McGarvie, Leigh A; Curthoys, Ian S

    2013-07-01

    The n10 component of the ocular vestibular evoked myogenic potential (oVEMP) to sound and vibration stimuli is a crossed response that has enhanced amplitude and decreased threshold in patients with CT-verified superior semicircular canal dehiscence (SSCD). However, demonstrating enhanced VEMP amplitude and reduced VEMP thresholds requires multiple trials and can be very time consuming and tiring for patients, so a specific indicator of probable SCD that is fast and not tiring would be preferred. Here we report a 1-trial indicator: that the oVEMP n10 in response to a very high frequency stimulus (4000 Hz), either air-conducted sound (ACS) or bone conducted vibration (BCV), is such a fast indicator of probable SCD. In 22 healthy subjects, oVEMP n10 at 4000 Hz was not detectable; however, in all 22 CT-verified SSCD patients tested, oVEMP n10 responses were clearly present to 4000 Hz to either ACS or BCV stimuli.

  12. An experimental comparison of reliance levels under alternative breach remedies

    NARCIS (Netherlands)

    Leuven, E.; Oosterbeek, H.; Sloof, R.; Sonnemans, J.H.

    2003-01-01

    Breach remedies serve an important role in protecting relationship-specific investments. Theory predicts that some common remedies protect too well and induce overinvestment because of complete insurance against potential separation, and the possibility to prevent breach by increasing the damage

  13. Just in Time Research: Data Breaches in Higher Education

    Science.gov (United States)

    Grama, Joanna

    2014-01-01

    This "Just in Time" research is in response to recent discussions on the EDUCAUSE Higher Education Information Security Council (HEISC) discussion list about data breaches in higher education. Using data from the Privacy Rights Clearinghouse, this research analyzes data breaches attributed to higher education. The results from this…

  14. Do Data Breach Disclosure Laws Reduce Identity Theft?

    Science.gov (United States)

    Romanosky, Sasha; Telang, Rahul; Acquisti, Alessandro

    2011-01-01

    In the United States, identity theft resulted in corporate and consumer losses of $56 billion dollars in 2005, with up to 35 percent of known identity thefts caused by corporate data breaches. Many states have responded by adopting data breach disclosure laws that require firms to notify consumers if their personal information has been lost or…

  15. Do Data Breach Disclosure Laws Reduce Identity Theft?

    Science.gov (United States)

    Romanosky, Sasha; Telang, Rahul; Acquisti, Alessandro

    2011-01-01

    In the United States, identity theft resulted in corporate and consumer losses of $56 billion dollars in 2005, with up to 35 percent of known identity thefts caused by corporate data breaches. Many states have responded by adopting data breach disclosure laws that require firms to notify consumers if their personal information has been lost or…

  16. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  17. Simulation of dam breach development for emergency treatment of the Tangjiashan Quake Lake in China

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The Tangjiashan Quake Lake is the largest quake lake triggered by the 5.12 Wenchuan Earthquake that happened on May 12,2008 in China,posing high risk of catastrophic flash flood hazards to downstream human life and properties.A physics-based numerical simulation approach is proposed for real-time prediction of dam breach development of the Tangjiashan Quake Lake in the case of emergency treatment.Bed erosion and lateral development of the dam breach are represented through accounting for the underlying physics including selective sediment transport and gravitational collapse.Conceptualized breach erosion model that involves few parameters enables quick calibration based on the monitored hydrological data in emergency analysis where fully geotechnical information about the barrier dam is not available.The process of dam breach development is found to be nonlinear in cascades due to the combined effects of headcutting and bank collapse.The agreement between the simulation results and the observed data shows the applicability of the present approach for emergency analysis of quake lakes.Limitations will arise in the situation where the soil composition of barrier dam is significantly inhomogeneous.Incorporation of circular arc failure for cohesive soil and lateral seepage in bank slope will also enhance its applicability to complex situations.

  18. Morphologic evolution of the wilderness area breach at Fire Island, New York—2012–15

    Science.gov (United States)

    Hapke, Cheryl J.; Nelson, Timothy R.; Henderson, Rachel E.; Brenner, Owen T.; Miselis, Jennifer L.

    2017-09-18

    IntroductionHurricane Sandy, which made landfall on October 29, 2012, near Atlantic City, New Jersey, had a significant impact on the coastal system along the south shore of Long Island, New York. A record significant wave height of 9.6 meters (m) was measured at wave buoy 44025, approximately 48 kilometers offshore of Fire Island, New York. Surge and runup during the storm resulted in extensive beach and dune erosion and breaching of the Fire Island barrier island system at two locations, including a breach that formed within the Otis Pike Fire Island High Dune Wilderness area on the eastern side of Fire Island.The U.S. Geological Survey (USGS) has a long history of conducting morphologic change and processes research at Fire Island. One of the primary objectives of the current research effort is to understand the morphologic evolution of the barrier system on a variety of time scales (from storm scale to decade(s) to century). A number of studies that support the project objectives have been published. Prior to Hurricane Sandy, however, little information was available on specific storm-driven change in this region. The USGS received Hurricane Sandy supplemental funding (project GS2–2B: Linking Coastal Processes and Vulnerability, Fire Island, New York, Regional Study) to enhance existing research efforts at Fire Island. The existing research was greatly expanded to include inner continental shelf mapping and investigations of processes of inner shelf sediment transport; beach and dune response and recovery; and observation, analysis, and modeling of the newly formed breach in the Otis Pike High Dune Wilderness area, herein referred to as the wilderness breach. The breach formed at the site of Old Inlet, which was open from 1763 to 1825. The location of the initial island breaching does not directly correspond with topographic lows of the dunes, but instead the breach formed in the location of a cross-island boardwalk that was destroyed during Hurricane Sandy

  19. Fast and accurate probability density estimation in large high dimensional astronomical datasets

    Science.gov (United States)

    Gupta, Pramod; Connolly, Andrew J.; Gardner, Jeffrey P.

    2015-01-01

    Astronomical surveys will generate measurements of hundreds of attributes (e.g. color, size, shape) on hundreds of millions of sources. Analyzing these large, high dimensional data sets will require efficient algorithms for data analysis. An example of this is probability density estimation that is at the heart of many classification problems such as the separation of stars and quasars based on their colors. Popular density estimation techniques use binning or kernel density estimation. Kernel density estimation has a small memory footprint but often requires large computational resources. Binning has small computational requirements but usually binning is implemented with multi-dimensional arrays which leads to memory requirements which scale exponentially with the number of dimensions. Hence both techniques do not scale well to large data sets in high dimensions. We present an alternative approach of binning implemented with hash tables (BASH tables). This approach uses the sparseness of data in the high dimensional space to ensure that the memory requirements are small. However hashing requires some extra computation so a priori it is not clear if the reduction in memory requirements will lead to increased computational requirements. Through an implementation of BASH tables in C++ we show that the additional computational requirements of hashing are negligible. Hence this approach has small memory and computational requirements. We apply our density estimation technique to photometric selection of quasars using non-parametric Bayesian classification and show that the accuracy of the classification is same as the accuracy of earlier approaches. Since the BASH table approach is one to three orders of magnitude faster than the earlier approaches it may be useful in various other applications of density estimation in astrostatistics.

  20. A laser profilometry technique for monitoring fluvial dike breaching in laboratory experiments

    Science.gov (United States)

    Dewals, Benjamin; Rifai, Ismail; Erpicum, Sébastien; Archambeau, Pierre; Violeau, Damien; Pirotton, Michel; El kadi Abderrezzak, Kamal

    2017-04-01

    A challenging aspect for experimental modelling of fluvial dike breaching is the continuous monitoring of the transient breach geometry. In dam breaching cases induced by flow overtopping over the whole breach crest (plane erosion), a side view through a glass wall is sufficient to monitor the breach formation. This approach can be extended for 3D dam breach tests (spatial erosion) if the glass wall is located along the breach centreline. In contrast, using a side view does not apply for monitoring fluvial dike breaching, because the breach is not symmetric in this case. We present a non-intrusive, high resolution technique to record the breach development in experimental models of fluvial dikes by means of a laser profilometry (Rifai et al. 2016). Most methods used for monitoring dam and dike breaching involve the projection of a pattern (fringes, grid) on the dam or dike body and the analysis of its deformation on images recorded during the breaching (e.g., Pickert et al. 2011, Frank and Hager 2014). A major limitation of these methods stems from reflection on the water surface, particularly in the vicinity of the breach where the free surface is irregular and rippled. This issue was addressed by Spinewine et al. (2004), who used a single laser sheet so that reflections on the water surface were strongly limited and did not hamper the accurate processing of each image. We have developed a similar laser profilometry technique tailored for laboratory experiments on fluvial dike breaching. The setup is simple and relatively low cost. It consists of a digital video camera (resolution of 1920 × 1080 pixels at 60 frames per second) and a swiping red diode 30 mW laser that enables the projection of a laser sheet over the dike body. The 2D image coordinates of each deformed laser profile incident on the dike are transformed into 3D object coordinates using the Direct Linear Transformation (DLT) algorithm. All 3D object coordinates computed over a swiping cycle of the

  1. Modeling of Breaching Due to Overtopping Flow and Waves Based on Coupled Flow and Sediment Transport

    Directory of Open Access Journals (Sweden)

    Zhiguo He

    2015-08-01

    Full Text Available Breaching of earthen or sandy dams/dunes by overtopping flow and waves is a complicated process with strong, unsteady flow, high sediment transport, and rapid bed changes in which the interactions between flow and morphology should not be ignored. This study presents a depth-averaged two-dimensional (2D coupled flow and sediment transport model to investigate the flow and breaching processes with and without waves. Bed change and variable flow density are included in the flow continuity and momentum equations to consider the impacts of sediment transport. The model adopts the non-equilibrium approach for total-load sediment transport and specifies different repose angles to handle non-cohesive embankment slope avalanching. The equations are solved using an explicit finite volume method on a rectangular grid with the improved Godunov-type central upwind scheme and the nonnegative reconstruction of the water depth method to handle mixed-regime flows near the breach. The model has been tested against two sets of experimental data which show that it well simulates the flow characteristics, bed changes, and sediment transport. It is then applied to analyze flow and morphologic changes by overtopping flow with and without waves. The simulated bed change and breach cross-section shape show a significant difference if waves are considered. Erosion by flow without waves mainly occurs at the breach and is dominated by vertical erosion at the initial stage followed by the lateral erosion. With waves, the flow overtops the entire length of the dune to cause faster erosion along the entire length. Erosion mainly takes place at the upper layer at the initial stage and gradually accelerates as the height of the dune reduces and flow discharge increases, which indicates the simulated results with waves shall be further verified by physical experimental evidence.

  2. Review Article: Lake and breach hazard assessment for moraine-dammed lakes: an example from the Cordillera Blanca (Peru

    Directory of Open Access Journals (Sweden)

    A. Emmer

    2013-06-01

    Full Text Available Glacial lake outburst floods (GLOFs and related debris flows represent a significant threat in high mountainous areas across the globe. It is necessary to quantify this threat so as to mitigate their catastrophic effects. Complete GLOF hazard assessment incorporates two phases: the probability of water release from a given glacial lake is estimated through lake and breach hazard assessment while the endangered areas are identified during downstream hazard assessment. This paper outlines a number of methods of lake and breach hazard assessment, which can be grouped into three categories: qualitative, of which we outline eight; semi-quantitative, of which we outline two; and quantitative, of which we outline three. It is considered that five groups of critical parameters are essential for an accurate regionally focused hazard assessment method for moraine-dammed lakes in the Cordillera Blanca. These comprise the possibility of dynamic slope movements into the lake, the possibility of a flood wave from a lake situated upstream, the possibility of dam rupture following a large earthquake, the size of the dam freeboard (or ratio of dam freeboard, and a distinction between natural dams and those with remedial work. It is shown that none of the summarised methods uses all these criteria with, at most, three of the five considered by the outlined methods. A number of these methods were used on six selected moraine-dammed lakes in the Cordillera Blanca: lakes Quitacocha, Checquiacocha, Palcacocha, Llaca, Rajucolta, and Tararhua. The results have been compared and show that each method has certain advantages and disadvantages when used in this region. These methods demonstrate that the most hazardous lake is Lake Palcacocha.

  3. The A to Z of healthcare data breaches.

    Science.gov (United States)

    Kobus, Theodore J

    2012-01-01

    There currently exists a myriad of privacy laws that impact a healthcare entity, including more than 47 notification laws that require notification when a data breach occurs, as well as the breach notification requirements of the Health Information Technology for Economic and Clinical Health Act. Given the plethora of issues a healthcare entity faces, there are certain principles that can be built into an organization's philosophy that will comply with the law and help protect it from reputational harm.

  4. 48 CFR 1852.223-75 - Major breach of safety or security.

    Science.gov (United States)

    2010-10-01

    ... major breach of safety may constitute a breach of contract that entitles the Government to exercise any of its rights and remedies applicable to material parts of this contract, including termination for default. A major breach of safety must be related directly to the work on the contract. A major breach...

  5. Values underlying perceptions of breach of the psychological contract

    Directory of Open Access Journals (Sweden)

    Leon Botha

    2010-03-01

    Full Text Available Orientation: This study identifies the most prominent breaches of the psychological contract and the values underlying the perceptions that violations have occurred.Research purpose: The study identifies the most important breaches and investigates which values underlie employee perceptions of breach of the psychological contract. It also addresses values that lead to employees interpreting incidents as breaches.Motivation for the study: The study calls on the fact that employees make inconsequential contributions to the terms of many formal employment contracts may imply that such contracts cannot be viewed as documents between equals.Research design, approach and method: The study identifies the most prominent breaches of the psychological contract and the values underlying the perceptions that violations have occurred.Main findings: The data revealed lack of promotion, poor interpersonal relations between colleagues and bad treatment by seniors as three main breaches of the contract, and social recognition, world of peace and sense of accomplishment as three dominant values that underlie perceptions of contract violation.Practical/managerial implications: The competent and intelligent manner in which lack of promotion is handled and communicated to employees is vital because it has implications for their willingness to contribute, their career prospects and their intention to stay in the organisation.Contribution/value-add: This research can serve as the basis for the development of survey or research instruments that are appropriate and relevant to the population.

  6. Effects of breach formation parameter uncertainty on inundation risk area and consequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Skousen, Benjamin Don [Los Alamos National Laboratory; David, Judi [Los Alamos National Laboratory; Mc Pherson, Timothy [Los Alamos National Laboratory; Burian, Steve [UNIV OF UTAH

    2010-01-01

    According to the national inventory of dams (NID), there are approximately 79,500 dams in the United States, with 11,800 of these dams being classified as high-hazard. It has been recommended that each high-hazard dam in the United States have an emergency action plan (EAP), but it has been found that only about 60% of the high-hazard dams have a complete EAP. A major aspect of these plans is inundation risk area identification and associated impacts in the event of dam failure. In order to determine the inundation risk area an estimation of breach discharge must be completed. Most methods used to determine breach discharge, including the NWS-DAMBRK model, require modelers to select size, shape, and time of breach formation. Federal agencies (e.g. Bureau of Reclamation, Federal Energy Regulatory Commission) with oversight of U.S. dams have recommended ranges of values for each of these parameters based on dam type. However, variations in these parameters even within the recommended range have the potential to impose significant transformation on the discharge hydrograph relative to both timing and magnitude of the peak discharge. Therefore, it has also been recommended that sensitivity of these parameters be investigated when performing breach inundation analyses. This paper presents a sensitivity analysis of three breach parameters (average breach width, side slope, and time to failure) on a case study dam located in the United States. The sensitivity analysis employed was based on the 3{sup 3} factorial design, in which three levels (e.g. low, medium, and high) were selected for each of the three parameters, resulting in twenty-seven combinations. The three levels remained within the recommended range of values for each parameter type. With each combination of input parameters, a discharge hydrograph was generated and used as a source condition for inundation analysis using a two-dimensional shallow water equation model. The resulting simulations were compared to

  7. Effects of breach formation parameter uncertainty on inundation risk area and consequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Skousen, Benjamin Don [Los Alamos National Laboratory; David, Judi [Los Alamos National Laboratory; Mc Pherson, Timothy [Los Alamos National Laboratory; Burian, Steve [UNIV OF UTAH

    2010-01-01

    According to the national inventory of dams (NID), there are approximately 79,500 dams in the United States, with 11,800 of these dams being classified as high-hazard. It has been recommended that each high-hazard dam in the United States have an emergency action plan (EAP), but it has been found that only about 60% of the high-hazard dams have a complete EAP. A major aspect of these plans is inundation risk area identification and associated impacts in the event of dam failure. In order to determine the inundation risk area an estimation of breach discharge must be completed. Most methods used to determine breach discharge, including the NWS-DAMBRK model, require modelers to select size, shape, and time of breach formation. Federal agencies (e.g. Bureau of Reclamation, Federal Energy Regulatory Commission) with oversight of U.S. dams have recommended ranges of values for each of these parameters based on dam type. However, variations in these parameters even within the recommended range have the potential to impose significant transformation on the discharge hydrograph relative to both timing and magnitude of the peak discharge. Therefore, it has also been recommended that sensitivity of these parameters be investigated when performing breach inundation analyses. This paper presents a sensitivity analysis of three breach parameters (average breach width, side slope, and time to failure) on a case study dam located in the United States. The sensitivity analysis employed was based on the 3{sup 3} factorial design, in which three levels (e.g. low, medium, and high) were selected for each of the three parameters, resulting in twenty-seven combinations. The three levels remained within the recommended range of values for each parameter type. With each combination of input parameters, a discharge hydrograph was generated and used as a source condition for inundation analysis using a two-dimensional shallow water equation model. The resulting simulations were compared to

  8. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  9. High-frequency cranial electrostimulation (CES) in patients with probable Alzheimer's disease

    NARCIS (Netherlands)

    Scherder, EJA; van Tol, MJ; Swaab, DF

    2006-01-01

    In a previous study, low-frequency cranial electrostimulation did not improve cognition and (affective) behavior in patients with probable Alzheimer's disease. In the present study, 2 1 Alzheimer's disease patients, divided into an experimental (n = 1 1) and a control group (n = 10), were treated fo

  10. High-frequency cranial electrostimulation (CES) in patients with probable Alzheimer's disease.

    NARCIS (Netherlands)

    Scherder, E.J.A.; Tol, M.J. van; Swaab, D.F.

    2006-01-01

    In a previous study, low-frequency cranial electrostimulation did not improve cognition and (affective) behavior in patients with probable Alzheimer's disease. In the present study, 21 Alzheimer's disease patients, divided into an experimental (n = 11) and a control group (n = 10), were treated for

  11. GIS inundation mapping and dam breach analysis of Woolwich Dam using HEC-geoRAS

    Energy Technology Data Exchange (ETDEWEB)

    Mocan, N. [Crozier and Associates Inc., Collingwood, ON (Canada); Joy, D.M. [Guelph Univ., ON (Canada); Rungis, G. [Grand River Conservation Authority, Cambridge, ON (Canada)

    2006-07-01

    A study was conducted to determine the extent of flood inundation given a hypothetical dam breach scenario of the Woolwich Dam located in the Grand River Watershed, 2.5 km north of the Town of Elmira, Ontario. The dam is operated by the Grand River Conservation Authority and was constructed to provide low-flow augmentation to Canagagigue Creek. Advances in the computational capabilities of numerical models along with the availability of fine resolution geospatial data has lead to significant advances in the evaluation of catastrophic consequences due to the ensuing flood waters when dams fail. The hydraulic models HEC-RAS and HEC-GeoRAS were used in this study along with GIS to produce high resolution spatial and temporal flood inundation mapping. Given the proximity to the Town of Elmira, the dam is classified as having a high hazard potential. The large size and high hazard potential of the dam suggests that the Inflow Design Flood (IDF) is the Probable Maximum Flood (PMF) event. The outlet structure of the spillway consists of 4 ogee-type concrete spillways equipped with radial gates. A low-level concrete pipe located within the spillway structure provides spillage for maintenance purposes. The full flow capacity of the spillway structure is 297 cubic metres per second at the full supply level of 364.8 metres. In addition to GIS flood inundation maps, this paper included the results of flood hydrographs, water surface profiles and peak flow data. It was concluded that techniques used in this analysis should be considered for use in the development of emergency management planning and dam safety assessments across Canada. 6 refs., 3 tabs., 4 figs.

  12. Probability, consequences, and mitigation for lightning strikes of Hanford high level waste tanks

    Energy Technology Data Exchange (ETDEWEB)

    Zach, J.J.

    1996-06-05

    The purpose of this report is to summarize selected lightning issues concerning the Hanford Waste Tanks. These issues include the probability of a lightning discharge striking the area immediately adjacent to a tank including a riser, the consequences of significant energy deposition from a lightning strike in a tank, and mitigating actions that have been or are being taken. The major conclusion of this report is that the probability of a lightning strike deposition sufficient energy in a tank to cause an effect on employees or the public is unlikely;but there are insufficient, quantitative data on the tanks and waste to prove that. Protection, such as grounding of risers and air terminals on existing light poles, is recommended.

  13. Probability, consequences, and mitigation for lightning strikes to Hanford site high-level waste tanks

    Energy Technology Data Exchange (ETDEWEB)

    Zach, J.J.

    1996-08-01

    The purpose of this report is to summarize selected lightning issues concerning the Hanford Waste Tanks. These issues include the probability of lightning discharge striking the area immediately adjacent to a tank including a riser, the consequences of significant energy deposition from a lightning strike in a tank, and mitigating actions that have been or are being taken. The major conclusion of this report is that the probability of a lightning strike depositing sufficient energy in a tank to cause an effect on employees or the public is unlikely;but there are insufficient, quantitative data on the tanks and waste to prove that. Protection, such as grounding of risers and air terminals on existing light poles, is recommended.

  14. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  15. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  16. An unusual case of coccidiosis in laboratory-reared pheasants resulting from a breach in biosecurity.

    Science.gov (United States)

    Gerhold, R W; Williams, S M; Fuller, A L; McDougald, L R

    2010-09-01

    An outbreak of coccidiosis in laboratory-reared Chinese ring-necked pheasants (Phasianus colchicus) resulted in high morbidity and moderate mortality. The outbreak was associated with a breach in biosecurity caused by the cleaning of a sewer line with a mechanical device, resulting in extensive splattering of fecal material throughout the "clean room" where birds were held prior to use in coccidiosis experiments. Mortality and morbidity in the affected birds were seen exactly 5 days after the incident, after birds had been moved to another room for experimental use, corresponding closely with the known prepatent or preclinical period of Eimeria phasiani and Eimeria colchici. Gross lesions in the affected birds varied from dehydration to intestinal and ventricular hemorrhage. Microscopic examination confirmed a diagnosis of severe intestinal coccidiosis. This report underscores the ease of contamination of experimental birds leading to coccidiosis outbreaks during breaches of management and biosecurity.

  17. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  18. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Science.gov (United States)

    Baird, Katherine E

    2016-01-01

    Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills. PMID:27651901

  19. Natural history of human papillomavirus infection in non-vaccinated young males: low clearance probability in high-risk genotypes.

    Science.gov (United States)

    Cai, T; Perletti, G; Meacci, F; Magri, V; Verze, P; Palmieri, A; Mazzoli, S; Santi, R; Nesi, G; Mirone, V; Bartoletti, R

    2016-03-01

    In this study, we aimed to investigate the clearance of type-specific genital human papillomavirus (HPV) infection in heterosexual, non-HPV-vaccinated males whose female partners were positive to HPV DNA tests. All consecutive men attending the same sexually transmitted diseases (STD) centre between January 2005 and December 2006 were considered for this study. All subjects (n = 1009) underwent a urologic visit and microbiological tests on first void, midstream urine and total ejaculate samples. One hundred and five patients were positive for HPV DNA (10.4 %; mean age: 34.8 ± 5.8 years) and consented to clinical examination and molecular diagnostic assays for HPV detection scheduled every 6 months (median surveillance period of 53.2 months). HPV genotypes were classified as high risk, probable high risk and low risk. HPV-positive samples which did not hybridise with any of the type-specific probes were referred to as positive non-genotypeable. At enrollment, the distribution of HPV genotypes was as follows: high-risk HPV (n = 37), probable high-risk HPV (n = 6), low-risk HPV (n = 23) and non-genotypeable HPV (n = 39). A high HPV genotype concordance between stable sexual partners emerged (kappa = 0.92; p probable high-risk HPV cases, 20/23 (86.9 %) low-risk HPV cases and 31/39 (79.5 %) non-genotypeable cases. The high-risk HPV genotypes showed the lowest rate and probability of viral clearance (p < 0.001). In our series, high-risk HPV infections were more likely to persist over time when compared with other HPV genotypes.

  20. Managing breaches of containment and eradication of invasive plant populations.

    Science.gov (United States)

    Fletcher, Cameron S; Westcott, David A; Murphy, Helen T; Grice, Anthony C; Clarkson, John R

    2015-02-01

    Containment can be a viable strategy for managing invasive plants, but it is not always cheaper than eradication. In many cases, converting a failed eradication programme to a containment programme is not economically justified. Despite this, many contemporary invasive plant management strategies invoke containment as a fallback for failed eradication, often without detailing how containment would be implemented.We demonstrate a generalized analysis of the costs of eradication and containment, applicable to any plant invasion for which infestation size, dispersal distance, seed bank lifetime and the economic discount rate are specified. We estimate the costs of adapting eradication and containment in response to six types of breach and calculate under what conditions containment may provide a valid fallback to a breached eradication programme.We provide simple, general formulae and plots that can be applied to any invasion and show that containment will be cheaper than eradication only when the size of the occupied zone exceeds a multiple of the dispersal distance determined by seed bank longevity and the discount rate. Containment becomes proportionally cheaper than eradication for invaders with smaller dispersal distances, longer lived seed banks, or for larger discount rates.Both containment and eradication programmes are at risk of breach. Containment is less exposed to risk from reproduction in the 'occupied zone' and three types of breach that lead to a larger 'occupied zone', but more exposed to one type of breach that leads to a larger 'buffer zone'.For a well-specified eradication programme, only the three types of breach leading to reproduction in or just outside the buffer zone can justify falling back to containment, and only if the expected costs of eradication and containment were comparable before the breach.Synthesis and applications. Weed management plans must apply a consistent definition of containment and provide sufficient implementation detail

  1. Modeling of Breaching Due to Overtopping Flow and Waves Based on Coupled Flow and Sediment Transport

    CERN Document Server

    He, Zhiguo; Zhao, Liang; Wu, Ganfeng; Pähtz, Thomas

    2015-01-01

    Breaching of earthen or sandy dams/dunes by overtopping flow and waves is a complicated process with strong, unsteady flow, high sediment transport, and rapid bed changes in which the interactions between flow and morphology should not be ignored. This study presents a depth-averaged two-dimensional (2D) coupled flow and sediment transport model to investigate the flow and breaching processes with and without waves. Bed change and variable flow density are included in the flow continuity and momentum equations to consider the impacts of sediment transport. The model adopts the non-equilibrium approach for total-load sediment transport and specifies different repose angles to handle non-cohesive embankment slope avalanching. The equations are solved using an explicit finite volume method on a rectangular grid with the improved Godunov-type central upwind scheme and the nonnegative reconstruction of the water depth method to handle mixed-regime flows near the breach. The model has been tested against two sets o...

  2. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  3. Decrease the Number of Glovebox Glove Breaches and Failures

    Energy Technology Data Exchange (ETDEWEB)

    Hurtle, Jackie C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2013-12-24

    Los Alamos National Laboratory (LANL) is committed to the protection of the workers, public, and environment while performing work and uses gloveboxes as engineered controls to protect workers from exposure to hazardous materials while performing plutonium operations. Glovebox gloves are a weak link in the engineered controls and are a major cause of radiation contamination events which can result in potential worker exposure and localized contamination making operational areas off-limits and putting programmatic work on hold. Each day of lost opportunity at Technical Area (TA) 55, Plutonium Facility (PF) 4 is estimated at $1.36 million. Between July 2011 and June 2013, TA-55-PF-4 had 65 glovebox glove breaches and failures with an average of 2.7 per month. The glovebox work follows the five step safety process promoted at LANL with a decision diamond interjected for whether or not a glove breach or failure event occurred in the course of performing glovebox work. In the event that no glove breach or failure is detected, there is an additional decision for whether or not contamination is detected. In the event that contamination is detected, the possibility for a glove breach or failure event is revisited.

  4. 50 CFR 38.9 - Breach of the peace.

    Science.gov (United States)

    2010-10-01

    ... NATIONAL WILDLIFE REFUGE SYSTEM MIDWAY ATOLL NATIONAL WILDLIFE REFUGE Prohibitions § 38.9 Breach of the peace. No person on Midway Atoll National Wildlife Refuge will: (a) With intent to cause public... one's conduct is likely to cause affront or alarm. ...

  5. 75 FR 13138 - Grand Ditch Breach Restoration Environmental Impact Statement, Rocky Mountain National Park, CO

    Science.gov (United States)

    2010-03-18

    ... National Park Service Grand Ditch Breach Restoration Environmental Impact Statement, Rocky Mountain... prepare an Environmental Impact Statement for the Grand Ditch Breach Restoration, Rocky Mountain National...), the National Park Service is preparing an Environmental Impact Statement for the Grand Ditch...

  6. Quantum probability

    CERN Document Server

    Gudder, Stanley P

    2014-01-01

    Quantum probability is a subtle blend of quantum mechanics and classical probability theory. Its important ideas can be traced to the pioneering work of Richard Feynman in his path integral formalism.Only recently have the concept and ideas of quantum probability been presented in a rigorous axiomatic framework, and this book provides a coherent and comprehensive exposition of this approach. It gives a unified treatment of operational statistics, generalized measure theory and the path integral formalism that can only be found in scattered research articles.The first two chapters survey the ne

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...

  8. Highly Oxidized Platinum Nanoparticles Prepared through Radio-Frequency Sputtering: Thermal Stability and Reaction Probability towards CO.

    Science.gov (United States)

    Svintsitskiy, Dmitry A; Kibis, Lidiya S; Stadnichenko, Andrey I; Koscheev, Sergei V; Zaikovskii, Vladimir I; Boronin, Andrei I

    2015-10-26

    Platinum-oxide nanoparticles were prepared through the radio-frequency (RF) discharge sputtering of a Pt electrode in an oxygen atmosphere. The structure, particles size, electronic properties, and surface composition of the RF-sputtered particles were studied by using transmission electron microscopy and X-ray photoelectron spectroscopy. The application of the RF discharge method resulted in the formation of highly oxidized Pt(4+) species that were stable under ultrahigh vacuum conditions up to 100 °C, indicating the capability of Pt(4+) -O species to play an important role in the oxidation catalysis under real conditions. The thermal stability and reaction probability of Pt(4+) oxide species were analyzed and compared with those of Pt(2+) species. The reaction probability of PtO2 nanoparticles at 90 °C was found to be about ten times higher than that of PtO-like structures.

  9. Battling Data Breaches: For Higher Education Institutions, Data Breach Prevention is More Complex than for Industry and Business

    Science.gov (United States)

    Patton, Madeline

    2015-01-01

    Data breach prevention is a battle, rarely plain and never simple. For higher education institutions, the Sisyphean aspects of the task are more complex than for industry and business. Two-year colleges have payrolls and vendor contracts like those enterprises. They also have public record and student confidentiality requirements. Colleges must…

  10. Application of sediment transport formulae to sand-dike breach erosion

    NARCIS (Netherlands)

    Visser, P.J.

    1994-01-01

    The Technical Advisory Committee on Water Defences in the Netherlands has decided to develop a mathematical model for breach erosion in dunes and dikes, with which it will be possible to predict the growth of the breach and the discharge rate through the breach in case of a dike-burst. An essential

  11. Overtopping breaching of cohesive homogeneous earth dam with different cohesive strength

    Institute of Scientific and Technical Information of China (English)

    ZHANG JianYun; LI Yun; XUAN GuoXiang; WANG XiaoGang; LI Jun

    2009-01-01

    dumping collapse.When the cohesive strength is smaller,the breach process becomes faster,and the peak outflow,the final width and depth of breach become bigger.The main character of the breach formation is single level head cutting and shearing collapse.

  12. Experimental study of breach process of landslide dams by overtopping and its initiation mechanisms

    Institute of Scientific and Technical Information of China (English)

    杨阳; 曹叔尤; 杨克君; 李文萍

    2015-01-01

    The present paper studies the physics of the breach erosion process, particularly, the breach initiation process in over- topped landslide dams. Due to great complexities involved, only homogeneous landslide dams are considered. The flume experime- nts of dam overtopping are conducted to study the breach growth process. And in order to reveal the effects of the seepage during the breach development, the permeability characteristics of the dam materials are also taken into consideration. With the experimental observation, the details of the breach growth are examined, and the whole breach process could be distinguished into five stages, i.e., Stage I, the seepage erosion, Stage II, the formation of the initial breach, Stage III, the erosion toward the head, Stage IV, the expan- sion and incision of the breach, and Stage V, the re-equilibration of the river channel through the breach. It is shown that once trigge- red the entire breach process goes continually without stop, which highlights the significant impact of the early stages on the later deformation of the dam. Evidence shows that the initial breach of the dam is most likely to take place in the downstream slope of the dam, near the upper edge of the seepage face. The experimental results show a “headcut” mechanism of the breach initiation.

  13. 25 CFR 163.42 - Obligated service and breach of contract.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Obligated service and breach of contract. 163.42 Section... breach of contract. (a) Obligated service. (1) Individuals completing forestry education programs with an... request for waiver. (b) Breach of contract. Any individual who has participated in and accepted...

  14. 15 CFR 971.405 - Breach of international peace and security involving armed conflict.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Breach of international peace and..., Modification/revision; Suspension/revocation § 971.405 Breach of international peace and security involving... expected to lead to a breach of international peace and security involving armed conflict....

  15. 78 FR 32441 - Grand Ditch Breach Restoration, Final Environmental Impact Statement, Rocky Mountain National...

    Science.gov (United States)

    2013-05-30

    ... National Park Service Grand Ditch Breach Restoration, Final Environmental Impact Statement, Rocky Mountain... Availability of the Final Environmental Impact Statement for the Grand Ditch Breach Restoration, Rocky Mountain... Grand Ditch Breach Restoration, Rocky Mountain National Park, Colorado. DATES: The National Park...

  16. 31 CFR 355.6 - What happens if the presenting bank breaches its warranty?

    Science.gov (United States)

    2010-07-01

    ... breaches its warranty? 355.6 Section 355.6 Money and Finance: Treasury Regulations Relating to Money and... GOVERNING FISCAL AGENCY CHECKS § 355.6 What happens if the presenting bank breaches its warranty? If the presenting bank breaches its warranty, the payor Reserve Bank may either return the check to the...

  17. 41 CFR 50-203.1 - Reports of breach or violation.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 1 2010-07-01 2010-07-01 true Reports of breach or... of the Walsh-Healey Public Contracts Act § 50-203.1 Reports of breach or violation. (a) Any employer, employee, labor or trade organization or other interested person or organization may report a breach...

  18. 15 CFR 970.505 - Breach of international peace and security involving armed conflict.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 3 2010-01-01 2010-01-01 false Breach of international peace and.../revision; Suspension/revocation § 970.505 Breach of international peace and security involving armed... to a breach of international peace and security involving armed conflict....

  19. 32 CFR 507.18 - Processing complaints of alleged breach of policies.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Processing complaints of alleged breach of... Program § 507.18 Processing complaints of alleged breach of policies. The Institute of Heraldry may revoke or suspend the certificate of authority to manufacture if there are breaches of quality...

  20. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  1. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  2. Thunderstorms as probable reason of high background neutron fluxes on L<1.2

    Science.gov (United States)

    Bratolyubova-Tsulukidze, L.; Grachev, E.; Grigoryan, O.; Kunitsyn, V.; Kuzhevskiy, B.; Nechaev, O.; Usanova, M.

    In this paper we analyze the neutron emission observations made in the experiment onboard MIR orbital station (1991), ISS (2002) and Colibri-2002 satellite (2002) at the altitude of 400 km. The helium discharge detectors made it possible to detect neutrons with energies ranging from 0.25eV to 1.9MeV. The spatial distribution of high background neutron fluxes has a longitude dependence. These events have been observed at -200 ... 600 and 1350 ...1800 ...- 1350 longitudinal intervals. The most intensive fluxes near the geomagnetic equator were registered in the African region. They are not found to be associated with increases of proton fluxes (Ep >50MeV). As a statistical set, the events appear to coincide with the most active region of atmospheric weather. In this paper we assess the possibility that the occurrence of high background neutron fluxes in the African region is connected with lightning discharges. To observe neutron emission at the altitude of 400 km ~101 0 neutrons are required to be produced by lightning discharge. These theoretical predictions suggest cloud charge values of about 250-300 Coulomb.

  3. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    Science.gov (United States)

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  4. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  5. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... of insurance companies facing losses due to natural disasters, banks seeking protection against huge losses, failures in expensive and sophisticated systems or loss of valuable information in electronic systems. The main difficulty when dealing with this kind of problems is the unavailability of a closed...

  6. Probability theory

    CERN Document Server

    S Varadhan, S R

    2001-01-01

    This volume presents topics in probability theory covered during a first-year graduate course given at the Courant Institute of Mathematical Sciences. The necessary background material in measure theory is developed, including the standard topics, such as extension theorem, construction of measures, integration, product spaces, Radon-Nikodym theorem, and conditional expectation. In the first part of the book, characteristic functions are introduced, followed by the study of weak convergence of probability distributions. Then both the weak and strong limit theorems for sums of independent rando

  7. Probability and volume of potential postwildfire debris flows in the 2012 High Park Burn Area near Fort Collins, Colorado

    Science.gov (United States)

    Verdin, Kristine L.; Dupree, Jean A.; Elliott, John G.

    2012-01-01

    This report presents a preliminary emergency assessment of the debris-flow hazards from drainage basins burned by the 2012 High Park fire near Fort Collins in Larimer County, Colorado. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of debris-flow occurrence and volume of debris flows along the burned area drainage network and to estimate the same for 44 selected drainage basins along State Highway 14 and the perimeter of the burned area. Input data for the models included topographic parameters, soil characteristics, burn severity, and rainfall totals and intensities for a (1) 2-year-recurrence, 1-hour-duration rainfall (25 millimeters); (2) 10-year-recurrence, 1-hour-duration rainfall (43 millimeters); and (3) 25-year-recurrence, 1-hour-duration rainfall (51 millimeters). Estimated debris-flow probabilities along the drainage network and throughout the drainage basins of interest ranged from 1 to 84 percent in response to the 2-year-recurrence, 1-hour-duration rainfall; from 2 to 95 percent in response to the 10-year-recurrence, 1-hour-duration rainfall; and from 3 to 97 in response to the 25-year-recurrence, 1-hour-duration rainfall. Basins and drainage networks with the highest probabilities tended to be those on the eastern edge of the burn area where soils have relatively high clay contents and gradients are steep. Estimated debris-flow volumes range from a low of 1,600 cubic meters to a high of greater than 100,000 cubic meters. Estimated debris-flow volumes increase with basin size and distance along the drainage network, but some smaller drainages were also predicted to produce substantial volumes of material. The predicted probabilities and some of the volumes predicted for the modeled storms indicate a potential for substantial debris-flow impacts on structures, roads, bridges, and culverts located both within and

  8. Development of computer code SAFFRON for evaluating breached pin performance in FBR's

    Energy Technology Data Exchange (ETDEWEB)

    Ukai, Shigeharu; Shikakura, Sakae (Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center); Sano, Yuji; Takita, Masami

    1994-07-01

    In order to evaluate the breached pin behavior in FBR, the breached pin performance analysis code SAFFRON was developed. Based on the results of run-beyond-cladding-breach test in FBR-II as a collaborative program between PNC and U.S.DOE, the following behaviors were taken into consideration; fuel sodium reaction product (FSRP) formation, resultant fuel expansion, breach extension of cladding and release of delayed neutron precursors into the coolant. Using 3-dimensional elastic analyses by finite element method, breached pin diameter increase is adequately predicted with the reduced Young's modulus of the breached fuel. The delayed neutron signal response in on-line diagnosis was evaluated in relation to the growth of FSRP and breached area enlargement. (author).

  9. Tapping Transaction Costs to Forecast Acquisition Cost Breaches

    Science.gov (United States)

    2016-01-01

    management , survival analysis  lead image by Diane Fleischer 58 Controlling cost growth for a major defense acquisition program (MDAP) has been...directing and controlling a totally integrated engineering effort of a system or program.” Program management is defined as “the business and...56 Defense ARJ, January 2016, Vol. 23 No. 1 : 56–76 TAPPING Transaction Costs to Forecast Acquisition COST BREACHES Laura E. Armey and Diana I

  10. Implications of Transaction Costs for Acquisition Program Cost Breaches

    Science.gov (United States)

    2013-06-01

    Government. IRB Protocol number ____N/A____. 12a. DISTRIBUTION / AVAILABILITY STATEMENT Approved for public release; distribution is unlimited 12b...purchasing price for the home (bargaining and decision costs). Last, if the home was purchased using money that was borrowed from a mortgage lender , the...not significantly affect the ability of an MDAP to operate within its approved financial constraints as measured by cost breaches. B. DATA The cost

  11. Transaction Costs and Cost Breaches in Major Defense Acquisition Programs

    Science.gov (United States)

    2014-02-04

    ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release...price for the home (bargaining and decision costs). Last, if the home was purchased using money that was borrowed from a mortgage lender , the...operate within its approved financial constraints as measured by cost breaches. Data Sources This study used two major data sources to develop the

  12. Standard breach remedies, quality thresholds, and cooperative investments

    OpenAIRE

    Stremitzer, Alexander

    2008-01-01

    When investments are non-verifiable, inducing cooperative investments with simple contracts may not be as difficult as previously thought. Indeed, modeling 'expectation damages' close to legal practice, we show that the default remedy of contract law induces the first best. Yet, in order to lower informational requirements of courts, parties may opt for a 'specific performance' regime which grants the breached-against buyer an option to choose 'restitution' if the tender's value falls below s...

  13. Childhood tumours with a high probability of being part of a tumour predisposition syndrome; reason for referral for genetic consultation.

    Science.gov (United States)

    Postema, Floor A M; Hopman, Saskia M J; Aalfs, Cora M; Berger, Lieke P V; Bleeker, Fonnet E; Dommering, Charlotte J; Jongmans, Marjolijn C J; Letteboer, Tom G W; Olderode-Berends, Maran J W; Wagner, Anja; Hennekam, Raoul C; Merks, Johannes H M

    2017-07-01

    Recognising a tumour predisposition syndrome (TPS) in childhood cancer patients is of major clinical relevance. The presence of a TPS may be suggested by the type of tumour in the child. We present an overview of 23 childhood tumours that in themselves should be a reason to refer a child for genetic consultation. We performed a PubMed search to review the incidence of TPSs in children for 85 tumour types listed in the International Classification of Childhood Cancer third edition (ICCC-3). The results were discussed during a national consensus meeting with representative clinical geneticists from all six academic paediatric oncology centres in The Netherlands. A TPS incidence of 5% or more was considered a high probability and therefore in itself a reason for referral to a clinical geneticist. The literature search resulted in data on the incidence of a TPS in 26 tumours. For 23/26 tumour types, a TPS incidence of 5% or higher was reported. In addition, during the consensus meeting the experts agreed that children with any carcinoma should always be referred for clinical genetic consultation as well, as it may point to a TPS. We present an overview of 23 paediatric tumours with a high probability of a TPS; this will facilitate paediatric oncologists to decide which patients should be referred for genetic consultation merely based on type of tumour. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    12211 Research Triangle Park , NC 27709-2211 ----- REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM(S...One theme that will arise repeatedly in the sequel is the connection between concentration and the rate of conver- gence to equilibrium of Markov...that is open for the weak convergence topology . Then lim inf n!1 1 n log P " 1 n n X k=1 X k 2 O # inf ⌫2O D(⌫||µ). Remark 4.33. We have only

  15. Temporally remote destabilization of prediction after rare breaches of expectancy.

    Science.gov (United States)

    Kühn, Anne B; Schubotz, Ricarda I

    2012-08-01

    While neural signatures of breaches of expectancy and their immediate effects have been investigated, thus far, temporally more remote effects have been neglected. The present fMRI study explored neural correlates of temporally remote destabilization of prediction following rare breaches of expectancy with a mean delay of 14 s. We hypothesized temporally remote destabilization to be reflected either in an attenuation of areas related to long-term memory or in an increase of lateral fronto-parietal loops related to the encoding of new stimuli. Monitoring a deterministic 24-digit sequence, subjects were asked to indicate occasional sequential omissions by key press. Temporally remote destabilization of prediction was expected to be revealed by contrasting sequential events whose equivalent was omitted in the preceding sequential run n-1 (destabilized events) with sequential events without such history (nondestabilized events). Temporally remote destabilization of prediction was reflected in an attenuation of activity in the dorsal frontomedian cortex (Brodmann Area (BA) 9) bilaterally. Moreover, activation of the left medial BA 9 was enhanced by contrasting nondestabilized events with breaches. The decrease of dorsal frontomedian activation in the case of destabilized events might be interpreted as a top-down modulation on perception causing a less expectation-restricted encoding of the current stimulus and hence enabling the adaptation of expectation and prediction in the long run.

  16. Mainstream body-character breach films and subjectivization.

    Science.gov (United States)

    Meiri, Sandra; Kohen-Raz, Odeya

    2017-02-01

    The authors analyze a unique cinematic corpus - 'body-character breach films' (one character, initially played by a certain actor, occupies the body of another character) - demonstrating Lacan's notion of traversing the fantasy, both on the level of the films' diegesis and that of spectatorship. Breaching the alliance between actors and their characters perturbs the viewer's fantasy of wholeness enabled by this very alliance. Consequently, a change in subject/spectatorial position in relation to the lack in the Other is induced, enhanced through the visualization of various scenarios of unconscious fantasies (mostly incest). These are meant to unsettle the spectator into an awareness of how a conscious fantasy conceals another unconscious fundamental fantasy, thereby encouraging a change in spectatorial position (from 'perverse'/fetishistic to 'neurotic'). Conflating this change with Lacan's notion of traversing the fantasy, the authors contend that mainstream cinema has the capacity to induce a process of subjectivization (assuming responsibility for one's own desire). This process is contingent on four conditions: identification with the protagonist's fantasy to conceal the lack in the Other; dissolution of this fantasy, initiated by the body-character breach; rhetorical strategies (the coding of unconscious scenarios cinematically); and an ethical dimension (encouraging the subject/spectator to follow her/his desire). Copyright © 2016 Institute of Psychoanalysis.

  17. Context-adaptive binary arithmetic coding with precise probability estimation and complexity scalability for high-efficiency video coding

    Science.gov (United States)

    Karwowski, Damian; Domański, Marek

    2016-01-01

    An improved context-based adaptive binary arithmetic coding (CABAC) is presented. The idea for the improvement is to use a more accurate mechanism for estimation of symbol probabilities in the standard CABAC algorithm. The authors' proposal of such a mechanism is based on the context-tree weighting technique. In the framework of a high-efficiency video coding (HEVC) video encoder, the improved CABAC allows 0.7% to 4.5% bitrate saving compared to the original CABAC algorithm. The application of the proposed algorithm marginally affects the complexity of HEVC video encoder, but the complexity of video decoder increases by 32% to 38%. In order to decrease the complexity of video decoding, a new tool has been proposed for the improved CABAC that enables scaling of the decoder complexity. Experiments show that this tool gives 5% to 7.5% reduction of the decoding time while still maintaining high efficiency in the data compression.

  18. Probability of Face Contact for a High-Speed Pressurised Liquid Film Bearing Including a Slip Boundary Condition

    Directory of Open Access Journals (Sweden)

    Nicola Y. Bailey

    2015-06-01

    Full Text Available An initial deterministic mathematical model for the dynamic motion of a simple pressurised liquid film bearing is derived and utilised to evaluate the possibility of bearing contact for thin film operation. For a very thin film bearing the flow incorporates a Navier slip boundary condition as parametrised by a slip length that in general is subject to significant variability and is difficult to determine with precision. This work considers the formulation of a modified Reynolds equation for the pressurised liquid flow in a highly rotating coned bearing. Coupling of the axial motion of the stator is induced by prescribed axial oscillations of the rotor through the liquid film. The bearing gap is obtained from solving a nonlinear second-order non-autonomous ordinary differential equation, via a mapping solver. Variability in the value of the slip length parameter is addressed by considering it as a random variable with prescribed mean and standard deviation. The method of derived distributions is used to exactly quantify the impact of variability in the slip length with a parametric study investigating the effect of both the deterministic and distribution parameters on the probability of contact. Additionally, as the axial rotor oscillations also have a random aspect due to possible varying excitations of the system, the probability of contact is investigated for both random amplitude of the periodic rotor oscillations and random slip length, resulting in a two parameter random input problem. The probability of contact is examined to obtain exact solutions and evaluate a range of bearing configurations.

  19. Favourable ten-year overall survival in a Caucasian population with high probability of hereditary breast cancer

    Directory of Open Access Journals (Sweden)

    Pasini Giuseppe

    2010-03-01

    Full Text Available Abstract Background The purpose of our study was to compare differences in the prognosis of breast cancer (BC patients at high (H risk or intermediate slightly (IS increased risk based on family history and those without a family history of BC, and to evaluate whether ten-year overall survival can be considered a good indicator of BRCA1 gene mutation. Methods We classified 5923 breast cancer patients registered between 1988 and 2006 at the Department of Oncology and Haematology in Modena, Italy, into one of three different risk categories according to Modena criteria. One thousand eleven patients at H and IS increased risk were tested for BRCA1/2 mutations. The overall survival (OS and disease free survival (DFS were the study end-points. Results Eighty BRCA1 carriers were identified. A statistically significantly better prognosis was observed for patients belonging to the H risk category with respect to women in the IS and sporadic groups (82% vs.75% vs.73%, respectively; p BRCA1 carriers with BRCA-negative and sporadic BC (77% vs.77% vs.73%, respectively; p Conclusions Patients belonging to a population with a high probability of being BRCA1 carriers had a better prognosis than those with sporadic BC. Considering these results, women who previously had BC and had survived ten years could be selected for BRCA1 analysis among family members at high risk of hereditary BC during genetic counselling. Since only 30% of patients with a high probability of having hereditary BC have BRCA1 mutations, selecting women with a long term survival among this population could increase the rate of positive analyses, avoiding the use of expensive tests.

  20. The Lateral Trigger Probability function for the Ultra-High Energy Cosmic Ray Showers detected by the Pierre Auger Observatory

    CERN Document Server

    Abreu, P; Ahn, E J; Albuquerque, I F M; Allard, D; Allekotte, I; Allen, J; Allison, P; Castillo, J Alvarez; Alvarez-Muñiz, J; Ambrosio, M; Aminaei, A; Anchordoqui, L; Andringa, S; Antičić, T; Anzalone, A; Aramo, C; Arganda, E; Arqueros, F; Asorey, H; Assis, P; Aublin, J; Ave, M; Avenier, M; Avila, G; Bäcker, T; Balzer, M; Barber, K B; Barbosa, A F; Bardenet, R; Barroso, S L C; Baughman, B; Bäuml, J; Beatty, J J; Becker, B R; Becker, K H; Bellétoile, A; Bellido, J A; BenZvi, S; Berat, C; Bertou, X; Biermann, P L; Billoir, P; Blanco, F; Blanco, M; Bleve, C; Blümer, H; Boháčová, M; Boncioli, D; Bonifazi, C; Bonino, R; Borodai, N; Brack, J; Brogueira, P; Brown, W C; Bruijn, R; Buchholz, P; Bueno, A; Burton, R E; Caballero-Mora, K S; Caramete, L; Caruso, R; Castellina, A; Catalano, O; Cataldi, G; Cazon, L; Cester, R; Chauvin, J; Cheng, S H; Chiavassa, A; Chinellato, J A; Chou, A; Chudoba, J; Clay, R W; Coluccia, M R; Conceição, R; Contreras, F; Cook, H; Cooper, M J; Coppens, J; Cordier, A; Coutu, S; Covault, C E; Creusot, A; Criss, A; Cronin, J; Curutiu, A; Dagoret-Campagne, S; Dallier, R; Dasso, S; Daumiller, K; Dawson, B R; de Almeida, R M; De Domenico, M; De Donato, C; de Jong, S J; De La Vega, G; Junior, W J M de Mello; Neto, J R T de Mello; De Mitri, I; de Souza, V; de Vries, K D; Decerprit, G; del Peral, L; del Río, M; Deligny, O; Dembinski, H; Dhital, N; Di Giulio, C; Diaz, J C; Castro, M L Díaz; Diep, P N; Dobrigkeit, C; Docters, W; D'Olivo, J C; Dong, P N; Dorofeev, A; Anjos, J C dos; Dova, M T; D'Urso, D; Dutan, I; Ebr, J; Engel, R; Erdmann, M; Escobar, C O; Espadanal, J; Etchegoyen, A; Luis, P Facal San; Tapia, I Fajardo; Falcke, H; Farrar, G; Fauth, A C; Fazzini, N; Ferguson, A P; Ferrero, A; Fick, B; Filevich, A; Filipčič, A; Fliescher, S; Fracchiolla, C E; Fraenkel, E D; Fröhlich, U; Fuchs, B; Gaior, R; Gamarra, R F; Gambetta, S; García, B; Gámez, D García; Garcia-Pinto, D; Gascon, A; Gemmeke, H; Gesterling, K; Ghia, P L; Giaccari, U; Giller, M; Glass, H; Gold, M S; Golup, G; Albarracin, F Gomez; Berisso, M Gómez; Gonçalves, P; Gonzalez, D; Gonzalez, J G; Gookin, B; Góra, D; Gorgi, A; Gouffon, P; Gozzini, S R; Grashorn, E; Grebe, S; Griffith, N; Grigat, M; Grillo, A F; Guardincerri, Y; Guarino, F; Guedes, G P; Guzman, A; Hague, J D; Hansen, P; Harari, D; Harmsma, S; Harton, J L; Haungs, A; Hebbeker, T; Heck, D; Herve, A E; Hojvat, C; Hollon, N; Holmes, V C; Homola, P; Hörandel, J R; Horneffer, A; Hrabovský, M; Huege, T; Insolia, A; Ionita, F; Italiano, A; Jarne, C; Jiraskova, S; Josebachuili, M; Kadija, K; Kampert, K H; Karhan, P; Kasper, P; Kégl, B; Keilhauer, B; Keivani, A; Kelley, J L; Kemp, E; Kieckhafer, R M; Klages, H O; Kleifges, M; Kleinfeller, J; Knapp, J; Koang, D -H; Kotera, K; Krohm, N; Krömer, O; Kruppke-Hansen, D; Kuehn, F; Kuempel, D; Kulbartz, J K; Kunka, N; La Rosa, G; Lachaud, C; Lautridou, P; Leão, M S A B; Lebrun, D; Lebrun, P; de Oliveira, M A Leigui; Lemiere, A; Letessier-Selvon, A; Lhenry-Yvon, I; Link, K; López, R; Agüera, A Lopez; Louedec, K; Bahilo, J Lozano; Lu, L; Lucero, A; Ludwig, M; Lyberis, H; Maccarone, M C; Macolino, C; Maldera, S; Mandat, D; Mantsch, P; Mariazzi, A G; Marin, J; Marin, V; Maris, I C; Falcon, H R Marquez; Marsella, G; Martello, D; Martin, L; Martinez, H; Bravo, O Martínez; Mathes, H J; Matthews, J; Matthews, J A J; Matthiae, G; Maurizio, D; Mazur, P O; Medina-Tanco, G; Melissas, M; Melo, D; Menichetti, E; Menshikov, A; Mertsch, P; Meurer, C; Mićanović, S; Micheletti, M I; Miller, W; Miramonti, L; Molina-Bueno, L; Mollerach, S; Monasor, M; Ragaigne, D Monnier; Montanet, F; Morales, B; Morello, C; Moreno, E; Moreno, J C; Morris, C; Mostafá, M; Moura, C A; Mueller, S; Muller, M A; Müller, G; Münchmeyer, M; Mussa, R; ‡, G Navarra; Navarro, J L; Navas, S; Necesal, P; Nellen, L; Nelles, A; Neuser, J; Nhung, P T; Niemietz, L; Nierstenhoefer, N; Nitz, D; Nosek, D; Nožka, L; Nyklicek, M; Oehlschläger, J; Olinto, A; Oliva, P; Olmos-Gilbaja, V M; Ortiz, M; Pacheco, N; Selmi-Dei, D Pakk; Palatka, M; Pallotta, J; Palmieri, N; Parente, G; Parizot, E; Parra, A; Parsons, R D; Pastor, S; Paul, T; Pech, M; Pękala, J; Pelayo, R; Pepe, I M; Perrone, L; Pesce, R; Petermann, E; Petrera, S; Petrinca, P; Petrolini, A; Petrov, Y; Petrovic, J; Pfendner, C; Phan, N; Piegaia, R; Pierog, T; Pieroni, P; Pimenta, M; Pirronello, V; Platino, M; Ponce, V H; Pontz, M; Privitera, P; Prouza, M; Quel, E J; Querchfeld, S; Rautenberg, J; Ravel, O; Ravignani, D; Revenu, B; Ridky, J; Riggi, S; Risse, M; Ristori, P; Rivera, H; Rizi, V; Roberts, J; Robledo, C; de Carvalho, W Rodrigues; Rodriguez, G; Martino, J Rodriguez; Rojo, J Rodriguez; Rodriguez-Cabo, I; Rodríguez-Frías, M D; Ros, G; Rosado, J; Rossler, T; Roth, M; Rouillé-d'Orfeuil, B; Roulet, E; Rovero, A C; Rühle, C; Salamida, F; Salazar, H; Salina, G; Sánchez, F; Santo, C E; Santos, E; Santos, E M; Sarazin, F; Sarkar, B; Sarkar, S; Sato, R; Scharf, N; Scherini, V; Schieler, H; Schiffer, P; Schmidt, A; Schmidt, F; Scholten, O; Schoorlemmer, H; Schovancova, J; Schovánek, P; Schröder, F; Schulte, S; Schuster, D; Sciutto, S J; Scuderi, M; Segreto, A; Settimo, M; Shadkam, A; Shellard, R C; Sidelnik, I; Sigl, G; Lopez, H H Silva; Śmiałkowski, A; Šmída, R; Snow, G R; Sommers, P; Sorokin, J; Spinka, H; Squartini, R; Stanic, S; Stapleton, J; Stasielak, J; Stephan, M; Strazzeri, E; Stutz, A; Suarez, F; Suomijärvi, T; Supanitsky, A D; Šuša, T; Sutherland, M S; Swain, J; Szadkowski, Z; Szuba, M; Tamashiro, A; Tapia, A; Tartare, M; Taşcău, O; Ruiz, C G Tavera; Tcaciuc, R; Tegolo, D; Thao, N T; Thomas, D; Tiffenberg, J; Timmermans, C; Tiwari, D K; Tkaczyk, W; Peixoto, C J Todero; Tomé, B; Tonachini, A; Travnicek, P; Tridapalli, D B; Tristram, G; Trovato, E; Tueros, M; Ulrich, R; Unger, M; Urban, M; Galicia, J F Valdés; Valiño, I; Valore, L; Berg, A M van den; Varela, E; Cárdenas, B Vargas; Vázquez, J R; Vázquez, R A; Veberič, D; Verzi, V; Vicha, J; Videla, M; Villaseñor, L; Wahlberg, H; Wahrlich, P; Wainberg, O; Warner, D; Watson, A A; Weber, M; Weidenhaupt, K; Weindl, A; Westerhoff, S; Whelan, B J; Wieczorek, G; Wiencke, L; Wilczyńska, B; Wilczyński, H; Will, M; Williams, C; Winchen, T; Winnick, M G; Wommer, M; Wundheiler, B; Yamamoto, T; Yapici, T; Younk, P; Yuan, G; Yushkov, A; Zamorano, B; Zas, E; Zavrtanik, D; Zavrtanik, M; Zaw, I; Zepeda, A; Silva, M Zimbres; Ziolkowski, M

    2011-01-01

    In this paper we introduce the concept of Lateral Trigger Probability (LTP) function, i.e., the probability for an extensive air shower (EAS) to trigger an individual detector of a ground based array as a function of distance to the shower axis, taking into account energy, mass and direction of the primary cosmic ray. We apply this concept to the surface array of the Pierre Auger Observatory consisting of a 1.5 km spaced grid of about 1600 water Cherenkov stations. Using Monte Carlo simulations of ultra-high energy showers the LTP functions are derived for energies in the range between 10^{17} and 10^{19} eV and zenith angles up to 65 degs. A parametrization combining a step function with an exponential is found to reproduce them very well in the considered range of energies and zenith angles. The LTP functions can also be obtained from data using events simultaneously observed by the fluorescence and the surface detector of the Pierre Auger Observatory (hybrid events). We validate the Monte-Carlo results sho...

  1. Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities

    Science.gov (United States)

    Abadie, Luis Maria; Galarraga, Ibon; Sainz de Murieta, Elisa

    2017-01-01

    A quantification of present and future mean annual losses due to extreme coastal events can be crucial for adequate decision making on adaptation to climate change in coastal areas around the globe. However, this approach is limited when uncertainty needs to be accounted for. In this paper, we assess coastal flood risk from sea-level rise and extreme events in 120 major cities around the world using an alternative stochastic approach that accounts for uncertainty. Probability distributions of future relative (local) sea-level rise have been used for each city, under three IPPC emission scenarios, RCP 2.6, 4.5 and 8.5. The approach allows a continuous stochastic function to be built to assess yearly evolution of damages from 2030 to 2100. Additionally, we present two risk measures that put low-probability, high-damage events in the spotlight: the Value at Risk (VaR) and the Expected Shortfall (ES), which enable the damages to be estimated when a certain risk level is exceeded. This level of acceptable risk can be defined involving different stakeholders to guide progressive adaptation strategies. The method presented here is new in the field of economics of adaptation and offers a much broader picture of the challenges related to dealing with climate impacts. Furthermore, it can be applied to assess not only adaptation needs but also to put adaptation into a timeframe in each city.

  2. Probability of cell hits in selected organs and tissues by high-LET particles at the ISS orbit

    Science.gov (United States)

    Yasuda, H.; Komiyama, T.; Fujitaka, K.; Badhwar, G. D. (Principal Investigator)

    2002-01-01

    The fluence of high-LET particles (HLP) with LET infinity H2O greater than 15 keV micrometers-1 in selected organs and tissues were measured with plastic nuclear track detectors using a life-size human phantom on the 9th Shuttle-Mir Mission (STS-91). The planar-track fluence of HLP during the 9.8-day mission ranged from 1.9 x 10(3) n cm-2 (bladder) to 5.1 x 10(3) n cm-2 (brain) by a factor of 2.7. Based on these data, a probability of HLP hits to a matured cell of each organ or tissue was roughly estimated for a 90-day ISS mission. In the calculation, all cells were assumed to be spheres with a geometric cross-sectional area of 500 micrometers2 and the cell-hit frequency from isotropic space radiation can be described by the Poisson-distribution function. As results, the probability of one or more than 1 hit to a single cell by HLP for 90 days ranged from 17% to 38%; that of two or more than 2 hits was estimated to be 1.3-8.2%. c2002 COSPAR. Published by Elsevier Science Ltd. All rights reserved.

  3. The Lateral Trigger Probability function for the ultra-high energy cosmic ray showers detected by the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Abreu, P.; /Lisbon, IST /Lisbon, LIFEP; Aglietta, M.; /INFN, Turin /Turin Observ. /Turin U.; Ahn, E.J.; /Fermilab; Albuquerque, I.F.M.; /Sao Paulo U.; Allard, D.; /APC, Paris; Allekotte, I.; /Centro Atomico Bariloche /Balseiro Inst., San Carlos de Bariloche; Allen, J.; /New York U.; Allison, P.; /Ohio State U.; Alvarez Castillo, J.; /Mexico U.; Alvarez-Muniz, J.; /Santiago de Compostela U.; Ambrosio, M.; /INFN, Naples /Naples U. /Nijmegen U., IMAPP

    2011-01-01

    In this paper we introduce the concept of Lateral Trigger Probability (LTP) function, i.e., the probability for an Extensive Air Shower (EAS) to trigger an individual detector of a ground based array as a function of distance to the shower axis, taking into account energy, mass and direction of the primary cosmic ray. We apply this concept to the surface array of the Pierre Auger Observatory consisting of a 1.5 km spaced grid of about 1600 water Cherenkov stations. Using Monte Carlo simulations of ultra-high energy showers the LTP functions are derived for energies in the range between 10{sup 17} and 10{sup 19} eV and zenith angles up to 65{sup o}. A parametrization combining a step function with an exponential is found to reproduce them very well in the considered range of energies and zenith angles. The LTP functions can also be obtained from data using events simultaneously observed by the fluorescence and the surface detector of the Pierre Auger Observatory (hybrid events). We validate the Monte Carlo results showing how LTP functions from data are in good agreement with simulations.

  4. The Lateral Trigger Probability function for the Ultra-High Energy Cosmic Ray showers detected by the Pierre Auger Observatory

    Science.gov (United States)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antičić, T.; Anzalone, A.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Bäcker, T.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Beatty, J. J.; Becker, B. R.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; Benzvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Burton, R. E.; Caballero-Mora, K. S.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chou, A.; Chudoba, J.; Clay, R. W.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Cotti, U.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; de Domenico, M.; de Donato, C.; de Jong, S. J.; de La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; de Mitri, I.; de Souza, V.; de Vries, K. D.; Decerprit, G.; Del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; di Giulio, C.; Diaz, J. C.; Díaz Castro, M. L.; Diep, P. N.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; Dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Etchegoyen, A.; Facal San Luis, P.; Fajardo Tapia, I.; Falcke, H.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Ferrero, A.; Fick, B.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; García Gámez, D.; Garcia-Pinto, D.; Gascon, A.; Gemmeke, H.; Gesterling, K.; Ghia, P. L.; Giaccari, U.; Giller, M.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gonçalves, P.; Gonzalez, D.; Gonzalez, J. G.; Gookin, B.; Góra, D.; Gorgi, A.; Gouffon, P.; Gozzini, S. R.; Grashorn, E.; Grebe, S.; Griffith, N.; Grigat, M.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Guzman, A.; Hague, J. D.; Hansen, P.; Harari, D.; Harmsma, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horneffer, A.; Hrabovský, M.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jarne, C.; Jiraskova, S.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuehn, F.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; Lautridou, P.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Lemiere, A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Meurer, C.; Mićanović, S.; Micheletti, M. I.; Miller, W.; Miramonti, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Morris, C.; Mostafá, M.; Moura, C. A.; Mueller, S.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Nhung, P. T.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Nyklicek, M.; Oehlschläger, J.; Olinto, A.; Oliva, P.; Olmos-Gilbaja, V. M.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Parsons, R. D.; Pastor, S.; Paul, T.; Pech, M.; Pȩkala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrinca, P.; Petrolini, A.; Petrov, Y.; Petrovic, J.; Pfendner, C.; Phan, N.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Ponce, V. H.; Pontz, M.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Robledo, C.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodriguez-Cabo, I.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-D'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Salamida, F.; Salazar, H.; Salina, G.; Sánchez, F.; Santander, M.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Schmidt, F.; Schmidt, T.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schulte, S.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Stapleton, J.; Stasielak, J.; Stephan, M.; Strazzeri, E.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tamashiro, A.; Tapia, A.; Tartare, M.; Taşcău, O.; Tavera Ruiz, C. G.; Tcaciuc, R.; Tegolo, D.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tiwari, D. K.; Tkaczyk, W.; Todero Peixoto, C. J.; Tomé, B.; Tonachini, A.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van den Berg, A. M.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Warner, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Westerhoff, S.; Whelan, B. J.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Winders, L.; Winnick, M. G.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Ziolkowski, M.

    2011-12-01

    In this paper we introduce the concept of Lateral Trigger Probability (LTP) function, i.e., the probability for an Extensive Air Shower (EAS) to trigger an individual detector of a ground based array as a function of distance to the shower axis, taking into account energy, mass and direction of the primary cosmic ray. We apply this concept to the surface array of the Pierre Auger Observatory consisting of a 1.5 km spaced grid of about 1600 water Cherenkov stations. Using Monte Carlo simulations of ultra-high energy showers the LTP functions are derived for energies in the range between 1017 and 1019 eV and zenith angles up to 65°. A parametrization combining a step function with an exponential is found to reproduce them very well in the considered range of energies and zenith angles. The LTP functions can also be obtained from data using events simultaneously observed by the fluorescence and the surface detector of the Pierre Auger Observatory (hybrid events). We validate the Monte Carlo results showing how LTP functions from data are in good agreement with simulations.

  5. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  6. Overtopping breaching of cohesive homogeneous earth dam with different cohesive strength

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In consideration of the range of clay content of Chinese earth dams, the world’s highest prototype tests have been made to research on the effects of cohesive strength of filling of cohesive homogeneous earth dam on breach formation. Three breach mechanisms were presented, they were the source-tracing erosion of dam body with the form of "multilevel headcut", "two-helix flow" erosion of dam crest and collapse of breach sidewalls due to instability. It can be concluded that the cohesive strength of filling of earth dam has great effect on breach formation. When the cohesive strength is bigger, the breach process becomes slower, and the peak outflow and the final width and depth of breach become smaller. The main character of the breach formation is head cutting and dumping collapse. When the cohesive strength is smaller, the breach process becomes faster, and the peak outflow, the final width and depth of breach become bigger. The main character of the breach formation is single level head cutting and shearing collapse.

  7. Using extreme value theory approaches to forecast the probability of outbreak of highly pathogenic influenza in Zhejiang, China.

    Directory of Open Access Journals (Sweden)

    Jiangpeng Chen

    Full Text Available Influenza is a contagious disease with high transmissibility to spread around the world with considerable morbidity and mortality and presents an enormous burden on worldwide public health. Few mathematical models can be used because influenza incidence data are generally not normally distributed. We developed a mathematical model using Extreme Value Theory (EVT to forecast the probability of outbreak of highly pathogenic influenza.The incidence data of highly pathogenic influenza in Zhejiang province from April 2009 to November 2013 were retrieved from the website of Health and Family Planning Commission of Zhejiang Province. MATLAB "VIEM" toolbox was used to analyze data and modelling. In the present work, we used the Peak Over Threshold (POT model, assuming the frequency as a Poisson process and the intensity to be Pareto distributed, to characterize the temporal variability of the long-term extreme incidence of highly pathogenic influenza in Zhejiang, China.The skewness and kurtosis of the incidence of highly pathogenic influenza in Zhejiang between April 2009 and November 2013 were 4.49 and 21.12, which indicated a "fat tail" distribution. A QQ plot and a mean excess plot were used to further validate the features of the distribution. After determining the threshold, we modeled the extremes and estimated the shape parameter and scale parameter by the maximum likelihood method. The results showed that months in which the incidence of highly pathogenic influenza is about 4462/2286/1311/487 are predicted to occur once every five/three/two/one year, respectively.Despite the simplicity, the present study successfully offers the sound modeling strategy and a methodological avenue to implement forecasting of an epidemic in the midst of its course.

  8. Psychological contract breaches, organizational commitment, and innovation-related behaviors: a latent growth modeling approach.

    Science.gov (United States)

    Ng, Thomas W H; Feldman, Daniel C; Lam, Simon S K

    2010-07-01

    This study examined the relationships among psychological contract breaches, organizational commitment, and innovation-related behaviors (generating, spreading, implementing innovative ideas at work) over a 6-month period. Results indicate that the effects of psychological contract breaches on employees are not static. Specifically, perceptions of psychological contract breaches strengthened over time and were associated with decreased levels of affective commitment over time. Further, increased perceptions of psychological contract breaches were associated with decreases in innovation-related behaviors. We also found evidence that organizational commitment mediates the relationship between psychological contract breaches and innovation-related behaviors. These results highlight the importance of examining the nomological network of psychological contract breaches from a change perspective.

  9. Security breaches: tips for assessing and limiting your risks.

    Science.gov (United States)

    Coons, Leeanne R

    2011-01-01

    As part of their compliance planning, medical practices should undergo a risk assessment to determine any vulnerability within the practice relative to security breaches. Practices should also implement safeguards to limit their risks. Such safeguards include facility access controls, information and electronic media management, use of business associate agreements, and education and enforcement. Implementation of specific policies and procedures to address security incidents is another critical step that medical practices should take as part of their security incident prevention plan. Medical practices should not only develop policies and procedures to prevent, detect, contain, and correct security violations, but should make sure that such policies and procedures are actually implemented in their everyday operations.

  10. RIM-binding protein links synaptic homeostasis to the stabilization and replenishment of high release probability vesicles.

    Science.gov (United States)

    Müller, Martin; Genç, Özgür; Davis, Graeme W

    2015-03-04

    Here we define activities of RIM-binding protein (RBP) that are essential for baseline neurotransmission and presynaptic homeostatic plasticity. At baseline, rbp mutants have a ∼10-fold decrease in the apparent Ca(2+) sensitivity of release that we attribute to (1) impaired presynaptic Ca(2+) influx, (2) looser coupling of vesicles to Ca(2+) influx, and (3) limited access to the readily releasable vesicle pool (RRP). During homeostatic plasticity, RBP is necessary for the potentiation of Ca(2+) influx and the expansion of the RRP. Remarkably, rbp mutants also reveal a rate-limiting stage required for the replenishment of high release probability (p) vesicles following vesicle depletion. This rate slows ∼4-fold at baseline and nearly 7-fold during homeostatic signaling in rbp. These effects are independent of altered Ca(2+) influx and RRP size. We propose that RBP stabilizes synaptic efficacy and homeostatic plasticity through coordinated control of presynaptic Ca(2+) influx and the dynamics of a high-p vesicle pool.

  11. The Lateral Trigger Probability function for the Ultra-High Energy Cosmic Ray showers detected by the Pierre Auger Observatory

    NARCIS (Netherlands)

    Abreu, P.; Aglietta, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Anticic, T.; Anzalone, A.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Baecker, T.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Baeuml, J.; Beatty, J. J.; Becker, B. R.; Becker, K. H.; Belletoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Bluemer, H.; Bohacova, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Burton, R. E.; Caballero-Mora, K. S.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chou, A.; Chudoba, J.; Clay, R. W.; Coluccia, M. R.; Conceicao, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Cotti, U.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; Decerprit, G.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Diaz, J. C.; Diaz Castro, M. L.; Diep, P. N.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, T. J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Etchegoyen, A.; San Luis, P. Facal; Fajardo Tapia, I.; Falcke, H.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Ferrero, A.; Fick, B.; Filevich, A.; Filipcic, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Froehlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; Garcia, B.; Garcia Gamez, D.; Garcia-Pinto, D.; Gascon, A.; Gemmeke, H.; Gesterling, K.; Ghia, P. L.; Giaccari, U.; Giller, M.; Glass, H.; Cold, M. S.; Golup, G.; Gomez Albarracin, F.; Gomez Berisso, M.; Goncalves, P.; Gonzalez, D.; Gonzalez, J. G.; Gookin, B.; Gora, D.; Gorgi, A.; Gouffon, P.; Gozzini, S. R.; Grashorn, E.; Grebe, S.; Griffith, N.; Grigat, M.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Guzman, A.; Hague, J. D.; Hansen, P.; Harari, D.; Harmsma, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hoerandel, J. R.; Horneffer, A.; Hrabovsky, M.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jarne, C.; Jiraskova, S.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Kegl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D. -H.; Kotera, K.; Krohm, N.; Kroemer, O.; Kruppke-Hansen, D.; Kuehn, F.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; Lautridou, P.; Leao, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Lemiere, A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopez, R.; Lopez Agueera, A.; Louedec, K.; Lozano Bahilo, J.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marini, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martinez, H.; Martinez Bravo, O.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Meurer, C.; Mitanovic, S.; Micheletti, M. I.; Miller, W.; Miramonti, L.; Mollerach, S.; Monasor, M.; Ragaigne, D. Monnier; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Morris, C.; Mostafa, M.; Moura, C. A.; Mueller, S.; Muller, M. A.; Mueller, G.; Muenchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Nhung, P. T.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nazka, L.; Nyklicek, M.; Oehischlaeger, J.; Olinto, A.; Oliva, P.; Olmos-Gilbaja, V. M.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Parsons, R. D.; Pastor, S.; Paul, T.; Pech, M.; Pekala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrinca, P.; Petrolini, A.; Petrov, Y.; Petrovic, J.; Pfendner, C.; Phan, N.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Ponce, V. H.; Pontz, M.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Robledo, C.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodriguez-Cabo, I.; Rodriguez-Frias, M. D.; Ros, G.; Rosado, J.; Rossier, T.; Roth, M.; Rouille-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Ruehle, C.; Salamida, F.; Salazar, H.; Salina, G.; Sanchez, F.; Santander, M.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Schmidt, F.; Schmidt, T.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovaneky, P.; Schroeder, F.; Schulte, S.; Schuster, D.; Scilltto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Smialkowski, A.; Smida, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Stapleton, J.; Stasielak, J.; Stephan, M.; Strazzeri, E.; Stutz, A.; Suarez, F.; Suomijarvi, T.; Supanitsky, A. D.; Susa, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tamashiro, A.; Tapia, A.; Tartare, M.; Tascau, O.; Tavera Ruiz, C. G.; Tcaciuc, R.; Tegolo, D.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tiwari, D. K.; Tkaczyk, W.; Todero Peixoto, C. J.; Tome, B.; Tonachini, A.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdes Galicia, J. F.; Valino, I.; Valore, L.; van den Berg, A. M.; Varela, E.; Vargas Cardenas, B.; Vazquez, J. R.; Vazquez, R. A.; Veberic, D.; Verzi, V.; Vicha, J.; Videla, M.; Villasenor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Warner, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Westerhoff, S.; Whelan, B. J.; Wieczorek, G.; Wiencke, L.; Wilczynska, B.; Wilczynski, H.; Will, M.; Williams, C.; Winchen, T.; Winders, L.; Winnick, M. G.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Ziolkowski, M.; Martin, L.

    2011-01-01

    In this paper we introduce the concept of Lateral Trigger Probability (LTP) function, i.e., the probability for an Extensive Air Shower (EAS) to trigger an individual detector of a ground based array as a function of distance to the shower axis, taking into account energy, mass and direction of the

  12. Barrier island breach evolution: Alongshore transport and bay-ocean pressure gradient interactions

    Science.gov (United States)

    Safak, Ilgar; Warner, John C.; List, Jeffrey H.

    2016-12-01

    Physical processes controlling repeated openings and closures of a barrier island breach between a bay and the open ocean are studied using aerial photographs and atmospheric and hydrodynamic observations. The breach site is located on Pea Island along the Outer Banks, separating Pamlico Sound from the Atlantic Ocean. Wind direction was a major control on the pressure gradients between the bay and the ocean to drive flows that initiate or maintain the breach opening. Alongshore sediment flux was found to be a major contributor to breach closure. During the analysis period from 2011 to 2016, three hurricanes had major impacts on the breach. First, Hurricane Irene opened the breach with wind-driven flow from bay to ocean in August 2011. Hurricane Sandy in October 2012 quadrupled the channel width from pressure gradient flows due to water levels that were first higher on the ocean side and then higher on the bay side. The breach closed sometime in Spring 2013, most likely due to an event associated with strong alongshore sediment flux but minimal ocean-bay pressure gradients. Then, in July 2014, Hurricane Arthur briefly opened the breach again from the bay side, in a similar fashion to Irene. In summary, opening and closure of breaches are shown to follow a dynamic and episodic balance between along-channel pressure gradient driven flows and alongshore sediment fluxes.

  13. SECURITY BREACH IN TRADING SYSTEM-COUNTERMEASURE USING IPTRACEBACK

    Directory of Open Access Journals (Sweden)

    M. P. Rajakumar

    2014-01-01

    Full Text Available Recently, economic scenario is often facing security breach that has heavy impact on the financial soundness of a company particularly, stock prices on firms. The utmost consequence being the whole business comes to a standstill. From the estimates attributed by the financial sector, it has been inferred that the loss incurred on virus and worms attack is said to have the greatest impact that hampers the prosperity of a business entity. Thus, security strategies attempt on revolving around the act of security breach, thereby targeting to curb the financial losses of a company totally or at least minimize the losses. If the operating system of the stock market or financial sector gets corrupted the whole system need to be formatted, a new operating system has to be reinstalled and the antivirus software tends to be useless in such cases. In this study, virus and worms are encountered using IPTraceback technique and network security is ensured. The effective integration of spectral analysis, worm detection and IPTraceback, alerts the user dynamically and kills the source which distributes the virus. Real time traffic data are used for evaluating the performance.

  14. The anterior insular cortex represents breaches of taste identity expectation.

    Science.gov (United States)

    Veldhuizen, Maria G; Douglas, Danielle; Aschenbrenner, Katja; Gitelman, Darren R; Small, Dana M

    2011-10-12

    Despite the importance of breaches of taste identity expectation for survival, its neural correlate is unknown. We used fMRI in 16 women to examine brain response to expected and unexpected receipt of sweet taste and tasteless/odorless solutions. During expected trials (70%), subjects heard "sweet" or "tasteless" and received the liquid indicated by the cue. During unexpected trials (30%), subjects heard sweet but received tasteless or they heard tasteless but received sweet. After delivery, subjects indicated stimulus identity by pressing a button. Reaction time was faster and more accurate after valid cuing, indicating that the cues altered expectancy as intended. Tasting unexpected versus expected stimuli resulted in greater deactivation in fusiform gyri, possibly reflecting greater suppression of visual object regions when orienting to, and identifying, an unexpected taste. Significantly greater activation to unexpected versus expected stimuli occurred in areas related to taste (thalamus, anterior insula), reward [ventral striatum (VS), orbitofrontal cortex], and attention [anterior cingulate cortex, inferior frontal gyrus, intraparietal sulcus (IPS)]. We also observed an interaction between stimulus and expectation in the anterior insula (primary taste cortex). Here response was greater for unexpected versus expected sweet compared with unexpected versus expected tasteless, indicating that this region is preferentially sensitive to breaches of taste expectation. Connectivity analyses confirmed that expectation enhanced network interactions, with IPS and VS influencing insular responses. We conclude that unexpected oral stimulation results in suppression of visual cortex and upregulation of sensory, attention, and reward regions to support orientation, identification, and learning about salient stimuli.

  15. The FERRUM project: Experimental lifetimes and transition probabilities from highly excited even 4d levels in Fe ii

    CERN Document Server

    Hartman, H; Engström, L; Lundberg, H

    2015-01-01

    We report lifetime measurements of the 6 levels in the 3d6(5D)4d e6G term in Fe ii at an energy of 10.4 eV, and f -values for 14 transitions from the investigated levels. The lifetimes were measured using time-resolved laser-induced fluorescence on ions in a laser-produced plasma. The high excitation energy, and the fact that the levels have the same parity as the the low-lying states directly populated in the plasma, necessitated the use of a two-photon excitation scheme. The probability for this process is greatly enhanced by the presence of the 3d6(5D)4p z6F levels at roughly half the energy di?erence. The f -values are obtained by combining the experimental lifetimes with branching fractions derived using relative intensities from a hollow cathode discharge lamp recorded with a Fourier transform spectrometer. The data is important for benchmarking atomic calculations of astrophysically important quantities and useful for spectroscopy of hot stars.

  16. The FERRUM project: Experimental lifetimes and transition probabilities from highly excited even 4d levels in Fe ii

    Science.gov (United States)

    Hartman, H.; Nilsson, H.; Engström, L.; Lundberg, H.

    2015-12-01

    We report lifetime measurements of the 6 levels in the 3d6(5D)4d e6G term in Fe ii at an energy of 10.4 eV, and f-values for 14 transitions from the investigated levels. The lifetimes were measured using time-resolved laser-induced fluorescence on ions in a laser-produced plasma. The high excitation energy, and the fact that the levels have the same parity as the the low-lying states directly populated in the plasma, necessitated the use of a two-photon excitation scheme. The probability for this process is greatly enhanced by the presence of the 3d6(5D)4p z6F levels at roughly half the energy difference. The f-values are obtained by combining the experimental lifetimes with branching fractions derived using relative intensities from a hollow cathode discharge lamp recorded with a Fourier transform spectrometer. The data is important for benchmarking atomic calculations of astrophysically important quantities and useful for spectroscopy of hot stars.

  17. 36 new, high-probability, damped Lyα absorbers at redshift 0.42 < z < 0.70

    Science.gov (United States)

    Turnshek, David A.; Monier, Eric M.; Rao, Sandhya M.; Hamilton, Timothy S.; Sardane, Gendith M.; Held, Ryan

    2015-05-01

    Quasar damped Lyα (DLA) absorption-line systems with redshifts z systems fall in the UV and are rarely found in blind UV spectroscopic surveys. Therefore, it has been difficult to compile a moderate-sized sample of UV DLAs in any narrow cosmic time interval. However, DLAs are easy to identify in low-resolution spectra because they have large absorption rest equivalent widths. We have performed an efficient strong-Mg II-selected survey for UV DLAs at redshifts z = [0.42, 0.70] using Hubble Space Telescope's low-resolution ACS-HRC-PR200L prism. This redshift interval covers ˜1.8 Gyr in cosmic time, i.e. t ≈ [7.2, 9.0] Gyr after the big bang. A total of 96 strong Mg II absorption-line systems identified in Sloan Digital Sky Survey spectra were successfully observed with the prism at the predicted UV wavelengths of Lyα absorption. We found that 35 of the 96 systems had a significant probability of being DLAs. One additional observed system could be a very high N_{H I} DLA (N_{H I} ˜ 2× 10^{22} atoms cm-2 or possibly higher), but since very high N_{H I} systems are extremely rare, it would be unusual for this system to be a DLA given the size of our sample. Here we present information on our prism sample, including our best estimates of N_{H I} and errors for the 36 systems fitted with DLA profiles. This list is valuable for future follow-up studies of low-redshift DLAs in a small redshift interval, although such work would clearly benefit from improved UV spectroscopy to more accurately determine their neutral hydrogen column densities.

  18. Percentage of probability of nonpoint-source nitrate contamination of recently recharged ground water in the High Plains aquifer

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This raster data set represents the percentage of probability of nonpoint-source nitrate contamination (greater than the proposed background concentration of 4...

  19. An Evaluation of a High-Probability Instructional Sequence to Increase Acceptance of Food and Decrease Inappropriate Behavior in Children with Pediatric Feeding Disorders

    Science.gov (United States)

    Patel, Meeta R.; Reed, Gregory K.; Piazza, Cathleen C.; Bachmeyer, Melainie H.; Layer, Stacy A.; Pabico, Ryan S.

    2006-01-01

    We evaluated the effects of escape extinction with and without a high-probability (high-p) instructional sequence on food acceptance and inappropriate behavior for children diagnosed with feeding problems. The high-p sequence consisted of three presentations of a response that was similar topographically (i.e., presentations of an empty nuk[R],…

  20. An Evaluation of a High-Probability Instructional Sequence to Increase Acceptance of Food and Decrease Inappropriate Behavior in Children with Pediatric Feeding Disorders

    Science.gov (United States)

    Patel, Meeta R.; Reed, Gregory K.; Piazza, Cathleen C.; Bachmeyer, Melainie H.; Layer, Stacy A.; Pabico, Ryan S.

    2006-01-01

    We evaluated the effects of escape extinction with and without a high-probability (high-p) instructional sequence on food acceptance and inappropriate behavior for children diagnosed with feeding problems. The high-p sequence consisted of three presentations of a response that was similar topographically (i.e., presentations of an empty nuk[R],…

  1. Modelling dune erosion, overwash and breaching at Fire Island (NY) during hurricane Sandy

    NARCIS (Netherlands)

    De Vet, P.L.M.; McCall, R.T.; Den Bieman, J.P.; Stive, M.J.F.; Van Ormondt, M.

    2015-01-01

    In 2012, Hurricane Sandy caused a breach at Fire Island (NY, USA), near Pelican Island. This paper aims at modelling dune erosion, overwash and breaching processes that occured during the hurricane event at this stretch of coast with the numerical model XBeach. By using the default settings, the ero

  2. a longitudinal study of age-related differences in reactions to phsycological contract breach

    NARCIS (Netherlands)

    Bal, Matthijs; Lange, Annet de; Jansen, Paul; Velde, Mandy van der

    2013-01-01

    The current paper investigated age‐related differences in the relations of psychological contract breach with work outcomes over time. Based on affective events theory, we expected job satisfaction to mediate the longitudinal relationship of contract breach with changes in job performance. Moreover,

  3. 48 CFR 52.233-4 - Applicable Law for Breach of Contract Claim.

    Science.gov (United States)

    2010-10-01

    ... Provisions and Clauses 52.233-4 Applicable Law for Breach of Contract Claim. As prescribed in 33.215(b), insert the following clause: Applicable Law for Breach of Contract Claim (OCT 2004) United States law... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Applicable Law for...

  4. Psychological Contract Breach and Job Attitudes: A Meta-Analysis of Age as a Moderator

    Science.gov (United States)

    Bal, P. Matthijs; De Lange, Annet H.; Jansen, Paul G. W.; Van Der Velde, Mandy E. G.

    2008-01-01

    The aim of this study was to examine the influence of age in the relation between psychological contract breach and the development of job attitudes. Based on affective events, social exchange, and lifespan theory, we hypothesized that (1) psychological contract breach would be related negatively to job attitudes, and (2) that age would moderate…

  5. Psychological contract breach and job attitudes: A meta-analysis of age as a moderator

    NARCIS (Netherlands)

    Bal, P.M.; Lange, A.H. de; Jansen, P.G.W.; Velde, M.E.G. van der

    2008-01-01

    The aim of this study was to examine the influence of age in the relation between psychological contract breach and the development of job attitudes. Based on affective events, social exchange, and lifespan theory, we hypothesized that (1) psychological contract breach would be related negatively to

  6. Breaches of health information: are electronic records different from paper records?

    Science.gov (United States)

    Sade, Robert M

    2010-01-01

    Breaches of electronic medical records constitute a type of healthcare error, but should be considered separately from other types of errors because the national focus on the security of electronic data justifies special treatment of medical information breaches. Guidelines for protecting electronic medical records should be applied equally to paper medical records.

  7. 5 CFR 2634.702 - Breaches by trust fiduciaries and interested parties.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Breaches by trust fiduciaries and interested parties. 2634.702 Section 2634.702 Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT... § 2634.702 Breaches by trust fiduciaries and interested parties. (a) The Attorney General may bring...

  8. An Examination of the Explicit Costs of Sensitive Information Security Breaches

    Science.gov (United States)

    Toe, Cleophas Adeodat

    2013-01-01

    Data security breaches are categorized as loss of information that is entrusted in an organization by its customers, partners, shareholders, and stakeholders. Data breaches are significant risk factors for companies that store, process, and transmit sensitive personal information. Sensitive information is defined as confidential or proprietary…

  9. Inversion Method for Early Detection of ARES-1 Case Breach Failure

    Science.gov (United States)

    Mackey, Ryan M.; Kulikov, Igor K.; Bajwa, Anupa; Berg, Peter; Smelyanskiy, Vadim

    2010-01-01

    A document describes research into the problem of detecting a case breach formation at an early stage of a rocket flight. An inversion algorithm for case breach allocation is proposed and analyzed. It is shown how the case breach can be allocated at an early stage of its development by using the rocket sensor data and the output data from the control block of the rocket navigation system. The results are simulated with MATLAB/Simulink software. The efficiency of an inversion algorithm for a case breach location is discussed. The research was devoted to the analysis of the ARES-l flight during the first 120 seconds after the launch and early prediction of case breach failure. During this time, the rocket is propelled by its first-stage Solid Rocket Booster (SRB). If a breach appears in SRB case, the gases escaping through it will produce the (side) thrust directed perpendicular to the rocket axis. The side thrust creates torque influencing the rocket attitude. The ARES-l control system will compensate for the side thrust until it reaches some critical value, after which the flight will be uncontrollable. The objective of this work was to obtain the start time of case breach development and its location using the rocket inertial navigation sensors and GNC data. The algorithm was effective for the detection and location of a breach in an SRB field joint at an early stage of its development.

  10. Pro-active data breach detection: examining accuracy and applicability on personal information detected

    CSIR Research Space (South Africa)

    Botha, J

    2016-03-01

    Full Text Available breaches but does not provide a clear indication of the level of personal information available on the internet since only reported incidents are taken into account. The possibility of pro-active automated breach detection has previously been discussed as a...

  11. Psychological contract breach and job attitudes : A meta-analysis of age as a moderator

    NARCIS (Netherlands)

    Bal, P. Matthijs; De lange, Annet H.; Jansen, Paul G. W.; Van der Velde, Mandy E. G.

    2008-01-01

    The aim of this study was to examine the influence of age in the relation between psychological contract breach and the development of job attitudes. Based on affective events, social exchange, and lifespan theory, we hypothesized that (1) psychological contract breach would be related negatively to

  12. Modelling dune erosion, overwash and breaching at Fire Island (NY) during hurricane Sandy

    NARCIS (Netherlands)

    De Vet, P.L.M.; McCall, R.T.; Den Bieman, J.P.; Stive, M.J.F.; Van Ormondt, M.

    2015-01-01

    In 2012, Hurricane Sandy caused a breach at Fire Island (NY, USA), near Pelican Island. This paper aims at modelling dune erosion, overwash and breaching processes that occured during the hurricane event at this stretch of coast with the numerical model XBeach. By using the default settings, the

  13. Advances in one-dimensional numerical breach modeling of sand barriers

    NARCIS (Netherlands)

    Tuan, T.Q.; Verhagen, H.J.; Visser, P.J.

    2006-01-01

    A hydrodynamic numerical model is formulated to describe the breach erosion process of sandy barriers. The breach flow is based on the system of unsteady shallow water equations, which is solved using a robust upwind numerical approach in conjunction with the Finite Volume Method (FVM). The hydrauli

  14. Estimating the probability of identity in a random dog population using 15 highly polymorphic canine STR markers.

    Science.gov (United States)

    Eichmann, Cordula; Berger, Burkhard; Steinlechner, Martin; Parson, Walther

    2005-06-30

    Dog DNA-profiling is becoming an important supplementary technology for the investigation of accident and crime, as dogs are intensely integrated in human social life. We investigated 15 highly polymorphic canine STR markers and two sex-related markers of 131 randomly selected dogs from the area around Innsbruck, Tyrol, Austria, which were co-amplified in three PCR multiplex reactions (ZUBECA6, FH2132, FH2087Ua, ZUBECA4, WILMSTF, PEZ15, PEZ6, FH2611, FH2087Ub, FH2054, PEZ12, PEZ2, FH2010, FH2079 and VWF.X). Linkage testing for our set of marker suggested no evidence for linkage between the loci. Heterozygosity (HET), polymorphism information content (PIC) and the probability of identity (P((ID)theoretical), P((ID)unbiased), P((ID)sib)) were calculated for each marker. The HET((exp))-values of the 15 markers lie between 0.6 (VWF.X) and 0.9 (ZUBECA6), P((ID)sib)-values were found to range between 0.49 (VWF.X) and 0.28 (ZUBECA6). Moreover, the P((ID)sib) was computed for sets of loci by sequentially adding single loci to estimate the information content and the usefulness of the selected marker sets for the identification of dogs. The estimated P((ID)sib) value of all 15 markers amounted to 8.5 x 10(-8). The presented estimations turned out to be a helpful approach for a reasonable choice of markers for the individualisation of dogs.

  15. Artificial intelligence for predicting recurrence-free probability of non-invasive high-grade urothelial bladder cell carcinoma.

    Science.gov (United States)

    Cai, Tommaso; Conti, Gloria; Nesi, Gabriella; Lorenzini, Matteo; Mondaini, Nicola; Bartoletti, Riccardo

    2007-10-01

    The objective of our study was to define a neural network for predicting recurrence and progression-free probability in patients affected by recurrent pTaG3 urothelial bladder cancer to use in everyday clinical practice. Among all patients who had undergone transurethral resection for bladder tumors, 143 were finally selected and enrolled. Four follow-ups for recurrence, progression or survival were performed at 6, 9, 12 and 108 months. The data were analyzed by using the commercially available software program NeuralWorks Predict. These data were compared with univariate and multivariate analysis results. The use of Artificial Neural Networks (ANN) in recurrent pTaG3 patients showed a sensitivity of 81.67% and specificity of 95.87% in predicting recurrence-free status after transurethral resection of bladder tumor at 12 months follow-up. Statistical and ANN analyses allowed selection of the number of lesions (multiple, HR=3.31, p=0.008) and the previous recurrence rate (>or=2/year, HR=3.14, p=0.003) as the most influential variables affecting the output decision in predicting the natural history of recurrent pTaG3 urothelial bladder cancer. ANN applications also included selection of the previous adjuvant therapy. We demonstrated the feasibility and reliability of ANN applications in everyday clinical practice, reporting a good recurrence predicting performance. The study identified a single subgroup of pTaG3 patients with multiple lesions, >or=2/year recurrence rate and without any response to previous Bacille Calmette-Guérin adjuvant therapy, that seem to be at high risk of recurrence.

  16. Disappearance of breach rhythm heralding recurrent tumor progression in a patient with astrocytoma.

    Science.gov (United States)

    Kampf, Christina; Grossmann, Annette; Benecke, Reiner; Rösche, Johannes

    2013-07-01

    The breach rhythm is sometimes considered the consequence of reduced resistance between the cortex and the scalp electrode in the region of a skull defect. On the other hand, the electroencephalographic (EEG) changes after craniotomy were attributed to an activation of EEG activity by meningocortical adhesions with admixed gliosis. We report changes of the breach rhythm in a patient with astrocytoma, which give further evidence that the breach rhythm is not merely the result of physical changes in the area of a skull defect. In our patient, the breach rhythm was no longer detectable before a new tumor progression took place, showed up again, and at the end changed into localized slowing before the deterioration of the patient's general medical condition. This case suggests that in patients with brain tumors, the loss or attenuation in frequency of an established breach rhythm might be considered as an indication of a new tumor progression.

  17. A model for release of fission products from a breached fuel plate under wet storage

    Energy Technology Data Exchange (ETDEWEB)

    Terremoto, L.A.A.; Seerban, R.S.; Zeituni, C.A.; Silva, J.E.R. da; Silva, A.T. e; Castanheira, M.; Lucki, G.; Damy, M. de A.; Teodoro, C.A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: laaterre@ipen.br

    2007-07-01

    MTR fuel elements burned-up inside the core of nuclear research reactors are stored worldwide mainly under the water of storage pools. When cladding breach is present in one or more fuel plates of such elements, radioactive fission products are released into the storage pool water. This work proposes a model to describe the release mechanism considering the diffusion of nuclides of a radioactive fission product either through a postulated small cylindrical breach or directly from a large circular hole in the cladding. In each case, an analytical expression is obtained for the activity released into the water as a function of the total storage time of a breached fuel plate. Regarding sipping tests already performed at the IEA-R1 research reactor on breached MTR fuel elements, the proposed model correlates successfully the specific activity of {sup 137}Cs, measured as a function of time, with the evaluated size of the cladding breach. (author)

  18. On Breaching Enterprise Data Privacy Through Adversarial Information Fusion

    CERN Document Server

    Ganta, Srivatsava Ranjit

    2008-01-01

    Data privacy is one of the key challenges faced by enterprises today. Anonymization techniques address this problem by sanitizing sensitive data such that individual privacy is preserved while allowing enterprises to maintain and share sensitive data. However, existing work on this problem make inherent assumptions about the data that are impractical in day-to-day enterprise data management scenarios. Further, application of existing anonymization schemes on enterprise data could lead to adversarial attacks in which an intruder could use information fusion techniques to inflict a privacy breach. In this paper, we shed light on the shortcomings of current anonymization schemes in the context of enterprise data. We define and experimentally demonstrate Web-based Information- Fusion Attack on anonymized enterprise data. We formulate the problem of Fusion Resilient Enterprise Data Anonymization and propose a prototype solution to address this problem.

  19. Investigation of breached depleted UF sub 6 cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Barber, E.J.; Butler, T.R.; DeVan, J.H.; Googin, J.M.; Taylor, M.S.; Dyer, R.H.; Russell, J.R.

    1991-09-01

    In June 1990, during a three-site inspection of cylinders being used for long-term storage of solid depleted UF{sub 6}, two 14-ton steel cylinders at Portsmouth, Ohio, were discovered with holes in the barrel section of the cylinders. Both holes, concealed by UF{sub 4} reaction products identical in color to the cylinder coating, were similarly located near the front stiffening ring. The UF{sub 4} appeared to have self-sealed the holes, thus containing nearly all of the uranium contents. Martin Marietta Energy Systems, Inc., Vice President K.W. Sommerfeld immediately formed an investigation team to: (1) identify the most likely cause of failure for the two breached cylinders, (2) determine the impact of these incidents on the three-site inventory, and (3) provide recommendations and preventive measures. This document discusses the results of this investigation.

  20. Investigation of breached depleted UF{sub 6} cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Barber, E.J.; Butler, T.R.; DeVan, J.H.; Googin, J.M.; Taylor, M.S.; Dyer, R.H.; Russell, J.R.

    1991-09-01

    In June 1990, during a three-site inspection of cylinders being used for long-term storage of solid depleted UF{sub 6}, two 14-ton steel cylinders at Portsmouth, Ohio, were discovered with holes in the barrel section of the cylinders. Both holes, concealed by UF{sub 4} reaction products identical in color to the cylinder coating, were similarly located near the front stiffening ring. The UF{sub 4} appeared to have self-sealed the holes, thus containing nearly all of the uranium contents. Martin Marietta Energy Systems, Inc., Vice President K.W. Sommerfeld immediately formed an investigation team to: (1) identify the most likely cause of failure for the two breached cylinders, (2) determine the impact of these incidents on the three-site inventory, and (3) provide recommendations and preventive measures. This document discusses the results of this investigation.

  1. MINIMIZING GLOVEBOX GLOVE BREACHES, PART IV: CONTROL CHARTS

    Energy Technology Data Exchange (ETDEWEB)

    COURNOYER, MICHAEL E. [Los Alamos National Laboratory; LEE, MICHELLE B. [Los Alamos National Laboratory; SCHREIBER, STEPHEN B. [Los Alamos National Laboratory

    2007-02-05

    At the Los Alamos National Laboratory (LANL) Plutonium Facility, plutonium. isotopes and other actinides are handled in a glovebox environment. The spread of radiological contamination, and excursions of contaminants into the worker's breathing zone, are minimized and/or prevented through the use of glovebox technology. Evaluating the glovebox configuration, the glovebo gloves are the most vulnerable part of this engineering control. Recognizing this vulnerability, the Glovebox Glove Integrity Program (GGIP) was developed to minimize and/or prevent unplanned openings in the glovebox environment, i.e., glove failures and breaches. In addition, LANL implement the 'Lean Six Sigma (LSS)' program that incorporates the practices of Lean Manufacturing and Six Sigma technologies and tools to effectively improve administrative and engineering controls and work processes. One tool used in LSS is the use of control charts, which is an effective way to characterize data collected from unplanned openings in the glovebox environment. The benefit management receives from using this tool is two-fold. First, control charts signal the absence or presence of systematic variations that result in process instability, in relation to glovebox glove breaches and failures. Second, these graphical representations of process variation detennine whether an improved process is under control. Further, control charts are used to identify statistically significant variations (trends) that can be used in decision making to improve processes. This paper discusses performance indicators assessed by the use control charts, provides examples of control charts, and shows how managers use the results to make decisions. This effort contributes to LANL Continuous Improvement Program by improving the efficiency, cost effectiveness, and formality of glovebox operations.

  2. Identification of consistency in rating curve data: Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.

    2016-04-01

    Before calculating rating curve discharges, it is crucial to identify possible interruptions in data consistency. In this research, a methodology to perform this preliminary analysis is developed and validated. This methodology, called Bidirectional Reach (BReach), evaluates in each data point results of a rating curve model with randomly sampled parameter sets. The combination of a parameter set and a data point is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Moreover, a tolerance degree that defines satisfactory behavior of a sequence of model results is chosen. This tolerance degree equals the percentage of observations that are allowed to have non-acceptable model results. Subsequently, the results of the classification is used to assess the maximum left and right reach for each data point of a chronologically sorted time series. This maximum left and right reach in a gauging point represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. This analysis is repeated for a variety of tolerance degrees. Plotting results of this analysis for all data points and all tolerance degrees in a combined BReach plot enables the detection of changes in data consistency. Moreover, if consistent periods are detected, limits of these periods can be derived. The methodology is validated with various synthetic stage-discharge data sets and proves to be a robust technique to investigate temporal consistency of rating curve data. It provides satisfying results despite of low data availability, large errors in the estimated observational uncertainty, and a rating curve model that is known to cover only a limited part of the observations.

  3. CGC/saturation approach for soft interactions at high energy: survival probability of the central exclusive production

    CERN Document Server

    Gotsman, E; Maor, U

    2015-01-01

    We estimate the value of the survival probability for central exclusive production, in a model, which is based on the CGC/saturation approach. Hard and the soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability ($ \\leq\\,1\\%$). The source of the small value, is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections inclusive production and rapidity correlations, as well as t21he $t$-dependence of deep inelastic diffractive production of vector mesons

  4. Computational Complexities and Breaches in Authentication Frameworks of Broadband Wireless Access

    CERN Document Server

    Hashmi, Raheel Maqsood; Jabeen, Memoona; Alimgeer, Khurram S; Khan, Shahid A

    2009-01-01

    Secure access of communication networks has become an increasingly important area of consideration for the communication service providers of present day. Broadband Wireless Access (BWA) networks are proving to be an efficient and cost effective solution for the provisioning of high rate wireless traffic links in static and mobile domains. The secure access of these networks is necessary to ensure their superior operation and revenue efficacy. Although authentication process is a key to secure access in BWA networks, the breaches present in them limit the networks performance. In this paper, the vulnerabilities in the authentication frameworks of BWA networks have been unveiled. Moreover, this paper also describes the limitations of these protocols and of the solutions proposed to them due to the involved computational complexities and overheads. The possible attacks on privacy and performance of BWA networks have been discussed and explained in detail.

  5. Analytical solution of a multidimensional Langevin equation at high friction limits and probability passing over a two-dimensional saddle

    Institute of Scientific and Technical Information of China (English)

    XING Yong-Zhong

    2009-01-01

    The analytical solution of a multidimensional Langevin equation at the overdamping limit is obtained and the probability of particles passing over a two-dimensional saddle point is discussed. These results may break a path for studying further the fusion in superheavy elements synthesis.

  6. Breach of autoreactive B cell tolerance by post-translationally modified proteins.

    Science.gov (United States)

    Dekkers, Jacqueline S; Verheul, Marije K; Stoop, Jeroen N; Liu, Bisheng; Ioan-Facsinay, Andreea; van Veelen, Peter A; de Ru, Arnoud H; Janssen, George M C; Hegen, Martin; Rapecki, Steve; Huizinga, Tom W J; Trouw, Leendert A; Toes, René E M

    2017-08-01

    Over 50% of patients with rheumatoid arthritis (RA) harbour a variety of anti-modified protein antibodies (AMPA) against different post-translationally modified (PTM) proteins, including anti-carbamylated protein (anti-CarP) antibodies. At present, it is unknown how AMPA are generated and how autoreactive B cell responses against PTM proteins are induced. Here we studied whether PTM foreign antigens can breach B cell tolerance towards PTM self-proteins. Serum reactivity towards five carbamylated proteins was determined for 160 patients with RA and 40 healthy individuals. Antibody cross-reactivity was studied by inhibition experiments. Mass spectrometry was performed to identify carbamylated self-proteins in human rheumatic joint tissue. Mice were immunised with carbamylated or non-modified (auto)antigens and analysed for autoantibody responses. We show that anti-CarP antibodies in RA are highly cross-reactive towards multiple carbamylated proteins, including modified self-proteins and modified non-self-proteins. Studies in mice show that anti-CarP antibody responses recognising carbamylated self-proteins are induced by immunisation with carbamylated self-proteins and by immunisation with carbamylated proteins of non-self-origin. Similar to the data observed with sera from patients with RA, the murine anti-CarP antibody response was, both at the monoclonal level and the polyclonal level, highly cross-reactive towards multiple carbamylated proteins, including carbamylated self-proteins. Self-reactive AMPA responses can be induced by exposure to foreign proteins containing PTM. These data show how autoreactive B cell responses against PTM self-proteins can be induced by exposure to PTM foreign proteins and provide new insights on the breach of autoreactive B cell tolerance. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. 25 CFR 224.87 - What are the obligations of a tribe if it discovers a violation or breach?

    Science.gov (United States)

    2010-04-01

    ... violation or breach? 224.87 Section 224.87 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR... DETERMINATION ACT Implementation of Tribal Energy Resource Agreements Violation Or Breach § 224.87 What are the obligations of a tribe if it discovers a violation or breach? As soon as practicable after discovering...

  8. 29 CFR 37.102 - What happens if a grant applicant or recipient breaches a Conciliation Agreement?

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true What happens if a grant applicant or recipient breaches a... Procedures § 37.102 What happens if a grant applicant or recipient breaches a Conciliation Agreement? When it... Notification of Breach of Conciliation Agreement....

  9. Levee Breach Experiment by Overflow at the Full Scale Experimental Channel

    Science.gov (United States)

    Shimada, T.; Yokoyama, H.

    2010-12-01

    The increased occurrence of storm disasters caused by typhoons and local downpours in recent years has given rise to concerns over the possibility of large-scale floods resulting from river overflow. Levee breaches cause particularly severe damage, and in Japan, more than 80% of such accidents in the past have been attributed to overflow. Studies on levee breach by overflow have been conducted from various viewpoints using diverse methods. However, the mechanism of three-dimensional levee breach by overflow has not been clarified in past studies. Elucidation of this mechanism is very important for disaster prevention as well as for the future progress of studies on levee breach by overflow. Levees (levee crown width; 3m, levee height; 3m, levee length; 80m) were built in the Chiyoda Experimental Channel (full-scale experimental channel; width is 30m, length is 1,300m, maximum discharge is 170t/s) in Hokkaido Japan in 2010, and a three-dimensional experiment on levee breach by overflow. The findings of the experiment are as follows: After the beginning of overflow, levee breach widening did not begin until after most of the levee section had collapsed. It was also considered that, even if overflow occurred, extremely serious damage (e.g., sudden increase in levee breach width and overflow discharge) was unlikely unless the majority of the levee section collapsed.

  10. [Anthropozoonoses and other inter-specific infections: breaches in the "species barrier"].

    Science.gov (United States)

    Parodi, A L

    2008-06-01

    Infectious outbreaks in human populations frequently occurred during the last decades. Most of these epidemics were emerging or re-emerging diseases and about 75% of them were caused by animal pathogens mainly from wild animal species. Thus the "species barrier" dogma seems to be frequently in the wrong. Many factors are probably involved in this failure. Among them, genetic mutations mainly in viruses which are more and more frequently discovered. On the other hand, anthropic factors are the major causes of this species barrier breaches. Closer contacts with domestic and/or wild animals, dramatic growth of human populations, poverty, numerous immunodepressed people, international travels and trading are facilitating or determining factors of the inter-specific transmission of pathogens. International mobilization is mandatory to efficiently anticipate and control the emergence of such zoonoses. According to their unpredictable emergence and their rapid potential worldwide spreading, their control requires an international cooperation both in a permanent epidemiological surveillance and a rapid efficient riposte. Human and veterinary medical institutions have to closely cooperate in this issue.

  11. A study of the energy dependence of the mean, truncated mean, and most probable energy deposition of high-energy muons in sampling calorimeters

    Energy Technology Data Exchange (ETDEWEB)

    Auchincloss, P.S.; De Barbaro, P.; Bodek, A.; Budd, H.; Pillai, M.; Qun, F.; Sakumoto, W.K.; Merritt, F.S.; Oreglia, M.J.; Schumm, B.; Bolton, T.; Arroyo, C.; Bachmann, K.T.; Bazarko, A.O.; Blair, R.E.; Foudas, C.; King, B.J.; Lefmann, W.C.; Leung, W.C.; Mishra, S.R.; Oltman, E.; Quintas, P.Z.; Rabinowitz, S.A.; Sciulli, F.; Seligman, W.G.; Shaevitz, M.H.; Bernstein, R.H.; Borcherding, F.; Fisk, H.E.; Lamm, M.; Marsh, W.; Merritt, K.W.B.; Schellman, H.; Yovanovitch, D.; Kinnel, T.S.; Sandler, P.; Smith, W.H. (Dept. of Physics and Astronomy, Univ. of Rochester, NY (United States) Dept. of Physics, Univ. of Chicago, IL (United States) Dept. of Physics, Columbia Univ. New York, NY (United States) Fermilab, Batavia, IL (United States) Dept. of Physics, Univ. of Wisconsin, Madison, WI (United States))

    1994-04-11

    We have extracted the momentum dependence of the mean, the truncated mean and the most probable value of the energy deposited in a segmented, iron-scintillator, hadron calorimeter by high-energy muons. Data were drawn from a sample of momentum-analyzed, high-energy muons produced in charged-current neutrino interactions. The truncated mean energy deposition of high-energy muons traversing 20 calorimeter segments increases by approximately 16% per 100 GeV/c increase in muon momentum over the range 25-125 GeV/c; the most probable energy deposition increases by approximately 7%. These results are important for experiments at high-energy colliders (e.g., Tevatron, SSC and LHC) which use the dE/dx of high-energy muons to calibrate the response of electromagnetic and hadron calorimeters with tower geometry. The data are in qualitative agreement with GEANT3 (v3.15/308a) simulations. (orig.)

  12. Estimating the per-contact probability of infection by highly pathogenic avian influenza (H7N7) virus during the 2003 epidemic in the Netherlands.

    NARCIS (Netherlands)

    Ssematimba, A.; Elbers, A.R.W.; Hagenaars, T.H.J.; Jong, de M.C.M.

    2012-01-01

    Estimates of the per-contact probability of transmission between farms of Highly Pathogenic Avian Influenza virus of H7N7 subtype during the 2003 epidemic in the Netherlands are important for the design of better control and biosecurity strategies. We used standardized data collected during the epid

  13. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  14. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  15. A guide to California's breaches. First year of state reporting requirement reveals common privacy violations.

    Science.gov (United States)

    Dimick, Chris

    2010-04-01

    Effective January 1, 2009, California healthcare providers were required to report every breach of patient information to the state. They have sent a flood of mishaps and a steady stream of malicious acts.

  16. BREACHING THE SEXUAL BOUNDARIES IN THE DOCTOR–PATIENT RELATIONSHIP: SHOULD ENGLISH LAW RECOGNISE FIDUCIARY DUTIES?

    Science.gov (United States)

    Ost, Suzanne

    2016-01-01

    In this article, I argue that sexual exploitation in the doctor–patient relationship would be dealt with more appropriately by the law in England and Wales on the basis of a breach of fiduciary duty. Three different types of sexual boundary breaches are discussed, and the particular focus is on breaches where the patient's consent is obtained through inducement. I contend that current avenues of redress do not clearly catch this behaviour and, moreover, they fail to capture the essence of the wrong committed by the doctor—the knowing breach of trust for self-gain—and the calculated way in which consent is induced. Finally, I demonstrate that the fiduciary approach is compatible with the contemporary pro-patient autonomy model of the doctor–patient relationship. PMID:26846652

  17. BREACHING THE SEXUAL BOUNDARIES IN THE DOCTOR-PATIENT RELATIONSHIP: SHOULD ENGLISH LAW RECOGNISE FIDUCIARY DUTIES?

    Science.gov (United States)

    Ost, Suzanne

    2016-01-01

    In this article, I argue that sexual exploitation in the doctor-patient relationship would be dealt with more appropriately by the law in England and Wales on the basis of a breach of fiduciary duty. Three different types of sexual boundary breaches are discussed, and the particular focus is on breaches where the patient's consent is obtained through inducement. I contend that current avenues of redress do not clearly catch this behaviour and, moreover, they fail to capture the essence of the wrong committed by the doctor-the knowing breach of trust for self-gain-and the calculated way in which consent is induced. Finally, I demonstrate that the fiduciary approach is compatible with the contemporary pro-patient autonomy model of the doctor-patient relationship. © The Author 2016. Published by Oxford University Press; all rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  19. Direct and Indirect Effects of Psychological Contract Breach on Academicians’ Turnover Intention in Turkey

    OpenAIRE

    Ozan BUYUKYILMAZ; Cakmak, Ahmet F.

    2013-01-01

    This study aims to investigate the assumed direct and indirect relationships between psychological contract breach and turnover intention through psychological contract violation and perceived organizational support. Data for the sample was collected from 570 academicians from a variety of universities in Turkey. Hierarchical regression analyses were conducted to test the hypotheses. The results show that psychological contract breach was positively related to turnover intention and psycholog...

  20. The older the better?: age-related differences in emotion regulation after psychological contract breach

    OpenAIRE

    Bal, P. Matthijs; Smit, Priscilla

    2012-01-01

    Purpose: The aim of this paper was to investigate the role of emotion regulation and age in reactions to psychological contract breach towards positive and negative affect. We expected that in the context of contract breach, reappraisal emotion regulation mitigate the negative relation with affect. Moreover, based on lifespan theory, suppression emotion regulation was expected to be important for younger workers, because older workers have learned how to express themselves appropriately at th...

  1. Legal Effect of Breach of Warranty in Construction Insurance in Malaysia

    Directory of Open Access Journals (Sweden)

    Arazi Idrus

    2011-12-01

    Full Text Available This study is aimed at analyzing the legal effect of breach of warranty in construction insurance contracts in Malaysia in light of the current developments in The English insurance law. The required data and information were collected from Malaysian and English court decisions dealing with breach of warranties in English marine insurance law from the online Malayan Law Journal published on the LexisNexis online database and from published textbooks related to insurance warranties. This study would help to offer judicial guidance to courts in Peninsular Malaysia on how to resolve the legal dilemma associated with breach of warranty in Malaysian insurance law. It was found out that the effect of breach of a continuing warranty will result in the contract of insurance remaining in existence and the risk is being treated as having incepted at the outset but automatically coming to an end as of the date of the breach. More so, the insurer is being discharged from any future liability, although any liabilities of the insurer before the date of the breach are unaffected.

  2. Epithelial cell extrusion leads to breaches in the intestinal epithelium.

    Science.gov (United States)

    Liu, Julia J; Davis, Elisabeth M; Wine, Eytan; Lou, Yuefei; Rudzinski, Jan K; Alipour, Misagh; Boulanger, Pierre; Thiesen, Aducio L; Sergi, Consolato; Fedorak, Richard N; Muruve, Daniel; Madsen, Karen L; Irvin, Randall T

    2013-04-01

    Two distinct forms of intestinal epithelial cell (IEC) extrusion are described: 1 with preserved epithelial integrity and 1 that introduced breaches in the epithelial lining. In this study, we sought to determine the mechanism underlying the IEC extrusion that alters the permeability of the gut epithelium. IEC extrusions in polarized T84 monolayer were induced with nigericin. Epithelial permeability was assessed with transepithelial electrical resistance and movements of latex microspheres and green fluorescent protein-transfected Escherichia coli across the monolayer. In vivo IEC extrusion was modulated in wild-type and a colitic (interleukin-10 knock-out) mouse model with caspase-1 activation and inhibition. Luminal aspirates and mucosal biopsies from control patients and patients with inflammatory bowel disease were analyzed for caspase-1 and caspase-3&7 activation. Caspase-1-induced IEC extrusion in T84 monolayers resulted in dose-dependent and time-dependent barrier dysfunction, reversible with caspase-1 inhibition. Moreover, the movements of microspheres and microbes across the treated epithelial monolayers were observed. Increased caspase-1-mediated IEC extrusion in interleukin-10 knock-out mice corresponded to enhanced permeation of dextran, microspheres, and translocation of E. coli compared with wild type. Caspase-1 inhibition in interleukin-10 knock-out mice resulted in a time-dependent reduction in cell extrusion and normalization of permeability to microspheres. Increased IEC extrusion in wild-type mice was induced with caspase-1 activation. In human luminal aspirates, the ratio of positively stained caspase-1 to caspase-3&7 cells were 1:1 and 2:1 in control patients and patients with inflammatory bowel disease, respectively; these observations were confirmed by cytochemical analysis of mucosal biopsies. IEC extrusion mediated by caspase-1 activation contributes to altered intestinal permeability in vitro and in vivo.

  3. Filament breaches during air-gap spinning%空气隙纺丝过程中的单丝破裂

    Institute of Scientific and Technical Information of China (English)

    B.Wirth; M.Wamecke; B.Sclimenk; G.Seide; T.Gries; 湛烂瑜

    2011-01-01

    Solution spinning processes are used if polymers decompose below their melting range. In contrast to the melt spinning process, solvents like dimethylformamide (DMF) lead to rising costs. Main factors are purchase, handling, disposal and greater efforts for employee protection. The air-gap spinning method belongs to the solution spinning processes are allows a higher speed of production, which partly compensates for the disadvantages named before. The air-gap offers a lot of advantages. Process gases can be changed, and the different possible fluid flows lead to a high variability of the process. Due to the combination of dry- and wet-spinning, there are filament breaches caused by both processes. The combination leads to a low stability of the entire process. Current research at Institut fur Textilechnik RWTH Aachen University (ITA) concentrates on spinning experiments for this process. It was possible to identify typical characteristics of filament breaches, which are based on different mechanisms. These kinds of breaches are described by mechanisms mostly known from the literature. Strategies are identified which help to avoid those breaches.%如果聚合物在熔点范围以下分解,就采用溶液纺丝法。与熔体纺丝法大不相同,溶剂如二甲基甲酰胺(DMF)导致成本升高。主要因素是采购、维护、处置和员工防护。空气隙纺丝法属于溶液纺丝法,能提高生产速度,部分弥补了上述缺点,空气隙提供许多优点,能改变工艺气体,可能的不同液流形成多种多样的工艺。由于干法和湿法纺丝的结合,存在由两种工艺引起的单丝破裂,这种结合使整个工艺的稳定性差。目前亚琛工业大学纺织研究所(ITA)正集中研究这种工艺的纺丝试验。基于不同的机理,确定单丝破裂的典型特征是可能的。主要由文献中已知的机理描述了这类破裂,确定了有助于避免这些破裂的对策。

  4. Experimental study on buoyancy-driven exchange flows through breaches of a tokamak vacuum vessel in a fusion reactor under the loss-of-vacuum-event conditions

    Energy Technology Data Exchange (ETDEWEB)

    Takase, Kazuyuki; Tomoaki, Kunugi; Ogawa, Masurou; Seki, Yasushi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)

    1997-02-01

    As one of thermofluid safety studies in the International Thermonuclear Experimental Reactor, buoyancy-driven exchange flow behavior through breaches of a vacuum vessel (VV) has been investigated quantitatively by using a preliminary loss-of-vacuum-event (LOVA) apparatus that simulated the tokamak VV of a fusion reactor with a small-scaled model. To carry out the present experiments under the atmospheric pressure condition, helium gas and air were provided as the working fluids. The inside of the VV was initially filled with helium gas and the outside was atmosphere. The breaches on the VV under the LOVA condition were simulated by opening six simulated breaches to which were set the different positions on the VV. When the buoyancy-driven exchange flow through the breach occurred, helium gas went out from the inside of the VV through the breach to the outside and air flowed into the inside of the VV through the breach from the outside. The exchange rate in the VV between helium gas and air was calculated from the measured weight change of the VV with time since the experiment has started. experimental parameters were breach position, breach number, breach length, breach size, and breach combination. The present study clarifies that the relation between the exchange rate and the breach position of the VV depended on the magnitude of the potential energy from the ground level to the breach position, and then, the exchange rate decreased as the breach length increased and as the breach size decreased.

  5. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  6. Is CT angiography of the pulmonary arteries indicated in patients with high clinical probability of pulmonary embolism?

    Science.gov (United States)

    Martínez Montesinos, L; Plasencia Martínez, J M; García Santos, J M

    2017-06-30

    When a diagnostic test confirms clinical suspicion, the indicated treatment can be administered. A problem arises when the diagnostic test does not confirm the initially suspected diagnosis; when the suspicion is grounded in clinically validated predictive rules and is high, the problem is even worse. This situation arises in up to 40% of patients with high suspicion for acute pulmonary embolism, raising the question of whether CT angiography of the pulmonary arteries should be done systematically. This paper reviews the literature about this issue and lays out the best evidence about the relevant recommendations for patients with high clinical suspicion of acute pulmonary embolism and negative findings on CT angiography. It also explains the probabilistic concepts derived from Bayes' theorem that can be useful for ascertaining the most appropriate approach in these patients. Copyright © 2017 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. Increasing Food Acceptance in the School Setting for Children with Autism Spectrum Disorder Using High Probability Requests Sequences

    Science.gov (United States)

    Congdon, Marissa

    2013-01-01

    Behavioral feeding difficulties occur at a high rate in children with autism spectrum disorders (ASD) and can have a serious impact on their overall health and development. Although there are a number of studies demonstrating effective strategies for addressing behavioral feeding difficulties in children with ASD, the majority of them have been…

  8. Increasing Food Acceptance in the School Setting for Children with Autism Spectrum Disorder Using High Probability Requests Sequences

    Science.gov (United States)

    Congdon, Marissa

    2013-01-01

    Behavioral feeding difficulties occur at a high rate in children with autism spectrum disorders (ASD) and can have a serious impact on their overall health and development. Although there are a number of studies demonstrating effective strategies for addressing behavioral feeding difficulties in children with ASD, the majority of them have been…

  9. Local Sea Level Changes: Assessing and Accounting for the Risk Associated With the Low-Probability, High-Risk Tail of the Risk Spectrum

    Science.gov (United States)

    Plag, H. P.

    2014-12-01

    Stakeholders in the coastal zone, particularly the urban coasts, are turning to science to get information on future Local Sea Level (LSL) rise. Many scientists and scientific committees respond to this request with a range of plausible trajectories (RPT) defined by a number of possible trajectories each corresponding to a certain scenario. Often these assessments take a starting point in the small number of global sea level trajectories provided by the IPCC. This approach is inherently deterministic. The resulting RPT, which can be quite large, is considered as reflecting "uncertainty in LSL projections." Non-scientists often use the RPT to select a preferred and much narrower sub-RPT, for which they plan, or they use the "large uncertainty" to justify not taking any measures. In response to societal needs, science focuses on a reduction of the uncertainties through improved deterministic models. This approach has a number of problems: (1) The complexity of LSL as the outcome of many local, regional and global earth system processes, including anthropogenic processes, renders a deterministic approach to prediction invalid. (2) Most assessments of the RPT account for an incomplete set of relevant earth system processes, and for each processes make assumptions that (often arbitrarily) constrain the contribution from this process. (3) LSL is an inherently probabilistic variable that has a broad probability density function (PDF), with a complex dependency of this PDF on the PDFs of the many contributing processes. In particular, the contribution from the large ice sheets has a PDF with low-probability high-impact tails that are generally neglected in deterministic LSL projections and in the sub-RPT used for coastal planning. A fully probabilistic assessment of the risk associated with LSL rise indicates that the standard deterministic assessment not only neglect most of the low-probability, high-risk tail of the PDF but also medium-probability, high-risk parts. This

  10. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  11. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  12. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  13. Hydrology and Sedimentology of a Series of Dam-Breach Paleolakes at Idaeus Fossae, Mars

    Science.gov (United States)

    Salese, F.; Di Achille, G.; Ori, G. G.

    2014-12-01

    We report on the identification and geological study of a nearly 300-km-long valley system located westward of Idaeus Fossae, in Tempe Terra, Mars. The valley apparently originates from a subsided area surrounding the ejecta of a relatively fresh crater and after about 25 km from its source area enters a series of dam-breach paleolakes. The lake chain consists of six open basins (with associated fan-shaped sedimentary deposits) and covers an area of about 2500 sq. km over a E-W stretch of about 100 km. The latter lakes are interconnected and were likely coeval and drain eastward into a main 20-km-diameter crater-lake forming a complex and multilobate deltaic deposit whose front lies at about 1800-1820 m below the martian datum. The deltaic deposit is about 8-km-long and morphologically resembles the Jezero delta, showing a well-developed distributary pattern with evidence of channel switching on the delta plain. The floor of the crater-lake is not incised by the main valley, however a breach area is present along the eastern crater rim and consists of two spillover channels at about the same elevation of the crater inlet (-1820 m). These latter channels connect the crater lake to the eastward portion of the valley continuing towards Idaeus Fossae with a more than 180-km-long complex pattern of anabranching channels . We used high-resolution imagery and topography (HRSC, and CTX and HiRISE stereo pairs) to derive a geological-geomorphological map of the area and to understand its evolution. The extension and morphology of the observed fluvio-lacustrine features suggest relatively long-term (>103 yrs) formation timescales as also supported by the presence of the main fan delta in the central open basin. The overall water source for the 300-km-long fluvial system is unclear, though the occurrence of many rampart craters and the relationships between their ejecta and the channels suggest that subsurface volatiles might have also played an important role.

  14. Atmospheric ionization by high-fluence, hard spectrum solar proton events and their probable appearance in the ice core archive

    CERN Document Server

    Melott, Adrian L; Laird, Claude M; Neuenswander, Ben; Atri, Dimitra

    2016-01-01

    Solar energetic particles ionize the atmosphere, leading to production of nitrogen oxides. It has been suggested that some such events are visible as layers of nitrate in ice cores, yielding archives of energetic, high fluence solar proton events (SPEs). There has been controversy, due to slowness of transport for these species down from the upper stratosphere; past numerical simulations based on an analytic calculation have shown very little ionization below the mid stratosphere. These simulations suffer from deficiencies: they consider only soft SPEs and narrow energy ranges; spectral fits are poorly chosen; with few exceptions secondary particles in air showers are ignored. Using improved simulations that follow development of the proton-induced air shower, we find consistency with recent experiments showing substantial excess ionization down to 5 km. We compute nitrate available from the 23 February 1956 SPE, which had a high fluence, hard spectrum, and well-resolved associated nitrate peak in a Greenland...

  15. A probability model: Tritium release into the coolant of a light water tritium production reactor

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, D N

    1992-04-01

    This report presents a probability model of the total amount of tritium that will be released from a core of tritium target rods into the coolant of a light water reactor during a tritium production cycle.The model relates the total tritium released from a core to the release characteristics of an individual target rod within the core. The model captures total tritium release from two sources-release via target rod breach and release via permeation through the target rod. Specifically, under conservative assumptions about the breach characteristics of a target rod, total tritium released from a core is modeled as a function of the probability of a target breach and the mean and standard deviation of the permeation reduction factor (PRF) of an individual target rod. Two dominant facts emerge from the analysis in this report. First, total tritium release cannot be controlled and minimized solely through the PRF characteristics of a target rod. Tritium release via breach must be abated if acceptable tritium production is to be achieved. Second, PRF values have a saturation point to their effectiveness. Specifically, in the presence of any realistic level of PRF variability, increasing PRF values above approximately 1000 wig contribute little to minimizing total tritium release.

  16. Caudate nucleus signals for breaches of expectation in a movement observation paradigm.

    Science.gov (United States)

    Schiffer, Anne-Marike; Schubotz, Ricarda I

    2011-01-01

    The striatum has been established as a carrier of reward-related prediction errors. This prediction error signal concerns the difference between how much reward was predicted and how much reward is gained. However, it remains to be established whether general breaches of expectation, i.e., perceptual prediction errors, are also implemented in the striatum. The current study used functional magnetic resonance imaging (fMRI) to investigate the role of caudate nucleus in breaches of expectation. Importantly, breaches were not related to the occurrence or absence of reward. Preceding the fMRI study, participants were trained to produce a sequence of whole-body movements according to auditory cues. In the fMRI session, they watched movies of a dancer producing the same sequences either according to the cue (88%) or not (12%). Caudate nucleus was activated for the prediction-violating movements. This activation was flanked by activity in posterior superior temporal sulcus, the temporo-parietal junction and adjacent angular gyrus, a network that may convey the deviating movement to caudate nucleus, while frontal areas may reflect adaptive adjustments of the current prediction. Alternative interpretations of caudate activity relating either to the saliency of breaches of expectation or to behavioral adaptation could be excluded by two control contrasts. The results foster the notion that neurons in the caudate nucleus code for a breach in expectation, and point toward a distributed network involved in detecting, signaling and adjusting behavior and expectations toward violated prediction.

  17. Whip Rule Breaches in a Major Australian Racing Jurisdiction: Welfare and Regulatory Implications

    Science.gov (United States)

    Hood, Jennifer; McDonald, Carolyn; Wilson, Bethany; McManus, Phil; McGreevy, Paul

    2017-01-01

    Simple Summary An evidence-based analysis of whip rule breaches in horse racing is needed to address community expectations that racehorses are treated humanely. The study provides the first peer-reviewed characterisation of whip rule breaches and their regulatory outcomes in horseracing, and considers the relationship between rules affecting racing integrity and the welfare of racehorses in a major Australian racing jurisdiction. Abstract Whip use in horseracing is increasingly being questioned on ethical, animal welfare, social sustainability, and legal grounds. Despite this, there is weak evidence for whip use and its regulation by Stewards in Australia. To help address this, we characterised whip rule breaches recorded by Stewards using Stewards Reports and Race Diaries from 2013 and 2016 in New South Wales (NSW) and the Australian Capital Territory (ACT). There were more recorded breaches at Metropolitan (M) than Country (C) or Provincial (P) locations, and by riders of horses that finished first, second, or third than by riders of horses that finished in other positions. The most commonly recorded breaches were forehand whip use on more than five occasions before the 100-metre (m) mark (44%), and whip use that raises the jockey’s arm above shoulder height (24%). It is recommended that racing compliance data be analysed annually to inform the evidence-base for policy, education, and regulatory change, and ensure the welfare of racehorses and racing integrity.

  18. Development of breached pin performance analysis code SAFFRON (System of Analyzing Failed Fuel under Reactor Operation by Numerical method)

    Energy Technology Data Exchange (ETDEWEB)

    Ukai, Shigeharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1995-03-01

    On the assumption of fuel pin failure, the breached pin performance analysis code SAFFRON was developed to evaluate the fuel pin behavior in relation to the delayed neutron signal response during operational mode beyond the cladding failure. Following characteristic behavior in breached fuel pin is modeled in 3-dimensional finite element method : pellet swelling by fuel-sodium reaction, fuel temperature change, and resultant cladding breach extension and delayed neutron precursors release into coolant. Particularly, practical algorithm of numerical procedure in finite element method was originally developed in order to solve the 3-dimensional non-linear contact problem between the swollen pellet due to fuel-sodium reaction and breached cladding. (author).

  19. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  20. Agreeing Probability Measures for Comparative Probability Structures

    NARCIS (Netherlands)

    P.P. Wakker (Peter)

    1981-01-01

    textabstractIt is proved that fine and tight comparative probability structures (where the set of events is assumed to be an algebra, not necessarily a σ-algebra) have agreeing probability measures. Although this was often claimed in the literature, all proofs the author encountered are not valid

  1. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  2. Understanding Students' Beliefs about Probability.

    Science.gov (United States)

    Konold, Clifford

    The concept of probability is not an easy concept for high school and college students to understand. This paper identifies and analyzes the students' alternative frameworks from the viewpoint of constructivism. There are various interpretations of probability through mathematical history: classical, frequentist, and subjectivist interpretation.…

  3. Whole-patient measure of safety: using administrative data to assess the probability of highly undesirable events during hospitalization.

    Science.gov (United States)

    Perla, Rocco J; Hohmann, Samuel F; Annis, Karen

    2013-01-01

    Hospitals often have limited ability to obtain primary clinical data from electronic health records to use in assessing quality and safety. We outline a new model that uses administrative data to gauge the safety of care at the hospital level. The model is based on a set of highly undesirable events (HUEs) defined using administrative data and can be customized to address the priorities and needs of different users. Patients with HUEs were identified using discharge abstracts from July 1, 2008 through June 30, 2010. Diagnoses were classified as HUEs based on the associated present-on-admission status. The 2-year study population comprised more than 6.5 million discharges from 161 hospitals. The proportion of hospitalizations including at least one HUE during the 24-month study period varied greatly among hospitals, with a mean of 7.74% (SD 2.3%) and a range of 13.32% (max, 15.31%; min, 1.99%). The whole-patient measure of safety provides a global measure to use in assessing hospitals with the patient's entire care experience in mind. As administrative and clinical datasets become more consistent, it becomes possible to use administrative data to compare the rates of HUEs across organizations and to identify opportunities for improvement.

  4. The Transition of Benthic Nutrient Sources after Planned Levee Breaches Adjacent to Upper Klamath and Agency Lakes, Oregon

    Science.gov (United States)

    Kuwabara, James S.; Topping, Brent R.; Carter, James L.; Parcheso, Francis; Cameron, Jason M.; Asbill, Jessica R.; Fend, Steven V.; Duff, John H.; Engelstad, Anita C.

    2010-01-01

    Four sampling trips were coordinated after planned levee breaches that hydrologically reconnected both Upper Klamath Lake and Agency Lake, Oregon, to adjacent wetlands. Sets of nonmetallic pore-water profilers were deployed during these trips in November 2007, June 2008, May 2009, and July 2009. Deployments temporally spanned the annual cyanophyte bloom of Aphanizomenon flos-aquae (AFA) and spatially involved three lake and four wetland sites. Profilers, typically deployed in triplicate at each lake or wetland site, provided high-resolution (centimeter-scale) estimates of the vertical concentration gradients for diffusive-flux determinations. Estimates based on molecular diffusion may underestimate benthic flux because solute transport across the sediment-water interface can be enhanced by processes including bioturbation, bioirrigation and groundwater advection. Water-column and benthic samples were also collected to help interpret spatial and temporal trends in diffusive-flux estimates. Data from these samples complement taxonomic and geochemical analyses of bottom-sediments taken from Upper Klamath Lake (UKL) in prior studies. This ongoing study provides information necessary for developing process-interdependent solute-transport models for the watershed (that is, models integrating physical, geochemical, and biological processes) and supports efforts to evaluate remediation or load-allocation strategies. To augment studies funded by the U.S. Bureau of Reclamation (USBR), the Department of the Interior supported an additional full deployment of pore-water profilers in November 2007 and July 2009, immediately following the levee breaches and after the crash of the annual summer AFA bloom. As observed consistently since 2006, benthic flux of 0.2-micron filtered, soluble reactive phosphorus (that is, biologically available phosphorus, primarily as orthophosphate; SRP) was consistently positive (that is, out of the sediment into the overlying water column) and

  5. Psychological contract types as moderator in the breach-violation and violation-burnout relationships.

    Science.gov (United States)

    Jamil, Amber; Raja, Usman; Darr, Wendy

    2013-01-01

    This research examined the relationships between perceived psychological contract breach, felt violation, and burnout in a sample (n = 361) of employees from various organizations in Pakistan. The moderating role of contract types in these relationships was also tested. Findings supported a positive association between perceived psychological contract breach and felt violation and both were positively related to burnout. Transactional and relational contracts moderated the felt violation-burnout relationship. Scores on relational contract type tended to be higher than for transactional contract type showing some contextual influence.

  6. Probability and Relative Frequency

    Science.gov (United States)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  7. Elements of probability theory

    CERN Document Server

    Rumshiskii, L Z

    1965-01-01

    Elements of Probability Theory presents the methods of the theory of probability. This book is divided into seven chapters that discuss the general rule for the multiplication of probabilities, the fundamental properties of the subject matter, and the classical definition of probability. The introductory chapters deal with the functions of random variables; continuous random variables; numerical characteristics of probability distributions; center of the probability distribution of a random variable; definition of the law of large numbers; stability of the sample mean and the method of moments

  8. Evaluating probability forecasts

    CERN Document Server

    Lai, Tze Leung; Shen, David Bo; 10.1214/11-AOS902

    2012-01-01

    Probability forecasts of events are routinely used in climate predictions, in forecasting default probabilities on bank loans or in estimating the probability of a patient's positive response to treatment. Scoring rules have long been used to assess the efficacy of the forecast probabilities after observing the occurrence, or nonoccurrence, of the predicted events. We develop herein a statistical theory for scoring rules and propose an alternative approach to the evaluation of probability forecasts. This approach uses loss functions relating the predicted to the actual probabilities of the events and applies martingale theory to exploit the temporal structure between the forecast and the subsequent occurrence or nonoccurrence of the event.

  9. Retrospective Analysis of Allegations About the Breach of Discipline in Ankara Chamber of Dentist Between 2005-2009

    Directory of Open Access Journals (Sweden)

    Zehtiye Füsun Yaşar

    2014-04-01

    Full Text Available PURPOSE: The allegations about the breach of disciplinary code in the Ankara Chamber of Dentists, the topic of the allegation and the specifications of the people whom allegedly breached the code are examined. METHOD: 198 files charging 116 dentists have been examined in this study. The nature of the allegation, the age, sex, professional position of the doctor, and the sanctions held against the dentists when the allegations of breaching were true are studied. FINDINGS: 45 (%38,8 of the dentists who were subject to questioning were women and 71 (%61,2 were men. It was observed that 76 of the dentists (%65,5 worked at polyclinics, and the 63 (%54,3 dentists who worked at polyclinics were responsible managers. Out of 198 breaching allegations 163 (%82,3 were proven to be true and various sanctions were implied to dentists. When the distribution of the breaches were examined it was found out that mostly the dentists didn’t comply with the prohibition of advertising (%42,9 and worked under the minimum wage (%22,7. CONCLUSION: It was observed that the majority of the dentists, who were questioned under the allegations of breach, did not have enough information about the law, code and regulations concerning their profession and also some dentists breached the same regulation again. With all the obtained data, it is suggested that the dentists should be educated about the rules and regulations by people from professional associations. Key Words: Dentist; law; regulation; breach

  10. The Relationship between Psychological Contract Breach and Organizational Commitment: Exchange Imbalance as a Moderator of the Mediating Role of Violation

    Science.gov (United States)

    Cassar, Vincent; Briner, Rob B.

    2011-01-01

    This study tested the mediating role of violation in the relationship between breach and both affective and continuance commitment and the extent to which this mediating role is moderated by exchange imbalance amongst a sample of 103 sales personnel. Results suggest that violation mediated the relationship between breach and commitment. Also,…

  11. 29 CFR 37.105 - Whom must the Director notify if enforcement action under a Notification of Breach of...

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 1 2010-07-01 2010-07-01 true Whom must the Director notify if enforcement action under a Notification of Breach of Conciliation Agreement is commenced? 37.105 Section 37.105 Labor Office of the... action under a Notification of Breach of Conciliation Agreement is commenced? In such circumstances,...

  12. Case Study of the Chaq-Chaq Dam Failure: Parameter Estimation and Evaluation of Dam Breach Prediction Models

    Directory of Open Access Journals (Sweden)

    Dr. KawaZedanAbdulrahman

    2014-05-01

    Full Text Available On 4th of February, 2006 at about 10:00 pm.Chaq-Chaq dam failed due to overtopping. The fall of 131.2 mm of rain over a 24-hour period was recorded at Sulaimani metrological gage station, which is located about 7.5Kmsouth-east of the dam. As a result, the reservoir level rose, the dam has been overtopped and finally breached near the spillway at the right abutment. Fortunately no human lives loss nor important structure destruction were reporteddue to the dam failure. The aim of this paper is to estimate the flood hydrograph passing through Chaq-Chaq dam breach using measured breach geometry as input to unsteady option of HEC RAS 4.1.0 and calibrating the breach formation time to obtain the measured maximum water surface at Chaq-Chaq Bridge (1.36 km downstream of dam axis. In addition the recent breach prediction models were evaluated to check their accuracy in predicting the breach geometry, breach formation time and peak breach discharge.

  13. Psychological contract breach in the anticipatory stage of change : Employee responses and the moderating role of supervisory informational justice

    NARCIS (Netherlands)

    de Ruiter, M.; Schalk, R.; Schaveling, Jaap; van Gelder, Daniel

    This study examined the impact of two types of psychological contract breach (organizational policies and social atmosphere breach) on resistance to change and engagement in the anticipatory phase of change and assessed whether supervisory informational justice mitigated the negative effects of

  14. Psychological contract breach in the anticipatory stage of change : Employee responses and the moderating role of supervisory informational justice

    NARCIS (Netherlands)

    De Ruiter, M.; Schaveling, J.; Schalk, R.; Gelder, van D.

    2016-01-01

    This study examined the impact of two types of psychological contract breach (organizational policies and social atmosphere breach) on resistance to change and engagement in the anticipatory phase of change and assessed whether supervisory informational justice mitigated the negative effects of

  15. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  16. Introduction to probability

    CERN Document Server

    Roussas, George G

    2006-01-01

    Roussas's Introduction to Probability features exceptionally clear explanations of the mathematics of probability theory and explores its diverse applications through numerous interesting and motivational examples. It provides a thorough introduction to the subject for professionals and advanced students taking their first course in probability. The content is based on the introductory chapters of Roussas's book, An Intoduction to Probability and Statistical Inference, with additional chapters and revisions. Written by a well-respected author known for great exposition an

  17. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  18. Breach modelling by overflow with TELEMAC 2D: Comparison with large-scale experiments

    Science.gov (United States)

    An erosion law has been implemented in TELEMAC 2D to represent the surface erosion process to model the breach formation of a levee. We focus on homogeneous and earth fill levee to simplify this first implementation. The first part of this study reveals the ability of this method to represent simu...

  19. Disclosure of past crimes: an analysis of mental health professionals' attitudes towards breaching confidentiality.

    Science.gov (United States)

    Wangmo, Tenzin; Handtke, Violet; Elger, Bernice Simone

    2014-09-01

    Ensuring confidentiality is the cornerstone of trust within the doctor-patient relationship. However, health care providers have an obligation to serve not only their patient's interests but also those of potential victims and society, resulting in circumstances where confidentiality must be breached. This article describes the attitudes of mental health professionals (MHPs) when patients disclose past crimes unknown to the justice system. Twenty-four MHPs working in Swiss prisons were interviewed. They shared their experiences concerning confidentiality practices and attitudes towards breaching confidentiality in prison. Qualitative analysis revealed that MHPs study different factors before deciding whether a past crime should be disclosed, including: (1) the type of therapy the prisoner-patient was seeking (i.e., whether it was court-ordered or voluntary), (2) the type of crime that is revealed (e.g., a serious crime, a crime of a similar nature to the original crime, or a minor crime), and (3) the danger posed by the prisoner-patient. Based on this study's findings, risk assessment of dangerousness was one of the most important factors determining disclosures of past crimes, taking into consideration both the type of therapy and the crime involved. Attitudes of MHPs varied with regard to confidentiality rules and when to breach confidentiality, and there was thus a lack of consensus as to when and whether past crimes should be reported. Hence, legal and ethical requirements concerning confidentiality breaches must be made clear and known to physicians in order to guide them with difficult cases.

  20. Remedies for Breach Under the United Nations Convention on Contracts for International Sale of Goods (CISG)

    DEFF Research Database (Denmark)

    Lookofsky, Joseph

    2011-01-01

    For every breach of a binding contract, there must be some remedy. The gap-filling remedial structure of the 1980 Vienna Sales Convention (CISG) reflects the fact that all significant forms of remedial relief may be said to fall within three basic courses of action which modern legal systems make...

  1. Developing a broader approach to management of infection control breaches in health care settings.

    Science.gov (United States)

    Patel, Priti R; Srinivasan, Arjun; Perz, Joseph F

    2008-12-01

    Our experiences with health departments and health care facilities suggest that questions surrounding instrument reprocessing errors and other infection control breaches are becoming increasingly common. We describe an approach to management of these incidents that focuses on risk of bloodborne pathogen transmission and the role of public health and other stakeholders to inform patient notification and testing decisions.

  2. Consequence analysis of a liner breach due to steam under the liner

    Energy Technology Data Exchange (ETDEWEB)

    HIMES, D.A.

    1999-06-01

    Radiological and toxicological consequences are estimated for a steam release from tank C-106 associated with a breach of the tank liner due to formation of steam under the liner after dry-out of the sludge layer in the tank. The consequences are shown to be well below the most restrictive risk guidelines.

  3. 47 CFR 64.2011 - Notification of customer proprietary network information security breaches.

    Science.gov (United States)

    2010-10-01

    ... Proprietary Network Information § 64.2011 Notification of customer proprietary network information security... 47 Telecommunication 3 2010-10-01 2010-10-01 false Notification of customer proprietary network information security breaches. 64.2011 Section 64.2011 Telecommunication FEDERAL COMMUNICATIONS...

  4. Liability to disgorge profits upon breach of contract or a delict

    NARCIS (Netherlands)

    Schrage, E.J.H.

    2013-01-01

    Remedies regarding contract and tort are, generally speaking, concerned with the incidence of liability for loss or damage suffered, whereas the claim in unjust enrichment is said to require that the enrichment has occurred at the expense of the creditor. Consequently claims for breach of contract a

  5. Dependent Probability Spaces

    Science.gov (United States)

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  6. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  7. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  8. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    2013-01-01

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned prob

  9. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. Dynamical Simulation of Probabilities

    Science.gov (United States)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  12. Philosophy and probability

    CERN Document Server

    Childers, Timothy

    2013-01-01

    Probability is increasingly important for our understanding of the world. What is probability? How do we model it, and how do we use it? Timothy Childers presents a lively introduction to the foundations of probability and to philosophical issues it raises. He keeps technicalities to a minimum, and assumes no prior knowledge of the subject. He explains the main interpretations of probability-frequentist, propensity, classical, Bayesian, and objective Bayesian-and uses stimulatingexamples to bring the subject to life. All students of philosophy will benefit from an understanding of probability,

  13. 浅析中考数学游戏中的概率问题%Probability Problem of Mathematical Games in the Senior High School Entrance Examination

    Institute of Scientific and Technical Information of China (English)

    杨春霞; 李锋

    2014-01-01

    以游戏为载体的概率问题在中考数学试题中经常出现。这些题目新颖有趣,既能极大地调动学生的学习积极性,又能较好地考查学生应用数学知识分析问题、解决问题的能力。%Probability problem taking the game as the carrier often appear in the math test in the senior high school entrance examination. The topics are novel and interesting,not only can greatly mobilize students’ learning initiative, but also can better examine the ability of appli-cating mathematical knowledge to analyze and solve problems.

  14. Process of Levee Breach by Overflow at the Full Scale Chiyoda Experimental Channel

    Science.gov (United States)

    Shimada, T.; Yokoyama, H.

    2011-12-01

    The increased occurrence of storm disasters caused by typhoons and local downpours in recent years has given rise to concerns over the possibility of large-scale floods resulting from river overflow. Levee breaches cause particularly severe damage, and in Japan, more than 80% of such accidents in the past have been attributed to overflow. Previous studies on overflow-induced levee breaches have not revealed the mechanisms of these issues on a full-scale 3D basis (i.e., side-overflow taking river flow on the riverside land into consideration). It is important to clarify these mechanisms in terms of disaster prevention and for the purpose of bringing progress in future studies on overflow-induced failure. Levees (levee crown width is 3m in 2010 and 6m in 2011, levee height is 3m, levee length is 80m) were built in the Chiyoda Experimental Channel (full-scale experimental channel; width is 30m, length is 1,300m, maximum discharge is 170t/s) in Hokkaido Japan, and a three-dimensional experiment on levee breach by overflow. The findings of the experiment are as follows: After the beginning of overflow, levee breach widening did not begin until after most of the levee section had collapsed. And in case of 6m of the levee crown width, that time in becomes long. It was also considered that, even if overflow occurred, extremely serious damage (e.g., sudden increase in levee breach width and overflow discharge) was unlikely unless the majority of the levee section collapsed.

  15. 高速网中最小阻塞率的接入控制研究%Call Admission Control with Optimal Block Probability in High-Speed Network

    Institute of Scientific and Technical Information of China (English)

    赵尔敦; 石冰心; 郭喻茹; 黄川

    2001-01-01

    A call admission control scheme with optimal block probability in high-speed network is given. Under the environment of multi-class calls ,the acceptance area with minimum call block probability is obtained. Numerical results show that the maximum call number decreases with the stay-time of the call and increases with the load of the call.

  16. 25 CFR 224.88 - What must the Director do after receiving notice of a violation or breach from the tribe?

    Science.gov (United States)

    2010-04-01

    ... violation or breach from the tribe? 224.88 Section 224.88 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF... DEVELOPMENT AND SELF DETERMINATION ACT Implementation of Tribal Energy Resource Agreements Violation Or Breach § 224.88 What must the Director do after receiving notice of a violation or breach from the tribe?...

  17. A study to assess and measure the breaches in the child rights

    Directory of Open Access Journals (Sweden)

    Pooja Chaudhary

    2014-10-01

    Full Text Available Background: Children signify eternal optimism in the human being and provide potential for the development. Every nation whether developed or developing, links its future with the status of child. An understanding of the physical, social and behavioral factors is conducive to assess the current situation and vulnerability of any child. To ensure the utmost level of health and safety of a child, we need to have an insight of the violation of child rights. We conducted this study to assess and measure the breaches in child rights. Materials and Methods: A cross-sectional study was carried out among 200 children between 10 to 18 years of age in one of the field practice areas of Community Medicine department, B. J. Medical College, Ahmedabad. Results: More than 90% of the children were born in the hospital settings, fully vaccinated and living with their parents. Of these, 60% of the children reported to have some kind of illness in the past one year, and majority of them approached healthcare facility for that the treatment. Of these, 26% of them had low birth weight (LBW and 68% had body mass index (BMI <18.5. About 93% of the children were enrolled in either government or private school and out of them, 54% complained of the burden of either homework or tuition. Of these, 6.5% children were school dropouts and all of them were girls; 4% of them were child labors. Conclusion: The study indicated positive findings in the areas of child survival, development and to some extent, in the domain of right to participation, though there were poor environmental conditions and recreational activities. Prevalence of malnutrition and child labor was also high. Study results highlighted the need for adequate services in these areas.

  18. Early degassing of lunar urKREEP by crust-breaching impact(s)

    Science.gov (United States)

    Barnes, Jessica J.; Tartèse, Romain; Anand, Mahesh; McCubbin, Francis M.; Neal, Clive R.; Franchi, Ian A.

    2016-08-01

    Current models for the Moon's formation have yet to fully account for the thermal evolution of the Moon in the presence of H2O and other volatiles. Of particular importance is chlorine, since most lunar samples are characterised by unique heavy δ37Cl values, significantly deviating from those of other planetary materials, including Earth, for which δ37Cl values cluster around ∼0‰. In order to unravel the cause(s) of the Moon's unique chlorine isotope signature, we performed a comprehensive study of high-precision in situ Cl isotope measurements of apatite from a suite of Apollo samples with a range of geochemical characteristics and petrologic types. The Cl-isotopic compositions measured in lunar apatite in the studied samples display a wide range of δ37Cl values (reaching a maximum value of +36‰), which are positively correlated with the amount of potassium (K), Rare Earth Element (REE) and phosphorous (P) (KREEP) component in each sample. Using these new data, integrated with existing H-isotope data obtained for the same samples, we are able to place these findings in the context of the canonical lunar magma ocean (LMO) model. The results are consistent with the urKREEP reservoir being characterised by a δ37Cl ∼+30‰. Such a heavy Cl isotope signature requires metal-chloride degassing from a Cl-enriched urKREEP LMO residue, a process likely to have been triggered by at least one large crust-breaching impact event that facilitated the transport and exposure of urKREEP liquid to the lunar surface.

  19. How the mountain pine beetle (Dendroctonus ponderosae) breached the Canadian Rocky Mountains.

    Science.gov (United States)

    Janes, Jasmine K; Li, Yisu; Keeling, Christopher I; Yuen, Macaire M S; Boone, Celia K; Cooke, Janice E K; Bohlmann, Joerg; Huber, Dezene P W; Murray, Brent W; Coltman, David W; Sperling, Felix A H

    2014-07-01

    The mountain pine beetle (MPB; Dendroctonus ponderosae Hopkins), a major pine forest pest native to western North America, has extended its range north and eastward during an ongoing outbreak. Determining how the MPB has expanded its range to breach putative barriers, whether physical (nonforested prairie and high elevation of the Rocky Mountains) or climatic (extreme continental climate where temperatures can be below -40 °C), may contribute to our general understanding of range changes as well as management of the current epidemic. Here, we use a panel of 1,536 single nucleotide polymorphisms (SNPs) to assess population genetic structure, connectivity, and signals of selection within this MPB range expansion. Biallelic SNPs in MPB from southwestern Canada revealed higher genetic differentiation and lower genetic connectivity than in the northern part of its range. A total of 208 unique SNPs were identified using different outlier detection tests, of which 32 returned annotations for products with putative functions in cholesterol synthesis, actin filament contraction, and membrane transport. We suggest that MPB has been able to spread beyond its previous range by adjusting its cellular and metabolic functions, with genome scale differentiation enabling populations to better withstand cooler climates and facilitate longer dispersal distances. Our study is the first to assess landscape-wide selective adaptation in an insect. We have shown that interrogation of genomic resources can identify shifts in genetic diversity and putative adaptive signals in this forest pest species. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  20. Simulating dam-breach flood scenarios of the Tangjiashan landslide dam induced by the Wenchuan Earthquake

    Directory of Open Access Journals (Sweden)

    X. Fan

    2012-10-01

    Full Text Available Floods from failures of landslide dams can pose a hazard to people and property downstream, which have to be rapidly assessed and mitigated in order to reduce the potential risk. The Tangjiashan landslide dam induced by the Mw = 7.9 2008 Wenchuan earthquake had impounded the largest lake in the earthquake affected area with an estimated volume of 3 × 108 m3, and the potential catastrophic dam breach posed a serious threat to more than 2.5 million people in downstream towns and Mianyang city, located 85 km downstream. Chinese authorities had to evacuate parts of the city until the Tangjiashan landslide dam was artificially breached by a spillway, and the lake was drained. We propose an integrated approach to simulate the dam-breach floods for a number of possible scenarios, to evaluate the severity of the threat to Mianyang city. Firstly, the physically-based BREACH model was applied to predict the flood hydrographs at the dam location, which were calibrated with observational data of the flood resulting from the artificial breaching. The output hydrographs from this model were inputted into the 1-D–2-D SOBEK hydrodynamic model to simulate the spatial variations in flood parameters. The simulated flood hydrograph, peak discharge and peak arrival time at the downstream towns fit the observations. Thus this approach is capable of providing reliable predictions for the decision makers to determine the mitigation plans. The sensitivity analysis of the BREACH model input parameters reveals that the average grain size, the unit weight and porosity of the dam materials are the most sensitive parameters. The variability of the dam material properties causes a large uncertainty in the estimation of the peak flood discharge and peak arrival time, but has little influence on the flood inundation area and flood depth downstream. The effect of cascading breaches of smaller dams downstream of the Tangjiashan dam was

  1. Investigation of breached depleted UF{sub 6} cylinders

    Energy Technology Data Exchange (ETDEWEB)

    DeVan, J.H. [Martin Marietta Energy Systems, Inc., Oak Ridge, TN (United States)

    1991-12-31

    In June 1990, during a three-site inspection of cylinders being used for long-term storage of solid depleted UF{sub 6}, two 14-ton cylinders at Portsmouth, Ohio, were discovered with holes in the barrel section of the cylinders. An investigation team was immediately formed to determine the cause of the failures and their impact on future storage procedures and to recommend corrective actions. Subsequent investigation showed that the failures most probably resulted from mechanical damage that occurred at the time that the cylinders had been placed in the storage yard. In both cylinders evidence pointed to the impact of a lifting lug of an adjacent cylinder near the front stiffening ring, where deflection of the cylinder could occur only by tearing the cylinder. The impacts appear to have punctured the cylinders and thereby set up corrosion processes that greatly extended the openings in the wall and obliterated the original crack. Fortunately, the reaction products formed by this process were relatively protective and prevented any large-scale loss of uranium. The main factors that precipitated the failures were inadequate spacing between cylinders and deviations in the orientations of lifting lugs from their intended horizontal position. After reviewing the causes and effects of the failures, the team`s principal recommendation for remedial action concerned improved cylinder handling and inspection procedures. Design modifications and supplementary mechanical tests were also recommended to improve the cylinder containment integrity during the stacking operation.

  2. Probability and radical behaviorism

    Science.gov (United States)

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  3. Probability and radical behaviorism

    OpenAIRE

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforc...

  4. The probabilities of unique events.

    Directory of Open Access Journals (Sweden)

    Sangeet S Khemlani

    Full Text Available Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable.

  5. PROBABILITY AND STATISTICS.

    Science.gov (United States)

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  6. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  7. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  8. Low-Probability High-Consequence (LPHC) Failure Events in Geologic Carbon Sequestration Pipelines and Wells: Framework for LPHC Risk Assessment Incorporating Spatial Variability of Risk

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Budnitz, Robert J. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-08-31

    If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO2 annually, with the CO2 delivered to many thousands of wells that will inject the CO2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelines are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of

  9. Accuracy of dual-source CT coronary angiography: first experience in a high pre-test probability population without heart rate control

    Energy Technology Data Exchange (ETDEWEB)

    Scheffel, Hans; Alkadhi, Hatem; Desbiolles, Lotus; Frauenfelder, Thomas; Schertler, Thomas; Husmann, Lars; Marincek, Borut; Leschka, Sebastian [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Plass, Andre; Vachenauer, Robert; Grunenfelder, Juerg; Genoni, Michele [Clinic for Cardiovascular Surgery, Zurich (Switzerland); Gaemperli, Oliver; Schepis, Tiziano [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); Kaufmann, Philipp A. [University Hospital Zurich, Cardiovascular Center, Zurich (Switzerland); University of Zurich, Center for Integrative Human Physiology, Zurich (Switzerland)

    2006-12-15

    The aim of this study was to assess the diagnostic accuracy of dual-source computed tomography (DSCT) for evaluation of coronary artery disease (CAD) in a population with extensive coronary calcifications without heart rate control. Thirty patients (24 male, 6 female, mean age 63.1{+-}11.3 years) with a high pre-test probability of CAD underwent DSCT coronary angiography and invasive coronary angiography (ICA) within 14{+-}9 days. No beta-blockers were administered prior to the scan. Two readers independently assessed image quality of all coronary segments with a diameter {>=}1.5 mm using a four-point score (1: excellent to 4: not assessable) and qualitatively assessed significant stenoses as narrowing of the luminal diameter >50%. Causes of false-positive (FP) and false-negative (FN) ratings were assigned to calcifications or motion artifacts. ICA was considered the standard of reference. Mean body mass index was 28.3{+-}3.9 kg/m{sup 2} (range 22.4-36.3 kg/m{sup 2}), mean heart rate during CT was 70.3{+-}14.2 bpm (range 47-102 bpm), and mean Agatston score was 821{+-}904 (range 0-3,110). Image quality was diagnostic (scores 1-3) in 98.6% (414/420) of segments (mean image quality score 1.68{+-}0.75); six segments in three patients were considered not assessable (1.4%). DSCT correctly identified 54 of 56 significant coronary stenoses. Severe calcifications accounted for false ratings in nine segments (eight FP/one FN) and motion artifacts in two segments (one FP/one FN). Overall sensitivity, specificity, positive and negative predictive value for evaluating CAD were 96.4, 97.5, 85.7, and 99.4%, respectively. First experience indicates that DSCT coronary angiography provides high diagnostic accuracy for assessment of CAD in a high pre-test probability population with extensive coronary calcifications and without heart rate control. (orig.)

  10. Probability state modeling theory.

    Science.gov (United States)

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  11. A Predictive Model for Selecting Patients with HCV Genotype 3 Chronic Infection with a High Probability of Sustained Virological Response to Peginterferon Alfa-2a/Ribavirin.

    Directory of Open Access Journals (Sweden)

    Tarik Asselah

    Full Text Available Access to direct-acting antiviral agents (DAAs is restricted in some settings; thus, the European Association for the Study of the Liver recommends dual peginterferon/ribavirin (PegIFN/RBV therapy wherever DAAs are unavailable. HCV genotype (GT 3 infection is now the most difficult genotype to eradicate and PegIFN/RBV remains an effective option. The goal of this study was to devise a simple predictive score to identify GT3 patients with a high probability of achieving a sustained virologic response (SVR with PegIFN alfa-2a/RBV therapy.Relationships between baseline characteristics and SVR were explored by multiple logistic regression models and used to develop a simple scoring system to predict SVR using data from 1239 treatment-naive GT3 patients who received PegIFN alfa-2a/RBV for 24 weeks in two large observational cohort studies.The score was validated using a database of 473 patients. Scores were assigned for six factors as follows: age (years (≤40: 2 points; >40 but ≤55: 1; bodyweight (kg (200: 2; ≥100 but <200: 1; HCV RNA (<400,000 IU/mL: 1. The points are summed to arrive at a score ranging from 0‒10 where higher scores indicate higher chances of SVR; 141, 123, 203, 249, 232, and 218 patients had total scores of 0‒4, 5, 6, 7, 8, and 9-10, respectively, among whom SVR rates were 45%, 62%, 72%, 76%, 84%, and 89%. Among 622 patients who had scores of 6‒10 and HCV RNA <50 IU/mL by treatment week 4 the SVR rate was 86% (532/622.A simple baseline scoring system involving age, bodyweight, cirrhosis status, ALT level, platelet count and HCV RNA level can be used to identify treatment-naive Caucasian patients with HCV GT3 infection with a high probability of SVR with PegIFN alfa-2a/RBV therapy.

  12. On Quantum Conditional Probability

    Directory of Open Access Journals (Sweden)

    Isabel Guerra Bobo

    2013-02-01

    Full Text Available We argue that quantum theory does not allow for a generalization of the notion of classical conditional probability by showing that the probability defined by the Lüders rule, standardly interpreted in the literature as the quantum-mechanical conditionalization rule, cannot be interpreted as such.

  13. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  14. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  15. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...

  17. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  18. Study on interception technique to airborne LPI radar with high probability%对机载LPI雷达的高概率截获技术研究

    Institute of Scientific and Technical Information of China (English)

    邹顺; 胡元奎; 张海黎

    2011-01-01

    The anti-reconnaissance ability and anti-jamming ability of airborne radars can be improved greatly through flexible beam control technique and advanced waveform design technique.A kind of reconnaissance method is put for ward by comprehensively using reconnaissance technique based on digital array,channelized detecting technique based on poly-phase filtering and signal sorting technique based on fuzzy clustering,and the method is hopeful to intercept airborne LPI radar with high probability.%采用先进的功率管理技术和灵活的波形设计技术,能大大提高机载雷达的反侦察和抗干扰能力。提出了一种综合利用数字阵列侦察技术、基于多相滤波的信道化检测技术以及基于模糊聚类的信号分选技术的侦察截获方法,有望实现对机载LPI雷达的高概率截获。

  19. Minimization of outage probability of WiMAX link supported by laser link between a high-altitude platform and a satellite.

    Science.gov (United States)

    Arnon, Shlomi

    2009-07-01

    Various technologies for the implementation of a WiMAX (IEEE802.16) base station on board a high-altitude platform (HAP) are currently being researched. The network configuration under consideration includes a satellite, several HAPs, and subscribers on the ground. The WiMAX base station is positioned on the satellite and connects with the HAP via an analog RF over-laser communication (LC) link. The HAPs house a transparent transponder that converts the optic signal to a WiMAX RF signal and the reverse. The LC system consists of a laser transmitter and an optical receiver that need to be strictly aligned to achieve a line-of-sight link. However, mechanical vibration and electronic noise in the control system challenge the transmitter-receiver alignment and cause pointing errors. The outcome of pointing errors is fading of the received signal, which leads to impaired link performance. In this paper, we derive the value of laser transmitter gain that can minimize the outage probability of the WiMAX link. The results indicate that the optimum value of the laser transmitter gain is not a function of the pointing error statistics.

  20. BL153 Partially Prevents High-Fat Diet Induced Liver Damage Probably via Inhibition of Lipid Accumulation, Inflammation, and Oxidative Stress

    Directory of Open Access Journals (Sweden)

    Jian Wang

    2014-01-01

    Full Text Available The present study was to investigate whether a magnolia extract, named BL153, can prevent obesity-induced liver damage and identify the possible protective mechanism. To this end, obese mice were induced by feeding with high fat diet (HFD, 60% kcal as fat and the age-matched control mice were fed with control diet (10% kcal as fat for 6 months. Simultaneously these mice were treated with or without BL153 daily at 3 dose levels (2.5, 5, and 10 mg/kg by gavage. HFD feeding significantly increased the body weight and the liver weight. Administration of BL153 significantly reduced the liver weight but without effects on body weight. As a critical step of the development of NAFLD, hepatic fibrosis was induced in the mice fed with HFD, shown by upregulating the expression of connective tissue growth factor and transforming growth factor beta 1, which were significantly attenuated by BL153 in a dose-dependent manner. Mechanism study revealed that BL153 significantly suppressed HFD induced hepatic lipid accumulation and oxidative stress and slightly prevented liver inflammation. These results suggest that HFD induced fibrosis in the liver can be prevented partially by BL153, probably due to reduction of hepatic lipid accumulation, inflammation and oxidative stress.

  1. Evidence of a role for Th17 cells in the breach of immune tolerance in arthritis

    OpenAIRE

    Yu, Xinhua; Ibrahim, Saleh M.

    2011-01-01

    Th17 cells are thought to play a pathogenic role in various autoimmune diseases. Cytokines secreted by Th17 cells like IL-17, IL-17F and IL-22 have the capacity to mediate a massive inflammatory response. These proinflammatroy cytokines are likely to mediate the pathogenic potential of Th17 cells. Recent evidence suggests a role for Th17 cells in the breach of immune tolerance. This might shed some new light on the pathogenic role of Th17 cells in autoimmunity.

  2. A Model for Understanding the Relationship Between Transaction Costs and Acquisition Cost Breaches

    Science.gov (United States)

    2014-04-30

    become a research sponsor, or to print additional copies of reports, please contact any of the staff listed on the Acquisition Research Program...the Root Cause: Nunn-McCurdy Breaches in Major Defense Acquisition Programs Bill Shelton, RAND Corporation Irv Blickstein, RAND Corporation Jerry...and Wolter J. Fabrycky. 2010. Systems Engineering and Analysis. 5th ed. Upper Saddle River, NJ: Prentice Hall. Blickstein, Irv , Michael Boito

  3. Digging Out the Root Cause: Nunn-McCurdy Breaches in Major Defense Acquisition Programs

    Science.gov (United States)

    2014-04-30

    Acquisition Programs Bill Shelton, RAND Corporation Irv Blickstein, RAND Corporation Jerry Sollinger, RAND Corporation Charles Nemfakos, RAND...to become a research sponsor, or to print additional copies of reports, please contact any of the staff listed on the Acquisition Research Program...Out the Root Cause: Nunn-McCurdy Breaches in Major Defense Acquisition Programs Bill Shelton, RAND Corporation Irv Blickstein, RAND Corporation

  4. Comparative Remedies for Breach of Contract, hrsg. von Nili Cohen und Ewan McKendrick

    OpenAIRE

    Müller-Chen, Markus

    2007-01-01

    Nili Cohen, Professorin an der Tel-Aviv Universität und Ewan McKendrick, Professor in Oxford, haben die verdienstvolle Aufgabe unternommen, einen rechtsvergleichenden Sammelband zu einer der Kernmaterien des Schuldrechts, den Folgen einer Vertragsverletzung, herauszugeben. Die Beiträge sind das Ergebnis einer im Jahre 2002 an der Universität Tel-Aviv durchgeführten Konferenz zum Thema "Remedies for Breach of Contract". Die vertraglichen Rechtsbehelfe werden aus der Perspektive des Common Law,...

  5. Once more unto the breach managing information security in an uncertain world

    CERN Document Server

    Simmons, Andrea C

    2012-01-01

    In Once more unto the Breach, Andrea C Simmons speaks directly to information security managers and provides an insider's view of the role, offering priceless gems from her extensive experience and knowledge. Based on a typical year in the life of an information security manager, the book examines how the general principles can be applied to all situations and discusses the lessons learnt from a real project.

  6. Better than Fuller:a two interests model of remedies for breach of contract

    OpenAIRE

    Campbell, David

    2015-01-01

    The attempt to combine the contractual interests properly so-called with the restitution interest in the Fuller and Purdue three interests model of remedies for breach of contract is ineradicably incoherent. Stimulated by reflection on contemporary restitution doctrine’s understanding of the quasi-contractual remedies of recovery and quantum meruit, this paper argues that the complete elimination from the law of contract of the restitution interest, which incorporates those remedies into the ...

  7. Comparative Remedies for Breach of Contract, hrsg. von Nili Cohen und Ewan McKendrick

    OpenAIRE

    Müller-Chen, Markus

    2007-01-01

    Nili Cohen, Professorin an der Tel-Aviv Universität und Ewan McKendrick, Professor in Oxford, haben die verdienstvolle Aufgabe unternommen, einen rechtsvergleichenden Sammelband zu einer der Kernmaterien des Schuldrechts, den Folgen einer Vertragsverletzung, herauszugeben. Die Beiträge sind das Ergebnis einer im Jahre 2002 an der Universität Tel-Aviv durchgeführten Konferenz zum Thema "Remedies for Breach of Contract". Die vertraglichen Rechtsbehelfe werden aus der Perspektive des Common Law,...

  8. The trajectory of the target probability effect.

    Science.gov (United States)

    Hon, Nicholas; Yap, Melvin J; Jabar, Syaheed B

    2013-05-01

    The effect of target probability on detection times is well-established: Even when detection accuracy is high, lower probability targets are detected more slowly than higher probability ones. Although this target probability effect on detection times has been well-studied, one aspect of it has remained largely unexamined: How the effect develops over the span of an experiment. Here, we investigated this issue with two detection experiments that assessed different target probability ratios. Conventional block segment analysis and linear mixed-effects modeling converged on two key findings. First, we found that the magnitude of the target probability effect increases as one progresses through a block of trials. Second, we found, by examining the trajectories of the low- and high-probability targets, that this increase in effect magnitude was driven by the low-probability targets. Specifically, we found that low-probability targets were detected more slowly as a block of trials progressed. Performance to high-probability targets, on the other hand, was largely invariant across the block. The latter finding is of particular interest because it cannot be reconciled with accounts that propose that the target probability effect is driven by the high-probability targets.

  9. Whip Rule Breaches in a Major Australian Racing Jurisdiction: Welfare and Regulatory Implications.

    Science.gov (United States)

    Hood, Jennifer; McDonald, Carolyn; Wilson, Bethany; McManus, Phil; McGreevy, Paul

    2017-01-16

    Whip use in horseracing is increasingly being questioned on ethical, animal welfare, social sustainability, and legal grounds. Despite this, there is weak evidence for whip use and its regulation by Stewards in Australia. To help address this, we characterised whip rule breaches recorded by Stewards using Stewards Reports and Race Diaries from 2013 and 2016 in New SouthWales (NSW) and the Australian Capital Territory (ACT). There were more recorded breaches at Metropolitan (M) than Country (C) or Provincial (P) locations, and by riders of horses that finished first, second, or third than by riders of horses that finished in other positions. The most commonly recorded breaches were forehand whip use on more than five occasions before the 100-metre (m) mark (44%), and whip use that raises the jockey's arm above shoulder height (24%). It is recommended that racing compliance data be analysed annually to inform the evidence-base for policy, education, and regulatory change, and ensure the welfare of racehorses and racing integrity.

  10. After the data breach: Managing the crisis and mitigating the impact.

    Science.gov (United States)

    Brown, Hart S

    2016-01-01

    Historically, the unauthorised access and theft of information was a tactic used between countries as part of espionage campaigns, during times of conflict as well as for personal and criminal purposes. The consumers of the information were relatively isolated and specific. As information became stored and digitised in larger quantities in the 1980s the ability to access mass amounts of records at one time became possible. The expertise needed to remotely access and exfiltrate the data was not readily available and the number of markets to monetise the data was limited. Over the past ten years, shadow networks have been used by criminals to collaborate on hacking techniques, exchange hacking advice anonymously and commercialise data on the black market. The intersection of these networks along with the unintentional losses of information have resulted in 5,810 data breaches made public since 2005 (comprising some 847,807,830 records) and the velocity of these events is increasing. Organisations must be prepared for a potential breach event to maintain cyber resiliency. Proper management of a breach response can reduce response costs and can serve to mitigate potential reputational losses.

  11. Breaching the skin barrier--insights from molecular simulation of model membranes.

    Science.gov (United States)

    Notman, Rebecca; Anwar, Jamshed

    2013-02-01

    Breaching the skin's barrier function by design is an important strategy for delivering drugs and vaccines to the body. However, while there are many proposed approaches for reversibly breaching the skin barrier, our understanding of the molecular processes involved is still rudimentary. Molecular simulation offers an unprecedented molecular-level resolution with an ability to reproduce molecular and bulk level properties. We review the basis of the molecular simulation methodology and give applications of relevance to the skin lipid barrier, focusing on permeation of molecules and chemical approaches for breaching the lipid barrier by design. The bulk kinetic model based on Fick's Law describing absorption of a drug through skin has been reconciled with statistical mechanical quantities such as the local excess chemical potential and local diffusion coefficient within the membrane structure. Applications of molecular simulation reviewed include investigations of the structure and dynamics of simple models of skin lipids, calculation of the permeability of molecules in simple model membranes, and mechanisms of action of the penetration enhancers, DMSO, ethanol and oleic acid. The studies reviewed illustrate the power and potential of molecular simulation to yield important physical insights, inform and rationalize experimental studies, and to predict structural changes, and kinetic and thermodynamic quantities. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Single-Beam Bathymetry Point Data Shapefile of the Hurricane Sandy Breach at Fire Island, New York, June 2013

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset, 20130626_bathy_points.zip, consists of single-beam point data collected in June 2013 during a bathymetry survey of the Wilderness Breach and adjacent...

  13. Single-Beam XYZ Point Bathymetry Data of the Hurricane Sandy Breach at Fire Island, New York, June 2013

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset, 20130626_bathy_xyz.zip, consists of single-beam point data collected in June 2013 during a bathymetry survey of the Wilderness Breach and adjacent...

  14. Single-Beam XYZ Point Bathymetry Data of the Hurricane Sandy Breach at Fire Island, New York, June 2013

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset, 20130626_bathy_xyz.zip, consists of single-beam point data collected in June 2013 during a bathymetry survey of the Wilderness Breach and adjacent...

  15. Coastal bathymetry data collected in June 2014 from Fire Island, New York—The wilderness breach and shoreface

    Science.gov (United States)

    Nelson, Timothy R.; Miselis, Jennifer L.; Hapke, Cheryl J.; Wilson, Kathleen E.; Henderson, Rachel E.; Brenner, Owen T.; Reynolds, Billy J.; Hansen, Mark E.

    2016-08-02

    Scientists from the U.S. Geological Survey St. Petersburg Coastal and Marine Science Center in St. Petersburg, Florida, collected bathymetric data along the upper shoreface and within the wilderness breach at Fire Island, New York, in June 2014. The U.S. Geological Survey is involved in a post-Hurricane Sandy effort to map and monitor the morphologic evolution of the shoreface along Fire Island and model the evolution of the wilderness breach as a part of the Hurricane Sandy Supplemental Project GS2-2B. During this study, bathymetry was collected with single-beam echo sounders and global positioning systems, mounted to personal watercraft, along the Fire Island shoreface and within the wilderness breach. Additional bathymetry was collected using backpack global positioning systems along the flood shoals and shallow channels within the wilderness breach.

  16. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  17. Followup to Columbia Investigation: Reinforced Carbon/Carbon From the Breach Location in the Wing Leading Edge Studied

    Science.gov (United States)

    Jacobson, Nathan S.; Opila, Elizabeth J.; Tallant, David

    2005-01-01

    Initial estimates on the temperature and conditions of the breach in the Space Shuttle Columbia's wing focused on analyses of the slag deposits. These deposits are complex mixtures of the reinforced carbon/carbon (RCC) constituents, insulation material, and wing structural materials. Identification of melted/solidified Cerachrome insulation (Thermal Ceramics, Inc., Augusta, GA) indicated that the temperatures at the breach had exceeded 1760 C.

  18. Why won’t talents return home? : a case study of contract breach by graduates of the Program of Training High-Caliber Backbone Personnel from the Ethnic Minorities = ¿Por qué el talento no vuelve a casa? : un estudio de caso sobre la brecha contractual entre graduados de Grupos Minoritarios del Programa para la Formación de Alto Nivel de Personal Backbone

    Directory of Open Access Journals (Sweden)

    Zhiyong Zhu

    2014-01-01

    Full Text Available In 2004, the Ministry of Education, National Development and Reform Commission, State Ethnic Affairs Commission, Ministry of Finance, Ministry of Human Resources and Social Security jointly released a special document On Training High-Caliber Backbone Personnel from the Ethnic Minorities, as a supporting measure for the strategy of human resources in western development of China. The central government takes it as an important force in thriving and developing the western region, especially the region where ethnic minority groups reside in the west. However, more and more graduates through the Program have breached or intend to break the contract of the Program. Why do they not abide by the contract and return to their hometown province for the work after they have enjoyed the preferential Program? This case study focused on three graduates who broke the employment contract through the Program, traced the origin and development of their minds and action, and demonstrated more about what hid behind the phenomenon. According to the deep interaction with these ethnic minority students, the research found that the social background under which high education policies for the ethnic minorities had greatly changed. The changes have challenged the implementation of the Program. Furthermore, under the socialistic market economy system, the employment contract of the Program couldn’t be processed and secured as the involved actors have expected. Last but not the least, the graduates benefited from the Program have become more and more individualistic and diversified in their career development due to different micro and macro reasons.En el año 2004 y de manera conjunta, el Ministerio de Educación, la Comisión Nacional para el Desarrollo y las Reformas, la Comisión Estatal de Asuntos Étnicos, el Ministerio de Economía, el Ministerio de Recursos Humanos y la Seguridad Social, publicaron una normativa especial para la Formación de Alto Nivel de

  19. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  20. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  1. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  2. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  3. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  4. THE NUCLEAR ENCOUNTER PROBABILITY

    NARCIS (Netherlands)

    SMULDERS, PJM

    1994-01-01

    This Letter dicusses the nuclear encounter probability as used in ion channeling analysis. A formulation is given, incorporating effects of large beam angles and beam divergence. A critical examination of previous definitions is made.

  5. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. Critical Approach to Methods of Glacier Reconstruction in High Asia and Discussion of the Probability of a Qinghai-Xizang (Tibetan) Inland Ice

    Institute of Scientific and Technical Information of China (English)

    Matthias Kuhle

    2007-01-01

    This overview discusses old and new results as to the controversy on the past glacier extension in High Asia, which has been debated for 35 years now. This paper makes an attempt to come closer to a solution. H.v. Wissmann's interpretation(1959) of a small-scale glaciation contrasts with M.Kuhle's reconstruction (1974) of a large-scale glaciation with a 2.4 million km2 extended Qinghai-Xizang (Tibetan) inland glaciation and a Himalaya-Karakorum icestream network. Both opinions find support but also contradiction in the International and Chinese literature (Academia Sinica). The solution of this question is of supraregional importance because of the subtropical position of the concerned areas. In case of large albedo-intensive ice surfaces, a global cooling would be the energetical consequence and, furthermore, a breakdown of the summer monsoon. The current and interglacial heat-low above the very effective heating panel of the Qinghai-Xizang (Tibetan) Plateau exceeding 4000 m, which gives rise to this monsoon circulation, would be replaced by the cold-high of an inland ice. In addition, the plate-tectonically created Pleistocene history of the uplift of High Asia - should the occasion arise up to beyond the snowline (ELA) -would attain a paleoclimatically great, perhaps global importance. In case of a heavy superimposed ice load,the question would come up as to the glacio-isostatic interruption of this primary uplift. The production of the loesses sedimentated in NE-China and their very probable glacial genesis as well as an eustatic lowering of the sea-level by 5 to 7 m in the maximum case of glaciation are immediately tied up with the question of glaciation we want to discuss. Not the least, the problems of biotopes of the sanctuary-centres of flora and fauna, i.e., interglacial re-settlement, are also dependent on it. On the basis of this Quaternary-geomorphological-glaciological connection, future contributions are requested on the past glaciation, the current

  8. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  9. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  10. Monte Carlo transition probabilities

    OpenAIRE

    Lucy, L. B.

    2001-01-01

    Transition probabilities governing the interaction of energy packets and matter are derived that allow Monte Carlo NLTE transfer codes to be constructed without simplifying the treatment of line formation. These probabilities are such that the Monte Carlo calculation asymptotically recovers the local emissivity of a gas in statistical equilibrium. Numerical experiments with one-point statistical equilibrium problems for Fe II and Hydrogen confirm this asymptotic behaviour. In addition, the re...

  11. EVALUATING DAMAGE ASSESSMENT OF BREACHES ALONG THE EMBANKMENTS OF INDUS RIVER DURING FLOOD 2010 USING REMOTE SENSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    R. Ahmad

    2013-09-01

    Full Text Available Natural disasters cause human sufferings and property loss, if not managed properly. It cannot be prevented but their adverse impacts can be reduced through proper planning and disaster mitigation measures. The floods triggered by heavy rains during July 2010 in Pakistan caused swallowing of rivers causing human, agriculture, livestock and property losses in almost all over the country. The heavy rains in upper part of country were attributed to El-Nina effect. Accumulated water in the rivers floodplain overtopped and breached flood protective infrastructure. Flood damage particularly in Sindh province was caused by breaches in the embankments and even after months of flood recession in rivers, flood water affected settled areas in the province. This study evaluates the role of satellite remote sensing particularly in assessment of breaches and consequential damages as well as measures leading to minimize the effects of floods caused by breaches in flood protective infrastructure. More than 50 SPOT-5 imageries had been used for this purpose and breached areas were delineated using pre and post flood imageries, later on rehabilitation work were also monitored. A total 136 breaches were delineated out of which 60 were in the Punjab and 76 in Sindh province. The study demonstrates the potentials of satellite remote sensing for mapping and monitoring natural disasters and devising mitigation strategies.

  12. Managing and understanding risk perception of surface leaks from CCS sites: risk assessment for emerging technologies and low-probability, high-consequence events

    Science.gov (United States)

    Augustin, C. M.

    2015-12-01

    Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a

  13. Explaining agro-industrial contract breaches: the case of Brazilian tomatoes processing industry

    Directory of Open Access Journals (Sweden)

    Decio Zylbersztajn

    2007-12-01

    Full Text Available Three hundred small tomato growers located in Brazilian northeast states, supplied a processing industry. In view of the large number of contract hazards and weak enforcement of clauses, managers have decided to move to the Midwest, where a reduced number of larger farmers have been contracted. The industry blamed high transaction costs due to the weak mechanism of public enforcement of property rights. The industry blamed some farmers of selling the product at the market for fresh consumption. Also, farmers blamed the industry for taking advantage of asymmetric information related to quality. This study presents an analysis of contract architecture and an evaluation of effects of transaction costs related variables on the likelihood of contract breaches. A panel data study with 1,523 observations and limited dependent variable models has been formulated to test hypothesis based on transaction cost theory. Results show that opportunism and the absence of courts guarantees of property rights precluded the possibility of achieving a stable contract relationship in the region.Trezentos pequenos produtores de tomate no Nordeste do Brasil supriam uma indústria processadora. Em face do grande número de quebras contratuais, a indústria decidiu mudar-se para o Centro Oeste e operar com um número menor de contratos com empresários rurais de maior porte. Segundo a indústria, sua decisão foi motivada pelos altos custos de transação resultantes dos fracos mecanismos institucionais de proteção aos contratos. Os produtores, por outro lado, culparam a indústria pelo abuso na depreciação do preço pago pelo produto, com base na avaliação da sua qualidade. O estudo apresenta uma análise da arquitetura dos contratos e testa hipóteses com base na Economia dos Custos de Transação, explicativas das quebras contratuais. Um painel com 1523 observações foi utilizado e os resultados indicaram a significância de variáveis associadas aos incentivos

  14. Communicating Low-Probability High-Consequence Risk, Uncertainty and Expert Confidence: Induced Seismicity of Deep Geothermal Energy and Shale Gas.

    Science.gov (United States)

    Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina

    2017-08-10

    Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.

  15. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  16. Management Perspectives Pertaining to Root Cause Analyses of Nunn-McCurdy Breaches: Contractor Motivations and Anticipating Breaches, Volume 6

    Science.gov (United States)

    2014-01-01

    Paul Kern, George Muellner, and Eleanor Spector. The COG mem- bers spent several days discussing and debating the issues. Their insights were highly...College Park , Md.: University of Maryland, Center for Public Policy and Private Enterprise, May 16, 2012; James Gill, “Incentive Arrangements for Space...Defense Acquisitions, College Park , Md.: Center for Public Policy and Private Enterprise, School of Public Policy, University of Maryland, Feb- ruary

  17. Conditional probability modulates visual search efficiency.

    Science.gov (United States)

    Cort, Bryan; Anderson, Britt

    2013-01-01

    We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability-the likelihood of a particular color given a particular combination of two cues-varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  18. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  19. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  20. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  1. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  2. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  3. Modeling normal tissue complication probability from repetitive computed tomography scans during fractionated high-dose-rate brachytherapy and external beam radiotherapy of the uterine cervix.

    Science.gov (United States)

    Dale, E; Hellebust, T P; Skjønsberg, A; Høgberg, T; Olsen, D R

    2000-07-01

    To calculate the normal tissue complication probability (NTCP) of late radiation effects on the rectum and bladder from repetitive CT scans during fractionated high-dose-rate brachytherapy (HDRB) and external beam radiotherapy (EBRT) of the uterine cervix and compare the NTCP with the clinical frequency of late effects. Fourteen patients with cancer of the uterine cervix (Stage IIb-IVa) underwent 3-6 (mean, 4.9) CT scans in treatment position during their course of HDRB using a ring applicator with an Iridium stepping source. The rectal and bladder walls were delineated on the treatment-planning system, such that a constant wall volume independent of organ filling was achieved. Dose-volume histograms (DVH) of the rectal and bladder walls were acquired. A method of summing multiple DVHs accounting for variable dose per fraction were applied to the DVHs of HDRB and EBRT together with the Lyman-Kutcher NTCP model fitted to clinical dose-volume tolerance data from recent studies. The D(mean) of the DVH from EBRT was close to the D(max) for both the rectum and bladder, confirming that the DVH from EBRT corresponded with homogeneous whole-organ irradiation. The NTCP of the rectum was 19.7% (13.5%, 25. 9%) (mean and 95% confidence interval), whereas the clinical frequency of late rectal sequelae (Grade 3-4, RTOG/EORTC) was 13% based on material from 200 patients. For the bladder the NTCP was 61. 9% (46.8%, 76.9%) as compared to the clinical frequency of Grade 3-4 late effects of 14%. If only 1 CT scan from HDRB was assumed available, the relative uncertainty (standard deviation or SD) of the NTCP value for an arbitrary patient was 20-30%, whereas 4 CT scans provided an uncertainty of 12-13%. The NTCP for the rectum was almost consistent with the clinical frequency of late effects, whereas the NTCP for bladder was too high. To obtain reliable (SD of 12-13%) NTCP values, 3-4 CT scans are needed during 5-7 fractions of HDRB treatments.

  4. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  5. Probabilities from Envariance

    CERN Document Server

    Zurek, W H

    2004-01-01

    I show how probabilities arise in quantum physics by exploring implications of {\\it environment - assisted invariance} or {\\it envariance}, a recently discovered symmetry exhibited by entangled quantum systems. Envariance of perfectly entangled states can be used to rigorously justify complete ignorance of the observer about the outcome of any measurement on either of the members of the entangled pair. Envariance leads to Born's rule, $p_k \\propto |\\psi_k|^2$. Probabilities derived in this manner are an objective reflection of the underlying state of the system -- they reflect experimentally verifiable symmetries, and not just a subjective ``state of knowledge'' of the observer. Envariance - based approach is compared with and found superior to the key pre-quantum definitions of probability including the {\\it standard definition} based on the `principle of indifference' due to Laplace, and the {\\it relative frequency approach} advocated by von Mises. Implications of envariance for the interpretation of quantu...

  6. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  7. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications...

  8. Negative Probabilities and Contextuality

    CERN Document Server

    de Barros, J Acacio; Oas, Gary

    2015-01-01

    There has been a growing interest, both in physics and psychology, in understanding contextuality in experimentally observed quantities. Different approaches have been proposed to deal with contextual systems, and a promising one is contextuality-by-default, put forth by Dzhafarov and Kujala. The goal of this paper is to present a tutorial on a different approach: negative probabilities. We do so by presenting the overall theory of negative probabilities in a way that is consistent with contextuality-by-default and by examining with this theory some simple examples where contextuality appears, both in physics and psychology.

  9. Estimating tail probabilities

    Energy Technology Data Exchange (ETDEWEB)

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  10. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  11. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  12. Failure Predictions for Graphite Reflector Bricks in the Very High Temperature Reactor with the Prismatic Core Design

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Gyanender, E-mail: sing0550@umn.edu [Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States); Fok, Alex [Minnesota Dental Research in Biomaterials and Biomechanics, School of Dentistry, University of Minnesota, 515, Delaware St. SE, Minneapolis, MN 55455 (United States); Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States); Mantell, Susan [Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States)

    2017-06-15

    Highlights: • Failure probability of VHTR reflector bricks predicted though crack modeling. • Criterion chosen for defining failure strongly affects the predictions. • Breaching of the CRC could be significantly delayed through crack arrest. • Capability to predict crack initiation and propagation demonstrated. - Abstract: Graphite is used in nuclear reactor cores as a neutron moderator, reflector and structural material. The dimensions and physical properties of graphite change when it is exposed to neutron irradiation. The non-uniform changes in the dimensions and physical properties lead to the build-up of stresses over the course of time in the core components. When the stresses reach the critical limit, i.e. the strength of the material, cracking occurs and ultimately the components fail. In this paper, an explicit crack modeling approach to predict the probability of failure of a VHTR prismatic reactor core reflector brick is presented. Firstly, a constitutive model for graphite is constructed and used to predict the stress distribution in the reflector brick under in-reactor conditions of high temperature and irradiation. Fracture simulations are performed as part of a Monte Carlo analysis to predict the probability of failure. Failure probability is determined based on two different criteria for defining failure time: A) crack initiation and B) crack extension to near control rod channel. A significant difference is found between the failure probabilities based on the two criteria. It is predicted that the reflector bricks will start cracking during the time range of 5–9 years, while breaching of the control rod channels will occur during the period of 11–16 years. The results show that, due to crack arrest, there is a significantly delay between crack initiation and breaching of the control rod channel.

  13. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  14. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  15. Varga: On Probability.

    Science.gov (United States)

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  16. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  17. On Probability Domains

    Science.gov (United States)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  18. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  19. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  20. Electroconvulsive therapy, hypertensive surge, blood-brain barrier breach, and amnesia

    DEFF Research Database (Denmark)

    Andrade, Chittaranjan; Bolwig, Tom G

    2014-01-01

    Preclinical and clinical evidence show that electroconvulsive therapy (ECT)-induced intraictal surge in blood pressure may result in a small, transient breach in the blood-brain barrier, leading to mild cerebral edema and a possible leach of noxious substances from blood into brain tissues...... of blood pressure during electroconvulsive shocks attenuate electroconvulsive shock-induced amnestic changes; however, the evidence suggests that antihypertensive mechanisms may not necessarily be involved. Clinical studies involving pre-ECT administration of antihypertensive medications do not provide...

  1. Responding to moderate breaches in professionalism: an intervention for medical students.

    Science.gov (United States)

    Gill, Anne C; Nelson, Elizabeth A; Mian, Ayesha I; Raphael, Jean L; Rowley, David R; Mcguire, Amy L

    2015-02-01

    Much has been written about how we understand, teach and evaluate professionalism in medical training. Less often described are explicit responses to mild or moderate professionalism concerns in medical students. To address this need, Baylor College of Medicine created a mechanism to assess professionalism competency for medical students and policies to address breaches in professional behavior. This article describes the development of an intervention using a guided reflection model, student responses to the intervention, and how the program evolved into a credible resource for deans and other educational leaders.

  2. Breach of information duties in the B2C e-commerce: adequacy of available remedies

    Directory of Open Access Journals (Sweden)

    Zofia Bednarz

    2016-07-01

    Full Text Available

    B2C e-commerce is characterised by the information asymmetry between the contracting parties. Various information duties are imposed on traders, both at the European and national level to correct this asymmetry and to ensure proper market functioning. The mandated disclosure is based on the assumption of consumers' rationality. However, developments of behavioural economics challenge this assumption. The utility of mandated disclosure in consumer contracts depends also on the remedies available to consumers in a case of breach of information duties. Those remedies are often heavily influenced by the national general private law applicable to the contractual relationship between the parties. Nevertheless, since the economics of general contract law differ importantly from principles of consumer e-commerce, various problems can be associated with the application of general law remedies to the breach of information duties in B2C contracts. The limited value of the majority of the online B2C transactions is incompatible with costly and lengthy court proceedings. Moreover, breach of information duties will often not produce enough material damage on the side of the consumer to make the remedies available. Different solutions are explored, from ADR, to the duty to advise, to non-legal mechanisms making the information easier to use for consumers throughlimiting disclosure. Finally, the right of withdrawal is analysed as an example of a specific remedy, adapted to the economics of the B2C electronic transactions, where the aims parties pursue through contracts are different than in commercial contracts, and their relationship is marked with the inequality of economic power and information asymmetry. However, the legally established cooling-off period is not free from limitations, and only a combination of various measures, including effective

  3. Motive Matters! An exploration of the notion ‘deliberate breach of contract’ and its consequences for the application of remedies

    OpenAIRE

    Kogelenberg, Martijn

    2012-01-01

    textabstractThis thesis explores the notion of deliberate breach of contract and its potential remedial consequences. In the major jurisdictions in Europe and in the United States the notion of deliberate breach of contract is generally not coherently and officially defined and acknowledged as an independent legal phenomenon. The ultimate added value of this thesis intends to be a first coherent comparative research on deliberate breach of contract and its potential consequences for the core ...

  4. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  5. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  6. Superpositions of probability distributions

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  7. Superpositions of probability distributions.

    Science.gov (United States)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=sigma;{2} play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  8. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  9. Fractal probability laws.

    Science.gov (United States)

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  10. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  11. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  12. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  13. Searching with Probabilities

    Science.gov (United States)

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  14. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  15. A Combined Atmospheric Rivers and Geopotential Height Analysis for the Detection of High Streamflow Event Probability Occurrence in UK and Germany

    Science.gov (United States)

    Rosario Conticello, Federico; Cioffi, Francesco; Lall, Upmanu; Merz, Bruno

    2017-04-01

    The role of atmospheric rivers (ARs) in inducing High Streamflow Events (HSEs) in Europe has been confirmed by numerous studies. Here, we assume as HSEs the streamflows exceeding the 99th percentile of daily flowrate time series measured at streamflow gauges. Among the indicators of ARs are: the Integrated Water Vapor (IWV) and Integrated Water Vapor Transport (IVT). For both indicators the literature suggests thresholds in order to identify ARs. Furthermore, local thresholds of such indices are used to assess the occurrence of HSEs in a given region. Recent research on ARs still leaves room for open issues: 1) The literature is not unanimous in defining which of the two indicators is better. 2) The selection of the thresholds is based on subjective assessments. 3) The predictability of HSEs at the local scale associated with these indices seems to be weak and to exist only in the winter months. In order to address these issues, we propose an original methodology: (i) to choose between the two indicators which one is the most suitable for HSEs predictions; (ii) to select IWT and/or IVT (IVT/IWV) local thresholds in a more objective way; (iii) to implement an algorithm able to determine whether a IVT/IWV configuration is inducing HSEs, regardless of the season. In pursuing this goal, besides IWV and IVT fields, we introduce as further predictor the geopotential height at 850 hPa (GPH850) field, that implicitly contains information about the pattern of temperature, direction and intensity of the winds. In fact, the introduction of the GPH850 would help to improve the assessment of the occurrence of HSEs throughout the year. It is also plausible to hypothesize, that IVT/IWV local thresholds could vary in dependence of the GPH850 configuration. In this study, we propose a model to statistically relate these predictors, IVT/IWV and GPH850, to the simultaneous occurrence of HSEs in one or more streamflow gauges in UK and Germany. Historical data from 57 streamflow gauges

  16. The breach between academic studies and professional intervention in health field

    Directory of Open Access Journals (Sweden)

    Graciela H. Tonón

    2015-08-01

    Full Text Available The main objective of this project was the definition of the breach between academic studies and professional exercise of professional that work in Health field. The project was developed in the Psychology Research Center of the Faculty of Social Sciences of the Universidad de Palermo (Argentina, which was a training space for students of the Psychology Doctoral Program of this university. It is a qualitative descriptive study in which we applied semi-structured interviews to professionals that work in Health field in private and public health institutions of the Ciudad Autónoma de Buenos Aires. People interviewed said that in their university studies the role of the theory was more important than the role of practice; even we can observe differences between the decades of graduation and type of university institution (public or private. At the same time, they recognized a breach between academic studies and professional exercise which characterized by the lack of spaces for the practice before the graduation and the knowledge about labor market, and at the same time the increase of the number of students in the universities. They recommended give a protagonist role to the practice in the university studies and regain the recognition for health professionals from the population and the governments. 

  17. Breaching the security of the Kaiser Permanente Internet patient portal: the organizational foundations of information security.

    Science.gov (United States)

    Collmann, Jeff; Cooper, Ted

    2007-01-01

    This case study describes and analyzes a breach of the confidentiality and integrity of personally identified health information (e.g. appointment details, answers to patients' questions, medical advice) for over 800 Kaiser Permanente (KP) members through KP Online, a web-enabled health care portal. The authors obtained and analyzed multiple types of qualitative data about this incident including interviews with KP staff, incident reports, root cause analyses, and media reports. Reasons at multiple levels account for the breach, including the architecture of the information system, the motivations of individual staff members, and differences among the subcultures of individual groups within as well as technical and social relations across the Kaiser IT program. None of these reasons could be classified, strictly speaking, as "security violations." This case study, thus, suggests that, to protect sensitive patient information, health care organizations should build safe organizational contexts for complex health information systems in addition to complying with good information security practice and regulations such as the Health Insurance Portability and Accountability Act (HIPAA) of 1996.

  18. Improving Ranking Using Quantum Probability

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of false alarm and the same parameter estimation data. As quantum probability provided more effective detectors than classical probability within other domains that data management, we conjecture that, the system that can implement subspace-based detectors shall be more effective than a system which implements a set-based detectors, the effectiveness being calculated as expected recall estimated over the probability of detection and expected fallout estimated over the probability of false alarm.

  19. Coastal bathymetry data collected in May 2015 from Fire Island, New York—Wilderness breach and shoreface

    Science.gov (United States)

    Nelson, Timothy R.; Miselis, Jennifer L.; Hapke, Cheryl J.; Brenner, Owen T.; Henderson, Rachel E.; Reynolds, Billy J.; Wilson, Kathleen E.

    2017-05-12

    Scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center in St. Petersburg, Florida, conducted a bathymetric survey of Fire Island from May 6-20, 2015. The USGS is involved in a post-Hurricane Sandy effort to map and monitor the morphologic evolution of the wilderness breach as a part of the Hurricane Sandy Supplemental Project GS2-2B. During this study, bathymetry data were collected with single-beam echo sounders and Global Positioning Systems, which were mounted to personal watercraft, along the Fire Island shoreface and within the wilderness breach. Additional bathymetry and elevation data were collected using backpack Global Positioning Systems on flood shoals and in shallow channels within the wilderness breach.

  20. Bathymetry data collected in October 2014 from Fire Island, New York—The wilderness breach, shoreface, and bay

    Science.gov (United States)

    Nelson, Timothy R.; Miselis, Jennifer L.; Hapke, Cheryl J.; Brenner, Owen T.; Henderson, Rachel E.; Reynolds, Billy J.; Wilson, Kathleen E.

    2017-03-24

    Scientists from the U.S. Geological Survey St. Petersburg Coastal and Marine Science Center in St. Petersburg, Florida, conducted a bathymetric survey of Fire Island, New York, from October 5 to 10, 2014. The U.S. Geological Survey is involved in a post-Hurricane Sandy effort to map and monitor the morphologic evolution of the wilderness breach, which formed in October 2012 during Hurricane Sandy, as part of the Hurricane Sandy Supplemental Project GS2-2B. During this study, bathymetry data were collected, using single-beam echo sounders and global positioning systems mounted to personal watercraft, along the Fire Island shoreface and within the wilderness breach, Fire Island Inlet, Narrow Bay, and Great South Bay east of Nicoll Bay. Additional bathymetry and elevation data were collected using backpack and wheel-mounted global positioning systems along the subaerial beach (foreshore and backshore), flood shoals, and shallow channels within the wilderness breach and adjacent shoreface.

  1. Probability distributions for multimeric systems.

    Science.gov (United States)

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  2. Applying Popper's Probability

    CERN Document Server

    Whiting, Alan B

    2014-01-01

    Professor Sir Karl Popper (1902-1994) was one of the most influential philosophers of science of the twentieth century, best known for his doctrine of falsifiability. His axiomatic formulation of probability, however, is unknown to current scientists, though it is championed by several current philosophers of science as superior to the familiar version. Applying his system to problems identified by himself and his supporters, it is shown that it does not have some features he intended and does not solve the problems they have identified.

  3. Probably Almost Bayes Decisions

    DEFF Research Database (Denmark)

    Anoulova, S.; Fischer, Paul; Poelt, S.

    1996-01-01

    discriminant functions for this purpose. We analyze this approach for different classes of distribution functions of Boolean features:kth order Bahadur-Lazarsfeld expansions andkth order Chow expansions. In both cases, we obtain upper bounds for the required sample size which are small polynomials...... in the relevant parameters and which match the lower bounds known for these classes. Moreover, the learning algorithms are efficient.......In this paper, we investigate the problem of classifying objects which are given by feature vectors with Boolean entries. Our aim is to "(efficiently) learn probably almost optimal classifications" from examples. A classical approach in pattern recognition uses empirical estimations of the Bayesian...

  4. Probability for physicists

    CERN Document Server

    Sirca, Simon

    2016-01-01

    This book is designed as a practical and intuitive introduction to probability, statistics and random quantities for physicists. The book aims at getting to the main points by a clear, hands-on exposition supported by well-illustrated and worked-out examples. A strong focus on applications in physics and other natural sciences is maintained throughout. In addition to basic concepts of random variables, distributions, expected values and statistics, the book discusses the notions of entropy, Markov processes, and fundamentals of random number generation and Monte-Carlo methods.

  5. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  6. L-subshell fluorescence yields and Coster-Kronig transition probabilities with a reliable uncertainty budget for selected high- and medium-Z elements

    Science.gov (United States)

    Kolbe, Michael; Hönicke, Philipp; Müller, Matthias; Beckhoff, Burkhard

    2012-10-01

    Photon-in/photon-out experiments at thin specimens have been carried out to determine L-subshell fluorescence yields as well as Coster-Kronig transition probabilities of Au, Pb, Mo, and Pd using radiometrically calibrated instrumentation in the Physikalisch-Technische Bundesanstalt (PTB) laboratory at the electron storage ring BESSY II in Berlin. An advanced approach was developed in order to derive the fluorescence line intensities by means of line sets of each subshell that were corrected for self-absorption and broadened with experimentally determined detector response functions. The respective photoelectric cross sections for each subshell were determined by means of transmission measurements of the same samples without any change in the experimental operating condition. All values derived were compared to those of earlier works. A completely traceable uncertainty budget is provided for the determined values.

  7. Measure, integral and probability

    CERN Document Server

    Capiński, Marek

    2004-01-01

    Measure, Integral and Probability is a gentle introduction that makes measure and integration theory accessible to the average third-year undergraduate student. The ideas are developed at an easy pace in a form that is suitable for self-study, with an emphasis on clear explanations and concrete examples rather than abstract theory. For this second edition, the text has been thoroughly revised and expanded. New features include: · a substantial new chapter, featuring a constructive proof of the Radon-Nikodym theorem, an analysis of the structure of Lebesgue-Stieltjes measures, the Hahn-Jordan decomposition, and a brief introduction to martingales · key aspects of financial modelling, including the Black-Scholes formula, discussed briefly from a measure-theoretical perspective to help the reader understand the underlying mathematical framework. In addition, further exercises and examples are provided to encourage the reader to become directly involved with the material.

  8. Probabilities for Solar Siblings

    Science.gov (United States)

    Valtonen, Mauri; Bajkova, A. T.; Bobylev, V. V.; Mylläri, A.

    2015-02-01

    We have shown previously (Bobylev et al. Astron Lett 37:550-562, 2011) that some of the stars in the solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10 % (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  9. Probabilities for Solar Siblings

    CERN Document Server

    Valtonen, M; Bobylev, V V; Myllari, A

    2015-01-01

    We have shown previously (Bobylev et al 2011) that some of the stars in the Solar neighborhood today may have originated in the same star cluster as the Sun, and could thus be called Solar Siblings. In this work we investigate the sensitivity of this result to Galactic models and to parameters of these models, and also extend the sample of orbits. There are a number of good candidates for the Sibling category, but due to the long period of orbit evolution since the break-up of the birth cluster of the Sun, one can only attach probabilities of membership. We find that up to 10% (but more likely around 1 %) of the members of the Sun's birth cluster could be still found within 100 pc from the Sun today.

  10. Emptiness Formation Probability

    Science.gov (United States)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  11. Learning unbelievable marginal probabilities

    CERN Document Server

    Pitkow, Xaq; Miller, Ken D

    2011-01-01

    Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals `unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some le...

  12. State Security Breach Response Laws: State-by-State Summary Table. Using Data to Improve Education: A Legal Reference Guide to Protecting Student Privacy and Data Security

    Science.gov (United States)

    Data Quality Campaign, 2011

    2011-01-01

    Under security breach response laws, businesses--and sometimes state and governmental agencies--are required to inform individuals when the security, confidentiality or integrity of their personal information has been compromised. This resource provides a state-by-state analysis of security breach response laws. [The Data Quality Campaign has…

  13. The Effect of Perceived Privacy Breaches on Continued Technology Use and Individual Psychology: The Construct, Instrument Development, and an Application Using Internet Search Engines

    Science.gov (United States)

    Ahmad, Altaf

    2010-01-01

    This dissertation involved the development of a new construct, perceived privacy breach (PPB), to evaluate how a person perceives breaches of privacy in terms of whether they perceive any exchange of information was fair or not and how they believe it will impact people whose information has been shared. . This instrument assists researchers to…

  14. Market Reactions to Publicly Announced Privacy and Security Breaches Suffered by Companies Listed on the United States Stock Exchanges: A Comparative Empirical Investigation

    Science.gov (United States)

    Coronado, Adolfo S.

    2012-01-01

    Using a sample of security and privacy breaches the present research examines the comparative announcement impact between the two types of events. The first part of the dissertation analyzes the impact of publicly announced security and privacy breaches on abnormal stock returns, the change in firm risk, and abnormal trading volume are measured.…

  15. Motive Matters! An exploration of the notion ‘deliberate breach of contract’ and its consequences for the application of remedies

    NARCIS (Netherlands)

    M. van Kogelenberg (Martijn)

    2012-01-01

    textabstractThis thesis explores the notion of deliberate breach of contract and its potential remedial consequences. In the major jurisdictions in Europe and in the United States the notion of deliberate breach of contract is generally not coherently and officially defined and acknowledged as an in

  16. People's conditional probability judgments follow probability theory (plus noise).

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  17. Hurricane Sandy beach response and recovery at Fire Island, New York: Shoreline, beach profile data, and breach shoreline data: October 2012 to June 2016

    Science.gov (United States)

    Henderson, Rachel E.; Hapke, Cheryl J.; Brenner, Owen T.; Reynolds, Billy J.

    2017-01-01

    Fire Island, New York is the site of a long term coastal morphologic change and processes project conducted by the U.S. Geological Survey (USGS). One of the objectives of the project was to understand the morphologic evolution of the barrier system on a variety of time scales (months–years–decades–centuries). In response to Hurricane Sandy (October 2012), this effort continued with the intention of resolving storm impacts, post-storm beach response, and recovery. The day before Hurricane Sandy made landfall a USGS field team conducted surveys at Fire Island National Seashore (FIIS) to quantify the pre-storm morphologic state of the beach and dunes. The area was re-surveyed after the storm, as soon as access to the island was possible. In order to fully capture the recovery of the barrier system, the USGS Hurricane Sandy Supplemental Fire Island Study was established to include regular surveying in the weeks, months, and years following the storm. As part of the USGS Hurricane Sandy Supplemental Fire Island Study, the beach is monitored periodically to enable better understanding of post-Sandy recovery. The alongshore state of the beach is recorded using a differential global positioning system (DGPS) to collect data around the mean high water (MHW; 0.46 meter North American Vertical Datum of 1988) to derive a shoreline, and the cross-shore response and recovery are measured along a series of 15 profiles (Figure 1). Monitoring continued in the weeks following Hurricane Sandy with additional monthly collection through April 2013, and repeat surveys every 2–3 months thereafter until October 2014. Additional bi-annual surveys have been collected through September 2016. Beginning in October 2014 the USGS also began collecting a shoreline at the Wilderness breach, in the location of Old Inlet, in the Otis Pike High Dunes Wilderness area. The shoreline collected was an approximation of the MHW shoreline. The operator walked along an estimated MHW elevation above

  18. Savage s Concept of Probability

    Institute of Scientific and Technical Information of China (English)

    熊卫

    2003-01-01

    Starting with personal preference, Savage [3] constructs a foundation theory for probability from the qualitative probability to the quantitative probability and to utility. There are some profound logic connections between three steps in Savage's theory; that is, quantitative concepts properly represent qualitative concepts. Moreover, Savage's definition of subjective probability is in accordance with probability theory, and the theory gives us a rational decision model only if we assume that the weak ...

  19. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  20. LMX, Breach Perceptions, Work-Family Conflict, and Well-Being: A Mediational Model.

    Science.gov (United States)

    Hill, Rachel T; Morganson, Valerie J; Matthews, Russell A; Atkinson, Theresa P

    2016-01-01

    Despite research advances, work-family scholars still lack an understanding of how leadership constructs relate to an employee's ability to effectively manage the work-family interface. In addition, there remains a need to examine the process through which leadership and work-family conflict influence well-being outcomes. Using a sample of 312 workers, a mediated process model grounded in social exchange theory is tested wherein the authors seek to explain how leaders shape employee perceptions, which, in turn, impact organizational fulfillment of expectations (i.e., psychological contract breach), work-family conflict, and well-being. A fully latent structural equation model was used to test study hypotheses, all of which were supported. Building on existing theory, findings suggest that the supervisor plays a critical role as a frontline representative for the organization and that work-family conflict is reduced and well-being enhanced through a process of social exchange between the supervisor and worker.

  1. Human experimentation: historical perspective of breaches of ethics in US health care.

    Science.gov (United States)

    Layman, Elizabeth J

    2009-01-01

    Health care supervisors and managers may participate in ethical discussions and serve on ethics committees in their health care organizations. To aid them in their participation and service, this article expands upon the knowledge of ethics that they obtained in their academic training. The article provides readers with a common language based on frequently cited cases and key documents. The article traces a brief history of human experimentation, describes ethical breaches in the United States, and summarizes key documents guiding current thought on informed and voluntary consent. The article concludes with 3 common misconceptions that health care supervisors and managers will want to avoid in ethical discussions and ethical decision making. Health care supervisors and managers will be prepared to meaningfully contribute to the discussion of ethical issues and to the resolution of ethical problems in their health care organizations.

  2. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-04-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach). This methodology considers a period to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the model behaves satisfactory. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each station, a BReach analysis is performed and subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear consistent with this knowledge of historical changes and facilitates thus a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model evaluation design about consistent time periods to analyze).

  3. First estimates of the probability of survival in a small-bodied, high-elevation frog (Boreal Chorus Frog, Pseudacris maculata), or how historical data can be useful

    Science.gov (United States)

    Muths, Erin L.; Scherer, R. D.; Amburgey, S. M.; Matthews, T.; Spencer, A. W.; Corn, P.S.

    2016-01-01

    In an era of shrinking budgets yet increasing demands for conservation, the value of existing (i.e., historical) data are elevated. Lengthy time series on common, or previously common, species are particularly valuable and may be available only through the use of historical information. We provide first estimates of the probability of survival and longevity (0.67–0.79 and 5–7 years, respectively) for a subalpine population of a small-bodied, ostensibly common amphibian, the Boreal Chorus Frog (Pseudacris maculata (Agassiz, 1850)), using historical data and contemporary, hypothesis-driven information–theoretic analyses. We also test a priori hypotheses about the effects of color morph (as suggested by early reports) and of drought (as suggested by recent climate predictions) on survival. Using robust mark–recapture models, we find some support for early hypotheses regarding the effect of color on survival, but we find no effect of drought. The congruence between early findings and our analyses highlights the usefulness of historical information in providing raw data for contemporary analyses and context for conservation and management decisions.

  4. Thirty-four New, High-Probability, Damped Ly-alpha Absorbers at Redshift z=[0.42, 0.70

    CERN Document Server

    Turnshek, David A; Rao, Sandhya M; Hamilton, Timothy S; Sardane, Gendith M; Held, Ryan

    2014-01-01

    Quasar damped Ly-alpha (DLA) absorption line systems with redshifts z<1.65 are used to trace neutral gas over approximately 70 per cent of the most recent history of the Universe. However, such systems fall in the UV and are rarely found in blind UV spectroscopic surveys. Therefore, it has been difficult to compile a moderate-sized sample of UV DLAs in any narrow cosmic time interval. However, DLAs are easy to identify in low-resolution spectra because they have large absorption rest equivalent widths. We have performed an efficient strong-MgII-selected survey for UV DLAs at redshifts z=[0.42,0.70] using HST's low-resolution ACS-HRC-PR200L prism. This redshift interval covers ~1.8 Gyr in cosmic time, i.e., t~[7.2,9.0] Gyrs after the Big Bang. A total of 95 strong MgII absorption-line systems identified in SDSS spectra were successfully observed with the prism at the predicted UV wavelengths of Ly-alpha absorption. We found that 33 of the 95 systems had a significant probability of being DLAs. One additiona...

  5. Breaching vulnerability of coastal barriers under effects of tropical cyclones: a model study on the Hue lagoon - Vietnam

    NARCIS (Netherlands)

    Tuan, T.Q.; Stive, M.J.F.; Verhagen, H.J.

    2006-01-01

    Under effects of tropical cyclones, the coast is subjected to attack both by surge and wave from the sea and by flooding from the bay. These forces pose a serious breaching threat to natural sea-defence works such as barrier spits, barrier islands, lagoon barriers, etc. on the coast. Unintended

  6. Pupils' Visual Representations in Standard and Problematic Problem Solving in Mathematics: Their Role in the Breach of the Didactical Contract

    Science.gov (United States)

    Deliyianni, Eleni; Monoyiou, Annita; Elia, Iliada; Georgiou, Chryso; Zannettou, Eleni

    2009-01-01

    This study investigated the modes of representations generated by kindergarteners and first graders while solving standard and problematic problems in mathematics. Furthermore, it examined the influence of pupils' visual representations on the breach of the didactical contract rules in problem solving. The sample of the study consisted of 38…

  7. Breaching vulnerability of coastal barriers under effects of tropical cyclones: a model study on the Hue lagoon - Vietnam

    NARCIS (Netherlands)

    Tuan, T.Q.; Stive, M.J.F.; Verhagen, H.J.

    2006-01-01

    Under effects of tropical cyclones, the coast is subjected to attack both by surge and wave from the sea and by flooding from the bay. These forces pose a serious breaching threat to natural sea-defence works such as barrier spits, barrier islands, lagoon barriers, etc. on the coast. Unintended brea

  8. Technical difficulties. Recent health IT security breaches are unlikely to improve the public's perception about the safety of personal data.

    Science.gov (United States)

    Becker, Cinda

    2006-02-20

    Consumers who claimed in recent surveys that they were "more afraid of cyber crimes than physical crimes" may have had reason for caution. A spate of well-publicized information thefts and security breaches at healthcare organizations have eroded trust in technology, says Carol Diamond, left, of the Markle Foundation, and that could have an adverse effect on acceptance of electronic medical records.

  9. Pupils' Visual Representations in Standard and Problematic Problem Solving in Mathematics: Their Role in the Breach of the Didactical Contract

    Science.gov (United States)

    Deliyianni, Eleni; Monoyiou, Annita; Elia, Iliada; Georgiou, Chryso; Zannettou, Eleni

    2009-01-01

    This study investigated the modes of representations generated by kindergarteners and first graders while solving standard and problematic problems in mathematics. Furthermore, it examined the influence of pupils' visual representations on the breach of the didactical contract rules in problem solving. The sample of the study consisted of 38…

  10. "Microscopic evidences of heavy metals distribution and anatomic alterations in breaching-leaves of Cupressus lindleyi growing around mining wastes".

    Science.gov (United States)

    Juan Miguel, Gómez-Bernal; Ofelia, Morton-Bermea; Esther Aurora, Ruiz-Huerta; Maria Aurora, Armienta-Hernández; Dávila Osiel, González

    2014-09-01

    In this article a study of the distribution of heavy metals in Cupressus lindleyi breaching-leaves was done in Taxco, Guerrero. At the same, heavy metals micro-localization was conducted in the breaching-leaves to understand the structural changes provoked by mining waste on plants. The most abundant contaminants in soils, tailings and different plant organs (roots, stems, and leaves) were Zn, Mn, and Pb. Nevertheless, As was more accumulated in the stem and breaching-leaves. The translocation factor and the bio-concentration factor were less than 1. The structural changes observed were the great accumulation of starch grains and phenolic compounds in the palisade parenchyma, changes in the hypodermis cell wall and necrotic zones in the palisade parenchyma. The distribution of heavy metals in breaching-leaves tissues was homogeneous in most of the elements. These results showed that C. lindleyi is a species that can be employed in phytostabilization of contaminated zones with mining waste because it is a native plant that does not require a lot of conditions for its development.

  11. Integrity breaches in a hollow fiber nanofilter - Effects on natural organic matter and virus-like particle removal.

    Science.gov (United States)

    Lidén, Angelica; Lavonen, Elin; Persson, Kenneth M; Larson, Magnus

    2016-11-15

    Ultrafiltration and nanofiltration have become common methods to treat surface water for drinking water purposes. Common aims of a membrane step are removal of natural organic matter (NOM), softening or adding an extra microbiological or chemical barrier. In most cases, the membrane is considered a good disinfection step; commonly the viral removal is at least 4-log. To ensure a working disinfection, reliable integrity tests are required. In the present pilot study with a hollow fiber nanofilter, the membrane achieved a high NOM reduction, and the difference in parameters related to NOM quality before and after treatment proved to be useful indicators of integrity breaches. Changes in total organic carbon (TOC) concentration, UV-absorbance at 254 nm (UVA254) and fluorescence derived parameters in the permeate flow were related to leaking fibers. On average, UVA254 in the permeate was 3 times higher for a membrane with compromised fibers (0.041 cm(-1)) compared to an intact membrane (0.013 cm(-1)), while TOC was less than 2 times as high on average. Thus, this membrane had a higher reduction of UVA254 than TOC and the sensitivity for changes from leakage was higher. Therefore, it is suggested that UVA254 could be used as an indicator for membrane integrity. Additionally, there is a significant (P < 0.01) difference in fluorescence derived parameters between a leaking and an intact fiber, showing that fluorescence also has potential to be applied for online monitoring of membrane processes. During fiber failure, around 2% of the permeate flow passes through one single leaking fiber. The transport depends on the distance between the inflow and the leak, which in most cases are similar and most likely close to the middle of the fiber.

  12. Probability distributions for magnetotellurics

    Energy Technology Data Exchange (ETDEWEB)

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  13. RANDOM VARIABLE WITH FUZZY PROBABILITY

    Institute of Scientific and Technical Information of China (English)

    吕恩琳; 钟佑明

    2003-01-01

    Mathematic description about the second kind fuzzy random variable namely the random variable with crisp event-fuzzy probability was studied. Based on the interval probability and using the fuzzy resolution theorem, the feasible condition about a probability fuzzy number set was given, go a step further the definition arid characters of random variable with fuzzy probability ( RVFP ) and the fuzzy distribution function and fuzzy probability distribution sequence of the RVFP were put forward. The fuzzy probability resolution theorem with the closing operation of fuzzy probability was given and proved. The definition and characters of mathematical expectation and variance of the RVFP were studied also. All mathematic description about the RVFP has the closing operation for fuzzy probability, as a result, the foundation of perfecting fuzzy probability operation method is laid.

  14. Sedimentology and hydrology of a well-preserved paleoriver systems with a series of dam-breach paleolakes at Moa Valles, Mars

    Science.gov (United States)

    Salese, Francesco; Di Achille, Gaetano; Neesemann, Adrian; Ori, Gian Gabriele; Hauber, Ernst

    2016-04-01

    Moa Valles is a well-preserved paleodrainage system that is nearly 300-km-long and carved into ancient highland terrains west of Idaeus Fossae. The paleofluvial system apparently originated from fluidized ejecta blankets, and it consists of a series of dam-breach paleolakes with associated fan-shaped sedimentary deposits. This paleofluvial system shows a rich morphological record of hydrologic activity in the highlands of Mars. Based on crater counting the latter activity seems to be Amazonian in age (2.43 - 1.41 Ga). This work is based on a digital elevation model (DEM) derived from Context camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo images. Our goals are to (a) study the complex channel flow paths draining into Idaeus Fossae after forming a series of dam-breach paleolakes and to (b) investigate the origin and evolution of this valley system with its implications for climate and tectonic control. The first part of the system is characterized by many paleolakes, which are interconnected and drain eastward into Liberta crater, forming a complex and multilobate deltaic deposit exhibiting a well-developed channelized distributary pattern with evidence of switching on the delta plain. A breach area, consisting of three spillover channels, is present in the eastern part of the crater rim. These channels connect the Liberta crater to the eastward portion of the valley system, continuing toward Moa Valles with a complex pattern of anabranching channels that is more than 180-km-long. Our crater counting results and hydrological calculations of infilling and spillover discharges of the Liberta crater-lake suggest that the system is the result of an Early Amazonian water-rich environment that was likely sustained by relatively short fluvial events (<102 years), thereby supporting the hypotheses that water-related erosion might have been active on Mars (at least locally) during the Amazonian. The most important water source for the system could

  15. Probability of mechanical loosening of the femoral component in high flexion total knee arthroplasty can be reduced by rather simple surgical techniques

    NARCIS (Netherlands)

    Groes, S.A.W. van de; Waal Malefijt, M.C. de; Verdonschot, N.J.J.

    2014-01-01

    BACKGROUND: Some follow-up studies of high flexion total knee arthoplasties report disturbingly high incidences of femoral component loosening. Femoral implant fixation is dependant on two interfaces: the cement-implant and the cement-bone interface. The present finite-element model (FEM) is the

  16. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  19. Conditionals, probability, and belief revision

    NARCIS (Netherlands)

    Voorbraak, F.

    1989-01-01

    A famous result obtained in the mid-seventies by David Lewis shows that a straightforward interpretation of probabilities of conditionals as conditional probabilities runs into serious trouble. In this paper we try to circumvent this trouble by defining extensions of probability functions, called

  20. Probability of mechanical loosening of the femoral component in high flexion total knee arthroplasty can be reduced by rather simple surgical techniques

    NARCIS (Netherlands)

    van de Groes, S.A.W.; De Waal Malefijt, M.C.; Verdonschot, Nicolaas Jacobus Joseph

    2014-01-01

    Background: Some follow-up studies of highflexion total knee arthoplasties report disturbingly high incidences of femoral component loosening. Femoral implantfixation is dependant on two interfaces: the cement–implant and the cement–bone interface. The present finite-element model (FEM) is the first

  1. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Energy Technology Data Exchange (ETDEWEB)

    Yin, Bing, E-mail: rayinyin@gmail.com; Wen, Zhen-Yi [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Institute of Modern Physics, Northwest University, Xi' an 710069 (China); Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li [MOE Key Laboratory of Synthetic and Natural Functional Molecule Chemistry, Shaanxi Key Laboratory of Physico-Inorganic Chemistry, College of Chemistry and Materials Science, Northwest University, Xi' an 710069 (China); Jiang, Zhen-Yi [Institute of Modern Physics, Northwest University, Xi' an 710069 (China)

    2014-03-07

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg{sub 2}(CN){sub 5}]{sup −} clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN{sup −1} were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  2. Are polynuclear superhalogens without halogen atoms probable? A high-level ab initio case study on triple-bridged binuclear anions with cyanide ligands

    Science.gov (United States)

    Yin, Bing; Li, Teng; Li, Jin-Feng; Yu, Yang; Li, Jian-Li; Wen, Zhen-Yi; Jiang, Zhen-Yi

    2014-03-01

    The first theoretical exploration of superhalogen properties of polynuclear structures based on pseudohalogen ligand is reported here via a case study on eight triply-bridged [Mg2(CN)5]- clusters. From our high-level ab initio results, all these clusters are superhalogens due to their high vertical electron detachment energies (VDE), of which the largest value is 8.67 eV at coupled-cluster single double triple (CCSD(T)) level. Although outer valence Green's function results are consistent with CCSD(T) in most cases, it overestimates the VDEs of three anions dramatically by more than 1 eV. Therefore, the combined usage of several theoretical methods is important for the accuracy of purely theoretical prediction of superhalogen properties of new structures. Spatial distribution of the extra electron of high-VDE anions here indicates two features: remarkable aggregation on bridging CN units and non-negligible distribution on every CN unit. These two features lower the potential and kinetic energies of the extra electron respectively and thus lead to high VDE. Besides superhalogen properties, the structures, relative stabilities and thermodynamic stabilities with respect to detachment of CN-1 were also investigated for these anions. The collection of these results indicates that polynuclear structures based on pseudohalogen ligand are promising candidates for new superhalogens with enhanced properties.

  3. A game with rules in the making - how the high probability of waiting games in nanomedicine is being mitigated through distributed regulation and responsible innovation

    NARCIS (Netherlands)

    D'Silva, J.J.F.; Robinson, D.K.R.; Shelley Egan, C.

    2012-01-01

    The potential benefits of nanotechnologies in healthcare are widely expected to be enormous and a considerable amount of investment is already pouring into public research in this area. These high expectations of benefits are coupled with uncertainty surrounding the potential risks of the prospectiv

  4. Gd Transition Probabilities and Abundances

    CERN Document Server

    Den Hartog, E A; Sneden, C; Cowan, J J

    2006-01-01

    Radiative lifetimes, accurate to +/- 5%, have been measured for 49 even-parity and 14 odd-parity levels of Gd II using laser-induced fluorescence. The lifetimes are combined with branching fractions measured using Fourier transform spectrometry to determine transition probabilities for 611 lines of Gd II. This work is the largest-scale laboratory study to date of Gd II transition probabilities and the first using a high performance Fourier transform spectrometer. This improved data set has been used to determine a new solar photospheric Gd abundance, log epsilon = 1.11 +/- 0.03. Revised Gd abundances have also been derived for the r-process-rich metal-poor giant stars CS 22892-052, BD+17 3248, and HD 115444. The resulting Gd/Eu abundance ratios are in very good agreement with the solar-system r-process ratio. We have employed the increasingly accurate stellar abundance determinations, resulting in large part from the more precise laboratory atomic data, to predict directly the Solar System r-process elemental...

  5. OPTICAL SPECTROSCOPY OF THE HIGH-MASS γ-RAY BINARY 1FGL J1018.6−5856: A PROBABLE NEUTRON STAR PRIMARY

    Energy Technology Data Exchange (ETDEWEB)

    Strader, Jay; Chomiuk, Laura; Peacock, Mark [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Cheung, C. C. [Space Science Division, Naval Research Laboratory, Washington, DC 20375 (United States); Salinas, Ricardo [Gemini Observatory, Casilla 603, La Serena (Chile)

    2015-11-10

    We present medium-resolution optical spectroscopy with the SOAR telescope of the O star secondary of the high-mass γ-ray binary 1FGL J1018.6–5856 to help determine whether the primary is a neutron star or black hole. We find that the secondary has a low radial velocity semi-amplitude of 11–12 km s{sup −1}, with consistent values obtained for H and He absorption lines. This low value strongly favors a neutron star primary: while a black hole cannot be excluded if the system is close to face on, such inclinations are disallowed by the observed rotation of the secondary. We also find the high-energy (X-ray and γ-ray) flux maxima occur when the star is behind the compact object along our line of sight, inconsistent with a simple model of anisotropic inverse Compton scattering for the γ-ray photons.

  6. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  7. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  8. The Art of Probability Assignment

    CERN Document Server

    Dimitrov, Vesselin I

    2012-01-01

    The problem of assigning probabilities when little is known is analized in the case where the quanities of interest are physical observables, i.e. can be measured and their values expressed by numbers. It is pointed out that the assignment of probabilities based on observation is a process of inference, involving the use of Bayes' theorem and the choice of a probability prior. When a lot of data is available, the resulting probability are remarkable insensitive to the form of the prior. In the oposite case of scarse data, it is suggested that the probabilities are assigned such that they are the least sensitive to specific variations of the probability prior. In the continuous case this results in a probability assignment rule wich calls for minimizing the Fisher information subject to constraints reflecting all available information. In the discrete case, the corresponding quantity to be minimized turns out to be a Renyi distance between the original and the shifted distribution.

  9. Probability workshop to be better in probability topic

    Science.gov (United States)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  10. The estimation of tree posterior probabilities using conditional clade probability distributions.

    Science.gov (United States)

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  11. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  12. Casting-type calcifications on the mammogram suggest a higher probability of early relapse and death among high-risk breast cancer patients

    Energy Technology Data Exchange (ETDEWEB)

    Palka, Istvan [Dept. of Pathology, Univ. of Szeged, Szeged (Hungary); Ormandi, K atalin [Dept. of Radiology, Univ. of Szeged, Szeged (Hungary); Gaal, Szilvia; Kahan, Zsuzsanna [Dept. of Oncotherapy, Univ. of Szeged, Szeged (Hungary); Boda, Krisztina [Dept. of Medical Informatics, Univ. of Szeged, Szeged (Hungary)

    2007-11-15

    A retrospective analysis of the relation between the presence of casting-type calcifications on the mammogram and the prognosis of breast cancer was performed. The mammographic tumor features and other characteristics (invasive tumor size, histological tumor type, grade, nodal, hormone receptor and HER2 status, presence of lymphovascular invasion) of 55 high-risk breast cancers were studied. After a median follow-up time of 29.1 months, the median relapse-free survival and overall survival times among breast cancer patients with tumors associated with casting calcifications were 26.6 and 29.6 months, respectively. The corresponding parameters among patients with tumors not accompanied by casting calcifications were 54.4 and >58.5 months, respectively. Significant associations were found between the presence of casting calcifications and the risks of relapse (HR = 3.048, 95% CI: 1.116-8.323, p = 0.030) or death (HR = 3.504, 95% CI: 1.074-11.427, p 0.038). Positive associations were found between casting calcifications and ER/PR negativity (p = 0.015 and p = 0.003, respectively) and HER2 overexpression (p = 0.019). Our findings support the theory that breast tumors associated with casting-type calcifications at mammography comprise a disease entity which exhibits significantly more aggressive behavior and a poorer outcome than do cancers with other mammographic tumor features.

  13. Modelling of HTR Confinement Behaviour during Accidents Involving Breach of the Helium Pressure Boundary

    Directory of Open Access Journals (Sweden)

    Joan Fontanet

    2009-01-01

    Full Text Available Development of HTRs requires the performance of a thorough safety study, which includes accident analyses. Confinement building performance is a key element of the system since the behaviour of aerosol and attached fission products within the building is of an utmost relevance in terms of the potential source term to the environment. This paper explores the available simulation capabilities (ASTEC and CONTAIN codes and illustrates the performance of a postulated HTR vented confinement under prototypical accident conditions by a scoping study based on two accident sequences characterized by Helium Pressure Boundary breaches, a small and a large break. The results obtained indicate that both codes predict very similar thermal-hydraulic responses of the confinement both in magnitude and timing. As for the aerosol behaviour, both codes predict that most of the inventory coming into the confinement is eventually depleted on the walls and only about 1% of the aerosol dust is released to the environment. The crosscomparison of codes states that largest differences are in the intercompartmental flows and the in-compartment gas composition.

  14. The moderating role of overcommitment in the relationship between psychological contract breach and employee mental health.

    Science.gov (United States)

    Reimann, Mareike

    2016-09-30

    This study investigated whether the association between perceived psychological contract breach (PCB) and employee mental health is moderated by the cognitive-motivational pattern of overcommitment (OC). Linking the psychological contract approach to the effort-reward imbalance model, this study examines PCB as an imbalance in employment relationships that acts as a psychosocial stressor in the work environment and is associated with stress reactions that in turn negatively affect mental health. The analyses were based on a sample of 3,667 employees who participated in a longitudinal linked employer-employee survey representative of large organizations (with at least 500 employees who are subject so social security contributions) in Germany. Fixed-effects regression models, including PCB and OC, were estimated for employee mental health, and interaction effects between PCB and OC were assessed. The multivariate fixed-effects regression analyses showed a significant negative association between PCB and employee mental health. The results also confirmed that OC does indeed significantly increase the negative effect of PCB on mental health and that OC itself has a significant and negative effect on mental health. The results suggest that employees characterized by the cognitive-motivational pattern of OC are at an increased risk of developing poor mental health if they experience PCB compared with employees who are not overly committed to their work. The results of this study support the assumption that psychosocial work stressors play an important role in employee mental health.

  15. 'I knew before I was told': Breaches, cues and clues in the diagnostic assemblage.

    Science.gov (United States)

    Locock, Louise; Nettleton, Sarah; Kirkpatrick, Susan; Ryan, Sara; Ziebland, Sue

    2016-04-01

    Diagnosis can be both a 'diagnostic moment', but also a process over time. This paper uses secondary analysis of narrative interviews on ovarian cancer, antenatal screening and motor neurone disease to explore how people relate assembling procedural, spatial and interactional evidence before the formal diagnostic moment. We offer the idea of a diagnostic assemblage to capture the ways in which individuals connect to and re-order signs and events that come to be associated with their bodies. Building on the empirical work of Poole and Lyne (2000) in the field of breast cancer diagnosis, we identify how patients describe being alerted to their diagnosis, either through 'clues' they report picking up (often inadvertently) or through 'cues', perceived as a more intentional prompt given by a health professional, or an organisational process. For patients, these clues frequently represent a breach in the expected order of their encounter with healthcare. Even seemingly mundane episodes or behaviours take on meanings which health professionals may not themselves anticipate. Our findings speak to an emergent body of work demonstrating that experiences of formal healthcare during the lead-up to diagnosis shape patients' expectations, degree of trust in professionals, and even health outcomes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Identification of temporal consistency in rating curve data: Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.

    2016-08-01

    In this paper, a methodology is developed to identify consistency of rating curve data based on a quality analysis of model results. This methodology, called Bidirectional Reach (BReach), evaluates results of a rating curve model with randomly sampled parameter sets in each observation. The combination of a parameter set and an observation is classified as nonacceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Based on this classification, conditions for satisfactory behavior of a model in a sequence of observations are defined. Subsequently, a parameter set is evaluated in a data point by assessing the span for which it behaves satisfactory in the direction of the previous (or following) chronologically sorted observations. This is repeated for all sampled parameter sets and results are aggregated by indicating the endpoint of the largest span, called the maximum left (right) reach. This temporal reach should not be confused with a spatial reach (indicating a part of a river). The same procedure is followed for each data point and for different definitions of satisfactory behavior. Results of this analysis enable the detection of changes in data consistency. The methodology is validated with observed data and various synthetic stage-discharge data sets and proves to be a robust technique to investigate temporal consistency of rating curve data. It provides satisfying results despite of low data availability, errors in the estimated observational uncertainty, and a rating curve model that is known to cover only a limited part of the observations.

  17. Breaching Biological Barriers: Protein Translocation Domains as Tools for Molecular Imaging and Therapy

    Directory of Open Access Journals (Sweden)

    Benjamin L. Franc

    2003-10-01

    Full Text Available The lipid bilayer of a cell presents a significant barrier for the delivery of many molecular imaging reagents into cells at target sites in the body. Protein translocation domains (PTDs are peptides that breach this barrier. Conjugation of PTDs to imaging agents can be utilized to facilitate the delivery of these agents through the cell wall, and in some cases, into the cell nucleus, and have potential for in vitro and in vivo applications. PTD imaging conjugates have included small molecules, peptides, proteins, DNA, metal chelates, and magnetic nanoparticles. The full potential of the use of PTDs in novel in vivo molecular probes is currently under investigation. Cells have been labeled in culture using magnetic nanoparticles derivatized with a PTD and monitored in vivo to assess trafficking patterns relative to cells expressing a target antigen. In vivo imaging of PTD-mediated gene transfer to cells of the skin has been demonstrated in living animals. Here we review several natural and synthetic PTDs that have evolved in the quest for easier translocation across biological barriers and the application of these peptide domains to in vivo delivery of imaging agents.

  18. What do Islamic institutional fatwas say about medical and research confidentiality and breach of confidentiality?

    Science.gov (United States)

    Alahmad, Ghiath; Dierickx, Kris

    2012-08-01

    Protecting confidentiality is an essential value in all human relationships, no less in medical practice and research.(1) Doctor-patient and researcher-participant relationships are built on trust and on the understanding those patients' secrets will not be disclosed.(2) However, this confidentiality can be breached in some situations where it is necessary to meet a strong conflicting duty.(3) Confidentiality, in a general sense, has received much interest in Islamic resources including the Qur'an, Sunnah and juristic writings. However, medical and research confidentiality have not been explored deeply. There are few fatwas about the issue, despite an increased effort by both individuals and Islamic medical organizations to use these institutional fatwas in their research. Infringements on confidentiality make up a significant portion of institutional fatwas, yet they have never been thoroughly investigated. Moreover, the efforts of organizations and authors in this regard still require further exploration, especially on the issue of research confidentiality. In this article, we explore medical and research confidentiality and potential conflicts with this practice as a result of fatwas released by international, regional, and national Islamic Sunni juristic councils. We discuss how these fatwas affect research and publication by Muslim doctors, researchers, and Islamic medical organizations. We argue that more specialized fatwas are needed to clarify Islamic juristic views about medical and research confidentiality, especially the circumstances in which infringements on this confidentiality are justified. © 2012 Blackwell Publishing Ltd.

  19. Vertical deformation of lacustrine shorelines along breached relay ramps, Catlow Valley fault, southeastern Oregon, USA

    Science.gov (United States)

    Hopkins, Michael C.; Dawers, Nancye H.

    2016-04-01

    Vertical deformation of pluvial lacustrine shorelines is attributed to slip along the Catlow Valley fault, a segmented Basin and Range style normal fault in southeastern Oregon, USA. The inner edges of shorelines are mapped along three breached relay ramps along the fault to examine the effect of fault linkage on the distribution of slip. Shoreline inner edges act as paleohorizontal datums so deviations in elevation from horizontal, outside of a 2 m error window, are taken to be indications of fault slip. The sites chosen represent a spectrum of linkage scenarios in that the throw on the linking fault compared to that on the main fault adjacent to the linking fault varies from site to site. Results show that the maturity of the linkage between segments (i.e. larger throw on the linking fault with respect to the main fault) does not control the spatial distribution of shoreline deformation. Patterns of shoreline deformation indicate that the outboard, linking, and/or smaller ramp faults have slipped since the shorelines formed. Observations indicate that displacement has not fully localized on the linking faults following complete linkage between segments.

  20. Breach of belongingness: Newcomer relationship conflict, information, and task-related outcomes during organizational socialization.

    Science.gov (United States)

    Nifadkar, Sushil S; Bauer, Talya N

    2016-01-01

    Previous studies of newcomer socialization have underlined the importance of newcomers' information seeking for their adjustment to the organization, and the conflict literature has consistently reported negative effects of relationship conflict with coworkers. However, to date, no study has examined the consequences of relationship conflict on newcomers' information seeking. In this study, we examined newcomers' reactions when they have relationship conflict with their coworkers, and hence cannot obtain necessary information from them. Drawing upon belongingness theory, we propose a model that moves from breach of belongingness to its proximal and distal consequences, to newcomer information seeking, and then to task-related outcomes. In particular, we propose that second paths exist-first coworker-centric and the other supervisor-centric-that may have simultaneous yet contrasting influence on newcomer adjustment. To test our model, we employ a 3-wave data collection research design with egocentric and Likert-type multisource surveys among a sample of new software engineers and their supervisors working in India. This study contributes to the field by linking the literatures on relationship conflict and newcomer information seeking and suggesting that despite conflict with coworkers, newcomers may succeed in organizations by building relationships with and obtaining information from supervisors.

  1. Breaches of the pial basement membrane are associated with defective dentate gyrus development in mouse models of congenital muscular dystrophies.

    Science.gov (United States)

    Li, Jing; Yu, Miao; Feng, Gang; Hu, Huaiyu; Li, Xiaofeng

    2011-11-07

    A subset of congenital muscular dystrophies (CMDs) has central nervous system manifestations. There are good mouse models for these CMDs that include POMGnT1 knockout, POMT2 knockout and Large(myd) mice with all exhibiting defects in dentate gyrus. It is not known how the abnormal dentate gyrus is formed during the development. In this study, we conducted a detailed morphological examination of the dentate gyrus in adult and newborn POMGnT1 knockout, POMT2 knockout, and Large(myd) mice by immunofluorescence staining and electron microscopic analyses. We observed that the pial basement membrane overlying the dentate gyrus was disrupted and there was ectopia of granule cell precursors through the breached pial basement membrane. Besides these, the knockout dentate gyrus exhibited reactive gliosis in these mouse models. Thus, breaches in the pial basement membrane are associated with defective dentate gyrus development in mouse models of congenital muscular dystrophies.

  2. Can Cross-Listing Mitigate the Impact of an Information Security Breach Announcement on a Firm's Values?

    Science.gov (United States)

    Chen, Yong; Dong, Feng; Chen, Hong; Xu, Li

    2016-08-01

    The increase in globalization in the markets has driven firms to adopt online technologies and to cross-list their stocks. Recent studies have consistently found that the announcements of information security breaches (ISBs) are negatively associated with the market values of the announcing firms during the days surrounding the breach announcements. Given the improvement in firms’ information environments and the better protection for investors generated by cross-listing, does cross-listing help firms to reduce the negative impacts caused by their announcements of ISBs? This paper conducts an event study of 120 publicly traded firms (among which 25 cross-list and 95 do not), in order to explore the answer. The results indicate that the impact of ISB announcements on a firm's stock prices shows no difference between cross-listing firms and non-cross-listing firms. Cross-listing does not mitigate the impact of ISBs announcement on a firm's market value.

  3. Obtaining the Probability Vector Current Density in Canonical Quantum Mechanics by Linear Superposition

    CERN Document Server

    Kauffmann, Steven Kenneth

    2013-01-01

    The quantum mechanics status of the probability vector current density has long seemed to be marginal. On one hand no systematic prescription for its construction is provided, and the special examples of it that are obtained for particular types of Hamiltonian operator could conceivably be attributed to happenstance. On the other hand this concept's key physical interpretation as local average particle flux, which flows from the equation of continuity that it is supposed to satisfy in conjunction with the probability scalar density, has been claimed to breach the uncertainty principle. Given the dispiriting impact of that claim, we straightaway point out that the subtle directional nature of the uncertainty principle makes it consistent with the measurement of local average particle flux. We next focus on the fact that the unique closed-form linear-superposition quantization of any classical Hamiltonian function yields in tandem the corresponding unique linear-superposition closed-form divergence of the proba...

  4. Hidden Variables or Positive Probabilities?

    CERN Document Server

    Rothman, T; Rothman, Tony

    2001-01-01

    Despite claims that Bell's inequalities are based on the Einstein locality condition, or equivalent, all derivations make an identical mathematical assumption: that local hidden-variable theories produce a set of positive-definite probabilities for detecting a particle with a given spin orientation. The standard argument is that because quantum mechanics assumes that particles are emitted in a superposition of states the theory cannot produce such a set of probabilities. We examine a paper by Eberhard who claims to show that a generalized Bell inequality, the CHSH inequality, can be derived solely on the basis of the locality condition, without recourse to hidden variables. We point out that he nonetheless assumes a set of positive-definite probabilities, which supports the claim that hidden variables or "locality" is not at issue here, positive-definite probabilities are. We demonstrate that quantum mechanics does predict a set of probabilities that violate the CHSH inequality; however these probabilities ar...

  5. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  6. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  7. Probability landscapes for integrative genomics

    Directory of Open Access Journals (Sweden)

    Benecke Arndt

    2008-05-01

    Full Text Available Abstract Background The comprehension of the gene regulatory code in eukaryotes is one of the major challenges of systems biology, and is a requirement for the development of novel therapeutic strategies for multifactorial diseases. Its bi-fold degeneration precludes brute force and statistical approaches based on the genomic sequence alone. Rather, recursive integration of systematic, whole-genome experimental data with advanced statistical regulatory sequence predictions needs to be developed. Such experimental approaches as well as the prediction tools are only starting to become available and increasing numbers of genome sequences and empirical sequence annotations are under continual discovery-driven change. Furthermore, given the complexity of the question, a decade(s long multi-laboratory effort needs to be envisioned. These constraints need to be considered in the creation of a framework that can pave a road to successful comprehension of the gene regulatory code. Results We introduce here a concept for such a framework, based entirely on systematic annotation in terms of probability profiles of genomic sequence using any type of relevant experimental and theoretical information and subsequent cross-correlation analysis in hypothesis-driven model building and testing. Conclusion Probability landscapes, which include as reference set the probabilistic representation of the genomic sequence, can be used efficiently to discover and analyze correlations amongst initially heterogeneous and un-relatable descriptions and genome-wide measurements. Furthermore, this structure is usable as a support for automatically generating and testing hypotheses for alternative gene regulatory grammars and the evaluation of those through statistical analysis of the high-dimensional correlations between genomic sequence, sequence annotations, and experimental data. Finally, this structure provides a concrete and tangible basis for attempting to formulate a

  8. Quasi 2D hydrodynamic modelling of the flooded hinterland due to dyke breaching on the Elbe River

    Directory of Open Access Journals (Sweden)

    S. Huang

    2007-01-01

    Full Text Available In flood modeling, many 1D and 2D combination and 2D models are used to simulate diversion of water from rivers through dyke breaches into the hinterland for extreme flood events. However, these models are too demanding in data requirements and computational resources which is an important consideration when uncertainty analysis using Monte Carlo techniques is used to complement the modeling exercise. The goal of this paper is to show the development of a quasi-2D modeling approach, which still calculates the dynamic wave in 1D but the discretisation of the computational units are in 2D, allowing a better spatial representation of the flow in the hinterland due to dyke breaching without a large additional expenditure on data pre-processing and computational time. A 2D representation of the flow and velocity fields is required to model sediment and micro-pollutant transport. The model DYNHYD (1D hydrodynamics from the WASP5 modeling package was used as a basis for the simulations. The model was extended to incorporate the quasi-2D approach and a Monte-Carlo Analysis was used to conduct a flood sensitivity analysis to determine the sensitivity of parameters and boundary conditions to the resulting water flow. An extreme flood event on the Elbe River, Germany, with a possible dyke breach area was used as a test case. The results show a good similarity with those obtained from another 1D/2D modeling study.

  9. Single-Beam Bathymetry of the Hurricane Sandy Breach at Fire Island, New York, June 2013 (1-Meter Digital Elevation Model)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This dataset, 20130626_bathy_DEM.zip, contains a 1-meter (m) grid of June 2013 bathymetry of the breach channel, ebb shoal, and adjacent coast of the Fire Island...

  10. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  11. Varieties of Belief and Probability

    NARCIS (Netherlands)

    D.J.N. van Eijck (Jan); S. Ghosh; J. Szymanik

    2015-01-01

    htmlabstractFor reasoning about uncertain situations, we have probability theory, and we have logics of knowledge and belief. How does elementary probability theory relate to epistemic logic and the logic of belief? The paper focuses on the notion of betting belief, and interprets a language for

  12. Landau-Zener Probability Reviewed

    CERN Document Server

    Valencia, C

    2008-01-01

    We examine the survival probability for neutrino propagation through matter with variable density. We present a new method to calculate the level-crossing probability that differs from Landau's method by constant factor, which is relevant in the interpretation of neutrino flux from supernova explosion.

  13. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...

  14. A graduate course in probability

    CERN Document Server

    Tucker, Howard G

    2014-01-01

    Suitable for a graduate course in analytic probability, this text requires only a limited background in real analysis. Topics include probability spaces and distributions, stochastic independence, basic limiting options, strong limit theorems for independent random variables, central limit theorem, conditional expectation and Martingale theory, and an introduction to stochastic processes.

  15. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  16. Linear Positivity and Virtual Probability

    CERN Document Server

    Hartle, J B

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. A quantum theory of closed systems requires two elements; 1) a condition specifying which sets of histories may be assigned probabilities that are consistent with the rules of probability theory, and 2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time-neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to i...

  17. Survival probability and ruin probability of a risk model

    Institute of Scientific and Technical Information of China (English)

    LUO Jian-hua

    2008-01-01

    In this paper, a new risk model is studied in which the rate of premium income is regarded as a random variable, the arrival of insurance policies is a Poisson process and the process of claim occurring is p-thinning process. The integral representations of the survival probability are gotten. The explicit formula of the survival probability on the infinite interval is obtained in the special casc--exponential distribution.The Lundberg inequality and the common formula of the ruin probability are gotten in terms of some techniques from martingale theory.

  18. Probability

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    People much given to gambling usually manage to work out rough-and-ready ways of measuring the likelihood of certain situations so as to know which way to bet their money, and how much. If they did not do this., they would quickly lose all their money to those who did.

  19. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  20. Probability Ranking in Vector Spaces

    CERN Document Server

    Melucci, Massimo

    2011-01-01

    The Probability Ranking Principle states that the document set with the highest values of probability of relevance optimizes information retrieval effectiveness given the probabilities are estimated as accurately as possible. The key point of the principle is the separation of the document set into two subsets with a given level of fallout and with the highest recall. The paper introduces the separation between two vector subspaces and shows that the separation yields a more effective performance than the optimal separation into subsets with the same available evidence, the performance being measured with recall and fallout. The result is proved mathematically and exemplified experimentally.

  1. Holographic probabilities in eternal inflation.

    Science.gov (United States)

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  2. Local Causality, Probability and Explanation

    CERN Document Server

    Healey, Richard A

    2016-01-01

    In papers published in the 25 years following his famous 1964 proof John Bell refined and reformulated his views on locality and causality. Although his formulations of local causality were in terms of probability, he had little to say about that notion. But assumptions about probability are implicit in his arguments and conclusions. Probability does not conform to these assumptions when quantum mechanics is applied to account for the particular correlations Bell argues are locally inexplicable. This account involves no superluminal action and there is even a sense in which it is local, but it is in tension with the requirement that the direct causes and effects of events are nearby.

  3. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  4. Diurnal distribution of sunshine probability

    Energy Technology Data Exchange (ETDEWEB)

    Aydinli, S.

    1982-01-01

    The diurnal distribution of the sunshine probability is essential for the predetermination of average irradiances and illuminances by solar radiation on sloping surfaces. The most meteorological stations have only monthly average values of the sunshine duration available. It is, therefore, necessary to compute the diurnal distribution of sunshine probability starting from the average monthly values. It is shown how the symmetric component of the distribution of the sunshine probability which is a consequence of a ''sidescene effect'' of the clouds can be calculated. The asymmetric components of the sunshine probability depending on the location and the seasons and their influence on the predetermination of the global radiation are investigated and discussed.

  5. Probability representation of classical states

    NARCIS (Netherlands)

    Man'ko, OV; Man'ko, [No Value; Pilyavets, OV

    2005-01-01

    Probability representation of classical states described by symplectic tomograms is discussed. Tomographic symbols of classical observables which are functions on phase-space are studied. Explicit form of kernel of commutative star-product of the tomographic symbols is obtained.

  6. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  7. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  8. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  9. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  10. Joint probabilities and quantum cognition

    CERN Document Server

    de Barros, J Acacio

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantum-like response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  11. Three lectures on free probability

    OpenAIRE

    2012-01-01

    These are notes from a three-lecture mini-course on free probability given at MSRI in the Fall of 2010 and repeated a year later at Harvard. The lectures were aimed at mathematicians and mathematical physicists working in combinatorics, probability, and random matrix theory. The first lecture was a staged rediscovery of free independence from first principles, the second dealt with the additive calculus of free random variables, and the third focused on random matrix models.

  12. Updating piping probabilities with survived historical loads

    NARCIS (Netherlands)

    Schweckendiek, T.; Kanning, W.

    2009-01-01

    Piping, also called under-seepage, is an internal erosion mechanism, which can cause the failure of dikes or other flood defence structures. The uncertainty in the resistance of a flood defence against piping is usually large, causing high probabilities of failure for this mechanism. A considerable

  13. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  14. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  15. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  16. Channel Capacity Estimation using Free Probability Theory

    CERN Document Server

    Ryan, Øyvind

    2007-01-01

    In many channel measurement applications, one needs to estimate some characteristics of the channels based on a limited set of measurements. This is mainly due to the highly time varying characteristics of the channel. In this contribution, it will be shown how free probability can be used for channel capacity estimation in MIMO systems. Free probability has already been applied in various application fields such as digital communications, nuclear physics and mathematical finance, and has been shown to be an invaluable tool for describing the asymptotic behaviour of many systems when the dimensions of the system get large (i.e. the number of antennas). In particular, introducing the notion of free deconvolution, we provide hereafter an asymptotically (in the number of antennas) unbiased capacity estimator (w.r.t. the number of observations) for MIMO channels impaired with noise. Another unbiased estimator (for any number of observations) is also constructed by slightly modifying the free probability based est...

  17. Initial Impacts of the Mount Polley Tailings Pond Breach on Adjacent Aquatic Ecosystems

    Science.gov (United States)

    Petticrew, Ellen; Gantner, Nikolaus; Albers, Sam; Owens, Philip

    2015-04-01

    On August 4th 2014, the Mount Polley Tailings pond breach near Likely, B.C., released approximately 24 million cubic metres of tailings material into Polley Lake, Hazeltine Creek and Quesnel Lake. The discharge scoured and eroded a swath of soil and sediment delivering an unknown amount of metals and sediment into this tributary ecosystem of the Fraser River. Subsequent efforts by the mine operator to remediate by pumping tailings water from Polley Lake into Hazeltine Creek, which flows into Quesnel Lake, resulted in additional and continuous release of unknown volumes of contaminated water and sediments into the watershed. Heavy metals (e.g., selenium, copper, or mercury) reported as stored in the tailings pond entered the downstream aquatic environment and have been monitored in the water column of Quesnel Lake since August. These contaminants are likely particle-bound and thus subject to transport over long distances without appreciable degradation, resulting in the potential for chronic exposures and associated toxicological effects in exposed biota. While significant dilution is expected during aquatic transport, and the resulting concentrations in the water will likely be low, concentrations in exposed biota may become of concern over time. Metals such as mercury and selenium undergo bioaccumulation and biomagnification, once incorporated into the food chain/web. Thus, even small concentrations of such contaminants in water can lead to greater concentrations (~100 fold) in top predators. Over time, our predictions are that food web transfer will lead to an increase in concentrations from water (1-2 years)->invertebrates (1-2 yrs) ->fishes (2-5 yrs). Pacific salmon travel great distances in this watershed and may be exposed to contaminated water during their migrations. Resident species will be exposed to the contaminated waters and sediments in the study lakes year round. Little or no background/baseline data for metals in biota from Quesnel Lake exists

  18. Dam-Breach hydrology of the Johnstown flood of 1889-challenging the findings of the 1891 investigation report.

    Science.gov (United States)

    Coleman, Neil M; Kaktins, Uldis; Wojno, Stephanie

    2016-06-01

    In 1891 a report was published by an ASCE committee to investigate the cause of the Johnstown flood of 1889. They concluded that changes made to the dam by the South Fork Fishing and Hunting Club did not cause the disaster because the embankment would have been overflowed and breached if the changes were not made. We dispute that conclusion based on hydraulic analyses of the dam as originally built, estimates of the time of concentration and time to peak for the South Fork drainage basin, and reported conditions at the dam and in the watershed. We present a LiDAR-based volume of Lake Conemaugh at the time of dam failure (1.455 × 10(7) m(3)) and hydrographs of flood discharge and lake stage decline. Our analytical approach incorporates the complex shape of this dam breach. More than 65 min would have been needed to drain most of the lake, not the 45 min cited by most sources. Peak flood discharges were likely in the range 7200 to 8970 m(3) s(-1). The original dam design, with a crest ∼0.9 m higher and the added capacity of an auxiliary spillway and five discharge pipes, had a discharge capacity at overtopping more than twice that of the reconstructed dam. A properly rebuilt dam would not have overtopped and would likely have survived the runoff event, thereby saving thousands of lives. We believe the ASCE report represented state-of-the-art for 1891. However, the report contains discrepancies and lapses in key observations, and relied on excessive reservoir inflow estimates. The confidence they expressed that dam failure was inevitable was inconsistent with information available to the committee. Hydrodynamic erosion was a likely culprit in the 1862 dam failure that seriously damaged the embankment. The Club's substandard repair of this earlier breach sowed the seeds of its eventual destruction.

  19. The Risk of Goods in International Sales. An Approach from the Breach of Contract and Remedies of the Buyer

    Directory of Open Access Journals (Sweden)

    Álvaro Vidal Olivares

    2016-12-01

    Full Text Available This article refers to the regime risks in the CISG with the aim of showing that the regime that it is incorporated, is based on functional criteria to the interests of the parties to solve the problems that originated in the loss and prior to the transfer of risk to the buyer will have recourse to remedies system and thus achieve the connection with the breach of contract. In its development we used the dogmatic method, from the systematic analysis of the rules of the CISG, doctrine and case law.

  20. Cluster Membership Probability: Polarimetric Approach

    CERN Document Server

    Medhi, Biman J

    2013-01-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper-motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q(%) and u(%) for the proper-motion member stars depends on the ...

  1. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  2. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  3. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  4. Probability on real Lie algebras

    CERN Document Server

    Franz, Uwe

    2016-01-01

    This monograph is a progressive introduction to non-commutativity in probability theory, summarizing and synthesizing recent results about classical and quantum stochastic processes on Lie algebras. In the early chapters, focus is placed on concrete examples of the links between algebraic relations and the moments of probability distributions. The subsequent chapters are more advanced and deal with Wigner densities for non-commutative couples of random variables, non-commutative stochastic processes with independent increments (quantum Lévy processes), and the quantum Malliavin calculus. This book will appeal to advanced undergraduate and graduate students interested in the relations between algebra, probability, and quantum theory. It also addresses a more advanced audience by covering other topics related to non-commutativity in stochastic calculus, Lévy processes, and the Malliavin calculus.

  5. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  6. Innovation and social probable knowledge

    OpenAIRE

    Marco Crocco

    2000-01-01

    In this paper some elements of Keynes's theory of probability are used to understand the process of diffusion of an innovation. Based on a work done elsewhere (Crocco 1999, 2000), we argue that this process can be viewed as a process of dealing with the collective uncertainty about how to sort a technological problem. Expanding the concepts of weight of argument and probable knowledge to deal with this kind of uncertainty we argue that the concepts of social weight of argument and social prob...

  7. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  8. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  9. Calculation Model and Simulation of Warship Damage Probability

    Institute of Scientific and Technical Information of China (English)

    TENG Zhao-xin; ZHANG Xu; YANG Shi-xing; ZHU Xiao-ping

    2008-01-01

    The combat efficiency of mine obstacle is the focus of the present research. Based on the main effects that mine obstacle has on the target warship damage probability such as: features of mines with maneuverability, the success rate of mine-laying, the hit probability, mine reliability and action probability, a calculation model of target warship mine-encounter probability is put forward under the condition that the route selection of target warships accords with even distribution and the course of target warships accords with normal distribution. And a damage probability model of mines with maneuverability to target warships is set up, a simulation way proved the model to be a high practicality.

  10. Comments on quantum probability theory.

    Science.gov (United States)

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  11. Exact Probability Distribution versus Entropy

    Directory of Open Access Journals (Sweden)

    Kerstin Andersson

    2014-10-01

    Full Text Available The problem  addressed concerns the determination of the average number of successive attempts  of guessing  a word of a certain  length consisting of letters with given probabilities of occurrence. Both first- and second-order approximations  to a natural language are considered.  The guessing strategy used is guessing words in decreasing order of probability. When word and alphabet sizes are large, approximations  are necessary in order to estimate the number of guesses.  Several  kinds of approximations  are discussed demonstrating moderate requirements regarding both memory and central processing unit (CPU time. When considering realistic  sizes of alphabets and words (100, the number of guesses can be estimated  within minutes with reasonable accuracy (a few percent and may therefore constitute an alternative to, e.g., various entropy expressions.  For many probability  distributions,  the density of the logarithm of probability products is close to a normal distribution. For those cases, it is possible to derive an analytical expression for the average number of guesses. The proportion  of guesses needed on average compared to the total number  decreases almost exponentially with the word length. The leading term in an asymptotic  expansion can be used to estimate the number of guesses for large word lengths. Comparisons with analytical lower bounds and entropy expressions are also provided.

  12. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  13. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  14. Fuzzy Markov chains: uncertain probabilities

    OpenAIRE

    2002-01-01

    We consider finite Markov chains where there are uncertainties in some of the transition probabilities. These uncertainties are modeled by fuzzy numbers. Using a restricted fuzzy matrix multiplication we investigate the properties of regular, and absorbing, fuzzy Markov chains and show that the basic properties of these classical Markov chains generalize to fuzzy Markov chains.

  15. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  16. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha...

  17. A Novel Approach to Probability

    CERN Document Server

    Kafri, Oded

    2016-01-01

    When P indistinguishable balls are randomly distributed among L distinguishable boxes, and considering the dense system in which P much greater than L, our natural intuition tells us that the box with the average number of balls has the highest probability and that none of boxes are empty; however in reality, the probability of the empty box is always the highest. This fact is with contradistinction to sparse system in which the number of balls is smaller than the number of boxes (i.e. energy distribution in gas) in which the average value has the highest probability. Here we show that when we postulate the requirement that all possible configurations of balls in the boxes have equal probabilities, a realistic "long tail" distribution is obtained. This formalism when applied for sparse systems converges to distributions in which the average is preferred. We calculate some of the distributions resulted from this postulate and obtain most of the known distributions in nature, namely, Zipf law, Benford law, part...

  18. Probability representations of fuzzy systems

    Institute of Scientific and Technical Information of China (English)

    LI Hongxing

    2006-01-01

    In this paper, the probability significance of fuzzy systems is revealed. It is pointed out that COG method, a defuzzification technique used commonly in fuzzy systems, is reasonable and is the optimal method in the sense of mean square. Based on different fuzzy implication operators, several typical probability distributions such as Zadeh distribution, Mamdani distribution, Lukasiewicz distribution, etc. are given. Those distributions act as "inner kernels" of fuzzy systems. Furthermore, by some properties of probability distributions of fuzzy systems, it is also demonstrated that CRI method, proposed by Zadeh, for constructing fuzzy systems is basically reasonable and effective. Besides, the special action of uniform probability distributions in fuzzy systems is characterized. Finally, the relationship between CRI method and triple I method is discussed. In the sense of construction of fuzzy systems, when restricting three fuzzy implication operators in triple I method to the same operator, CRI method and triple I method may be related in the following three basic ways: 1) Two methods are equivalent; 2) the latter is a degeneration of the former; 3) the latter is trivial whereas the former is not. When three fuzzy implication operators in triple I method are not restricted to the same operator, CRI method is a special case of triple I method; that is, triple I method is a more comprehensive algorithm. Since triple I method has a good logical foundation and comprises an idea of optimization of reasoning, triple I method will possess a beautiful vista of application.

  19. Time-lapse imagery of the breaching of Marmot Dam, Oregon, and subsequent erosion of sediment by the Sandy River, October 2007 to May 2008

    Science.gov (United States)

    Major, Jon J.; Spicer, Kurt R.; Collins, Rebecca A.

    2010-01-01

    In 2007, Marmot Dam on the Sandy River, Oregon, was removed and a temporary cofferdam standing in its place was breached, allowing the river to flow freely along its entire length. Time-lapse imagery obtained from a network of digital single-lens reflex cameras placed around the lower reach of the sediment-filled reservoir behind the dam details rapid erosion of sediment by the Sandy River after breaching of the cofferdam. Within hours of the breaching, the Sandy River eroded much of the nearly 15-m-thick frontal part of the sediment wedge impounded behind the former concrete dam; within 24-60 hours it eroded approximately 125,000 m3 of sediment impounded in the lower 300-meter-reach of the reservoir. The imagery shows that the sediment eroded initially through vertical incision, but that lateral erosion rapidly became an important process.

  20. FIIS_Breach_Shorelines.shp - Fire Island National Seashore Wilderness Breach Shoreline Data Collected from Fire Island, New York, October 2014 to September 2016

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — Hurricane Sandy made U.S. landfall, coincident with astronomical high tides, near Atlantic City, New Jersey, on October 29, 2012. The storm, the largest on...