WorldWideScience

Sample records for high breaching probability

  1. An approach for estimating the breach probabilities of moraine-dammed lakes in the Chinese Himalayas using remote-sensing data

    Directory of Open Access Journals (Sweden)

    X. Wang

    2012-10-01

    Full Text Available To make first-order estimates of the probability of moraine-dammed lake outburst flood (MDLOF and prioritize the probabilities of breaching posed by potentially dangerous moraine-dammed lakes (PDMDLs in the Chinese Himalayas, an objective approach is presented. We first select five indicators to identify PDMDLs according to four predesigned criteria. The climatic background was regarded as the climatic precondition of the moraine-dam failure, and under different climatic preconditions, we distinguish the trigger mechanisms of MDLOFs and subdivide them into 17 possible breach modes, with each mode having three or four components; we combined the precondition, modes and components to construct a decision-making tree of moraine-dam failure. Conversion guidelines were established so as to quantify the probabilities of components of a breach mode employing the historic performance method combined with expert knowledge and experience. The region of the Chinese Himalayas was chosen as a study area where there have been frequent MDLOFs in recent decades. The results show that the breaching probabilities (P of 142 PDMDLs range from 0.037 to 0.345, and they can be further categorized as 43 lakes with very high breach probabilities (P ≥ 0.24, 47 lakes with high breach probabilities (0.18 ≤ P < 0.24, 24 lakes with mid-level breach probabilities (0.12 ≤ P < 0.18, 24 lakes with low breach probabilities (0.06 ≤ P < 0.12, and four lakes with very low breach probabilities (p < 0.06.

  2. Data Breach Preparation

    Energy Technology Data Exchange (ETDEWEB)

    Belangia, David Warren [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-13

    The Home Depot Data Breach is the second largest data breach on record. It has or will affect up to 56 million debit or credit cards. A trusted vendor account, coupled with the use of a previously unknown variant of malware that allowed the establishment of a foothold, was the entry point into the Home Depot network. Once inside the perimeter, privilege escalation provided an avenue to obtain the desired information. Home Depot did, however, learn some lessons from Target. Home Depot certainly communicated better than Target, procured insurance, and instituted as secure an environment as possible. There are specific measures an institution should undertake to prepare for a data breach, and everyone can learn from this breach. Publicly available information about the Home Depot Data Breach provides insight into the attack, an old malware variant with a new twist.While the malware was modified as to be unrecognizable with tools, it probably should have been detected. There are also concerns with Home Depot’s insurance and the insurance provider’s apparent lack of fully reimbursing Home Depot for their losses. The effect on shareholders and Home Depot’s stock price was short lived. This story is still evolving but provides interesting lessons learned concerning how an organization should prepare for it inevitable breach.

  3. Probabilistic assessment of spent-fuel cladding breach

    International Nuclear Information System (INIS)

    Foadian, H.; Rashid, Y.R.; Seager, K.D.

    1991-01-01

    A methodology for determining the probability spent-fuel cladding breach due to normal and accident class B cask transport conditions is introduced. This technique uses deterministic stress analysis results as well as probabilistic cladding material properties, initial flaws, and breach criteria. Best estimates are presented for the probability distributions of irradiated Zircaloy properties such as ductility and fracture toughness, and for fuel rod initial conditions such as manufacturing flaws and PCI part-wall cracks. Example analyses are used to illustrate the implementation of this methodology for a BWR (GE 7 x 7) and a PWR (B ampersand W 15 x 15) assembly. The cladding breach probabilities for each assembly are tabulated for regulatory normal and accident transport conditions including fire

  4. Probabilistic assessment of spent-fuel cladding breach

    International Nuclear Information System (INIS)

    Foadian, H.; Rashid, Y.R.; Seager, K.D.

    1992-01-01

    In this paper a methodology for determining the probability of spent-fuel cladding breach due to normal and accident class B cask transport conditions is introduced. This technique uses deterministic stress analysis results as well as probabilistic cladding material properties, initial flaws, and breach criteria. Best estimates are presented for the probability distributions of irradiated Zircaloy properties such as ductility and fracture toughness, and for fuel rod initial conditions such as manufacturing flaws and PCI part-wall cracks. Example analyses are used to illustrate the implementation of this methodology for a BWR (GE 7 x 7) and a PWR (B and W 15 x 15) assembly. The cladding breach probabilities for each assembly are tabulated for regulatory normal and accident transport conditions including fire

  5. Tapping Transaction Costs to Forecast Acquisition Cost Breaches

    Science.gov (United States)

    2016-01-01

    experience a cost breach. In our medical example, we could use survival analysis to identify risk fac- tors, such as obesity , that might indicate a greater... exogenous variables on the probability of a dichotomous outcome, such as whether or not a cost breach occurs in any given program year. Logit is

  6. Intermittent ephemeral river-breaching

    Science.gov (United States)

    Reniers, A. J.; MacMahan, J. H.; Gallagher, E. L.; Shanks, A.; Morgan, S.; Jarvis, M.; Thornton, E. B.; Brown, J.; Fujimura, A.

    2012-12-01

    In the summer of 2011 we performed a field experiment in Carmel River State Beach, CA, at a time when the intermittent natural breaching of the ephemeral Carmel River occurred due to an unusually rainy period prior to the experiment associated with El Nino. At this time the river would fill the lagoon over the period of a number of days after which a breach would occur. This allowed us to document a number of breaches with unique pre- and post-breach topographic surveys, accompanying ocean and lagoon water elevations as well as extremely high flow (4m/s) velocities in the river mouth during the breaching event. The topographic surveys were obtained with a GPS-equipped backpack mounted on a walking human and show the evolution of the river breaching with a gradually widening and deepening river channel that cuts through the pre-existing beach and berm. The beach face is qualified as a steep with an average beach slope of 1:10 with significant reflection of the incident waves (MacMahan et al., 2012). The wave directions are generally shore normal as the waves refract over the deep canyon that is located offshore of the beach. The tide is mixed semi-diurnal with a range on the order of one meter. Breaching typically occurred during the low-low tide. Grain size is highly variable along the beach with layers of alternating fine and coarse material that could clearly be observed as the river exit channel was cutting through the beach. Large rocky outcroppings buried under the beach sand are also present along certain stretches of the beach controlling the depth of the breaching channel. The changes in the water level measured within the lagoon and the ocean side allows for an estimate of the volume flux associated with the breach as function of morphology, tidal elevation and wave conditions as well as an assessment of the conditions and mechanisms of breach closure, which occurred on the time scale of O(0.5 days). Exploratory model simulations will be presented at the

  7. Simulation of Breach Outflow for Earthfill Dam

    International Nuclear Information System (INIS)

    Razad, Azwin Zailti Abdul; Muda, Rahsidi Sabri; Sidek, Lariyah Mohd; Azia, Intan Shafilah Abdul; Mansor, Faezah Hanum; Yalit, Ruzaimei

    2013-01-01

    Dams have been built for many reasons such as irrigation, hydropower, flood mitigation, and water supply to support development for the benefit of human. However, the huge amount of water stored behind the dam can seriously pose adverse impacts to the downstream community should it be released due to unwanted dam break event. To minimise the potential loss of lives and property damages, a workable Emergency Response Plan is required to be developed. As part of a responsible dam owner and operator, TNB initiated a study on dam breach modelling for Cameron Highlands Hydroelectric Scheme to simulate the potential dam breach for Jor Dam. Prediction of dam breach parameters using the empirical equations of Froehlich and Macdonal-Langridge-Monopolis formed the basis of the modelling, coupled with MIKE 11 software to obtain the breach outflow due to Probable Maximum Flood (PMF). This paper will therefore discuss the model setup, simulation procedure and comparison of the prediction with existing equations.

  8. Dam Break Analysis of Embankment Dams Considering Breach Characteristics

    Directory of Open Access Journals (Sweden)

    Abolfazl Shamsaei

    2004-05-01

    Full Text Available The study of dam's break, needs the definition of various parameters such as the break cause, its type, its dimension and the duration of breach development. The precise forecast for different aspects of the breach is one of the most important factors for analyzing it in embankment dam. The characteristics of the breach and determination of their vulnerability has the most effect on the waves resulting from dam break. Investigating, about the parameters of the breach in "Silveh" earth dam have been determined using the suitable model. In Silve dam a trapezoid breach with side slope z=0.01m and the average base line b=80m was computed. The duration of the breaches development is 1.9 hour. Regarding the above results and the application of DAM Break software the consequences of the probable break of the dam was determined. The analysis of the results of water covering of the city of Piranshahr located 12km from silve dam confirms that in 3 hours the water will reach the height (level of 1425 meters.

  9. Psychological contract breach among allied health professionals.

    Science.gov (United States)

    Rodwell, John; Gulyas, Andre

    2015-01-01

    Allied health professionals are vital for effective healthcare yet there are continuing shortages of these employees. Building on work with other healthcare professionals, the purpose of this paper is to investigate the influence of psychological contract (PC) breach and types of organisational justice on variables important to retention among allied health professionals: mental health and organisational commitment. The potential effects of justice on the negative outcomes of breach were examined. Multiple regressions analysed data from 113 allied health professionals working in a medium-large Australian healthcare organisation. The main negative impacts on respondents' mental health and commitment were from high PC breach, low procedural and distributive justice and less respectful treatment from organisational representatives. The interaction between procedural justice and breach illustrates that breach may be forgivable if processes are fair. Surprisingly, a betrayal or "aggravated breach effect" may occur after a breach when interpersonal justice is high. Further, negative affectivity was negatively related to respondents' mental health (affective outcomes) but not commitment (work-related attitude). Healthcare organisations should ensure the fairness of decisions and avoid breaking promises within their control. If promises cannot reasonably be kept, transparency of processes behind the breach may allow allied health professionals to understand that the organisation did not purposefully fail to fulfil expectations. This study offers insights into how breach and four types of justice interact to influence employee mental health and work attitudes among allied health professionals.

  10. Dam-breach analysis and flood-inundation mapping for Lakes Ellsworth and Lawtonka near Lawton, Oklahoma

    Science.gov (United States)

    Rendon, Samuel H.; Ashworth, Chad E.; Smith, S. Jerrod

    2012-01-01

    Dams provide beneficial functions such as flood control, recreation, and reliable water supplies, but they also entail risk: dam breaches and resultant floods can cause substantial property damage and loss of life. The State of Oklahoma requires each owner of a high-hazard dam, which the Federal Emergency Management Agency defines as dams for which failure or misoperation probably will cause loss of human life, to develop an emergency action plan specific to that dam. Components of an emergency action plan are to simulate a flood resulting from a possible dam breach and map the resulting downstream flood-inundation areas. The resulting flood-inundation maps can provide valuable information to city officials, emergency managers, and local residents for planning the emergency response if a dam breach occurs. Accurate topographic data are vital for developing flood-inundation maps. This report presents results of a cooperative study by the city of Lawton, Oklahoma, and the U.S. Geological Survey (USGS) to model dam-breach scenarios at Lakes Ellsworth and Lawtonka near Lawton and to map the potential flood-inundation areas of such dam breaches. To assist the city of Lawton with completion of the emergency action plans for Lakes Ellsworth and Lawtonka Dams, the USGS collected light detection and ranging (lidar) data that were used to develop a high-resolution digital elevation model and a 1-foot contour elevation map for the flood plains downstream from Lakes Ellsworth and Lawtonka. This digital elevation model and field measurements, streamflow-gaging station data (USGS streamflow-gaging station 07311000, East Cache Creek near Walters, Okla.), and hydraulic values were used as inputs for the dynamic (unsteady-flow) model, Hydrologic Engineering Center's River Analysis System (HEC-RAS). The modeled flood elevations were exported to a geographic information system to produce flood-inundation maps. Water-surface profiles were developed for a 75-percent probable maximum

  11. The extreme risk of personal data breaches and the erosion of privacy

    Science.gov (United States)

    Wheatley, Spencer; Maillart, Thomas; Sornette, Didier

    2016-01-01

    Personal data breaches from organisations, enabling mass identity fraud, constitute an extreme risk. This risk worsens daily as an ever-growing amount of personal data are stored by organisations and on-line, and the attack surface surrounding this data becomes larger and harder to secure. Further, breached information is distributed and accumulates in the hands of cyber criminals, thus driving a cumulative erosion of privacy. Statistical modeling of breach data from 2000 through 2015 provides insights into this risk: A current maximum breach size of about 200 million is detected, and is expected to grow by fifty percent over the next five years. The breach sizes are found to be well modeled by an extremely heavy tailed truncated Pareto distribution, with tail exponent parameter decreasing linearly from 0.57 in 2007 to 0.37 in 2015. With this current model, given a breach contains above fifty thousand items, there is a ten percent probability of exceeding ten million. A size effect is unearthed where both the frequency and severity of breaches scale with organisation size like s0.6. Projections indicate that the total amount of breached information is expected to double from two to four billion items within the next five years, eclipsing the population of users of the Internet. This massive and uncontrolled dissemination of personal identities raises fundamental concerns about privacy.

  12. What Caused the Breach? An Examination of Use of Information Technology and Health Data Breaches

    Science.gov (United States)

    Wikina, Suanu Bliss

    2014-01-01

    Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states—Virginia, Illinois, California, Florida, New York, and Tennessee—in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates. PMID:25593574

  13. What caused the breach? An examination of use of information technology and health data breaches.

    Science.gov (United States)

    Wikina, Suanu Bliss

    2014-01-01

    Data breaches arising from theft, loss, unauthorized access/disclosure, improper disclosure, or hacking incidents involving personal health information continue to increase every year. As of September 2013, reported breaches affecting individuals reached close to 27 million since 2009, when compilation of records on breaches began. These breaches, which involved 674 covered entities and 153 business associates, involved computer systems and networks, desktop computers, laptops, paper, e-mail, electronic health records, and removable/portable devices (CDs, USBs, x-ray films, backup tapes, etc.). Even with the increased use of health information technology by health institutions and allied businesses, theft and loss (not hacking) constitute the major types of data breaches encountered. Removable/portable devices, desktop computers, and laptops were the top sources or locations of the breached information, while the top six states-Virginia, Illinois, California, Florida, New York, and Tennessee-in terms of the number of reported breaches accounted for nearly 75 percent of the total individual breaches, 33 percent of breaches in covered entities, and about 30 percent of the total breaches involving business associates.

  14. Dam-breach analysis and flood-inundation mapping for selected dams in Oklahoma City, Oklahoma, and near Atoka, Oklahoma

    Science.gov (United States)

    Shivers, Molly J.; Smith, S. Jerrod; Grout, Trevor S.; Lewis, Jason M.

    2015-01-01

    Dams provide beneficial functions such as flood control, recreation, and storage of water supplies, but they also entail risk; dam breaches and resultant floods can cause substantial property damage and loss of life. The State of Oklahoma requires each owner of a high-hazard dam, which the Federal Emergency Management Agency defines as dams for which failure or improper operation probably will cause loss of human life, to develop an emergency action plan specific to that dam. Components of an emergency action plan are to simulate a flood resulting from a possible dam breach and map the resulting downstream flood-inundation areas. The resulting flood-inundation maps can provide valuable information to city officials, emergency managers, and local residents for planning an emergency response if a dam breach occurs.

  15. 38 CFR 75.113 - Data breach.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Data breach. 75.113 Section 75.113 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.113 Data breach. Consistent with the definition of data breach in § 75.112 of this subpart, a data breach...

  16. Applying Mechanistic Dam Breach Models to Historic Levee Breaches

    OpenAIRE

    Risher Paul; Gibson Stanford

    2016-01-01

    Hurricane Katrina elevated levee risk in the US national consciousness, motivating agencies to assess and improve their levee risk assessment methodology. Accurate computation of the flood flow magnitude and timing associated with a levee breach remains one of the most difficult and uncertain components of levee risk analysis. Contemporary methods are largely empirical and approximate, introducing substantial uncertainty to the damage and life loss models. Levee breach progressions are often ...

  17. Breached fuel pin contamination from Run Beyond Cladding Breach (RBCB) tests in EBR-II

    International Nuclear Information System (INIS)

    Colburn, R.P.; Strain, R.V.; Lambert, J.D.B.; Ukai, S.; Shibahara, I.

    1988-09-01

    Studies indicate there may be a large economic incentive to permit some continued reactor operation with breached fuel pin cladding. A major concern for this type of operation is the potential spread of contamination in the primary coolant system and its impact on plant maintenance. A study of the release and transport of contamination from naturally breached mixed oxide Liquid Metal Reactor (LMR) fuel pins was performed as part of the US Department of Energy/Power Reactor and Nuclear Fuel Development Corporation (DOE/PNC) Run Beyond Cladding Breach (RBCB) Program at EBR-II. The measurements were made using the Breached Fuel Test Facility (BFTF) at EBR-II with replaceable deposition samplers located approximately 1.5 meters from the breached fuel test assemblies. The effluent from the test assemblies containing the breached fuel pins was routed up through the samplers and past dedicated instrumentation in the BFTF before mixing with the main coolant flow stream. This paper discusses the first three contamination tests in this program. 2 refs., 5 figs., 2 tabs

  18. In-reactor cladding breach of EBR-II driver-fuel elements

    International Nuclear Information System (INIS)

    Seidel, B.R.; Einziger, R.E.

    1977-01-01

    Knowledge of performance and minimum useful element lifetime of Mark-II driver-fuel elements is required to maintain a high plant operating capacity factor with maximum fuel utilization. To obtain such knowledge, intentional cladding breach has been obtained in four run-to-cladding-breach Mark-II experimental driver-fuel subassemblies operating under normal conditions in EBR-II. Breach and subsequent fission-product release proved benign to reactor operations. The breaches originated on the outer surface of the cladding in the root of the restrainer dimples and were intergranular. The Weibull distribution of lifetime accurately predicts the observed minimum useful element lifetime of 10 at.% burnup, with breach ensuing shortly thereafter

  19. Breached cylinder incident at the Portsmouth gaseous diffusion plant

    Energy Technology Data Exchange (ETDEWEB)

    Boelens, R.A. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States)

    1991-12-31

    On June 16, 1990, during an inspection of valves on partially depleted product storage cylinders, a 14-ton partially depleted product cylinder was discovered breached. The cylinder had been placed in long-term storage in 1977 on the top row of Portsmouth`s (two rows high) storage area. The breach was observed when an inspector noticed a pile of green material along side of the cylinder. The breach was estimated to be approximately 8- inches wide and 16-inches long, and ran under the first stiffening ring of the cylinder. During the continuing inspection of the storage area, a second 14-ton product cylinder was discovered breached. This cylinder was stacked on the bottom row in the storage area in 1986. This breach was also located adjacent to a stiffening ring. This paper will discuss the contributing factors of the breaching of the cylinders, the immediate response, subsequent actions in support of the investigation, and corrective actions.

  20. A simplified physically-based breach model for a high concrete-faced rockfill dam: A case study

    OpenAIRE

    Qi-ming Zhong; Sheng-shui Chen; Zhao Deng

    2018-01-01

    A simplified physically-based model was developed to simulate the breaching process of the Gouhou concrete-faced rockfill dam (CFRD), which is the only breach case of a high CFRD in the world. Considering the dam height, a hydraulic method was chosen to simulate the initial scour position on the downstream slope, with the steepening of the downstream slope taken into account; a headcut erosion formula was adopted to simulate the backward erosion as well. The moment equilibrium method was util...

  1. Experimental investigation of fluvial dike breaching due to flow overtopping

    Science.gov (United States)

    El Kadi Abderrezzak, K.; Rifai, I.; Erpicum, S.; Archambeau, P.; Violeau, D.; Pirotton, M.; Dewals, B.

    2017-12-01

    breaching. These specific features need to be incorporated in flood risk analyses involving fluvial dike breach and failure. In addition, a well-documented, reliable data set, with a continuous high resolution monitoring of the 3D breach evolution under various flow conditions, has been gathered, which can be used for validating numerical models.

  2. Antecedents of Psychological Contract Breach: The Role of Job Demands, Job Resources, and Affect.

    Directory of Open Access Journals (Sweden)

    Tim Vantilborgh

    Full Text Available While it has been shown that psychological contract breach leads to detrimental outcomes, relatively little is known about factors leading to perceptions of breach. We examine if job demands and resources predict breach perceptions. We argue that perceiving high demands elicits negative affect, while perceiving high resources stimulates positive affect. Positive and negative affect, in turn, influence the likelihood that psychological contract breaches are perceived. We conducted two experience sampling studies to test our hypotheses: the first using daily surveys in a sample of volunteers, the second using weekly surveys in samples of volunteers and paid employees. Our results confirm that job demands and resources are associated with negative and positive affect respectively. Mediation analyses revealed that people who experienced high job resources were less likely to report psychological contract breach, because they experienced high levels of positive affect. The mediating role of negative affect was more complex, as it increased the likelihood to perceive psychological contract breach, but only in the short-term.

  3. Antecedents of Psychological Contract Breach: The Role of Job Demands, Job Resources, and Affect.

    Science.gov (United States)

    Vantilborgh, Tim; Bidee, Jemima; Pepermans, Roland; Griep, Yannick; Hofmans, Joeri

    2016-01-01

    While it has been shown that psychological contract breach leads to detrimental outcomes, relatively little is known about factors leading to perceptions of breach. We examine if job demands and resources predict breach perceptions. We argue that perceiving high demands elicits negative affect, while perceiving high resources stimulates positive affect. Positive and negative affect, in turn, influence the likelihood that psychological contract breaches are perceived. We conducted two experience sampling studies to test our hypotheses: the first using daily surveys in a sample of volunteers, the second using weekly surveys in samples of volunteers and paid employees. Our results confirm that job demands and resources are associated with negative and positive affect respectively. Mediation analyses revealed that people who experienced high job resources were less likely to report psychological contract breach, because they experienced high levels of positive affect. The mediating role of negative affect was more complex, as it increased the likelihood to perceive psychological contract breach, but only in the short-term.

  4. 13 CFR 115.69 - Imminent Breach.

    Science.gov (United States)

    2010-01-01

    ... an Imminent Breach of the terms of a Contract covered by an SBA guaranteed bond. The PSB Surety does... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Imminent Breach. 115.69 Section... Surety Bond (PSB) Guarantees § 115.69 Imminent Breach. (a) No prior approval requirement. SBA will...

  5. Privacy Breach Analysis in Social Networks

    Science.gov (United States)

    Nagle, Frank

    This chapter addresses various aspects of analyzing privacy breaches in social networks. We first review literature that defines three types of privacy breaches in social networks: interactive, active, and passive. We then survey the various network anonymization schemes that have been constructed to address these privacy breaches. After exploring these breaches and anonymization schemes, we evaluate a measure for determining the level of anonymity inherent in a network graph based on its topological structure. Finally, we close by emphasizing the difficulty of anonymizing social network data while maintaining usability for research purposes and offering areas for future work.

  6. Preventing a data breach from becoming a disaster.

    Science.gov (United States)

    Goldberg, Ed

    2013-01-01

    Organisations have traditionally dealt with data breaches by investing in protective measures without a great deal of attention to mitigation of breach consequences and response. Conversely, business continuity (BC) planning has traditionally focused on mitigating disasters, not on preventing them. From a BC planning perspective, organisations need to assume that a data breach is inevitable and plan accordingly. The spate of data breaches in these past few years hit many organisations that were well protected. Those that suffered disastrous consequences as a result of a data breach lacked effective mitigation and response, not protection. The complexity and speed of an effective data breach response require that detailed planning takes place in advance of a breach.

  7. Seasonal breaching of coastal barriers

    NARCIS (Netherlands)

    Tuan, Thieu Quang

    2007-01-01

    Natural or unintended breaching can be catastrophic, causing loss of human lives and damage to infrastructures, buildings and natural habitats. Quantitative understand-ing of coastal barrier breaching is therefore of great importance to vulnerability as-sessment of protection works as well as to

  8. Reactions to psychological contract breaches and organizational citizenship behaviours: An experimental manipulation of severity.

    Science.gov (United States)

    Atkinson, Theresa P; Matthews, Russell A; Henderson, Alexandra A; Spitzmueller, Christiane

    2018-01-30

    Grounded in affective events theory, we investigated the effects of experimentally manipulated psychological contract breaches on participants' feelings of violation, subsequent perceptions of psychological contract strength, and organizational citizenship behaviours in a sample of working adults. Results support previous findings that pre-existing relational psychological contract strength interacts with severity of unmet promises or expectations. Specifically, individuals with high relational contracts who experience low severity of unmet promises/expectations have the lowest breach perceptions, whereas individuals with high relational contracts who experience more severe levels unmet promises/expectations experience the highest level of breach perceptions. Results also support the concept of a breach spiral in that prior perceptions of breach led to an increased likelihood of subsequent perceptions of breach following the experimental manipulation. Furthermore, consistent with affective events theory, results support the argument that a psychological contract breach's effect on specific organizational citizenship behaviours is mediated by feelings of violation and the reassessment of relational contracts. These effects were present even after controlling for the direct effects of the manipulated severity of unmet promises/expectations. Copyright © 2018 John Wiley & Sons, Ltd.

  9. User Compensation as a Data Breach Recovery Action: An Investigation of the Sony PlayStation Network Breach.

    OpenAIRE

    Venkatesh, Viswanath

    2017-01-01

    Drawing on expectation confirmation research, we develop hypotheses regarding the effect of compensation on key customer outcomes following a major data breach and consequent service recovery effort. Data were collected in a longitudinal field study of Sony customers during their data breach in 2011. A total of 144 customers participated in the two-phase data collection that began when the breach was announced and concluded after reparations were made. Using polynomial modeling an...

  10. Data breaches. Final rule.

    Science.gov (United States)

    2008-04-11

    This document adopts, without change, the interim final rule that was published in the Federal Register on June 22, 2007, addressing data breaches of sensitive personal information that is processed or maintained by the Department of Veterans Affairs (VA). This final rule implements certain provisions of the Veterans Benefits, Health Care, and Information Technology Act of 2006. The regulations prescribe the mechanisms for taking action in response to a data breach of sensitive personal information.

  11. Douglas County Dam Breach Inundation Areas

    Data.gov (United States)

    Kansas Data Access and Support Center — Dam breach analysis provides a prediction of the extent and timing of flooding from a catastrophic breach of the dams. These results are sufficient for developing...

  12. Outcomes associated with breach and fulfillment of the psychological contract of safety.

    Science.gov (United States)

    Walker, Arlene

    2013-12-01

    The study investigated the outcomes associated with breach and fulfillment of the psychological contract of safety. The psychological contract of safety is defined as the beliefs of individuals about reciprocal employer and employee safety obligations inferred from implicit or explicit promises. When employees perceive that safety obligations promised by the employer have not been met, a breach of the psychological contract occurs, termed employer breach of obligations. The extent to which employees fulfill their safety obligations to the employer is termed employee fulfillment of obligations. Structural equation modeling was used to test a model of safety that investigated the positive and negative outcomes associated with breach and fulfillment of the psychological contract of safety. Participants were 424 health care workers recruited from two hospitals in the State of Victoria, Australia. Following slight modification of the hypothesized model, a good fitting model resulted. Being injured in the workplace was found to lower perceptions of trust in the employer and increase perceptions of employer breach of safety obligations. Trust in the employer significantly influenced perceived employer breach of safety obligations such that lowered trust resulted in higher perceptions of breach. Perceptions of employer breach significantly impacted employee fulfillment of safety obligations with high perceptions of breach resulting in low employee fulfillment of obligations. Trust and perceptions of breach significantly influenced safety attitudes, but not safety behavior. Fulfillment of employee safety obligations significantly impacted safety behavior, but not safety attitudes. Implications of these findings for safety and psychological contract research are explored. A positive emphasis on social exchange relationships in organizations will have positive outcomes for safety climate and safety behavior. © 2013.

  13. Data breach locations, types, and associated characteristics among US hospitals.

    Science.gov (United States)

    Gabriel, Meghan Hufstader; Noblin, Alice; Rutherford, Ashley; Walden, Amanda; Cortelyou-Ward, Kendall

    2018-02-01

    The objectives of this study were to describe the locations in hospitals where data are breached, the types of breaches that occur most often at hospitals, and hospital characteristics, including health information technology (IT) sophistication and biometric security capabilities, that may be predicting factors of large data breaches that affect 500 or more patients. The Office of Civil Rights breach data from healthcare providers regarding breaches that affected 500 or more individuals from 2009 to 2016 were linked with hospital characteristics from the Health Information Management Systems Society and the American Hospital Association Health IT Supplement databases. Descriptive statistics were used to characterize hospitals with and without breaches, data breach type, and location/mode of data breaches in hospitals. Multivariate logistic regression analysis explored hospital characteristics that were predicting factors of a data breach affecting at least 500 patients, including area characteristics, region, health system membership, size, type, biometric security use, health IT sophistication, and ownership. Of all types of healthcare providers, hospitals accounted for approximately one-third of all data breaches and hospital breaches affected the largest number of individuals. Paper and films were the most frequent location of breached data, occurring in 65 hospitals during the study period, whereas network servers were the least common location but their breaches affected the most patients overall. Adjusted multivariate results showed significant associations among data breach occurrences and some hospital characteristics, including type and size, but not others, including health IT sophistication or biometric use for security. Hospitals should conduct routine audits to allow them to see their vulnerabilities before a breach occurs. Additionally, information security systems should be implemented concurrently with health information technologies. Improving

  14. Reasons for Picture Archiving and Communication System (PACS data security breaches: Intentional versus non-intentional breaches

    Directory of Open Access Journals (Sweden)

    Tintswalo B. Mahlaola

    2016-10-01

    Objective: The purpose of this article is to explore the nature of and reasons for confidentiality breaches by PACS users in a South African context. Methods: A closed-ended questionnaire was used to collect quantitative data from 115 health professionals employed in a private hospital setting, including its radiology department and a second independent radiology department. The questionnaire sought to explore the attitudes of participants towards confidentiality breeches and reasons for suchbehaviour. Results: Breach incidences were expressed as percentage compliance and classified according to the nature and reasons provided by Sarkar's breach classification. Cross tabulations indicated a statistical significance (p < 0.00 between the expected and observed confidentiality practices of participants and also the adequacy of training, system knowledge and policy awareness. Conclusion: Our study supports previous findings that, in the absence of guidelines, most security breaches were non-intentional acts committed due to ignorance. Of concern are incidents in which sensitive information was intentionally shared via social media.

  15. SWOT analysis of breach models for common dike failure mechanisms

    NARCIS (Netherlands)

    Peeters, P.; Van Hoestenberghe, T.; Vincke, L.; Visser, P.J.

    2011-01-01

    The use of breach models includes two tasks: predicting breach characteristics and estimating flow through the breach. Strengths and weaknesses as well as opportunities and threats of different simplified and detailed physically-based breach models are listed following theoretical and practical

  16. Federal Information Security and Data Breach Notification Laws

    Science.gov (United States)

    2009-01-29

    The following report describes information security and data breach notification requirements included in the Privacy Act, the Federal Information...information for unauthorized purposes. Data breach notification laws typically require covered entities to implement a breach notification policy, and...Feinstein), S. 495 (Leahy), and S. 1178 (Inouye)--were reported favorably out of Senate committees. Those bills include information security and data

  17. Teaching Case: Security Breach at Target

    Science.gov (United States)

    Plachkinova, Miloslava; Maurer, Chris

    2018-01-01

    This case study follows the security breach that affected Target at the end of 2013 and resulted in the loss of financial data for over 70 million customers. The case provides an overview of the company and describes the reasons that led to one of the biggest security breaches in history. It offers a discussion on Target's vendor management…

  18. Data security breaches and privacy in Europe

    CERN Document Server

    Wong, Rebecca

    2013-01-01

    Data Security Breaches and Privacy in Europe aims to consider data protection and cybersecurity issues; more specifically, it aims to provide a fruitful discussion on data security breaches. A detailed analysis of the European Data Protection framework will be examined. In particular, the Data Protection Directive 95/45/EC, the Directive on Privacy and Electronic Communications and the proposed changes under the Data Protection Regulation (data breach notifications) and its implications are considered. This is followed by an examination of the Directive on Attacks against information systems a

  19. The characterization and monitoring of metallic fuel breaches in EBR-2

    International Nuclear Information System (INIS)

    Pahl, R.G.; Batte, G.L.; Mikaili, R.; Lambert, J.D.B.; Hofman, G.L.

    1991-01-01

    This paper discusses the characterization and monitoring of metallic fuel breaches which is now a significant part of the Integral Fast Reactor fuel testing program at Argonne National Laboratory. Irradiation experience with failed metallic fuel now includes natural breaches in the plenum and fuel column regions in lead ''endurance'' tests as well as fuel column breaches in artificially-defected fuel which have operated for months in the run-beyond-cladding breach (RBCB) mode. Analyses of the fission gas (FG) release-to-birth (R/B) ratios of selected historical breaches have been completed and have proven to be very useful in differentiating between plenum and fuel column breaches

  20. 25 CFR 163.42 - Obligated service and breach of contract.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Obligated service and breach of contract. 163.42 Section... breach of contract. (a) Obligated service. (1) Individuals completing forestry education programs with an... request for waiver. (b) Breach of contract. Any individual who has participated in and accepted financial...

  1. Breach to Nowhere

    Science.gov (United States)

    Schaffhauser, Dian

    2009-01-01

    Will that data breach be the end of a chief security officer (CSO)? Managing information security in higher education requires more than just technical expertise, especially when the heat is cranked up. This article takes a look at how two CSOs deal with hack attacks at their universities. When Purdue University Chief Information Security Officer…

  2. A SWOT analysis of hydrodynamic models with respect to simulating breaching

    NARCIS (Netherlands)

    van Damme, M.; Visser, P.J.

    2015-01-01

    Deriving the bed shear stresses from hydrodynamic models in breach models is challenging due to the continuous changing hydraulic head over the breach in combination with horizontal and vertical flow contractions, and the continuous rapidly changing breach geometry. Three stages can be distinguished

  3. Conscientiousness and reactions to psychological contract breach: a longitudinal field study.

    Science.gov (United States)

    Orvis, Karin A; Dudley, Nicole M; Cortina, Jose M

    2008-09-01

    The authors examined the role of employee conscientiousness as a moderator of the relationships between psychological contract breach and employee behavioral and attitudinal reactions to the breach. They collected data from 106 newly hired employees within the 1st month of employment (Time 1), 3 months later (Time 2), and 8 months after Time 1 (Time 3) to observe the progression through contract development, breach, and reaction. Results suggest that conscientiousness is a significant moderator for 4 of the 5 contract breach-employee reaction relationships examined (turnover intentions, organizational loyalty, job satisfaction, and 1 of 2 facets of job performance). Specifically, employees who were lower in conscientiousness had more negative reactions to perceived breach with respect to turnover intentions, organizational loyalty, and job satisfaction. In contrast, employees who were higher in conscientiousness reduced their job performance to a greater degree in response to contract breach. Future research directions are discussed.

  4. Flood hydrology and dam-breach hydraulic analyses of four reservoirs in the Black Hills, South Dakota

    Science.gov (United States)

    Hoogestraat, Galen K.

    2011-01-01

    Extensive information about the construction of dams or potential downstream hazards in the event of a dam breach is not available for many small reservoirs within the Black Hills National Forest. In 2009, the U.S. Forest Service identified the need for reconnaissance-level dam-breach assessments for four of these reservoirs within the Black Hills National Forest (Iron Creek, Horsethief, Lakota, and Mitchell Lakes) with the potential to flood downstream structures. Flood hydrology and dam-breach hydraulic analyses for the four selected reservoirs were conducted by the U.S. Geological Survey in cooperation with the U.S. Forest service to estimate the areal extent of downstream inundation. Three high-flow breach scenarios were considered for cases when the dam is in place (overtopped) and when a dam break (failure) occurs: the 100-year recurrence 24-hour precipitation, 500-year recurrence peak flow, and the probable maximum precipitation. Inundation maps were developed that show the estimated extent of downstream floodwaters from simulated scenarios. Simulation results were used to determine the hazard classification of a dam break (high, significant, or low), based primarily on the potential for loss of life or property damage resulting from downstream inundation because of the flood surge.The inflow design floods resulting from the two simulated storm events (100-year 24-hour and probable maximum precipitation) were determined using the U.S. Army Corps of Engineers Hydrologic Engineering Center Hydrologic Modeling System (HEC-HMS). The inflow design flood for the 500-year recurrence peak flow was determined by using regional regression equations developed for streamflow-gaging stations with similar watershed characteristics. The step-backwater hydraulic analysis model, Hydrologic Engineering Center's River Analysis System (HEC-RAS), was used to determine water-surface profiles of in-place and dam-break scenarios for the three inflow design floods that were

  5. A laser profilometry technique for monitoring fluvial dike breaching in laboratory experiments

    Science.gov (United States)

    Dewals, Benjamin; Rifai, Ismail; Erpicum, Sébastien; Archambeau, Pierre; Violeau, Damien; Pirotton, Michel; El kadi Abderrezzak, Kamal

    2017-04-01

    A challenging aspect for experimental modelling of fluvial dike breaching is the continuous monitoring of the transient breach geometry. In dam breaching cases induced by flow overtopping over the whole breach crest (plane erosion), a side view through a glass wall is sufficient to monitor the breach formation. This approach can be extended for 3D dam breach tests (spatial erosion) if the glass wall is located along the breach centreline. In contrast, using a side view does not apply for monitoring fluvial dike breaching, because the breach is not symmetric in this case. We present a non-intrusive, high resolution technique to record the breach development in experimental models of fluvial dikes by means of a laser profilometry (Rifai et al. 2016). Most methods used for monitoring dam and dike breaching involve the projection of a pattern (fringes, grid) on the dam or dike body and the analysis of its deformation on images recorded during the breaching (e.g., Pickert et al. 2011, Frank and Hager 2014). A major limitation of these methods stems from reflection on the water surface, particularly in the vicinity of the breach where the free surface is irregular and rippled. This issue was addressed by Spinewine et al. (2004), who used a single laser sheet so that reflections on the water surface were strongly limited and did not hamper the accurate processing of each image. We have developed a similar laser profilometry technique tailored for laboratory experiments on fluvial dike breaching. The setup is simple and relatively low cost. It consists of a digital video camera (resolution of 1920 × 1080 pixels at 60 frames per second) and a swiping red diode 30 mW laser that enables the projection of a laser sheet over the dike body. The 2D image coordinates of each deformed laser profile incident on the dike are transformed into 3D object coordinates using the Direct Linear Transformation (DLT) algorithm. All 3D object coordinates computed over a swiping cycle of the

  6. Incomplete Information, Renegotiation, and Breach of Contract

    OpenAIRE

    Jihong Lee

    2005-01-01

    Once a contract has been agreed by two agents, the problem of renegotiating breach under two-sided asymmetric information on the agents' outside options is equivalent to the problem of bilateral trade with uncertain gains. Thus, the theorem of Myerson and Satterthwaite (1983) implies the impossibility of efficient renegotiation. We also show that, assuming no renegotiation, the optimal breach mechanism in this setting corresponds to the expectation damage rule.

  7. Influence of Personality on Perception of Psychological Contract Breach

    Directory of Open Access Journals (Sweden)

    Hassan Jafri

    2014-10-01

    Full Text Available The present research aimed to investigate the influence of personality (Five-Factor Model on Psychological Contract Breach. Using random sampling procedure, data were collected from 90 faculties of colleges of Royal University of Bhutan. Personality scales by John, Naumann, and Soto (2008 and Robinson and Morrison’s (2000 Psychological Contract Breach scale were used in this study. Correlation and regression analysis were carried out to analyze the obtained data. Results revealed that Extraversion and Neuroticism dimensions of the personality model have been found to be positively associated with the perception of breach. Employees who are by nature Agreeable and Conscientiousness are less likely to perceive breach in their psychological contract. Organization should look into the personality aspect while recruiting employees. If employees are hired with certain personality traits, they may focus on their performance and organizational growth.

  8. Inversion Method for Early Detection of ARES-1 Case Breach Failure

    Science.gov (United States)

    Mackey, Ryan M.; Kulikov, Igor K.; Bajwa, Anupa; Berg, Peter; Smelyanskiy, Vadim

    2010-01-01

    A document describes research into the problem of detecting a case breach formation at an early stage of a rocket flight. An inversion algorithm for case breach allocation is proposed and analyzed. It is shown how the case breach can be allocated at an early stage of its development by using the rocket sensor data and the output data from the control block of the rocket navigation system. The results are simulated with MATLAB/Simulink software. The efficiency of an inversion algorithm for a case breach location is discussed. The research was devoted to the analysis of the ARES-l flight during the first 120 seconds after the launch and early prediction of case breach failure. During this time, the rocket is propelled by its first-stage Solid Rocket Booster (SRB). If a breach appears in SRB case, the gases escaping through it will produce the (side) thrust directed perpendicular to the rocket axis. The side thrust creates torque influencing the rocket attitude. The ARES-l control system will compensate for the side thrust until it reaches some critical value, after which the flight will be uncontrollable. The objective of this work was to obtain the start time of case breach development and its location using the rocket inertial navigation sensors and GNC data. The algorithm was effective for the detection and location of a breach in an SRB field joint at an early stage of its development.

  9. Do Data Breach Disclosure Laws Reduce Identity Theft?

    Science.gov (United States)

    Romanosky, Sasha; Telang, Rahul; Acquisti, Alessandro

    2011-01-01

    In the United States, identity theft resulted in corporate and consumer losses of $56 billion dollars in 2005, with up to 35 percent of known identity thefts caused by corporate data breaches. Many states have responded by adopting data breach disclosure laws that require firms to notify consumers if their personal information has been lost or…

  10. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  11. Morphologic evolution of the wilderness area breach at Fire Island, New York—2012–15

    Science.gov (United States)

    Hapke, Cheryl J.; Nelson, Timothy R.; Henderson, Rachel E.; Brenner, Owen T.; Miselis, Jennifer L.

    2017-09-18

    IntroductionHurricane Sandy, which made landfall on October 29, 2012, near Atlantic City, New Jersey, had a significant impact on the coastal system along the south shore of Long Island, New York. A record significant wave height of 9.6 meters (m) was measured at wave buoy 44025, approximately 48 kilometers offshore of Fire Island, New York. Surge and runup during the storm resulted in extensive beach and dune erosion and breaching of the Fire Island barrier island system at two locations, including a breach that formed within the Otis Pike Fire Island High Dune Wilderness area on the eastern side of Fire Island.The U.S. Geological Survey (USGS) has a long history of conducting morphologic change and processes research at Fire Island. One of the primary objectives of the current research effort is to understand the morphologic evolution of the barrier system on a variety of time scales (from storm scale to decade(s) to century). A number of studies that support the project objectives have been published. Prior to Hurricane Sandy, however, little information was available on specific storm-driven change in this region. The USGS received Hurricane Sandy supplemental funding (project GS2–2B: Linking Coastal Processes and Vulnerability, Fire Island, New York, Regional Study) to enhance existing research efforts at Fire Island. The existing research was greatly expanded to include inner continental shelf mapping and investigations of processes of inner shelf sediment transport; beach and dune response and recovery; and observation, analysis, and modeling of the newly formed breach in the Otis Pike High Dune Wilderness area, herein referred to as the wilderness breach. The breach formed at the site of Old Inlet, which was open from 1763 to 1825. The location of the initial island breaching does not directly correspond with topographic lows of the dunes, but instead the breach formed in the location of a cross-island boardwalk that was destroyed during Hurricane Sandy

  12. Values underlying perceptions of breach of the psychological contract

    Directory of Open Access Journals (Sweden)

    Leon Botha

    2010-10-01

    Research purpose: The study identifies the most important breaches and investigates which values underlie employee perceptions of breach of the psychological contract. It also addresses values that lead to employees interpreting incidents as breaches. Motivation for the study: The study calls on the fact that employees make inconsequential contributions to the terms of many formal employment contracts may imply that such contracts cannot be viewed as documents between equals. Research design, approach and method: The study identifies the most prominent breaches of the psychological contract and the values underlying the perceptions that violations have occurred. Main findings: The data revealed lack of promotion, poor interpersonal relations between colleagues and bad treatment by seniors as three main breaches of the contract, and social recognition, world of peace and sense of accomplishment as three dominant values that underlie perceptions of contract violation. Practical/managerial implications: The competent and intelligent manner in which lack of promotion is handled and communicated to employees is vital because it has implications for their willingness to contribute, their career prospects and their intention to stay in the organisation. Contribution/value-add: This research can serve as the basis for the development of survey or research instruments that are appropriate and relevant to the population.

  13. The Significance of Mandatory Data Breach Warnings to Identity Crime

    OpenAIRE

    Eric Holm; Geraldine Mackenzie

    2015-01-01

    The relationship between data breaches and identity crime has been scarcely explored in current literature. However, there is an important relationship between the misuse of personal identification information and identity crime as the former is in many respects the catalyst for the latter. Data breaches are one of the ways in which this personal identification information is obtained by identity criminals, and thereby any response to data breaches is likely to impact the incidence of identit...

  14. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    Science.gov (United States)

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  15. 2003 International High-Level Radioactive Waste Management Conference Breached Drip Shield Test and Validation of a TSPA Sub-Model

    International Nuclear Information System (INIS)

    Walton, Z.P.; Kam, J.T.

    2002-01-01

    The Engineered Barrier System (EBS) represents the system of human engineered barriers in the isolation of high-level radioactive waste in the proposed repository at Yucca Mountain. It is designed to complement and enhance the natural barriers to isolate and prevent the transport of radionuclides into the surrounding environment. The transport mechanism most frequently postulated for radionuclides is liquid water flux that has penetrated the EBS through corrosion breaches in the drip shield and waste packages (WP). A water flux-splitting model is used to predict flow through WP and drip shield breaches and is documented in the ''EBS Radionuclide Transport Abstraction''. A future revision of the ''EBS Radionuclide Transport Abstraction'' will be one component of the total system performance assessment--license application (TSPA-LA) for the Yucca Mountain repository. The flux-splitting model is conservative based on the following assumptions: (1) Drip impact occurs without a loss of water mass. (2) Dripping flux falls exactly at the crown of the drip shield as opposed to different locations on the curved surface, which will effect splashing and flow patterns. (3) The flux passing through a drip shield patch is proportional to the ratio of the length of the penetration in the axial direction to the total axial length of the drip shield. In this assumption all fluid that drips and flows from the drip shield crown toward a penetration will be collected if the axial locations of the source and patch coincide. (4) The potential for evaporation is ignored. Because of these conservatisms, the current version of the flux-splitting model is incapable of accounting for water that has been splashed from the impact location, the deviation of water paths (rivulets) from the axis of impact, and water loss due to evaporation. This paper will present the results of a series of breached drip shield tests used to collect empirical data for the initial validation and further

  16. An Analysis of Data Breach Notifications as Negative News

    Science.gov (United States)

    Veltsos, Jennifer R.

    2012-01-01

    Forty-six states require organizations to notify users when personally identifiable information has been exposed or when the organization's data security measures have been breached. This article describes a qualitative document analysis of 13 data breach notification templates from state and federal agencies. The results confirm much of the…

  17. Just in Time Research: Data Breaches in Higher Education

    Science.gov (United States)

    Grama, Joanna

    2014-01-01

    This "Just in Time" research is in response to recent discussions on the EDUCAUSE Higher Education Information Security Council (HEISC) discussion list about data breaches in higher education. Using data from the Privacy Rights Clearinghouse, this research analyzes data breaches attributed to higher education. The results from this…

  18. Uncertainties and constraints on breaching and their implications for flood loss estimation.

    Science.gov (United States)

    Muir Wood, Robert; Bateman, William

    2005-06-15

    Around the coasts of the southern North Sea, flood risk is mediated everywhere by the performance of natural and man-made flood defences. Under the conditions of extreme surge with tide water levels, the performance of the defences determines the extent of inland flooding. Sensitivity tests reveal the enormous increase in the volume of water that can pass through a defence once breaching is initiated, with a 1m reduction in sill elevation doubling the loss. Empirical observations of defence performance in major storm surges around the North Sea reveal some of the principal controls on breaching. For the same defence type, the maximum size and depth of a breach is a function of the integral of the hydraulic gradient across the defence, which is in turn determined by the elevation of the floodplain and the degree to which water can continue to flow inland away from the breach. The most extensive and lowest floodplains thereby "generate" the largest breaches. For surges that approach the crest height, the weaker the protection of the defence, the greater the number of breaches. Defence reinforcement reduces both the number and size of the breaches.

  19. Lattice Boltzmann Study on Seawall-Break Flows under the Influence of Breach and Buildings

    Science.gov (United States)

    Mei, Qiu-Ying; Zhang, Wen-Huan; Wang, Yi-Hang; Chen, Wen-Wen

    2017-10-01

    In the process of storm surge, the seawater often overflows and even destroys the seawall. The buildings near the shore are usually inundated by the seawater through the breach. However, at present, there is little study focusing on the effects of buildings and breach on the seawall-break flows. In this paper, the lattice Boltzmann (LB) model with nine velocities in two dimensions (D2Q9) for the shallow water equations is adopted to simulate the seawall-break flows. The flow patterns and water depth distributions for the seawall-break flows under various densities, layouts and shapes of buildings and different breach discharges, sizes and locations are investigated. It is found that when buildings with a high enough density are perpendicular to the main flow direction, an obvious backwater phenomenon appears near buildings while this phenomenon does not occur when buildings with the same density are parallel to the main flow direction. Moreover, it is observed that the occurrence of backwater phenomenon is independent of the building shape. As to the effects of breach on the seawall-break flows, it is found that only when the breach discharge is large enough or the breach size is small enough, the effects of asymmetric distribution of buildings on the seawall-break flows become important. The breach location only changes the flow pattern in the upstream area of the first building that seawater meets, but has little impact on the global water depth distribution. Supported by the National Natural Science Foundation of China under Grant No. 11502124, the Natural Science Foundation of Zhejiang Province under Grant No. LQ16A020001, the Scientific Research Fund of Zhejiang Provincial Education Department under Grant No. Y201533808, the Natural Science Foundation of Ningbo under Grant No. 2016A610075, and is sponsored by K.C. Wong Magna Fund in Ningbo University.

  20. Run-beyond-clad-breach oxide testing in EBR-2

    International Nuclear Information System (INIS)

    Lambert, J.D.B.; Bottcher, J.H.; Strain, R.V.; Gross, K.C.; Lee, M.J.; Webb, J.P.; Colburn, R.P.; Ukai, S.; Nomura, S.; Odo, T.; Shikakura, S.

    1990-01-01

    Fourteen tests sponsored by the US and Japan were used to study reliability of breached LMR oxide fuel pins during continued operation in EBR-II for a range of conditions and parameters. The fuel-sodium reaction product governed pin behavior. It extended primary breaches by swelling and promoted secondary failures, yet it inhibited loss of fuel and fission products and enhanced release of delayed neutrons used in monitoring breach condition. Fission gas and cesium, the main contaminants from failures, could be adequately controlled. This positive EBR-II experience suggested that limited operation with failed fuel may be feasible in commercial LMR's. 16 refs., 14 figs., 4 tabs

  1. Tortious Interference with Contract versus "Efficient" Breach: Theory and Empirical Evidence.

    OpenAIRE

    McChesney, Fred S

    1999-01-01

    Tortious interference is bothersome, normatively and positively, to scholars espousing the economic model of "efficient breach" of contract because it penalizes third-party inducements to breach. Scholars nonetheless find innovative second-best arguments to justify the coexistence of tortious interference with "efficient" breach. This article shows normatively why tortious interference would be part of a first-best legal system. Tortious interference provides property protection to contract r...

  2. 48 CFR 52.233-4 - Applicable Law for Breach of Contract Claim.

    Science.gov (United States)

    2010-10-01

    ... Provisions and Clauses 52.233-4 Applicable Law for Breach of Contract Claim. As prescribed in 33.215(b), insert the following clause: Applicable Law for Breach of Contract Claim (OCT 2004) United States law... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Applicable Law for Breach...

  3. Psychological contract breaches, organizational commitment, and innovation-related behaviors: a latent growth modeling approach.

    Science.gov (United States)

    Ng, Thomas W H; Feldman, Daniel C; Lam, Simon S K

    2010-07-01

    This study examined the relationships among psychological contract breaches, organizational commitment, and innovation-related behaviors (generating, spreading, implementing innovative ideas at work) over a 6-month period. Results indicate that the effects of psychological contract breaches on employees are not static. Specifically, perceptions of psychological contract breaches strengthened over time and were associated with decreased levels of affective commitment over time. Further, increased perceptions of psychological contract breaches were associated with decreases in innovation-related behaviors. We also found evidence that organizational commitment mediates the relationship between psychological contract breaches and innovation-related behaviors. These results highlight the importance of examining the nomological network of psychological contract breaches from a change perspective.

  4. Mass Transfer Model for a Breached Waste Package

    International Nuclear Information System (INIS)

    Hsu, C.; McClure, J.

    2004-01-01

    The degradation of waste packages, which are used for the disposal of spent nuclear fuel in the repository, can result in configurations that may increase the probability of criticality. A mass transfer model is developed for a breached waste package to account for the entrainment of insoluble particles. In combination with radionuclide decay, soluble advection, and colloidal transport, a complete mass balance of nuclides in the waste package becomes available. The entrainment equations are derived from dimensionless parameters such as drag coefficient and Reynolds number and based on the assumption that insoluble particles are subjected to buoyant force, gravitational force, and drag force only. Particle size distributions are utilized to calculate entrainment concentration along with geochemistry model abstraction to calculate soluble concentration, and colloid model abstraction to calculate colloid concentration and radionuclide sorption. Results are compared with base case geochemistry model, which only considers soluble advection loss

  5. Making Sociology Relevant: The Assignment and Application of Breaching Experiments

    Science.gov (United States)

    Rafalovich, Adam

    2006-01-01

    Breaching experiments involve the conscious exhibition of "unexpected" behavior, an observation of the types of social reactions such behavioral violations engender, and an analysis of the social structure that makes these social reactions possible. The conscious violation of norms can be highly fruitful for sociology students, providing insights…

  6. The Cryptographic Implications of the LinkedIn Data Breach

    OpenAIRE

    Gune, Aditya

    2017-01-01

    Data security and personal privacy are difficult to maintain in the Internet age. In 2012, professional networking site LinkedIn suffered a breach, compromising the login of over 100 million accounts. The passwords were cracked and sold online, exposing the authentication credentials millions of users. This manuscript dissects the cryptographic failures implicated in the breach, and explores more secure methods of storing passwords.

  7. A Simple model for breach formation by overtopping

    Energy Technology Data Exchange (ETDEWEB)

    Trousseau, P. [Hydro-Quebec, Montreal, PQ (Canada); Kahawita, R. [Ecole Polytechnique, Montreal, PQ (Canada)

    2006-07-01

    Failures in earth or rockfill dams are often caused by overtopping of the crest, leading to initiation and uncontrolled progression of a breach. Overtopping may occur because of large inflows into the reservoir caused by excessive rainfall or by the failure of an upstream dam that causes a large volume of water to suddenly arrive at the downstream reservoir thus rapidly exceeding the storage and spillway evacuation capacity. Breach formation in a rockfill or earthfill dike due to overtopping of the crest is a complex process as it involves interaction between the hydraulics of the flow and the erosion characteristics of the fill material. This paper presented a description and validation of a simple parametric model for breach formation due to overtopping. A study was conducted to model, as closely as possible, the physical processes involved within the restriction of the simplified analysis. The objective of the study was to predict the relevant timescales for the phenomenon leading to a prediction of the outflow hydrograph. The model has been validated on the Oros dam failure in Brazil as well as on embankment tests conducted at Rosvatn, Norway. It was concluded that the major impediment to the development of breach erosion models for use as predictive tools is in the characterization of the erosion behaviour. 19 refs., 2 tabs., 9 figs.

  8. Decrease the Number of Glovebox Glove Breaches and Failures

    Energy Technology Data Exchange (ETDEWEB)

    Hurtle, Jackie C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2013-12-24

    Los Alamos National Laboratory (LANL) is committed to the protection of the workers, public, and environment while performing work and uses gloveboxes as engineered controls to protect workers from exposure to hazardous materials while performing plutonium operations. Glovebox gloves are a weak link in the engineered controls and are a major cause of radiation contamination events which can result in potential worker exposure and localized contamination making operational areas off-limits and putting programmatic work on hold. Each day of lost opportunity at Technical Area (TA) 55, Plutonium Facility (PF) 4 is estimated at $1.36 million. Between July 2011 and June 2013, TA-55-PF-4 had 65 glovebox glove breaches and failures with an average of 2.7 per month. The glovebox work follows the five step safety process promoted at LANL with a decision diamond interjected for whether or not a glove breach or failure event occurred in the course of performing glovebox work. In the event that no glove breach or failure is detected, there is an additional decision for whether or not contamination is detected. In the event that contamination is detected, the possibility for a glove breach or failure event is revisited.

  9. "Financial Emergency" and the Faculty Furlough: A Breach of Contract.

    Science.gov (United States)

    Richards, Mary Sanders

    1984-01-01

    The power of the university to breach faculty contracts in order to meet its temporary cash-flow problems and the rights of faculty when this breach occurs are discussed. To avoid litigation, a university must have established internal guidelines which can be incorporated into an employment contract. (MLW)

  10. Using Vignette Methodology to research the process of breach comparatively

    NARCIS (Netherlands)

    Boone, M.M.; Beyens, K.; Maguire, N.; Laurinavicius, A.; Persson, A.

    2015-01-01

    Comparative research related to any aspect of the process of breach in either the pre-trial, sentencing or release phases is relatively rare. Comparative studies of decision making in the specific context of breach process are particularly lacking. One reason for the dearth of research in this area

  11. Reasons for Picture Archiving and Communication System (PACS data security breaches: Intentional versus non-intentional breaches

    Directory of Open Access Journals (Sweden)

    Tintswalo Brenda Mahlaola

    2016-12-01

    Conclusion: Our study supports previous findings that, in the absence of guidelines, most security breaches were non-intentional acts committed due to ignorance. Of concern are incidents in which sensitive information was intentionally shared via social media.

  12. How to Survive a Data Breach

    CERN Document Server

    Mitchell, Stewart

    2009-01-01

    This is the downloadable version of this new pocket guide which provides essential support for organisations who would like to have a tried and tested procedure in place for dealing with data breaches.

  13. Sodium erosion of boron carbide from breached absorber pins

    International Nuclear Information System (INIS)

    Basmajian, J.A.; Baker, D.E.

    1981-03-01

    The purpose of the irradiation experiment was to provide an engineering demonstration of the irradiation behavior of breached boron carbide absorber pins. By building defects into the cladding of prototypic absorber pins, and performing the irradiation under typical FFTF operating conditions, a qualitative assessment of the consequences of a breach was achieved. Additionally, a direct comparison of pin behavior with that of the ex-reactor test could be made

  14. The A to Z of healthcare data breaches.

    Science.gov (United States)

    Kobus, Theodore J

    2012-01-01

    There currently exists a myriad of privacy laws that impact a healthcare entity, including more than 47 notification laws that require notification when a data breach occurs, as well as the breach notification requirements of the Health Information Technology for Economic and Clinical Health Act. Given the plethora of issues a healthcare entity faces, there are certain principles that can be built into an organization's philosophy that will comply with the law and help protect it from reputational harm. © 2012 American Society for Healthcare Risk Management of the American Hospital Association.

  15. Identifying psychological contract breaches to guide improvements in faculty recruitment, retention, and development.

    Science.gov (United States)

    Peirce, Gretchen L; Desselle, Shane P; Draugalis, JoLaine R; Spies, Alan R; Davis, Tamra S; Bolino, Mark

    2012-08-10

    To identify pharmacy faculty members' perceptions of psychological contract breaches that can be used to guide improvements in faculty recruitment, retention, and development. A list of psychological contract breaches was developed using a Delphi procedure involving a panel of experts assembled through purposive sampling. The Delphi consisted of 4 rounds, the first of which elicited examples of psychological contract breaches in an open-ended format. The ensuing 3 rounds consisting of a survey and anonymous feedback on aggregated group responses. Usable responses were obtained from 11 of 12 faculty members who completed the Delphi procedure. The final list of psychological contract breaches included 27 items, after modifications based on participant feedback in subsequent rounds. The psychological contract breach items generated in this study provide guidance for colleges and schools of pharmacy regarding important aspects of faculty recruitment, retention, and development.

  16. Bouncing back from psychological contract breach: How commitment recovers over time

    NARCIS (Netherlands)

    Solinger, O.N.; Hofmans, J.; Bal, P.M.; Jansen, P.G.W.

    2016-01-01

    The post-violation model of the psychological contract outlines four ways in which a psychological contract may be resolved after breach (i.e., psychological contract thriving, reactivation, impairment, and dissolution). To explore the implications of this model for post-breach restoration of

  17. An Examination of the Explicit Costs of Sensitive Information Security Breaches

    Science.gov (United States)

    Toe, Cleophas Adeodat

    2013-01-01

    Data security breaches are categorized as loss of information that is entrusted in an organization by its customers, partners, shareholders, and stakeholders. Data breaches are significant risk factors for companies that store, process, and transmit sensitive personal information. Sensitive information is defined as confidential or proprietary…

  18. 78 FR 5565 - Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules Under...

    Science.gov (United States)

    2013-01-25

    ... RIN 0945-AA03 Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules... HIPAA Privacy, Security, Breach Notification, and Enforcement Rules (the HIPAA Rules) to improve their... entities Total cost Notices of Privacy Practices.. 700,000 covered $55.9 million. entities. Breach...

  19. I Am So Tired… How Fatigue May Exacerbate Stress Reactions to Psychological Contract Breach.

    Science.gov (United States)

    Achnak, Safâa; Griep, Yannick; Vantilborgh, Tim

    2018-01-01

    Previous research showed that perceptions of psychological contract (PC) breach have undesirable individual and organizational consequences. Surprisingly, the PC literature has paid little to no attention to the relationship between PC breach perceptions and stress. A better understanding of how PC breach may elicit stress seems crucial, given that stress plays a key role in employees' physical and mental well-being. Based on Conservation of Resources Theory, we suggest that PC breach perceptions represent a perceived loss of valued resources, subsequently leading employees to experience higher stress levels resulting from emerging negative emotions. Moreover, we suggest that this mediated relationship is moderated by initial levels of fatigue, due to fatigue lowering the personal resources necessary to cope with breach events. To tests our hypotheses, we analyzed the multilevel data we obtained from two experience sampling designs (Study 1: 51 Belgian employees; Study 2: 53 US employees). Note that the unit of analysis is "observations" rather than "respondents," resulting in an effective sample size of 730 (Study 1) and 374 (Study 2) observations. In both studies, we found evidence for the mediating role of negative emotions in the PC breach-stress relationship. In the second study, we also found evidence for the moderating role of fatigue in the mediated PC breach-stress relationship. Implications for research and practice are discussed.

  20. An inverse method to estimate the flow through a levee breach

    Science.gov (United States)

    D'Oria, Marco; Mignosa, Paolo; Tanda, Maria Giovanna

    2015-08-01

    We propose a procedure to estimate the flow through a levee breach based on water levels recorded in river stations downstream and/or upstream of the failure site. The inverse problem is solved using a Bayesian approach and requires the execution of several forward unsteady flow simulations. For this purpose, we have used the well-known 1-D HEC-RAS model, but any unsteady flow model could be adopted in the same way. The procedure has been tested using four synthetic examples. Levee breaches with different characteristics (free flow, flow with tailwater effects, etc.) have been simulated to collect the synthetic level data used at a later stage in the inverse procedure. The method was able to accurately reproduce the flow through the breach in all cases. The practicability of the procedure was then confirmed applying it to the inundation of the Polesine Region (Northern Italy) which occurred in 1951 and was caused by three contiguous and almost simultaneous breaches on the left embankment of the Po River.

  1. Current status of the Run-Beyond-Cladding Breach (RBCB) tests for the Integral Fast Reactor (IFR)

    International Nuclear Information System (INIS)

    Batte, G.L.; Pahl, R.G.; Hofman, G.L.

    1993-01-01

    This paper describes the results from the Integral Fast Reactor (IFR) metallic fuel Run-Beyond-Cladding-Breach (RBCB) experiments conducted in the Experimental Breeder Reactor II (EBR-II). Included in the report are scoping test results and the data collected from the prototypical tests as well as the exam results and discussion from a naturally occurring breach of one of the lead IFR fuel tests. All results showed a characteristic delayed neutron and fission gas release pattern that readily allows for identification and evaluation of cladding breach events. Also, cladding breaches are very small and do not propagate during extensive post breach operation. Loss of fuel from breached cladding was found to be insignificant. The paper will conclude with a brief description of future RBCB experiments planned for irradiation in EBR-II

  2. When employees strike back: investigating mediating mechanisms between psychological contract breach and workplace deviance.

    Science.gov (United States)

    Bordia, Prashant; Restubog, Simon Lloyd D; Tang, Robert L

    2008-09-01

    In this article, psychological contract breach, revenge, and workplace deviance are brought together to identify the cognitive, affective, and motivational underpinnings of workplace deviance. On the basis of S. L. Robinson and R. J. Bennett's (1997) model of workplace deviance, the authors proposed that breach (a cognitive appraisal) and violation (an affective response) initiate revenge seeking. Motivated by revenge, employees then engage in workplace deviance. Three studies tested these ideas. All of the studies supported the hypothesized relationships. In addition, self-control was found to be a moderator of the relationship between revenge cognitions and deviant acts; the relationship was weaker for people high in self-control.

  3. Breached-pin testing in the US

    International Nuclear Information System (INIS)

    Mahagin, D.E.; Lambert, J.D.B.

    1981-04-01

    Experience gained at EBR-II by the late 1970's from a significant number of failures in experimental fuel-pin irradiations forms the basis of a program directed towards the characterization of breached pins. The questions to be answered and the issues raised by further testing are discussed

  4. Probability of expected climate stresses in North America in the next one My

    International Nuclear Information System (INIS)

    Kukla, G.

    1979-01-01

    Climates one million years ahead were predicted upon the assumption that the natural climate variability during the past My will continue. Response of environment and climate in the Basin and Range province of the western USA to global fluctuations was reconstructed; the most remarkable change was the filling of closed basins with large freshwater lakes. Probabilities of permanent ice cover and floods are discussed. It is believed that a site with minimal probability of climate-related breach can be selected

  5. Testing the Differential Effects of Changes in Psychological Contract Breach and Fulfillment

    Science.gov (United States)

    Conway, Neil; Guest, David; Trenberth, Linda

    2011-01-01

    Rousseau (1989 and elsewhere) argued that a defining feature of psychological contract breach was that once a promise had been broken it could not easily be repaired and therefore that the effects of psychological contract breach outweighed those of psychological contract fulfillment. Using two independent longitudinal surveys, this paper…

  6. ''Spinolaminar breach'': an important sign in cervical spinous process fractures

    International Nuclear Information System (INIS)

    Matar, L.D.; Helms, C.A.

    2000-01-01

    Objective. To report the sign of ''spinolaminar breach'' and its likely importance in fractures of the cervical spinous processes.Design. Six cases of spinous process fractures demonstrating disruption of the spinolaminar line or ''spinolaminar breach'' were analyzed. Lateral and anteroposterior radiographs (n=6), CT scans (n=3) and MRI scans (n=1) were reviewed together by the authors, with consensus being reached as to the radiographic findings. Clinical records were also reviewed.Results. The levels of injury were C6 (n=5) and C5 (n=2). Injuries were associated with delayed anterior subluxation (n=4) and neurological deficit (n=2). Five patients were male and one was female with a mean age of 31 years (range 8-59 years). Injuries resulted from motor vehicle accidents (n=4), a motor cycle accident (n=1) and a fall (n=1).Conclusion. ''Spinolaminar breach'', or disruption of the spinolaminar line, indicates a complex spinous process fracture with extension into the lamina and spinal canal. Spinous process fractures with spinolaminar breach may have associated posterior ligamentous injury with potential for delayed instability and neurological deficit. It is important that radiologists and physicians caring for the trauma patient be aware of this sign in order to avoid misdiagnosis as a ''clay shoveler's fracture'', which can lead to adverse outcome. (orig.)

  7. Information Society Services and Mandatory Data Breach Notifications: Introduction to Open Issues in the EU Framework

    OpenAIRE

    Burnik, Jelena

    2012-01-01

    In 2011 Sony suffered an extensive breach in its online game network that led to the theft of account data of 77 million users from all over the world. This was one of the largest internet security break-ins that resulted in a large scale personal data breach. As an answer to numerous incidents of security breaches where personal data have been compromised, an instrument of mandatory data breach notification is currently being implemented in the European Union that follows the approach taken ...

  8. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  9. On the importance of default breach remedies

    NARCIS (Netherlands)

    Sloof, R.; Oosterbeek, H.; Sonnemans, J.

    2007-01-01

    Theory predicts that default breach remedies are immaterial whenever contracting costs are negligible. Some experimental studies, however, suggest that in practice default rules do matter, as they may affect parties' preferences over contract terms. This paper presents results from an experiment

  10. Toward a better understanding of psychological contract breach: a study of customer service employees.

    Science.gov (United States)

    Deery, Stephen J; Iverson, Roderick D; Walsh, Janet T

    2006-01-01

    Experiences of psychological contract breach have been associated with a range of negative behavior. However, much of the research has focused on master of business administration alumni and managers and made use of self-reported outcomes. Studying a sample of customer service employees, the research found that psychological contract breach was related to lower organizational trust, which, in turn was associated with perceptions of less cooperative employment relations and higher levels of absenteeism. Furthermore, perceptions of external market pressures moderated the effect of psychological contract breach on absenteeism. The study indicated that psychological contract breach can arise when employees perceive discrepancies between an organization's espoused behavioral standards and its actual behavioral standards, and this can affect discretionary absence. (c) 2006 APA, all rights reserved.

  11. A Longitudinal Study of Age-Related Differences in Reactions to Psychological Contract Breach

    NARCIS (Netherlands)

    Bal, P.M.; Lange, A.H. de; Jansen, P.G.W.; Velde, M.E.G. van der

    2013-01-01

    The current paper investigated age-related differences in the relations of psychological contract breach with work outcomes over time. Based on affective events theory, we expected job satisfaction to mediate the longitudinal relationship of contract breach with changes in job performance. Moreover,

  12. A longitudinal study of age-related differences in reactions to psychological contract breach

    NARCIS (Netherlands)

    Bal, P.M.; de Lange, A.H.; Jansen, P.G.W.; van der Velde, E.G.

    2013-01-01

    The current paper investigated age-related differences in the relations of psychological contract breach with work outcomes over time. Based on affective events theory, we expected job satisfaction to mediate the longitudinal relationship of contract breach with changes in job performance. Moreover,

  13. a longitudinal study of age-related differences in reactions to phsycological contract breach

    NARCIS (Netherlands)

    Paul Jansen; Annet de Lange; Matthijs Bal; Mandy van der Velde

    2013-01-01

    The current paper investigated age‐related differences in the relations of psychological contract breach with work outcomes over time. Based on affective events theory, we expected job satisfaction to mediate the longitudinal relationship of contract breach with changes in job performance. Moreover,

  14. Pro-active data breach detection: examining accuracy and applicability on personal information detected

    CSIR Research Space (South Africa)

    Botha, J

    2016-03-01

    Full Text Available breaches but does not provide a clear indication of the level of personal information available on the internet since only reported incidents are taken into account. The possibility of pro-active automated breach detection has previously been discussed as a...

  15. Battling Data Breaches: For Higher Education Institutions, Data Breach Prevention is More Complex than for Industry and Business

    Science.gov (United States)

    Patton, Madeline

    2015-01-01

    Data breach prevention is a battle, rarely plain and never simple. For higher education institutions, the Sisyphean aspects of the task are more complex than for industry and business. Two-year colleges have payrolls and vendor contracts like those enterprises. They also have public record and student confidentiality requirements. Colleges must…

  16. Modelling dune erosion, overwash and breaching at Fire Island (NY) during hurricane Sandy

    NARCIS (Netherlands)

    De Vet, P.L.M.; McCall, R.T.; Den Bieman, J.P.; Stive, M.J.F.; Van Ormondt, M.

    2015-01-01

    In 2012, Hurricane Sandy caused a breach at Fire Island (NY, USA), near Pelican Island. This paper aims at modelling dune erosion, overwash and breaching processes that occured during the hurricane event at this stretch of coast with the numerical model XBeach. By using the default settings, the

  17. The effects of artificial sandbar breaching on the macrophyte communities of an intermittently open estuary

    Science.gov (United States)

    Ribeiro, Jose Pedro N.; Saggio, Ângelo; Lima, Maria Inês Salgueiro

    2013-04-01

    Artificial sandbar opening of intermittently open estuaries is a practice utilised worldwide to improve water quality, fishing, and recreational amenities and to prevent the flooding of adjacent properties. Breaching causes the water level to drop drastically, exposing plants to two water level extremes. With some exceptions, estuarine communities are adversely affected by this practice. Although breaching can happen naturally, artificial breaching is on the rise, and the impact of manipulating water levels on estuarine communities needs to be investigated. In this work, we described the breaching cycles of the Massaguaçu River Estuary and proposed flooding scenarios for the estuary's macrophyte banks based on our data. We calculated the relationship between plant distribution and flooding conditions and used our calculations to predict the estuary community's composition depending on the water level at breaching time. We discovered a strong relationship between plant distribution and flooding conditions, and we predicted that the estuarine community would be markedly different between flooding scenarios. Low frequency flooding scenarios would be related to submerged macrophytes and, as the flooding frequency increases, macrophytes would be replaced by amphibious plants, and eventually by the arboreal stratus. Therefore, we concluded that an increase in artificial breaching cycles would have a detrimental impact on the estuary community.

  18. Do promises matter? An exploration of the role of promises in psychological contract breach.

    Science.gov (United States)

    Montes, Samantha D; Zweig, David

    2009-09-01

    Promises are positioned centrally in the study of psychological contract breach and are argued to distinguish psychological contracts from related constructs, such as employee expectations. However, because the effects of promises and delivered inducements are confounded in most research, the role of promises in perceptions of, and reactions to, breach remains unclear. If promises are not an important determinant of employee perceptions, emotions, and behavioral intentions, this would suggest that the psychological contract breach construct might lack utility. To assess the unique role of promises, the authors manipulated promises and delivered inducements separately in hypothetical scenarios in Studies 1 (558 undergraduates) and 2 (441 employees), and they measured them separately (longitudinally) in Study 3 (383 employees). The authors' results indicate that breach perceptions do not represent a discrepancy between what employees believe they were promised and were given. In fact, breach perceptions can exist in the absence of promises. Further, promises play a negligible role in predicting feelings of violation and behavioral intentions. Contrary to the extant literature, the authors' findings suggest that promises may matter little; employees are concerned primarily with what the organization delivers.

  19. Are Emotions Transmitted From Work to Family? A Crossover Model of Psychological Contract Breach.

    Science.gov (United States)

    Liang, Huai-Liang

    2018-01-01

    Based on affective events theory and the crossover model, this study examines the effect of psychological contract breach on employee dysfunctional behavior and partner family undermining and explores the crossover effect of employee dysfunctional behavior on partner family undermining in work-family issues. This study collected 370 employee-partner dyads (277 male employees, 93 female employees, M age = 43.59 years) from a large manufacturing organization. The results of this study support the conception that employees' psychological contract breach results in frustration in the workplace. In addition, mediation analysis results reveal that psychological contract breach relates to employee dysfunctional behavior in the workplace. The findings show that partners' psychological strain mediates the relationship between employee dysfunctional behavior and partner family undermining. Furthermore, these findings provide investigations for the crossover model to display the value of psychological contract breach in family issues.

  20. Legal Effect of Breach of Warranty in Construction Insurance in Malaysia

    OpenAIRE

    Arazi Idrus; Mahmoud Sodangi; Jamaluddin Yaakob

    2011-01-01

    This study is aimed at analyzing the legal effect of breach of warranty in construction insurance contracts in Malaysia in light of the current developments in The English insurance law. The required data and information were collected from Malaysian and English court decisions dealing with breach of warranties in English marine insurance law from the online Malayan Law Journal published on the LexisNexis online database and from published textbooks related to insurance warranties. This study...

  1. BREACHING THE SEXUAL BOUNDARIES IN THE DOCTOR–PATIENT RELATIONSHIP: SHOULD ENGLISH LAW RECOGNISE FIDUCIARY DUTIES?

    Science.gov (United States)

    Ost, Suzanne

    2016-01-01

    In this article, I argue that sexual exploitation in the doctor–patient relationship would be dealt with more appropriately by the law in England and Wales on the basis of a breach of fiduciary duty. Three different types of sexual boundary breaches are discussed, and the particular focus is on breaches where the patient's consent is obtained through inducement. I contend that current avenues of redress do not clearly catch this behaviour and, moreover, they fail to capture the essence of the wrong committed by the doctor—the knowing breach of trust for self-gain—and the calculated way in which consent is induced. Finally, I demonstrate that the fiduciary approach is compatible with the contemporary pro-patient autonomy model of the doctor–patient relationship. PMID:26846652

  2. The older, the better! Age-related differences in emotion regulation after psychological contract breach.

    NARCIS (Netherlands)

    Bal, P.M.; Smit, P.

    2012-01-01

    The aim of this paper was to investigate the role of emotion regulation and age in reactions to psychological contract breach towards positive and negative affect. The authors expected that in the context of contract breach, reappraisal emotion regulation mitigate the negative relation with affect.

  3. Psychological contract breach and job attitudes : A meta-analysis of age as a moderator

    NARCIS (Netherlands)

    Bal, P. Matthijs; De lange, Annet H.; Jansen, Paul G. W.; Van der Velde, Mandy E. G.

    The aim of this study was to examine the influence of age in the relation between psychological contract breach and the development of job attitudes. Based on affective events, social exchange, and lifespan theory, we hypothesized that (1) psychological contract breach would be related negatively to

  4. Joint Precision Approach and Landing System Nunn-McCurdy Breach Root Cause Analysis and Portfolio Assessment Metrics for DOD Weapons Systems. Volume 8

    Science.gov (United States)

    2016-01-01

    ABP breaches Quantity changes > 5% PAUC growth Pre- Milestone C > 5% APUC growth 100 60 80 40 0 % 0 2012 ABP breaches Quantity changes > 5% PAUC growth...Pre- Milestone C > 5% APUC growth 100 60 80 40 % 2003 ABP breaches Nunn- McCurdy breaches Nunn- McCurdy breaches Nunn- McCurdy breaches Nunn- McCurdy...Quantity changes > 5% PAUC growth > 5% APUC growth Pre- Milestone C% 100 60 80 40 0 2004 ABP breaches Quantity changes > 5%

  5. Measuring information security breach impact and uncertainties under various information sharing scenarios

    OpenAIRE

    Durowoju, Olatunde; Chan, Hing; Wang, Xiaojun

    2013-01-01

    This study draws on information theory and aims to provide simulated evidence using real historical and statistical data to demonstrate how various levels of integration moderate the impact and uncertainties of information security breach on supply chain performance. We find that the supply chain behaves differently under various levels of integration when a security breach occurs. The entropy analysis revealed that the wholesaler experience the most uncertainty under system failure and data ...

  6. Expert system for surveillance and diagnosis of breach fuel elements

    Science.gov (United States)

    Gross, Kenny C.

    1989-01-01

    An apparatus and method are disclosed for surveillance and diagnosis of breached fuel elements in a nuclear reactor. A delayed neutron monitoring system provides output signals indicating the delayed neutron activity and age and the equivalent recoil areas of a breached fuel element. Sensors are used to provide outputs indicating the status of each component of the delayed neutron monitoring system. Detectors also generate output signals indicating the reactor power level and the primary coolant flow rate of the reactor. The outputs from the detectors and sensors are interfaced with an artificial intelligence-based knowledge system which implements predetermined logic and generates output signals indicating the operability of the reactor.

  7. Expert system for surveillance and diagnosis of breach fuel elements

    International Nuclear Information System (INIS)

    Gross, K.C.

    1989-01-01

    An apparatus and method are disclosed for surveillance and diagnosis of breached fuel elements in a nuclear reactor. A delayed neutron monitoring system provides output signals indicating the delayed neutron activity and age and the equivalent recoil areas of a breached fuel element. Sensors are used to provide outputs indicating the status of each component of the delayed neutron monitoring system. Detectors also generate output signals indicating the reactor power level and the primary coolant flow rate of the reactor. The outputs from the detectors and sensors are interfaced with an artificial intelligence-based knowledge system which implements predetermined logic and generates output signals indicating the operability of the reactor

  8. I Am So Tired… How Fatigue May Exacerbate Stress Reactions to Psychological Contract Breach

    Directory of Open Access Journals (Sweden)

    Safâa Achnak

    2018-03-01

    Full Text Available Previous research showed that perceptions of psychological contract (PC breach have undesirable individual and organizational consequences. Surprisingly, the PC literature has paid little to no attention to the relationship between PC breach perceptions and stress. A better understanding of how PC breach may elicit stress seems crucial, given that stress plays a key role in employees' physical and mental well-being. Based on Conservation of Resources Theory, we suggest that PC breach perceptions represent a perceived loss of valued resources, subsequently leading employees to experience higher stress levels resulting from emerging negative emotions. Moreover, we suggest that this mediated relationship is moderated by initial levels of fatigue, due to fatigue lowering the personal resources necessary to cope with breach events. To tests our hypotheses, we analyzed the multilevel data we obtained from two experience sampling designs (Study 1: 51 Belgian employees; Study 2: 53 US employees. Note that the unit of analysis is “observations” rather than “respondents,” resulting in an effective sample size of 730 (Study 1 and 374 (Study 2 observations. In both studies, we found evidence for the mediating role of negative emotions in the PC breach—stress relationship. In the second study, we also found evidence for the moderating role of fatigue in the mediated PC breach—stress relationship. Implications for research and practice are discussed.

  9. I Am So Tired… How Fatigue May Exacerbate Stress Reactions to Psychological Contract Breach

    Science.gov (United States)

    Achnak, Safâa; Griep, Yannick; Vantilborgh, Tim

    2018-01-01

    Previous research showed that perceptions of psychological contract (PC) breach have undesirable individual and organizational consequences. Surprisingly, the PC literature has paid little to no attention to the relationship between PC breach perceptions and stress. A better understanding of how PC breach may elicit stress seems crucial, given that stress plays a key role in employees' physical and mental well-being. Based on Conservation of Resources Theory, we suggest that PC breach perceptions represent a perceived loss of valued resources, subsequently leading employees to experience higher stress levels resulting from emerging negative emotions. Moreover, we suggest that this mediated relationship is moderated by initial levels of fatigue, due to fatigue lowering the personal resources necessary to cope with breach events. To tests our hypotheses, we analyzed the multilevel data we obtained from two experience sampling designs (Study 1: 51 Belgian employees; Study 2: 53 US employees). Note that the unit of analysis is “observations” rather than “respondents,” resulting in an effective sample size of 730 (Study 1) and 374 (Study 2) observations. In both studies, we found evidence for the mediating role of negative emotions in the PC breach—stress relationship. In the second study, we also found evidence for the moderating role of fatigue in the mediated PC breach—stress relationship. Implications for research and practice are discussed. PMID:29559935

  10. (PACS) data security breaches: Intentional versus non-intentional ...

    African Journals Online (AJOL)

    Background: The Picture Archiving and Communication System (PACS) has led to an increase in breached health records and violation of patient confidentiality. The South African constitution makes provision for human dignity and privacy, virtues which confidentiality seeks to preserve. Confidentiality thus constitutes a ...

  11. Hydraulics of embankment-dam breaching

    Science.gov (United States)

    Walder, J. S.; Iverson, R. M.; Logan, M.; Godt, J. W.; Solovitz, S.

    2012-12-01

    Constructed or natural earthen dams can pose hazards to downstream communities. Experiments to date on earthen-dam breaching have focused on dam geometries relevant to engineering practice. We have begun experiments with dam geometries more like those of natural dams. Water was impounded behind dams constructed at the downstream end of the USGS debris-flow flume. Dams were made of compacted, well-sorted, moist beach sand (D50=0.21 mm), 3.5 m from toe to toe, but varying in height from 0.5 to 1 m; the lower the dam, the smaller the reservoir volume and the broader the initially flat crest. Breaching was started by cutting a slot 30-40 mm wide and deep in the dam crest after filling the reservoir. Water level and pore pressure within the dam were monitored. Experiments were also recorded by an array of still- and video cameras above the flume and a submerged video camera pointed at the upstream dam face. Photogrammetric software was used to create DEMs from stereo pairs, and particle-image velocimetry was used to compute the surface-velocity field from the motion of tracers scattered on the water surface. As noted by others, breaching involves formation and migration of a knickpoint (or several). Once the knickpoint reaches the upstream dam face, it takes on an arcuate form whose continued migration we determined by measuring the onset of motion of colored markers on the dam face. The arcuate feature, which can be considered the head of the "breach channel", is nearly coincident with the transition from subcritical to supercritical flow; that is, it acts as a weir that hydraulically controls reservoir emptying. Photogenic slope failures farther downstream, although the morphologically dominant process at work, play no role at all in hydraulic control aside from rare instances in which they extend upstream so far as to perturb the weir, where the flow cross section is nearly self-similar through time. The domain downstream of the critical-flow section does influence

  12. A socio-emotional selectivity perspective on age-related differences in reactions to psychological contract breach

    NARCIS (Netherlands)

    Matthijs Bal; Paul Jansen; Annet de Lange; Mandy van der Velde

    2013-01-01

    The current paper investigated age-related differences in the relations of psychological contract breach with work outcomes over time. Based on affective events theory, we expected job satisfaction to mediate the longitudinal relationship of contract breach with changes in job performance. Moreover,

  13. Psychological Contract Breach and Job Attitudes: A Meta-Analysis of Age as a Moderator

    Science.gov (United States)

    Bal, P. Matthijs; De Lange, Annet H.; Jansen, Paul G. W.; Van Der Velde, Mandy E. G.

    2008-01-01

    The aim of this study was to examine the influence of age in the relation between psychological contract breach and the development of job attitudes. Based on affective events, social exchange, and lifespan theory, we hypothesized that (1) psychological contract breach would be related negatively to job attitudes, and (2) that age would moderate…

  14. Quick Look Report for Chemical Reactivity Modeling of Various Multi-Canister Overpack Breaches

    International Nuclear Information System (INIS)

    Bratton, Robert Lawrence

    2002-01-01

    . A uranium oxide coating covers the exposed uranium metal, yet uranium hydride can still form under the protective oxide coating over the 40-year interim storage time span. The current treatment process at Hanford does not remove chemically bound water contained in the hydrates or in the waters of hydration. The chemically bound water is the source material for hydrogen production over the 40-year storage time. So, additional uranium hydride creates concerns that breaches of an MCO with the appropriate size openings could result in the onset of bulk uranium oxidation with the potential of a self-sustaining thermal excursion or pyrophoric event. For this analysis, the worst-case scenario appears to be the match head configuration in a vertically standing MCO, where all the reactive surface area is placed on the tips of the fuel elements. This configuration concentrates the heat-producing chemical reaction at the tips of the fuel elements. Because no mechanistic drop analysis has been performed at this time to determine the MCO failure modes, parametric breach configurations were chosen in this analysis to determine the MCOs external thermal response range. The first breach is a pair of holes that suddenly open in the MCO wall. This thermal excursion is controlled by the ''thermal chimney effect'' in the 4.27-m (14-ft) tall canisters caused by the multiple holes breach (one high and one low). A second breach where the MCO lid is suddenly removed and exposed to the ambient air environment is evaluated. This thermal excursion is controlled by the countercurrent flow through the top of the MCO. Computer models for these breach configurations were constructed and executed

  15. Review Article: Lake and breach hazard assessment for moraine-dammed lakes: an example from the Cordillera Blanca (Peru

    Directory of Open Access Journals (Sweden)

    A. Emmer

    2013-06-01

    Full Text Available Glacial lake outburst floods (GLOFs and related debris flows represent a significant threat in high mountainous areas across the globe. It is necessary to quantify this threat so as to mitigate their catastrophic effects. Complete GLOF hazard assessment incorporates two phases: the probability of water release from a given glacial lake is estimated through lake and breach hazard assessment while the endangered areas are identified during downstream hazard assessment. This paper outlines a number of methods of lake and breach hazard assessment, which can be grouped into three categories: qualitative, of which we outline eight; semi-quantitative, of which we outline two; and quantitative, of which we outline three. It is considered that five groups of critical parameters are essential for an accurate regionally focused hazard assessment method for moraine-dammed lakes in the Cordillera Blanca. These comprise the possibility of dynamic slope movements into the lake, the possibility of a flood wave from a lake situated upstream, the possibility of dam rupture following a large earthquake, the size of the dam freeboard (or ratio of dam freeboard, and a distinction between natural dams and those with remedial work. It is shown that none of the summarised methods uses all these criteria with, at most, three of the five considered by the outlined methods. A number of these methods were used on six selected moraine-dammed lakes in the Cordillera Blanca: lakes Quitacocha, Checquiacocha, Palcacocha, Llaca, Rajucolta, and Tararhua. The results have been compared and show that each method has certain advantages and disadvantages when used in this region. These methods demonstrate that the most hazardous lake is Lake Palcacocha.

  16. The Role of HIPAA Omnibus Rules in Reducing the Frequency of Medical Data Breaches: Insights From an Empirical Study.

    Science.gov (United States)

    Yaraghi, Niam; Gopal, Ram D

    2018-03-01

    Policy Points: Frequent data breaches in the US health care system undermine the privacy of millions of patients every year-a large number of which happen among business associates of the health care providers that continue to gain unprecedented access to patients' data as the US health care system becomes digitally integrated. Implementation of the HIPAA Omnibus Rules in 2013 has led to a significant decrease in the number of privacy breach incidents among business associates. Frequent data breaches in the US health care system undermine the privacy of millions of patients every year. A large number of such breaches happens among business associates of the health care providers that continue to gain unprecedented access to patients' data as the US health care system becomes digitally integrated. The Omnibus Rules of the Health Insurance Portability and Accountability Act (HIPAA), which were enacted in 2013, significantly increased the regulatory oversight and privacy protection requirements of business associates. The objective of this study is to empirically examine the effects of this shift in policy on the frequency of medical privacy breaches among business associates in the US health care system. The findings of this research shed light on how regulatory efforts can protect patients' privacy. Using publicly available data on breach incidents between October 2009 and August 2017 as reported by the Office for Civil Rights (OCR), we conducted an interrupted time-series analysis and a difference-in-differences analysis to examine the immediate and long-term effects of implementation of HIPAA omnibus rules on the frequency of medical privacy breaches. We show that implementation of the omnibus rules led to a significant reduction in the number of breaches among business associates and prevented 180 privacy breaches from happening, which could have affected nearly 18 million Americans. Implementation of HIPAA omnibus rules may have been a successful federal policy

  17. High stakes. HITECH's privacy provisions will make costly security breaches even more painful to bear.

    Science.gov (United States)

    Gamble, Kate Huvane

    2009-07-01

    * The HITECH section of ARRA includes provisions relating to protected health information that could significantly alter the C-suite leader's strategy. * Patients will be entitled to request an accounting of disclosure for up to three years after the date of request. The onus will be on hospital leaders to put in place a process that makes accounting available without disrupting operations or patient care. * Because of the increased risks hospitals now face, it is critical that executives are aware of the new requirements, and are either involved in or have a solid understanding of the organization's breach notification policies.

  18. After the data breach: Managing the crisis and mitigating the impact.

    Science.gov (United States)

    Brown, Hart S

    2016-01-01

    Historically, the unauthorised access and theft of information was a tactic used between countries as part of espionage campaigns, during times of conflict as well as for personal and criminal purposes. The consumers of the information were relatively isolated and specific. As information became stored and digitised in larger quantities in the 1980s the ability to access mass amounts of records at one time became possible. The expertise needed to remotely access and exfiltrate the data was not readily available and the number of markets to monetise the data was limited. Over the past ten years, shadow networks have been used by criminals to collaborate on hacking techniques, exchange hacking advice anonymously and commercialise data on the black market. The intersection of these networks along with the unintentional losses of information have resulted in 5,810 data breaches made public since 2005 (comprising some 847,807,830 records) and the velocity of these events is increasing. Organisations must be prepared for a potential breach event to maintain cyber resiliency. Proper management of a breach response can reduce response costs and can serve to mitigate potential reputational losses.

  19. FFTF criteria for run to cladding breach experiments

    International Nuclear Information System (INIS)

    Van Keuren, J.C.; Heard, F.J.; Stepnewski, D.D.

    1985-12-01

    The review of experiments proposed for irradiation in FFTF resulted in the development of new criteria for run-to-cladding breach experiments. These criteria have allowed irradiation of aggressive experiments without compromising the safety bases for FFTF. This paper consisting of a set of narrated slides, discusses these criteria and related bases

  20. 50 CFR 38.9 - Breach of the peace.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Breach of the peace. 38.9 Section 38.9... peace. No person on Midway Atoll National Wildlife Refuge will: (a) With intent to cause public..., gestures, or displays, or address abusive language to any person present; or create a hazardous or...

  1. Direct and Indirect Effects of Psychological Contract Breach on Academicians’ Turnover Intention in Turkey

    OpenAIRE

    Buyukyilmaz, Ozan; Cakmak, Ahmet F.

    2013-01-01

    This study aims to investigate the assumed direct and indirect relationships between psychological contract breach and turnover intention through psychological contract violation and perceived organizational support. Data for the sample was collected from 570 academicians from a variety of universities in Turkey. Hierarchical regression analyses were conducted to test the hypotheses. The results show that psychological contract breach was positively related to turnover intention and psycholog...

  2. Pinhole Breaches in Spent Fuel Containers: Some Modeling Considerations

    International Nuclear Information System (INIS)

    Casella, Andrew M.; Loyalka, Sudarsham K.; Hanson, Brady D.

    2006-01-01

    This paper replaces PNNL-SA-48024 and incorporates the ANS reviewer's comments, including the change in the title. Numerical methods to solve the equations for gas diffusion through very small breaches in spent fuel containers are presented and compared with previous literature results

  3. Routes for breaching and protecting genetic privacy

    OpenAIRE

    Erlich, Yaniv; Narayanan, Arvind

    2013-01-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these datasets is vital for progress in biomedical research. However, one growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, point to the underlying assumptions, and assess its technological complexity and maturati...

  4. Observation and modeling of the evolution of an ephemeral storm-induced inlet: Pea Island Breach, North Carolina, USA

    Science.gov (United States)

    Velasquez Montoya, Liliana; Sciaudone, Elizabeth J.; Mitasova, Helena; Overton, Margery F.

    2018-03-01

    The Outer Banks of North Carolina is a wave-dominated barrier island system that has experienced the opening and closure of numerous inlets in the last four centuries. The most recent of those inlets formed after the breaching of Pea Island during Hurricane Irene in 2011. The Pea Island Breach experienced a rapid evolution including episodic curvature of the main channel, rotation of the ebb channel, shoaling, widening by Hurricane Sandy in 2012, and finally closing before the summer of 2013. Studying the life cycle of Pea Island Breach contributes to understanding the behavior of ephemeral inlets in breaching-prone regions. This topic has gained relevance due to rising sea levels, a phenomenon that increases the chances of ephemeral inlet formation during extreme events. This study explores the spatiotemporal effects of tides, waves, and storms on flow velocities and morphology of the breach by means of remotely sensed data, geospatial metrics, and a numerical model. The combined use of observations and results from modeling experiments allowed building a conceptual model to explain the life cycle of Pea Island Breach. Wave seasonality dominated the morphological evolution of the inlet by controlling the magnitude and direction of the longshore current that continuously built transient spits at both sides of the breach. Sensitivity analysis to external forcings indicates that ocean waves can modify water levels and velocities in the back barrier. Sound-side storm surge regulates overall growth rate, duration, and decay of peak water levels entering the inlet during extreme events.

  5. 24 CFR 982.453 - Owner breach of contract.

    Science.gov (United States)

    2010-04-01

    ...) If the owner has committed fraud, bribery or any other corrupt or criminal act in connection with any..., bribery or any other corrupt or criminal act in connection with the mortgage or loan. (5) If the owner has... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Owner breach of contract. 982.453...

  6. Electroconvulsive therapy, hypertensive surge, blood-brain barrier breach, and amnesia

    DEFF Research Database (Denmark)

    Andrade, Chittaranjan; Bolwig, Tom G

    2014-01-01

    Preclinical and clinical evidence show that electroconvulsive therapy (ECT)-induced intraictal surge in blood pressure may result in a small, transient breach in the blood-brain barrier, leading to mild cerebral edema and a possible leach of noxious substances from blood into brain tissues...... convincing evidence of benefits. It is concluded that there is insufficient support, at present, for the hypothesis that the hypertensive surge during ECT and the resultant blood-brain barrier breach contribute meaningfully to ECT-induced cognitive deficits. Future research should address the subset....... These changes may impair neuronal functioning and contribute to the mechanisms underlying ECT-induced cognitive deficits. Some but not all clinical data on the subject suggest that blood pressure changes during ECT correlate with indices of cognitive impairment. In animal models, pharmacological manipulations...

  7. Exploring the relationship between ADHD symptoms and prison breaches of discipline amongst youths in four Scottish prisons.

    Science.gov (United States)

    Gordon, V; Williams, D J; Donnelly, P D

    2012-04-01

    To explore the relationship between attention deficit hyperactivity disorder (ADHD) symptoms (inattention, hyperactivity and impulsivity) and violent and non-violent prison breaches of discipline in incarcerated male youths aged 18-21 years. A case-control study of 169 male youth offenders incarcerated in Scottish prisons and classified as 'symptomatic' or 'non-symptomatic' of inattentive and hyperactive/impulsive ADHD symptoms. ADHD symptoms were measured using the Conners' Adult ADHD Rating Scales-Self Report: Long Version, and prison breaches of discipline were gathered from the Scottish Prison Service's Prisoner Records System. Youths who were symptomatic of Diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM-IV) ADHD total symptoms had a significantly higher number of prison breaches of discipline than those who were non-symptomatic. Youths who were symptomatic of DSM-IV hyperactive/impulsive symptoms had a significantly higher number of violent and non-violent prison breaches of discipline than those who were non-symptomatic. However, no such significant difference was found between youths who were symptomatic and non-symptomatic of DSM-IV inattentive symptoms. Young male offenders who are symptomatic of ADHD have a higher number of prison breaches of discipline. In particular, symptoms of hyperactivity/impulsivity are associated with breaches of both a violent and non-violent nature. Implications of such symptoms on rehabilitation and recidivism are discussed. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  8. Psychological contract types as moderator in the breach-violation and violation-burnout relationships.

    Science.gov (United States)

    Jamil, Amber; Raja, Usman; Darr, Wendy

    2013-01-01

    This research examined the relationships between perceived psychological contract breach, felt violation, and burnout in a sample (n = 361) of employees from various organizations in Pakistan. The moderating role of contract types in these relationships was also tested. Findings supported a positive association between perceived psychological contract breach and felt violation and both were positively related to burnout. Transactional and relational contracts moderated the felt violation-burnout relationship. Scores on relational contract type tended to be higher than for transactional contract type showing some contextual influence.

  9. BLT [Breach, Leach, and Transport]: A source term computer code for low-level waste shallow land burial

    International Nuclear Information System (INIS)

    Suen, C.J.; Sullivan, T.M.

    1990-01-01

    This paper discusses the development of a source term model for low-level waste shallow land burial facilities and separates the problem into four individual compartments. These are water flow, corrosion and subsequent breaching of containers, leaching of the waste forms, and solute transport. For the first and the last compartments, we adopted the existing codes, FEMWATER and FEMWASTE, respectively. We wrote two new modules for the other two compartments in the form of two separate Fortran subroutines -- BREACH and LEACH. They were incorporated into a modified version of the transport code FEMWASTE. The resultant code, which contains all three modules of container breaching, waste form leaching, and solute transport, was renamed BLT (for Breach, Leach, and Transport). This paper summarizes the overall program structure and logistics, and presents two examples from the results of verification and sensitivity tests. 6 refs., 7 figs., 1 tab

  10. Psychological contract breach and outcomes: Combining meta-analysis and structural equation models.

    Science.gov (United States)

    Topa Cantisano, Gabriela; Morales Domínguez, J Francisco; Depolo, Marco

    2008-08-01

    In this study, meta-analytic procedures were used to examine the relationships between psychological contract perceived breach and certain outcome variables, such as organizational commitment, job satisfaction and organizational citizenship behaviours (OCB). Our review of the literature generated 41 independent samples in which perceived breach was used as a predictor of these personal and organizational outcomes. A medium effect size (ES) for desirable outcomes (job satisfaction, organizational commitment, organizational trust, OCB and performance) was obtained (r=-.35). For undesirable outcomes (neglect in role duties and intention to leave), ES were also medium (r=.31). When comparing attitudinal (job satisfaction, organizational commitment, organizational trust) and behavioural outcomes (OCB, neglect in role duties and performance), a stronger ES was found for attitudinal (r=-.24) than for behavioural outcomes (r=-.11). Potential moderator variables were examined, and it was found that they explained only a percentage of variability of primary studies. Structural equation analysis of the pooled meta-analytical correlation matrix indicated that the relationships of perceived breach with satisfaction, OCB, intention to leave and performance are fully mediated by organizational trust and commitment. Results are discussed in order to suggest theoretical and empirical implications.

  11. ECONOMIC AND LEGAL ASPECTS OF THE PLANNED DAMAGES ACTIONS FOR THE BREACHES OF EC ANTITRUST LAW

    Directory of Open Access Journals (Sweden)

    Elena Isac

    2010-09-01

    Full Text Available This paper investigates the planned damages actions for breaches of EC antitrust law in order to assess their impact on consumer welfare. It first examines the current legal situation and concurs that the European Union needs to regulate damages actions for breaches of EC antitrust law so that a higher number of consumers could be compensated for their losses. This paper then discusses the main legal provisions proposed by the Commission in the Green and in the White paper on damages actions for breaches of EC antitrust law. The analysis of these proposed legal provisions is done using arguments specific to the economic analysis of law. It is demonstrated that most of these proposed legal provisions will enhance consumer welfare but that there are also proposed legal provisions which will damage consumer welfare. The paper concludes that the planned damages actions for breaches of the EC law will be an improvement compared to the current situation. However, the Commission should amend some of the proposed legal provisions in order to help consumers further.

  12. Quick Look Report for Chemical Reactivity Modeling of Various Multi-Canister Overpack Breaches

    Energy Technology Data Exchange (ETDEWEB)

    Bratton, Robert Lawrence

    2002-04-01

    uranium oxide coating covers the exposed uranium metal, yet uranium hydride can still form under the protective oxide coating over the 40-year interim storage time span. The current treatment process at Hanford does not remove chemically bound water contained in the hydrates or in the waters of hydration. The chemically bound water is the source material for hydrogen production over the 40-year storage time. So, additional uranium hydride creates concerns that breaches of an MCO with the appropriate size openings could result in the onset of bulk uranium oxidation with the potential of a self-sustaining thermal excursion or pyrophoric event. For this analysis, the worst-case scenario appears to be the match head configuration in a vertically standing MCO, where all the reactive surface area is placed on the tips of the fuel elements. This configuration concentrates the heat-producing chemical reaction at the tips of the fuel elements. Because no mechanistic drop analysis has been performed at this time to determine the MCO failure modes, parametric breach configurations were chosen in this analysis to determine the MCOs’ external thermal response range. The first breach is a pair of holes that suddenly open in the MCO wall. This thermal excursion is controlled by the “thermal chimney effect” in the 4.27-m (14-ft) tall canisters caused by the multiple holes breach (one high and one low). A second breach where the MCO lid is suddenly removed and exposed to the ambient air

  13. Breaching barriers to collaboration in public spaces

    DEFF Research Database (Denmark)

    Heinemann, Trine; Mitchell, Robb

    2014-01-01

    Technology provoking disparate individuals to collaborate or share experiences in the public space faces a difficult barrier, namely the ordinary social order of urban places. We employed the notion of the breaching experiment to explore how this barrier might be overcome. We analyse responses...... of life in public spaces. Arising from this, we argue for the importance of qualities such as availability, facilitation, perspicuous settings, and perspicuous participants to encourage and support co-located strangers to collaborate and share experiences....

  14. CONTAINMENT EVALUATION OF BREACHED AL-SNF FOR CASK TRANSPORT

    International Nuclear Information System (INIS)

    Vinson, D. W.; Sindelar, R. L.; Iyer, N. C.

    2005-01-01

    Aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site. To enter the U.S., the cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Al-SNF is subject to corrosion degradation in water storage, and many of the fuel assemblies are ''failed'' or have through-clad damage. A methodology has been developed with technical bases to show that Al-SNF with cladding breaches can be directly transported in standard casks and maintained within the allowable release rates. The approach to evaluate the limiting allowable leakage rate, L R , for a cask with breached Al-SNF for comparison to its test leakage rate could be extended to other nuclear material systems. The approach for containment analysis of Al-SNF follows calculations for commercial spent fuel as provided in NUREG/CR-6487 that adopts ANSI N14.5 as a methodology for containment analysis. The material-specific features and characteristics of damaged Al-SNF (fuel materials, fabrication techniques, microstructure, radionuclide inventory, and vapor corrosion rates) that were derived from literature sources and/or developed in laboratory testing are applied to generate the four containment source terms that yield four separate cask cavity activity densities; namely, those from fines; gaseous fission product species; volatile fission product species; and fuel assembly crud. The activity values, A 2 , are developed per the guidance of 10CFR71. The analysis is performed parametrically to evaluate maximum number of breached assemblies and exposed fuel area for a proposed shipment in a cask with a test leakage rate

  15. Coastal bathymetry data collected in May 2015 from Fire Island, New York—Wilderness breach and shoreface

    Science.gov (United States)

    Nelson, Timothy R.; Miselis, Jennifer L.; Hapke, Cheryl J.; Brenner, Owen T.; Henderson, Rachel E.; Reynolds, Billy J.; Wilson, Kathleen E.

    2017-05-12

    Scientists from the U.S. Geological Survey (USGS) St. Petersburg Coastal and Marine Science Center in St. Petersburg, Florida, conducted a bathymetric survey of Fire Island from May 6-20, 2015. The USGS is involved in a post-Hurricane Sandy effort to map and monitor the morphologic evolution of the wilderness breach as a part of the Hurricane Sandy Supplemental Project GS2-2B. During this study, bathymetry data were collected with single-beam echo sounders and Global Positioning Systems, which were mounted to personal watercraft, along the Fire Island shoreface and within the wilderness breach. Additional bathymetry and elevation data were collected using backpack Global Positioning Systems on flood shoals and in shallow channels within the wilderness breach.

  16. Breach of duty: Power of shareholders to ratify directors fraudulent dealings

    Directory of Open Access Journals (Sweden)

    Anthony O. Nwafor

    2014-07-01

    Full Text Available Company directors owe duty of loyalty to the company which prohibits them from fraudulent dealings in the course of conducting the affairs of the company. Although the shareholders could, in the exercise of their voting powers, grant relieves to the directors from liabilities arising from a breach of duty that amounts to fraud, the extent and capacity in which the shareholders could exercise such powers is confounded by the elusive attempts by the courts in defining fraud. The paper argues that without a definite meaning ascribed to fraud, the power and capacity in which the shareholders could ratify a breach of duty arising from self-dealing and expropriation of corporate opportunities by directors cannot be predetermined, but that each case would be based on the peculiarities of its own facts.

  17. Motive Matters! An exploration of the notion ‘deliberate breach of contract’ and its consequences for the application of remedies

    OpenAIRE

    Kogelenberg, Martijn

    2012-01-01

    textabstractThis thesis explores the notion of deliberate breach of contract and its potential remedial consequences. In the major jurisdictions in Europe and in the United States the notion of deliberate breach of contract is generally not coherently and officially defined and acknowledged as an independent legal phenomenon. The ultimate added value of this thesis intends to be a first coherent comparative research on deliberate breach of contract and its potential consequences for the core ...

  18. The relationships between perceived organizational support, affective commitment, psychological contract breach, organizational citizenship behaviour and work engagement.

    Science.gov (United States)

    Gupta, Vishal; Agarwal, Upasna A; Khatri, Naresh

    2016-11-01

    This study examines the factors that mediate and moderate the relationships of perceived organizational support with work engagement and organization citizenship behaviour. Specifically, affective commitment is posited to mediate and psychological contract breach to moderate the above relationships. Nurses play a critical role in delivering exemplary health care. For nurses to perform at their best, they need to experience high engagement, which can be achieved by providing them necessary organizational support and proper working environment. Data were collected via a self-reported survey instrument. A questionnaire was administered to a random sample of 750 nurses in nine large hospitals in India during 2013-2014. Four hundred and seventy-five nurses (63%) responded to the survey. Hierarchical multiple regression was used for statistical analysis of the moderated-mediation model. Affective commitment was found to mediate the positive relationships between perceived organizational support and work outcomes (work engagement, organizational citizenship behaviour). The perception of unfulfilled expectations (psychological contract breach) was found to moderate the perceived organizational support-work outcome relationships adversely. The results of this study indicate that perceived organizational support exerts its influence on work-related outcomes and highlight the importance of taking organizational context, such as perceptions of psychological contract breach, into consideration when making sense of the influence of perceived organizational support on affective commitment, work engagement and citizenship behaviours of nurses. © 2016 John Wiley & Sons Ltd.

  19. Consequences of and remedies for breach of natural gas contracts

    International Nuclear Information System (INIS)

    Gretener, N. M.; Evans, A.; Callihoo, M.

    1999-01-01

    A common clause in a gas purchase contract is one that provides for specific damages for the non-performance of an obligation. As a rule, damages will be calculated based on the loss in the value of the bargain plus those losses foreseeably caused by the breach of contract. the rationale being to put the non-breaching party in as good a position as it would have been had the contract been performed. This paper examines the complex issues involved assessing and measuring damages, the concept of injunctive relief in circumstances where damages will be inadequate or insufficient to prevent injustice, the doctrine of mitigation, the extent of the right of set-off between different contracts, and the impact of bankruptcy and insolvency laws on the exercise of remedies. Four case histories are presented to illustrate the Courts' treatment of gas purchase contracts in the context of bankruptcies and /or insolvencies. 36 refs

  20. Consequences of and remedies for breach of natural gas contracts

    Energy Technology Data Exchange (ETDEWEB)

    Gretener, N. M.; Evans, A.; Callihoo, M. [Bennett Jones Law Group, Calgary, AB (Canada)

    1999-07-01

    A common clause in a gas purchase contract is one that provides for specific damages for the non-performance of an obligation. As a rule, damages will be calculated based on the loss in the value of the bargain plus those losses foreseeably caused by the breach of contract. the rationale being to put the non-breaching party in as good a position as it would have been had the contract been performed. This paper examines the complex issues involved assessing and measuring damages, the concept of injunctive relief in circumstances where damages will be inadequate or insufficient to prevent injustice, the doctrine of mitigation, the extent of the right of set-off between different contracts, and the impact of bankruptcy and insolvency laws on the exercise of remedies. Four case histories are presented to illustrate the Courts' treatment of gas purchase contracts in the context of bankruptcies and /or insolvencies. 36 refs.

  1. Work engagement, psychological contract breach and job satisfaction

    OpenAIRE

    Rayton, Bruce A.; Yalabik, Zeynep Y.

    2014-01-01

    This study extends both Social Exchange Theory and the Job Demands-Resources model by examining the link between psychological contract breach (PCB) and work engagement, and by integrating job satisfaction into this exchange relationship. We argue that PCB reflects employees' feelings of resource loss, and that these feelings impact work engagement through their impact on job satisfaction. Levels of employee work engagement can therefore be viewed as reciprocation for the exchange content pro...

  2. Establishing breach of the duty of care in the tort of negligence.

    Science.gov (United States)

    Tingle, John

    This article, the third in a series on clinical negligence, looks at the law surrounding breach of the duty of care in negligence. It shows some of the principles that judges and lawyers use in order to decide whether a person has broken his/her duty of care in the tort of negligence. It will be seen that the principles are contained in decided court cases, some of which are quite old but are still relevant today. The focus of this article is on the rule that courts, in deciding the issue of a breach of duty of care, would judge the defendant's conduct by the standard of what the hypothetical, 'reasonable person' would have done in the circumstances of the case.

  3. Psychological contract breach in the anticipatory stage of change : Employee responses and the moderating role of supervisory informational justice

    NARCIS (Netherlands)

    De Ruiter, M.; Schaveling, J.; Schalk, R.; Gelder, van D.

    2016-01-01

    This study examined the impact of two types of psychological contract breach (organizational policies and social atmosphere breach) on resistance to change and engagement in the anticipatory phase of change and assessed whether supervisory informational justice mitigated the negative effects of

  4. Psychological contract breach in the anticipatory stage of change : Employee responses and the moderating role of supervisory informational justice

    NARCIS (Netherlands)

    de Ruiter, M.; Schalk, R.; Schaveling, Jaap; van Gelder, Daniel

    This study examined the impact of two types of psychological contract breach (organizational policies and social atmosphere breach) on resistance to change and engagement in the anticipatory phase of change and assessed whether supervisory informational justice mitigated the negative effects of

  5. Routes for breaching and protecting genetic privacy.

    Science.gov (United States)

    Erlich, Yaniv; Narayanan, Arvind

    2014-06-01

    We are entering an era of ubiquitous genetic information for research, clinical care and personal curiosity. Sharing these data sets is vital for progress in biomedical research. However, a growing concern is the ability to protect the genetic privacy of the data originators. Here, we present an overview of genetic privacy breaching strategies. We outline the principles of each technique, indicate the underlying assumptions, and assess their technological complexity and maturation. We then review potential mitigation methods for privacy-preserving dissemination of sensitive data and highlight different cases that are relevant to genetic applications.

  6. Plugging Effects on Depressurization Time in Dry Storage Containers with Pinhole Breaches

    International Nuclear Information System (INIS)

    Casella, Andrew M.; LOYALKA, SUDARSHAN K.; Hanson, Brady D.

    2006-01-01

    As continuation on previous work, we now examine the effect that aerosol deposition may have on plugging pinhole breaches in spent fuel containers. A model is developed considering only diffusive settling

  7. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  8. GIS inundation mapping and dam breach analysis of Woolwich Dam using HEC-geoRAS

    Energy Technology Data Exchange (ETDEWEB)

    Mocan, N. [Crozier and Associates Inc., Collingwood, ON (Canada); Joy, D.M. [Guelph Univ., ON (Canada); Rungis, G. [Grand River Conservation Authority, Cambridge, ON (Canada)

    2006-07-01

    A study was conducted to determine the extent of flood inundation given a hypothetical dam breach scenario of the Woolwich Dam located in the Grand River Watershed, 2.5 km north of the Town of Elmira, Ontario. The dam is operated by the Grand River Conservation Authority and was constructed to provide low-flow augmentation to Canagagigue Creek. Advances in the computational capabilities of numerical models along with the availability of fine resolution geospatial data has lead to significant advances in the evaluation of catastrophic consequences due to the ensuing flood waters when dams fail. The hydraulic models HEC-RAS and HEC-GeoRAS were used in this study along with GIS to produce high resolution spatial and temporal flood inundation mapping. Given the proximity to the Town of Elmira, the dam is classified as having a high hazard potential. The large size and high hazard potential of the dam suggests that the Inflow Design Flood (IDF) is the Probable Maximum Flood (PMF) event. The outlet structure of the spillway consists of 4 ogee-type concrete spillways equipped with radial gates. A low-level concrete pipe located within the spillway structure provides spillage for maintenance purposes. The full flow capacity of the spillway structure is 297 cubic metres per second at the full supply level of 364.8 metres. In addition to GIS flood inundation maps, this paper included the results of flood hydrographs, water surface profiles and peak flow data. It was concluded that techniques used in this analysis should be considered for use in the development of emergency management planning and dam safety assessments across Canada. 6 refs., 3 tabs., 4 figs.

  9. Failure Predictions for Graphite Reflector Bricks in the Very High Temperature Reactor with the Prismatic Core Design

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Gyanender, E-mail: sing0550@umn.edu [Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States); Fok, Alex [Minnesota Dental Research in Biomaterials and Biomechanics, School of Dentistry, University of Minnesota, 515, Delaware St. SE, Minneapolis, MN 55455 (United States); Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States); Mantell, Susan [Department of Mechanical Engineering, University of Minnesota, 111, Church St. SE, Minneapolis, MN 55455 (United States)

    2017-06-15

    Highlights: • Failure probability of VHTR reflector bricks predicted though crack modeling. • Criterion chosen for defining failure strongly affects the predictions. • Breaching of the CRC could be significantly delayed through crack arrest. • Capability to predict crack initiation and propagation demonstrated. - Abstract: Graphite is used in nuclear reactor cores as a neutron moderator, reflector and structural material. The dimensions and physical properties of graphite change when it is exposed to neutron irradiation. The non-uniform changes in the dimensions and physical properties lead to the build-up of stresses over the course of time in the core components. When the stresses reach the critical limit, i.e. the strength of the material, cracking occurs and ultimately the components fail. In this paper, an explicit crack modeling approach to predict the probability of failure of a VHTR prismatic reactor core reflector brick is presented. Firstly, a constitutive model for graphite is constructed and used to predict the stress distribution in the reflector brick under in-reactor conditions of high temperature and irradiation. Fracture simulations are performed as part of a Monte Carlo analysis to predict the probability of failure. Failure probability is determined based on two different criteria for defining failure time: A) crack initiation and B) crack extension to near control rod channel. A significant difference is found between the failure probabilities based on the two criteria. It is predicted that the reflector bricks will start cracking during the time range of 5–9 years, while breaching of the control rod channels will occur during the period of 11–16 years. The results show that, due to crack arrest, there is a significantly delay between crack initiation and breaching of the control rod channel.

  10. Rapid reservoir erosion, hyperconcentrated flow, and downstream deposition triggered by breaching of 38 m tall Condit Dam, White Salmon River, Washington

    Science.gov (United States)

    Wilcox, Andrew C.; O'Connor, James E.; Major, Jon J.

    2014-01-01

    Condit Dam on the White Salmon River, Washington, a 38 m high dam impounding a large volume (1.8 million m3) of fine-grained sediment (60% sand, 35% silt and clay, and 5% gravel), was rapidly breached in October 2011. This unique dam decommissioning produced dramatic upstream and downstream geomorphic responses in the hours and weeks following breaching. Blasting a 5 m wide hole into the base of the dam resulted in rapid reservoir drawdown, abruptly releasing ~1.6 million m3 of reservoir water, exposing reservoir sediment to erosion, and triggering mass failures of the thickly accumulated reservoir sediment. Within 90 min of breaching, the reservoir's water and ~10% of its sediment had evacuated. At a gauging station 2.3 km downstream, flow increased briefly by 400 m3 s−1during passage of the initial pulse of released reservoir water, followed by a highly concentrated flow phase—up to 32% sediment by volume—as landslide-generated slurries from the reservoir moved downstream. This hyperconcentrated flow, analogous to those following volcanic eruptions or large landslides, draped the downstream river with predominantly fine sand. During the ensuing weeks, suspended-sediment concentration declined and sand and gravel bed load derived from continued reservoir erosion aggraded the channel by >1 m at the gauging station, after which the river incised back to near its initial elevation at this site. Within 15 weeks after breaching, over 1 million m3 of suspended load is estimated to have passed the gauging station, consistent with estimates that >60% of the reservoir's sediment had eroded. This dam removal highlights the influence of interactions among reservoir erosion processes, sediment composition, and style of decommissioning on rate of reservoir erosion and consequent downstream behavior of released sediment.

  11. SECURITY BREACH IN TRADING SYSTEM-COUNTERMEASURE USING IPTRACEBACK

    OpenAIRE

    M. P. Rajakumar; V. Shanthi

    2014-01-01

    Recently, economic scenario is often facing security breach that has heavy impact on the financial soundness of a company particularly, stock prices on firms. The utmost consequence being the whole business comes to a standstill. From the estimates attributed by the financial sector, it has been inferred that the loss incurred on virus and worms attack is said to have the greatest impact that hampers the prosperity of a business entity. Thus, security strategies attempt on revolving around th...

  12. Mining of high utility-probability sequential patterns from uncertain databases.

    Directory of Open Access Journals (Sweden)

    Binbin Zhang

    Full Text Available High-utility sequential pattern mining (HUSPM has become an important issue in the field of data mining. Several HUSPM algorithms have been designed to mine high-utility sequential patterns (HUPSPs. They have been applied in several real-life situations such as for consumer behavior analysis and event detection in sensor networks. Nonetheless, most studies on HUSPM have focused on mining HUPSPs in precise data. But in real-life, uncertainty is an important factor as data is collected using various types of sensors that are more or less accurate. Hence, data collected in a real-life database can be annotated with existing probabilities. This paper presents a novel pattern mining framework called high utility-probability sequential pattern mining (HUPSPM for mining high utility-probability sequential patterns (HUPSPs in uncertain sequence databases. A baseline algorithm with three optional pruning strategies is presented to mine HUPSPs. Moroever, to speed up the mining process, a projection mechanism is designed to create a database projection for each processed sequence, which is smaller than the original database. Thus, the number of unpromising candidates can be greatly reduced, as well as the execution time for mining HUPSPs. Substantial experiments both on real-life and synthetic datasets show that the designed algorithm performs well in terms of runtime, number of candidates, memory usage, and scalability for different minimum utility and minimum probability thresholds.

  13. Liability to disgorge profits upon breach of contract or a delict

    NARCIS (Netherlands)

    Schrage, E.J.H.

    2013-01-01

    Remedies regarding contract and tort are, generally speaking, concerned with the incidence of liability for loss or damage suffered, whereas the claim in unjust enrichment is said to require that the enrichment has occurred at the expense of the creditor. Consequently claims for breach of contract

  14. High probability of disease in angina pectoris patients

    DEFF Research Database (Denmark)

    Høilund-Carlsen, Poul F.; Johansen, Allan; Vach, Werner

    2007-01-01

    BACKGROUND: According to most current guidelines, stable angina pectoris patients with a high probability of having coronary artery disease can be reliably identified clinically. OBJECTIVES: To examine the reliability of clinical evaluation with or without an at-rest electrocardiogram (ECG......) in patients with a high probability of coronary artery disease. PATIENTS AND METHODS: A prospective series of 357 patients referred for coronary angiography (CA) for suspected stable angina pectoris were examined by a trained physician who judged their type of pain and Canadian Cardiovascular Society grade...... on CA. Of the patients who had also an abnormal at-rest ECG, 14% to 21% of men and 42% to 57% of women had normal MPS. Sex-related differences were statistically significant. CONCLUSIONS: Clinical prediction appears to be unreliable. Addition of at-rest ECG data results in some improvement, particularly...

  15. Run-Beyond-Cladding-Breach (RBCB) test results for the Integral Fast Reactor (IFR) metallic fuels program

    International Nuclear Information System (INIS)

    Batte, G.L.; Hoffman, G.L.

    1990-01-01

    In 1984 Argonne National Laboratory (ANL) began an aggressive program of research and development based on the concept of a closed system for fast-reactor power generation and on-site fuel reprocessing, exclusively designed around the use of metallic fuel. This is the Integral Fast Reactor (IFR). Although the Experimental Breeder Reactor-II (EBR-II) has used metallic fuel since its creation 25 yeas ago, in 1985 ANL began a study of the characteristics and behavior of an advanced-design metallic fuel based on uranium-zirconium (U-Zr) and uranium-plutonium-zirconium (U-Pu-Zr) alloys. During the past five years several areas were addressed concerning the performance of this fuel system. In all instances of testing the metallic fuel has demonstrated its ability to perform reliably to high burnups under varying design conditions. This paper will present one area of testing which concerns the fuel system's performance under breach conditions. It is the purpose of this paper to document the observed post-breach behavior of this advanced-design metallic fuel. 2 figs., 1 tab

  16. Social comparison and perceived breach of psychological contract: their effects on burnout in a multigroup analysis.

    Science.gov (United States)

    Cantisano, Gabriela Topa; Domínguez, J Francisco Morales; García, J Luis Caeiro

    2007-05-01

    This study focuses on the mediator role of social comparison in the relationship between perceived breach of psychological contract and burnout. A previous model showing the hypothesized effects of perceived breach on burnout, both direct and mediated, is proposed. The final model reached an optimal fit to the data and was confirmed through multigroup analysis using a sample of Spanish teachers (N = 401) belonging to preprimary, primary, and secondary schools. Multigroup analyses showed that the model fit all groups adequately.

  17. Decomposition of conditional probability for high-order symbolic Markov chains

    Science.gov (United States)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  18. Motive Matters! An exploration of the notion ‘deliberate breach of contract’ and its consequences for the application of remedies

    NARCIS (Netherlands)

    M. van Kogelenberg (Martijn)

    2012-01-01

    textabstractThis thesis explores the notion of deliberate breach of contract and its potential remedial consequences. In the major jurisdictions in Europe and in the United States the notion of deliberate breach of contract is generally not coherently and officially defined and acknowledged as an

  19. When it comes to securing patient health information from breaches, your best medicine is a dose of prevention: A cybersecurity risk assessment checklist.

    Science.gov (United States)

    Blanke, Sandra J; McGrady, Elizabeth

    2016-07-01

    Health care stakeholders are concerned about the growing risk of protecting sensitive patient health information from breaches. The Federal Emergency Management Agency (FEMA) has identified cyber attacks as an emerging concern, and regulations such as the Health Insurance Portability and Accountability Act (HIPAA) and the Health Information Technology for Economic and Clinical Health Act (HITECH) have increased security requirements and are enforcing compliance through stiff financial penalties. The purpose of this study is to describe health care breaches of protected information, analyze the hazards and vulnerabilities of reported breach cases, and prescribe best practices of managing risk through security controls and countermeasures. Prescriptive findings were used to construct a checklist tool to assess and monitor common risks. This research uses a case methodology to describe specific examples of the 3 major types of cyber breach hazards: portable device, insider, and physical breaches. We utilize a risk management framework to prescribe preventative actions that organizations can take to assess, analyze, and mitigate these risks. The health care sector has the largest number of reported breaches, with 3 major types: portable device, insider, and physical breaches. Analysis of actual cases indicates security gaps requiring prescriptive fixes based on "best practices." Our research culminates in a 25-item checklist that organizations can use to assess existing practices and identify security gaps requiring improvement. © 2016 American Society for Healthcare Risk Management of the American Hospital Association.

  20. Coaches in the Courtroom: Recovery in Actions for Breach of Employment Contracts.

    Science.gov (United States)

    Graves, Judson

    1986-01-01

    The rapid hiring and firing of college athletic coaches, the litigation brought in breach of employment contracts, and the special problems presented by coaching contracts have raised hard legal questions about proper methods of contract enforcement and recovery of damages. (MSE)

  1. 47 CFR 64.2011 - Notification of customer proprietary network information security breaches.

    Science.gov (United States)

    2010-10-01

    ... information security breaches. 64.2011 Section 64.2011 Telecommunication FEDERAL COMMUNICATIONS COMMISSION... Proprietary Network Information § 64.2011 Notification of customer proprietary network information security... criminal investigation or national security, such agency may direct the carrier not to so disclose or...

  2. First HIV legal precedent in Kyrgyzstan: breach of medical privacy.

    Science.gov (United States)

    Iriskulbekov, Erik; Balybaeva, Asylgul

    2007-12-01

    A recent court case of a breach of the privacy rights of a person living with HIV/AIDS in Kyrgyzstan is the first of its kind in Central Asia, write Erik Iriskulbekov and Asylgul Balybaeva. ADILET, the NGO that brought the case to court, is one of only a few NGOs in Central Asia that provide legal assistance related to HIV and AIDS.

  3. The Relationship between Psychological Contract Breach and Organizational Commitment: Exchange Imbalance as a Moderator of the Mediating Role of Violation

    Science.gov (United States)

    Cassar, Vincent; Briner, Rob B.

    2011-01-01

    This study tested the mediating role of violation in the relationship between breach and both affective and continuance commitment and the extent to which this mediating role is moderated by exchange imbalance amongst a sample of 103 sales personnel. Results suggest that violation mediated the relationship between breach and commitment. Also,…

  4. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  5. Crevasse Splays Versus Avulsions: A Recipe for Land Building With Levee Breaches

    Science.gov (United States)

    Nienhuis, Jaap H.; Törnqvist, Torbjörn E.; Esposito, Christopher R.

    2018-05-01

    Natural-levee breaches can not only initiate an avulsion but also, under the right circumstances, lead to crevasse splay formation and overbank sedimentation. The formative conditions for crevasse splays are not well understood, yet such river sediment diversions form an integral part of billion-dollar coastal restoration projects. Here we use Delft3D to investigate the influence of vegetation and soil consolidation on the evolution of a natural-levee breach. Model simulations show that crevasse splays heal because floodplain aggradation reduces the water surface slope, decreasing water discharge into the flood basin. Easily erodible and unvegetated floodplains increase the likelihood for channel avulsions. Denser vegetation and less potential for soil consolidation result in small crevasse splays that are not only efficient sediment traps but also short-lived. Successful crevasse splays that generate the largest land area gain for the imported sediment require a delicate balance between water and sediment discharge, vegetation root strength, and soil consolidation.

  6. Breaching confidentiality: medical mandatory reporting laws in Iran.

    Science.gov (United States)

    Milanifar, Alireza; Larijani, Bagher; Paykarzadeh, Parvaneh; Ashtari, Golanna; Mehdi Akhondi, Mohammad

    2014-01-01

    Medical ethics is a realm where four important subjects of philosophy, medicine, theology and law are covered. Physicians and philosophers cooperation in this area will have great efficiency in the respective ethical rules formation. In addition to respect the autonomy of the patient, physician's obligation is to ensure that the medical intervention has benefit for the patient and the harm is minimal. There is an obvious conflict between duty of confidentiality and duty of mandatory reporting. Professional confidentiality is one of the basic components in building a constant physician-patient relationship which nowadays, beside the novelty, it is the subject of discussion. Legal obligation of confidentiality is not absolute. In physician-patient relationship, keeping patient's secrets and maintaining confidentiality is a legal and ethical duty, and disclosure of such secrets is mainly through specific statutes. Thus, there are a number of situations where breach of confidentiality is permitted in different legal systems. One of the situations where breaching confidentiality is permitted is the medical mandatory reporting to the relevant authority which is in accordance with many countries' legal systems. Some situations are considered in many countries legal systems' such as notification of births and deaths, infectious diseases, child abuse, sport and relevant events, medical errors, drug side effects and dangerous pregnancies. In this paper, we will examine and discuss medical mandatory reporting and its ethical and legal aspects in the judicial and legal system of Iran and few other countries. Finally we will suggest making Medical Mandatory Reporting Law in Iran.

  7. Paternalistic breaches of confidentiality in prison: mental health professionals' attitudes and justifications.

    Science.gov (United States)

    Elger, Bernice Simone; Handtke, Violet; Wangmo, Tenzin

    2015-06-01

    This manuscript presents mental health practitioners' (MHPs) practice, attitudes and justifications for breaching confidentiality when imprisoned patients disclose suicidal thoughts or abuse by others. 24 MHPs working in Swiss prisons shared their experiences regarding confidentiality practices. The data were analysed qualitatively and MHPs' attitudes and course of action were identified. Analysis revealed paternalistic breaches of confidentiality. When patients reported suicidal thoughts and abuse, MHPs believed that forgoing confidentiality is necessary to protect patients, providing several justifications for it. Patients were informed that such information will be transmitted without their consent to medical and non-medical prison personnel. With reference to suicidal attempts, MHPs resorted to methods that may reduce suicidal attempts such as transfer to hospital or internal changes in living arrangements, which would require provision of certain information to prison guards. In cases of abuse, some MHPs convinced patients to accept intervention or sometimes overrode competent patients' refusals to report. Also in the case of abuse, provision of limited information to other prison personnel was seen as an acceptable method to protect patients from further harm. Breaches of confidentiality, whether limited or full, remain unethical, when used for competent patients based solely on paternalistic justifications. Institutionalising ethical and legal procedures to address suicidal and abuse situations would be helpful. Education and training to help both medical and prison personnel to respond to such situations in an appropriate manner that ensures confidentiality and protects patients from suicide and abuse are necessary. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. One after the other : Effects of sequence patterns of breaches and overfulfilled obligations

    NARCIS (Netherlands)

    de Jong, Jeroen; Rigotti, Thomas; Mulder, J.

    2017-01-01

    To date, the study of psychological contracts has primarily centred on the question how retrospective evaluations of the psychological contract impact employee attitudes and behaviours, and/or focus on individual coping processes in explaining responses to breached or overfulfilled obligations. In

  9. Transient response and radiation dose estimates for breaches to a spent fuel processing facility

    Energy Technology Data Exchange (ETDEWEB)

    Solbrig, Charles W., E-mail: soltechco@aol.com; Pope, Chad; Andrus, Jason

    2014-08-15

    Highlights: • We model doses received from a nuclear fuel facility from boundary leaks due to an earthquake. • The supplemental exhaust system (SES) starts after breach causing air to be sucked into the cell. • Exposed metal fuel burns increasing pressure and release of radioactive contamination. • Facility releases are small and much less than the limits showing costly refits are unnecessary. • The method presented can be used in other nuclear fuel processing facilities. - Abstract: This paper describes the analysis of the design basis accident for Idaho National Laboratory Fuel Conditioning Facility (FCF). The facility is used to process spent metallic nuclear fuel. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities.

  10. Transient response and radiation dose estimates for breaches to a spent fuel processing facility

    International Nuclear Information System (INIS)

    Solbrig, Charles W.; Pope, Chad; Andrus, Jason

    2014-01-01

    Highlights: • We model doses received from a nuclear fuel facility from boundary leaks due to an earthquake. • The supplemental exhaust system (SES) starts after breach causing air to be sucked into the cell. • Exposed metal fuel burns increasing pressure and release of radioactive contamination. • Facility releases are small and much less than the limits showing costly refits are unnecessary. • The method presented can be used in other nuclear fuel processing facilities. - Abstract: This paper describes the analysis of the design basis accident for Idaho National Laboratory Fuel Conditioning Facility (FCF). The facility is used to process spent metallic nuclear fuel. This analysis involves a model of the transient behavior of the FCF inert atmosphere hot cell following an earthquake initiated breach of pipes passing through the cell boundary. Such breaches allow the introduction of air and subsequent burning of pyrophoric metals. The model predicts the pressure, temperature, volumetric releases, cell heat transfer, metal fuel combustion, heat generation rates, radiological releases and other quantities. The results show that releases from the cell are minimal and satisfactory for safety. This analysis method should be useful in other facilities that have potential for damage from an earthquake and could eliminate the need to back fit facilities with earthquake proof boundaries or lessen the cost of new facilities

  11. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    Science.gov (United States)

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  12. MINIMIZING GLOVEBOX GLOVE BREACHES, PART IV: CONTROL CHARTS

    International Nuclear Information System (INIS)

    Cournoyer, Michael E.; Lee, Michelle B.; Schreiber, Stephen B.

    2007-01-01

    At the Los Alamos National Laboratory (LANL) Plutonium Facility, plutonium. isotopes and other actinides are handled in a glovebox environment. The spread of radiological contamination, and excursions of contaminants into the worker's breathing zone, are minimized and/or prevented through the use of glovebox technology. Evaluating the glovebox configuration, the glovebo gloves are the most vulnerable part of this engineering control. Recognizing this vulnerability, the Glovebox Glove Integrity Program (GGIP) was developed to minimize and/or prevent unplanned openings in the glovebox environment, i.e., glove failures and breaches. In addition, LANL implement the 'Lean Six Sigma (LSS)' program that incorporates the practices of Lean Manufacturing and Six Sigma technologies and tools to effectively improve administrative and engineering controls and work processes. One tool used in LSS is the use of control charts, which is an effective way to characterize data collected from unplanned openings in the glovebox environment. The benefit management receives from using this tool is two-fold. First, control charts signal the absence or presence of systematic variations that result in process instability, in relation to glovebox glove breaches and failures. Second, these graphical representations of process variation detennine whether an improved process is under control. Further, control charts are used to identify statistically significant variations (trends) that can be used in decision making to improve processes. This paper discusses performance indicators assessed by the use control charts, provides examples of control charts, and shows how managers use the results to make decisions. This effort contributes to LANL Continuous Improvement Program by improving the efficiency, cost effectiveness, and formality of glovebox operations.

  13. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  14. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  15. Perceived Control and Psychological Contract Breach as Explanations of the Relationships Between Job Insecurity, Job Strain and Coping Reactions: Towards a Theoretical Integration.

    Science.gov (United States)

    Vander Elst, Tinne; De Cuyper, Nele; Baillien, Elfi; Niesen, Wendy; De Witte, Hans

    2016-04-01

    This study aims to further knowledge on the mechanisms through which job insecurity is related to negative outcomes. Based on appraisal theory, two explanations-perceived control and psychological contract breach-were theoretically integrated in a comprehensive model and simultaneously examined as mediators of the job insecurity-outcome relationship. Different categories of outcomes were considered, namely work-related (i.e. vigour and need for recovery) and general strain (i.e. mental and physical health complaints), as well as psychological (i.e. job satisfaction and organizational commitment) and behavioural coping reactions (i.e. self-rated performance and innovative work behaviour). The hypotheses were tested using data of a heterogeneous sample of 2413 Flemish employees by means of both single and multiple mediator structural equation modelling analyses (bootstrapping method). Particularly, psychological contract breach accounted for the relationship between job insecurity and strain. Both perceived control and psychological contract breach mediated the relationships between job insecurity and psychological coping reactions, although the indirect effects were larger for psychological contract breach. Finally, perceived control was more important than psychological contract breach in mediating the relationships between job insecurity and behavioural coping reactions. This study meets previous calls for a theoretical integration regarding mediators of the job insecurity-outcome relationship. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    Science.gov (United States)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  17. Psychological contract breach and work performance: Is social exchange a buffer or an intensifier?

    NARCIS (Netherlands)

    Bal, P.M.; Chiaburu, D.S.; Jansen, P.G.W.

    2010-01-01

    Purpose: The aim of this paper is to investigate how social exchanges modify the relationship between psychological contract breach and work performance. It aims to present two concurrent hypotheses, based on theoretical interaction effects of social exchanges (conceptualized as social exchange

  18. Pinhole Breaches in Spent Fuel Containers: Improvements to Conservative Models of Aerosol Release and Plugging

    International Nuclear Information System (INIS)

    Casella, Andrew M.; Loyalka, Sudarsham K.; Hanson, Brady D.

    2007-01-01

    By taking the differential forms of transport and equations of state, the equations describing aerosol transport in pinhole breaches can be solved directly using continuous models. The results are compared with discrete models.

  19. A guide to California's breaches. First year of state reporting requirement reveals common privacy violations.

    Science.gov (United States)

    Dimick, Chris

    2010-04-01

    Effective January 1, 2009, California healthcare providers were required to report every breach of patient information to the state. They have sent a flood of mishaps and a steady stream of malicious acts.

  20. A Depth-Averaged 2-D Simulation for Coastal Barrier Breaching Processes

    Science.gov (United States)

    2011-05-01

    including bed change and variable flow density in the flow continuity and momentum equations. The model adopts the HLL approximate Riemann solver to handle...flow density in the flow continuity and momentum equations. The model adopts the HLL approximate Riemann solver to handle the mixed-regime flows near...18 547 Keulegan equation or the Bernoulli equation, and the breach morphological change is determined using simplified sediment transport models

  1. Breach of information duties in the B2C e-commerce: adequacy of available remedies

    Directory of Open Access Journals (Sweden)

    Zofia Bednarz

    2016-07-01

    Full Text Available

    B2C e-commerce is characterised by the information asymmetry between the contracting parties. Various information duties are imposed on traders, both at the European and national level to correct this asymmetry and to ensure proper market functioning. The mandated disclosure is based on the assumption of consumers' rationality. However, developments of behavioural economics challenge this assumption. The utility of mandated disclosure in consumer contracts depends also on the remedies available to consumers in a case of breach of information duties. Those remedies are often heavily influenced by the national general private law applicable to the contractual relationship between the parties. Nevertheless, since the economics of general contract law differ importantly from principles of consumer e-commerce, various problems can be associated with the application of general law remedies to the breach of information duties in B2C contracts. The limited value of the majority of the online B2C transactions is incompatible with costly and lengthy court proceedings. Moreover, breach of information duties will often not produce enough material damage on the side of the consumer to make the remedies available. Different solutions are explored, from ADR, to the duty to advise, to non-legal mechanisms making the information easier to use for consumers throughlimiting disclosure. Finally, the right of withdrawal is analysed as an example of a specific remedy, adapted to the economics of the B2C electronic transactions, where the aims parties pursue through contracts are different than in commercial contracts, and their relationship is marked with the inequality of economic power and information asymmetry. However, the legally established cooling-off period is not free from limitations, and only a combination of various measures, including effective

  2. Quantitative and Qualitative Job Insecurity and Idea Generation: The Mediating Role of Psychological Contract Breach

    Directory of Open Access Journals (Sweden)

    Wendy Niesen

    2018-03-01

    Full Text Available This study investigates how quantitative and qualitative job insecurity relate to idea generation, a dimension of innovative work behaviour. We hypothesise that both types of job insecurity relate negatively to this type of innovative behaviour, and expect a stronger association between quantitative job insecurity and idea generation. Moreover, we argue that psychological contract breach mediates (‘explains’ these negative relationships. The hypotheses were tested in a sample of 1420 supervisors from a large Belgian organisation, using hierarchical regression analyses, bootstrapping analyses, and relative weight analysis. The results showed that both types of job insecurity are negatively associated with idea generation. Contrary to our expectations, the relationship between both forms of job insecurity and idea generation was equally strong. Psychological contract breach was found to mediate these relationships.

  3. Recommendations for a barrier island breach management plan for Fire Island National Seashore, including the Otis Pike High Dune Wilderness Area, Long Island, New York

    Science.gov (United States)

    Williams, S. Jeffress; Foley, Mary K.

    2007-01-01

    -control stabilization of the headlandds such as the Montauk Point headlands, and deepening of navigation channels by dredging through the tidal inlets and in the bays. Indirect impacts that have a bearing on decisions to deal with breaching are: high-risk development of the barrier islands and low-lying areas of the mainland vulnerable to flooding, and the dredging of nearshore sand shoals for beach nourishment. The NPS strives to employ a coastal management framework for decision making that is based on assessment of the physical and ecological properties of the shoreline as well as human welfare and property. In order to protect developed areas of Fire Island and the mainland from loss of life, flooding, and other economic and physical damage, the NPS will likely need to consider allowing artificial closure of some breaches within the FIIS under certain circumstances. The decision by the NPS to allow breaches to evolve naturally and possibly close or to allow artificially closing breaches is based on four criteria: 1. Volumes of sediment transported landward and exchange of water and nutrients;

  4. No harm done? Assessing risk of harm under the federal breach notification rule.

    Science.gov (United States)

    Dimick, Chris

    2010-08-01

    Provisions within the HITECH Act require that covered entities notify individuals if their protected health information is breached. However, the current regulation allows an exemption if the risk of harm is slight. Assessing risk can be subjective, and privacy officers have been working to create methods to conduct and document their analyses.

  5. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  6. Market Reactions to Publicly Announced Privacy and Security Breaches Suffered by Companies Listed on the United States Stock Exchanges: A Comparative Empirical Investigation

    Science.gov (United States)

    Coronado, Adolfo S.

    2012-01-01

    Using a sample of security and privacy breaches the present research examines the comparative announcement impact between the two types of events. The first part of the dissertation analyzes the impact of publicly announced security and privacy breaches on abnormal stock returns, the change in firm risk, and abnormal trading volume are measured.…

  7. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    Science.gov (United States)

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely

  8. The Changing Contours of the Psychological Contract: Unpacking Context and Circumstances of Breach

    Science.gov (United States)

    Pate, Judy

    2006-01-01

    Purpose: The purpose of this paper is to propose a processual framework of psychological contract breach, which maps holistically the interactions among concepts drawn from the trust and justice literature. However, the price of a holistic picture is frequently a lack of depth of analysis of any single variable, and consequently the second part of…

  9. Medical negligence based on bad faith, breach of contract, or mental anguish.

    Science.gov (United States)

    Ficarra, B J

    1980-01-01

    Financial recovery owing to breach of contract is restricted to the pecuniary amount lost because of failure to perform on the stipulated contract. With the acquisition of newer knowledge, attorneys are now utilizing the weapon of contractual failure as applied to medical negligence. The impetus to this new weapon for the plaintiff has accrued because of the favorable verdicts rendered from positive decisions based upon bad faith.

  10. Once more unto the breach managing information security in an uncertain world

    CERN Document Server

    Simmons, Andrea C

    2012-01-01

    In Once more unto the Breach, Andrea C Simmons speaks directly to information security managers and provides an insider's view of the role, offering priceless gems from her extensive experience and knowledge. Based on a typical year in the life of an information security manager, the book examines how the general principles can be applied to all situations and discusses the lessons learnt from a real project.

  11. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    Science.gov (United States)

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  12. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    Science.gov (United States)

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  13. Use of an influence diagram and fuzzy probability for evaluating accident management in a boiling water reactor

    International Nuclear Information System (INIS)

    Yu, D.; Kastenberg, W.E.; Okrent, D.

    1994-01-01

    A new approach is presented for evaluating the uncertainties inherent in severe accident management strategies. At first, this analysis considers accident management as a decision problem (i.e., applying a strategy compared with do nothing) and uses an influence diagram. To evaluate imprecise node probabilities in the influence diagram, the analysis introduces the concept of a fuzzy probability. When fuzzy logic is applied, fuzzy probabilities are easily propagated to obtain results. In addition, the results obtained provide not only information similar to the classical approach, which uses point-estimate values, but also additional information regarding the impact of using imprecise input data. As an illustrative example, the proposed methodology is applied to the evaluation of the drywell flooding strategy for a long-term station blackout sequence at the Peach Bottom nuclear power plant. The results show that the drywell flooding strategy is beneficial for preventing reactor vessel breach. It is also effective for reducing the probability of containment failure for both liner melt-through and late overpressurization. Even though uncertainty exists in the results, flooding is preferred to do nothing when evaluated in terms of two risk measures: early and late fatalities

  14. Estimates of mean consequences and confidence bounds on the mean associated with low-probability seismic events in total system performance assessments

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James

    2007-01-01

    An approach is described to estimate mean consequences and confidence bounds on the mean of seismic events with low probability of breaching components of the engineered barrier system. The approach is aimed at complementing total system performance assessment models used to understand consequences of scenarios leading to radionuclide releases in geologic nuclear waste repository systems. The objective is to develop an efficient approach to estimate mean consequences associated with seismic events of low probability, employing data from a performance assessment model with a modest number of Monte Carlo realizations. The derived equations and formulas were tested with results from a specific performance assessment model. The derived equations appear to be one method to estimate mean consequences without having to use a large number of realizations. (authors)

  15. The probability distribution of intergranular stress corrosion cracking life for sensitized 304 stainless steels in high temperature, high purity water

    International Nuclear Information System (INIS)

    Akashi, Masatsune; Kenjyo, Takao; Matsukura, Shinji; Kawamoto, Teruaki

    1984-01-01

    In order to discuss the probability distribution of intergranular stress corrsion carcking life for sensitized 304 stainless steels, a series of the creviced bent beem (CBB) and the uni-axial constant load tests were carried out in oxygenated high temperature, high purity water. The following concludions were resulted; (1) The initiation process of intergranular stress corrosion cracking has been assumed to be approximated by the Poisson stochastic process, based on the CBB test results. (2) The probability distribution of intergranular stress corrosion cracking life may consequently be approximated by the exponential probability distribution. (3) The experimental data could be fitted to the exponential probability distribution. (author)

  16. Use of an influence diagram and fuzzy probability for evaluating accident management in a BWR

    International Nuclear Information System (INIS)

    Yu, Donghan; Okrent, D.; Kastenberg, W.E.

    1993-01-01

    This paper develops a new approach for evaluating severe accident management strategies. At first, this approach considers accident management as a decision problem (i.e., ''applying a strategy'' vs. ''do nothing'') and uses influence diagrams. This approach introduces the concept of a ''fuzzy probability'' in the evaluation of an influence diagram. When fuzzy logic is applied, fuzzy probabilities in an influence diagram can be easily propagated to obtain results. In addition, the results obtained provide not only information similar to the classical approach using point-estimate values, but also additional information regarding the impact from imprecise input data. The proposed methodology is applied to the evaluation of the drywell flooding strategy for a long-term station blackout sequence in the Peach Bottom nuclear power plant. The results show that the drywell flooding strategy seems to be beneficial for preventing reactor vessel breach. It is also effective for reducing the probability of the containment failure for both liner melt-through and late overpressurization. Even though there exists uncertainty in the results, ''flooding'' is preferred to ''do nothing'' when evaluated in terms of expected consequences, i.e., early and late fatalities

  17. The Effect of Perceived Privacy Breaches on Continued Technology Use and Individual Psychology: The Construct, Instrument Development, and an Application Using Internet Search Engines

    Science.gov (United States)

    Ahmad, Altaf

    2010-01-01

    This dissertation involved the development of a new construct, perceived privacy breach (PPB), to evaluate how a person perceives breaches of privacy in terms of whether they perceive any exchange of information was fair or not and how they believe it will impact people whose information has been shared. . This instrument assists researchers to…

  18. Corrosion of breached UF6 storage cylinders

    International Nuclear Information System (INIS)

    Barber, E.J.; Taylor, M.S.; DeVan, J.H.

    1993-01-01

    This paper describes the corrosion processes that occurred following the mechanical failure of two steel 14-ton storage cylinders containing depleted UF 6 . The failures both were traced to small mechanical tears that occurred during stacking of the cylinders. Although subsequent corrosion processes greatly extended the openings in the wall. the reaction products formed were quite protective and prevented any significant environmental insult or loss of uranium. The relative sizes of the two holes correlated with the relative exposure times that had elapsed from the time of stacking. From the sizes and geometries of the two holes, together with analyses of the reaction products, it was possible to determine the chemical reactions that controlled the corrosion process and to develop a scenario for predicting the rate of hydrolysis of UF 6 , the loss rate of HF, and chemical attack of a breached UF 6 storage cylinder

  19. Semantic-less Breach Detection of Polymorphic Malware in Federated Cloud

    Directory of Open Access Journals (Sweden)

    Yahav Biran

    2017-06-01

    Full Text Available Cloud computing is one of the largest emerging utility services that is expected to grow enormously over the next decade. Many organizations are moving into hybrid cloud/hosted computing models. Single cloud service provider introduces cost and environmental challenges. Also, multi-cloud solution implemented by the Cloud tenant is suboptimal as it requires expensive adaptation costs. Cloud Federation is a useful structure for aggregating cloud based services under a single umbrella to share resources and responsibilities for the benefit of the member cloud service providers. An efficient security model is crucial for successful cloud business. However, with the advent of large scale and multi-tenant environments, the traditional perimeter boundaries along with traditional security practices are changing. Defining and securing asset and enclave boundaries is more challenging, and system perimeter boundaries are more susceptible to breach. This paper to describe security best practices for Cloud Federation. The paper also describes a tool and technique for detecting anomalous behavior in resource usage across the federation participants. This is a particularly serious issue because of the possibility of an attacker potentially gaining access to more than one CSP federation member. Specifically, this technique is developed for Cloud Federations since they have to deal with heterogeneous multi-platform environments with a diverse mixture of data and security log schema, and it has to do this in real time. A Semantic-less Breach detection system that implements a self-learning system was prototyped and resulted in up to 87% True-Positive rate with 93% True-Negative.

  20. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  1. High temperature triggers latent variation among individuals: oviposition rate and probability for outbreaks.

    Directory of Open Access Journals (Sweden)

    Christer Björkman

    2011-01-01

    Full Text Available It is anticipated that extreme population events, such as extinctions and outbreaks, will become more frequent as a consequence of climate change. To evaluate the increased probability of such events, it is crucial to understand the mechanisms involved. Variation between individuals in their response to climatic factors is an important consideration, especially if microevolution is expected to change the composition of populations.Here we present data of a willow leaf beetle species, showing high variation among individuals in oviposition rate at a high temperature (20 °C. It is particularly noteworthy that not all individuals responded to changes in temperature; individuals laying few eggs at 20 °C continued to do so when transferred to 12 °C, whereas individuals that laid many eggs at 20 °C reduced their oviposition and laid the same number of eggs as the others when transferred to 12 °C. When transferred back to 20 °C most individuals reverted to their original oviposition rate. Thus, high variation among individuals was only observed at the higher temperature. Using a simple population model and based on regional climate change scenarios we show that the probability of outbreaks increases if there is a realistic increase in the number of warm summers. The probability of outbreaks also increased with increasing heritability of the ability to respond to increased temperature.If climate becomes warmer and there is latent variation among individuals in their temperature response, the probability for outbreaks may increase. However, the likelihood for microevolution to play a role may be low. This conclusion is based on the fact that it has been difficult to show that microevolution affect the probability for extinctions. Our results highlight the urge for cautiousness when predicting the future concerning probabilities for extreme population events.

  2. Security breaches: tips for assessing and limiting your risks.

    Science.gov (United States)

    Coons, Leeanne R

    2011-01-01

    As part of their compliance planning, medical practices should undergo a risk assessment to determine any vulnerability within the practice relative to security breaches. Practices should also implement safeguards to limit their risks. Such safeguards include facility access controls, information and electronic media management, use of business associate agreements, and education and enforcement. Implementation of specific policies and procedures to address security incidents is another critical step that medical practices should take as part of their security incident prevention plan. Medical practices should not only develop policies and procedures to prevent, detect, contain, and correct security violations, but should make sure that such policies and procedures are actually implemented in their everyday operations.

  3. Law society breaches competition rules over financial regulation training for conveyancers

    OpenAIRE

    Johnson, D.

    2017-01-01

    The article considers the impact of a competition law ruling against the Law Society of England and Wales. \\ud \\ud The Law Society was found to have breached UK competition law rules in relation to its provision of anti-money laundering and mortgage fraud training courses to law firms. The Law Society made it a condition of membership of its Quality Conveyancing Scheme that all law firm members must only receive this training from the Law Society. A competing provider of legal training course...

  4. Breach of Personal Security through Applicative use of Online Social Networks

    Directory of Open Access Journals (Sweden)

    Bojan Nikolovski

    2013-11-01

    Full Text Available Throughout this article there is an attempt to indicate the threats of potential to breach of personal security through applicative use of internet as well as applicative use of online social networks. In addition to many other ways of privacy protection applicative users of social network’s sites must take into considerations the risk of distributing private data. Through a series of actions and settings users can customize the security settings with the ultimate goal of reducing the risk of attack on their privacy.

  5. Thermofluid experiments for Fusion Reactor Safety. Visualization of exchange flows through breaches of a vacuum vessel in a fusion reactor under the LOVA condition

    International Nuclear Information System (INIS)

    Fujii, Sadao; Shibazaki, Hiroaki; Takase, Kazuyuki; Kunugi, Tomoaki.

    1997-01-01

    Exchange flow rates through breaches of a vacuum vessel in a fusion reactor under the LOVA (Loss of VAcuum event) conditions were measured quantitatively by using a preliminary LOVA apparatus and exchange flow patterns over the breach were visualized qualitatively by smoke. Velocity distributions in the exchange flows were predicted from the observed flow patterns by using the correlation method in the flow visualization procedures. Mean velocities calculated from the predicted velocity distributions at the outside of the breach were in good agreement with the LOVA experimental results when the exchange flow velocities were low. It was found that the present flow visualization and the image processing system might be an useful procedure to evaluate the exchange flow rates. (author)

  6. State Security Breach Response Laws: State-by-State Summary Table. Using Data to Improve Education: A Legal Reference Guide to Protecting Student Privacy and Data Security

    Science.gov (United States)

    Data Quality Campaign, 2011

    2011-01-01

    Under security breach response laws, businesses--and sometimes state and governmental agencies--are required to inform individuals when the security, confidentiality or integrity of their personal information has been compromised. This resource provides a state-by-state analysis of security breach response laws. [The Data Quality Campaign has…

  7. The Effect of High Frequency Pulse on the Discharge Probability in Micro EDM

    Science.gov (United States)

    Liu, Y.; Qu, Y.; Zhang, W.; Ma, F.; Sha, Z.; Wang, Y.; Rolfe, B.; Zhang, S.

    2017-12-01

    High frequency pulse improves the machining efficiency of micro electric discharge machining (micro EDM), while it also brings some changes in micro EDM process. This paper focuses on the influence of skin-effect under the high frequency pulse on energy distribution and transmission in micro EDM, based on which, the rules of discharge probability of electrode end face are also analysed. On the basis of the electrical discharge process under the condition of high frequency pulse in micro EDM, COMSOL Multiphysics software is used to establish energy transmission model in micro electrode. The discharge energy distribution and transmission within tool electrode under different pulse frequencies, electrical currents, and permeability situation are studied in order to get the distribution pattern of current density and electric field intensity in the electrode end face under the influence of electrical parameters change. The electric field intensity distribution is regarded as the influencing parameter of discharge probability on the electrode end. Finally, MATLAB is used to fit the curve and obtain the distribution of discharge probability of electrode end face.

  8. Can Cross-Listing Mitigate the Impact of an Information Security Breach Announcement on a Firm's Values?

    Science.gov (United States)

    Chen, Yong; Dong, Feng; Chen, Hong; Xu, Li

    2016-08-01

    The increase in globalization in the markets has driven firms to adopt online technologies and to cross-list their stocks. Recent studies have consistently found that the announcements of information security breaches (ISBs) are negatively associated with the market values of the announcing firms during the days surrounding the breach announcements. Given the improvement in firms’ information environments and the better protection for investors generated by cross-listing, does cross-listing help firms to reduce the negative impacts caused by their announcements of ISBs? This paper conducts an event study of 120 publicly traded firms (among which 25 cross-list and 95 do not), in order to explore the answer. The results indicate that the impact of ISB announcements on a firm's stock prices shows no difference between cross-listing firms and non-cross-listing firms. Cross-listing does not mitigate the impact of ISBs announcement on a firm's market value.

  9. Educational Malpractice: Breach of Statutory Duty and Affirmative Acts of Negligence by a School District.

    Science.gov (United States)

    Beckham, Joseph

    1979-01-01

    A cause of action for educational malpractice may well receive initial judicial recognition through successfully harmonizing allegations of breach of a statutory duty of care and acts of negligence of a type and magnitude that would distinguish a student-plaintiff's injuries from others for whose benefit the statutory duty was created. (Author)

  10. Quasi 2D hydrodynamic modelling of the flooded hinterland due to dyke breaching on the Elbe River

    Directory of Open Access Journals (Sweden)

    S. Huang

    2007-01-01

    Full Text Available In flood modeling, many 1D and 2D combination and 2D models are used to simulate diversion of water from rivers through dyke breaches into the hinterland for extreme flood events. However, these models are too demanding in data requirements and computational resources which is an important consideration when uncertainty analysis using Monte Carlo techniques is used to complement the modeling exercise. The goal of this paper is to show the development of a quasi-2D modeling approach, which still calculates the dynamic wave in 1D but the discretisation of the computational units are in 2D, allowing a better spatial representation of the flow in the hinterland due to dyke breaching without a large additional expenditure on data pre-processing and computational time. A 2D representation of the flow and velocity fields is required to model sediment and micro-pollutant transport. The model DYNHYD (1D hydrodynamics from the WASP5 modeling package was used as a basis for the simulations. The model was extended to incorporate the quasi-2D approach and a Monte-Carlo Analysis was used to conduct a flood sensitivity analysis to determine the sensitivity of parameters and boundary conditions to the resulting water flow. An extreme flood event on the Elbe River, Germany, with a possible dyke breach area was used as a test case. The results show a good similarity with those obtained from another 1D/2D modeling study.

  11. "I Can Only Work So Hard Before I Burn Out." A Time Sensitive Conceptual Integration of Ideological Psychological Contract Breach, Work Effort, and Burnout.

    Science.gov (United States)

    Jones, Samantha K; Griep, Yannick

    2018-01-01

    Employees often draw meaning from personal experiences and contributions in their work, particularly when engaging in organizational activities that align with their personal identity or values. However, recent empirical findings have demonstrated how meaningful work can also have a negative effect on employee's well-being as employees feel so invested in their work, they push themselves beyond their limits resulting in strain and susceptibility to burnout. We develop a framework to understand this "double edged" role of meaningful work by drawing from ideological psychological contracts (iPCs), which are characterized by employees and their employer who are working to contribute to a shared ideology or set of values. Limited iPC research has demonstrated employees may actually work harder in response to an iPC breach. In light of these counterintuitive findings, we propose the following conceptual model to theoretically connect our understanding of iPCs, perceptions of breach, increases in work effort, and the potential "dark side" of repeated occurrences of iPC breach. We argue that time plays a central role in the unfolding process of employees' reactions to iPC breach over time. Further, we propose how perceptions of iPC breach relate to strain and, eventually, burnout. This model contributes to our understanding of the role of time in iPC development and maintenance, expands our exploration of ideology in the PC literature, and provides a framework to understanding why certain occupations are more susceptible to instances of strain and burnout. This framework has the potential to guide future employment interventions in ideology-infused organizations to help mitigate negative employee outcomes.

  12. The Relationship between Psychological Contract Breach and Employee Deviance: The Moderating Role of Hostile Attributional Style

    Science.gov (United States)

    Chiu, Su-Fen; Peng, Jei-Chen

    2008-01-01

    This study investigated the main effects and the interaction effects of psychological contract breach and hostile attributional style on employee deviance (i.e., interpersonal deviance and organizational deviance). Data were collected from 233 employees and their supervisors in eight electronic companies in Taiwan. Results demonstrate that…

  13. Suitable Penalty for Breach of Contract: AFROTC Cadets. A Research Report Submitted to the Faculty.

    Science.gov (United States)

    Reese, Robert D.

    A legislative history of financial incentives in the Reserve Officer Training Corps gives perspective to an analysis of present law and policy concerning breach of contract for Air Force ROTC cadets. The changed environment, criticisms of the present law and policy, and the example of three other Western nations with all volunteer militaries are…

  14. Operator decision aid for breached fuel operation in liquid metal cooled nuclear reactors

    International Nuclear Information System (INIS)

    Gross, K.C.; Hawkins, R.E.; Nickless, W.K.

    1991-01-01

    The purpose of this paper is to report the development of an expert system that provides continuous assessment of the safety significance and technical specification conformance of Delayed Neutron (DN) signals during breached fuel operation. The completed expert system has been parallelized on an innovative distributed-memory network-computing system that enables the computationally intensive kernel of the expert system to run in parallel on a group of low-cost Unix workstations. 1 ref

  15. Pressure pulses generated by gas released from a breached fuel element

    International Nuclear Information System (INIS)

    Wu, T.S.

    1979-01-01

    In experimental measurements of liquid pressure pulses generated by rapid release of gas from breached fuel elements in a nuclear reactor, different peak pressures were observed at locations equidistant from the origin of the release. Using the model of a submerged spherical bubble with a nonstationary center, this analysis predicts not only that the peak pressure would be higher at a point in front of the advancing bubble than that at a point the same distance behind the bubble origin, but also that the pressure pulse in front of the bubble reaches its peak later than the pulse behind the origin

  16. Lessons learned from a privacy breach at an academic health science centre.

    Science.gov (United States)

    Malonda, Jacqueline; Campbell, Janice; Crivianu-Gaita, Daniela; Freedman, Melvin H; Stevens, Polly; Laxer, Ronald M

    2009-01-01

    In 2007, the Hospital for Sick Children experienced a serious privacy breach when a laptop computer containing the personal health information of approximately 3,000 patients and research subjects was stolen from a physician-researcher's vehicle. This incident was reported to the information and privacy commissioner of Ontario (IPC). The IPC issued an order that required the hospital to examine and revise its policies, practices and research protocols related to the protection of personal health information and to educate staff on privacy-related matters.

  17. Investigating the Role of Psychological Contract Breach on Career Success: Convergent Evidence from Two Longitudinal Studies

    Science.gov (United States)

    Restubog, Simon Lloyd D.; Bordia, Prashant; Bordia, Sarbari

    2011-01-01

    The current study extends past research by examining leader-member exchange as a mediator of the relationship between employee reports of psychological contract breach and career success. In addition, we tested a competing perspective in which we proposed that performance mediators (i.e., in-role performance and organizational citizenship…

  18. Remedies for Breach Under the United Nations Convention on Contracts for International Sale of Goods (CISG)

    DEFF Research Database (Denmark)

    Lookofsky, Joseph

    2011-01-01

    For every breach of a binding contract, there must be some remedy. The gap-filling remedial structure of the 1980 Vienna Sales Convention (CISG) reflects the fact that all significant forms of remedial relief may be said to fall within three basic courses of action which modern legal systems make...

  19. Severity, probability and risk of accidents during maritime transport of radioactive material. Final report of a co-ordinated research project 1995-1999

    International Nuclear Information System (INIS)

    2001-07-01

    The primary purpose of this CRP was to provide a co-ordinated international effort to assemble and evaluate relevant data using sound technical judgement concerning the effects that fires, explosions or breaches of hulls of ships might have on the integrity of radioactive material packages. The probability and expected consequences of such events could thereby be assessed. If it were shown that the proportion of maritime accidents with severity in excess of the IAEA regulatory requirements was expected to be higher than that for land transport, then pertinent proposals could be submitted to the forthcoming Revision Panels to amend the IAEA Regulations for Safe Transport of Radioactive Material and their supporting documents. Four main areas of research were included in the CRP. These consisted of studying the probability of ship accidents; fire; collision; and radiological consequences

  20. Severity, probability and risk of accidents during maritime transport of radioactive material. Final report of a co-ordinated research project 1995-1999

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    The primary purpose of this CRP was to provide a co-ordinated international effort to assemble and evaluate relevant data using sound technical judgement concerning the effects that fires, explosions or breaches of hulls of ships might have on the integrity of radioactive material packages. The probability and expected consequences of such events could thereby be assessed. If it were shown that the proportion of maritime accidents with severity in excess of the IAEA regulatory requirements was expected to be higher than that for land transport, then pertinent proposals could be submitted to the forthcoming Revision Panels to amend the IAEA Regulations for Safe Transport of Radioactive Material and their supporting documents. Four main areas of research were included in the CRP. These consisted of studying the probability of ship accidents; fire; collision; and radiological consequences.

  1. Who breaches the four-hour emergency department wait time target? A retrospective analysis of 374,000 emergency department attendances between 2008 and 2013 at a type 1 emergency department in England.

    Science.gov (United States)

    Bobrovitz, Niklas; Lasserson, Daniel S; Briggs, Adam D M

    2017-11-02

    The four-hour target is a key hospital emergency department performance indicator in England and one that drives the physical and organisational design of the ED. Some studies have identified time of presentation as a key factor affecting waiting times. Few studies have investigated other determinants of breaching the four-hour target. Therefore, our objective was to describe patterns of emergency department breaches of the four-hour wait time target and identify patients at highest risk of breaching. This was a retrospective cohort study of a large type 1 Emergency department at an NHS teaching hospital in Oxford, England. We analysed anonymised individual level patient data for 378,873 emergency department attendances, representing all attendances between April 2008 and April 2013. We examined patient characteristics and emergency department presentation circumstances associated with the highest likelihood of breaching the four-hour wait time target. We used 374,459 complete cases for analysis. In total, 8.3% of all patients breached the four-hour wait time target. The main determinants of patients breaching the four-hour wait time target were hour of arrival to the ED, day of the week, patient age, ED referral source, and the types of investigations patients receive (p target were older, presented at night, presented on Monday, received multiple types of investigation in the emergency department, and were not self-referred (p target including patient age, ED referral source, the types of investigations patients receive, as well as the hour, day, and month of arrival to the ED. Efforts to reduce the number of breaches could explore late-evening/overnight staffing, access to diagnostic tests, rapid discharge facilities, and early assessment and input on diagnostic and management strategies from a senior practitioner.

  2. Breaching the security of the Kaiser Permanente Internet patient portal: the organizational foundations of information security.

    Science.gov (United States)

    Collmann, Jeff; Cooper, Ted

    2007-01-01

    This case study describes and analyzes a breach of the confidentiality and integrity of personally identified health information (e.g. appointment details, answers to patients' questions, medical advice) for over 800 Kaiser Permanente (KP) members through KP Online, a web-enabled health care portal. The authors obtained and analyzed multiple types of qualitative data about this incident including interviews with KP staff, incident reports, root cause analyses, and media reports. Reasons at multiple levels account for the breach, including the architecture of the information system, the motivations of individual staff members, and differences among the subcultures of individual groups within as well as technical and social relations across the Kaiser IT program. None of these reasons could be classified, strictly speaking, as "security violations." This case study, thus, suggests that, to protect sensitive patient information, health care organizations should build safe organizational contexts for complex health information systems in addition to complying with good information security practice and regulations such as the Health Insurance Portability and Accountability Act (HIPAA) of 1996.

  3. Probabilistic analysis of Millstone Unit 3 ultimate containment failure probability given high pressure: Chapter 14

    International Nuclear Information System (INIS)

    Bickel, J.H.

    1983-01-01

    The quantification of the containment event trees in the Millstone Unit 3 Probabilistic Safety Study utilizes a conditional probability of failure given high pressure which is based on a new approach. The generation of this conditional probability was based on a weakest link failure mode model which considered contributions from a number of overlapping failure modes. This overlap effect was due to a number of failure modes whose mean failure pressures were clustered within a 5 psi range and which had uncertainties due to variances in material strengths and analytical uncertainties which were between 9 and 15 psi. Based on a review of possible probability laws to describe the failure probability of individual structural failure modes, it was determined that a Weibull probability law most adequately described the randomness in the physical process of interest. The resultant conditional probability of failure is found to have a median failure pressure of 132.4 psia. The corresponding 5-95 percentile values are 112 psia and 146.7 psia respectively. The skewed nature of the conditional probability of failure vs. pressure results in a lower overall containment failure probability for an appreciable number of the severe accident sequences of interest, but also probabilities which are more rigorously traceable from first principles

  4. Psychological contract breach and work performance:Is social exchange a buffer or an intensifier?

    OpenAIRE

    Bal, P. Matthijs; Chiaburu, Dan S.; Jansen, Paul G. W.

    2010-01-01

    Purpose: The aim of this paper is to investigate how social exchanges modify the relationship between psychological contract breach and work performance. It aims to present two concurrent hypotheses, based on theoretical interaction effects of social exchanges (conceptualized as social exchange relationships, POS, and trust). Design/methodology/approach: Data were collected from a sample of 266 employees in a service sector company in the USA. Regression analysis was used to explore the moder...

  5. “I Can Only Work So Hard Before I Burn Out.” A Time Sensitive Conceptual Integration of Ideological Psychological Contract Breach, Work Effort, and Burnout

    Science.gov (United States)

    Jones, Samantha K.; Griep, Yannick

    2018-01-01

    Employees often draw meaning from personal experiences and contributions in their work, particularly when engaging in organizational activities that align with their personal identity or values. However, recent empirical findings have demonstrated how meaningful work can also have a negative effect on employee’s well-being as employees feel so invested in their work, they push themselves beyond their limits resulting in strain and susceptibility to burnout. We develop a framework to understand this “double edged” role of meaningful work by drawing from ideological psychological contracts (iPCs), which are characterized by employees and their employer who are working to contribute to a shared ideology or set of values. Limited iPC research has demonstrated employees may actually work harder in response to an iPC breach. In light of these counterintuitive findings, we propose the following conceptual model to theoretically connect our understanding of iPCs, perceptions of breach, increases in work effort, and the potential “dark side” of repeated occurrences of iPC breach. We argue that time plays a central role in the unfolding process of employees’ reactions to iPC breach over time. Further, we propose how perceptions of iPC breach relate to strain and, eventually, burnout. This model contributes to our understanding of the role of time in iPC development and maintenance, expands our exploration of ideology in the PC literature, and provides a framework to understanding why certain occupations are more susceptible to instances of strain and burnout. This framework has the potential to guide future employment interventions in ideology-infused organizations to help mitigate negative employee outcomes. PMID:29479334

  6. “I Can Only Work So Hard Before I Burn Out.” A Time Sensitive Conceptual Integration of Ideological Psychological Contract Breach, Work Effort, and Burnout

    Directory of Open Access Journals (Sweden)

    Samantha K. Jones

    2018-02-01

    Full Text Available Employees often draw meaning from personal experiences and contributions in their work, particularly when engaging in organizational activities that align with their personal identity or values. However, recent empirical findings have demonstrated how meaningful work can also have a negative effect on employee’s well-being as employees feel so invested in their work, they push themselves beyond their limits resulting in strain and susceptibility to burnout. We develop a framework to understand this “double edged” role of meaningful work by drawing from ideological psychological contracts (iPCs, which are characterized by employees and their employer who are working to contribute to a shared ideology or set of values. Limited iPC research has demonstrated employees may actually work harder in response to an iPC breach. In light of these counterintuitive findings, we propose the following conceptual model to theoretically connect our understanding of iPCs, perceptions of breach, increases in work effort, and the potential “dark side” of repeated occurrences of iPC breach. We argue that time plays a central role in the unfolding process of employees’ reactions to iPC breach over time. Further, we propose how perceptions of iPC breach relate to strain and, eventually, burnout. This model contributes to our understanding of the role of time in iPC development and maintenance, expands our exploration of ideology in the PC literature, and provides a framework to understanding why certain occupations are more susceptible to instances of strain and burnout. This framework has the potential to guide future employment interventions in ideology-infused organizations to help mitigate negative employee outcomes.

  7. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  8. Fuel-sodium reaction product formation in breached mixed-oxide fuel

    International Nuclear Information System (INIS)

    Bottcher, J.H.; Lambert, J.D.B.; Strain, R.V.; Ukai, S.; Shibahara, S.

    1988-01-01

    The run-beyond-cladding-breach (RBCB) operation of mixed-oxide LMR fuel pins has been studied for six years in the Experimental Breeder Reactor-II (EBR-II) as part of a joint program between the US Department of Energy and the Power Reactor and Nuclear Fuel Development Corporation of Japan. The formation of fuel-sodium reaction product (FSRP), Na 3 MO 4 , where M = U/sub 1-y/Pu/sub y/, in the outer fuel regions is the major phenomenon governing RBCB behavior. It increases fuel volume, decreases fuel stoichiometry, modifies fission-product distributions, and alters thermal performance of a pin. This paper describes the morphology of Na 3 MO 4 observed in 5.84-mm diameter pins covering a variety of conditions and RBCB times up to 150 EFPD's. 8 refs., 1 fig

  9. Establishing breach of the duty of care in the tort of negligence: 2.

    Science.gov (United States)

    Tingle, John

    This article discusses the law surrounding breach of the duty of care in negligence. A mistake or error does not necessarily mean legal fault and negligence. Judges look at risks and benefits in determining what would have been the appropriate standard of care to be exercised in the circumstances and may decide that the defendant's conduct was reasonable. There are a number of interrelated factors which judges have to balance and these can be categorized as foreseeability of harm, magnitude of risk, burden of taking precautions, utility of the defendant's conduct and common practice.

  10. Development of laser-based techniques for identification of breached nuclear fuel elements in storage. Progress report, 1 October 1978-1 September 1979

    International Nuclear Information System (INIS)

    Anderson, R.J.; Esherick, P.

    1980-06-01

    From many possible laser-based techniques of atomic and molecular spectroscopy we have selected multiphoton ionization spectroscopy (MIS) as that one technique which best conforms to the constraints imposed by the problem of identifying breached nuclear fuel assemblies in storage. We describe experiments utilizing MIS to specifically and sensitively detect xenon (Xe) and nitric oxide (NO). We therefore experimentally demonstrate the applicability of this technique to the detection of volatile fission products leaking from breached nuclear fuel assemblies stored in water cooled basins. Even for the non-ideal circumstances of these preliminary experiments, we estimate a detection limit of 10 10 atoms/cm 3 for xenon, roughly 1% of the atmospheric content of xenon at sea level

  11. A method and programme (BREACH) for predicting the flow distribution in water cooled reactor cores

    International Nuclear Information System (INIS)

    Randles, J.; Roberts, H.A.

    1961-03-01

    The method presented here of evaluating the flow rate in individual reactor channels may be applied to any type of water cooled reactor in which boiling occurs The flow distribution is calculated with the aid of a MERCURY autocode programme, BREACH, which is described in detail. This programme computes the steady state longitudinal void distribution and pressure drop in a single channel on the basis of the homogeneous model of two phase flow. (author)

  12. A method and programme (BREACH) for predicting the flow distribution in water cooled reactor cores

    Energy Technology Data Exchange (ETDEWEB)

    Randles, J; Roberts, H A [Technical Assessments and Services Division, Atomic Energy Establishment, Winfrith, Dorchester, Dorset (United Kingdom)

    1961-03-15

    The method presented here of evaluating the flow rate in individual reactor channels may be applied to any type of water cooled reactor in which boiling occurs The flow distribution is calculated with the aid of a MERCURY autocode programme, BREACH, which is described in detail. This programme computes the steady state longitudinal void distribution and pressure drop in a single channel on the basis of the homogeneous model of two phase flow. (author)

  13. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    Science.gov (United States)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  14. Breaching vulnerability of coastal barriers under effects of tropical cyclones : A model study on the Hue lagoon - Vietnam

    NARCIS (Netherlands)

    Tuan, T.Q.; Stive, M.J.F.; Verhagen, H.J.

    2006-01-01

    Under effects of tropical cyclones, the coast is subjected to attack both by surge and wave from the sea and by flooding from the bay. These forces pose a serious breaching threat to natural sea-defence works such as barrier spits, barrier islands, lagoon barriers, etc. on the coast. Unintended

  15. Risk assessment for the intentional depressurization strategy in PWRs

    International Nuclear Information System (INIS)

    Dingman, S.E.

    1994-03-01

    An accident management strategy has been proposed in which the reactor coolant system is intentionally depressurized during an accident. The aim is to reduce the containment pressurization that would result from high pressure ejection of molten debris at vessel breach. Probabilistic risk assessment (PRA) methods were used to evaluate this strategy for the Surry nuclear power plant. Sensitivity studies were conducted using event trees that were developed for the NUREG-1150 study. It was found that depressurization (intentional or unintentional) had minimal impact on the containment failure probability at vessel breach for Surry because the containment loads assessed for NUREG-1150 were not a great threat to the containment survivability. An updated evaluation of the impact of intentional depressurization on the probability of having a high pressure melt ejection was then made that reflected analyses that have been performed since NUREG-1150 was completed. The updated evaluation confirmed the sensitivity study conclusions that intentional depressurization has minimal impact on the probability of a high pressure melt ejection. The updated evaluation did show a slight benefit from depressurization because depressurization delayed core melting, which led to a higher probability of recovering emergency core coolant injection, thereby arresting the core damage

  16. A prototype method for diagnosing high ice water content probability using satellite imager data

    Science.gov (United States)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  17. Probability of fracture and life extension estimate of the high-flux isotope reactor vessel

    International Nuclear Information System (INIS)

    Chang, S.J.

    1998-01-01

    The state of the vessel steel embrittlement as a result of neutron irradiation can be measured by its increase in ductile-brittle transition temperature (DBTT) for fracture, often denoted by RT NDT for carbon steel. This transition temperature can be calibrated by the drop-weight test and, sometimes, by the Charpy impact test. The life extension for the high-flux isotope reactor (HFIR) vessel is calculated by using the method of fracture mechanics that is incorporated with the effect of the DBTT change. The failure probability of the HFIR vessel is limited as the life of the vessel by the reactor core melt probability of 10 -4 . The operating safety of the reactor is ensured by periodic hydrostatic pressure test (hydrotest). The hydrotest is performed in order to determine a safe vessel static pressure. The fracture probability as a result of the hydrostatic pressure test is calculated and is used to determine the life of the vessel. Failure to perform hydrotest imposes the limit on the life of the vessel. The conventional method of fracture probability calculations such as that used by the NRC-sponsored PRAISE CODE and the FAVOR CODE developed in this Laboratory are based on the Monte Carlo simulation. Heavy computations are required. An alternative method of fracture probability calculation by direct probability integration is developed in this paper. The present approach offers simple and expedient ways to obtain numerical results without losing any generality. In this paper, numerical results on (1) the probability of vessel fracture, (2) the hydrotest time interval, and (3) the hydrotest pressure as a result of the DBTT increase are obtained

  18. Assessment of the potential for high-pressure melt ejection resulting from a Surry station blackout transient

    International Nuclear Information System (INIS)

    Knudson, D.L.; Dobbe, C.A.

    1993-11-01

    Containment integrity could be challenged by direct heating associated with a high pressure melt ejection (HPME) of core materials following reactor vessel breach during certain severe accidents. Intentional reactor coolant system (RCS) depressurization, where operators latch pressurizer relief valves open, has been proposed as an accident management strategy to reduce risks by mitigating the severity of HPME. However, decay heat levels, valve capacities, and other plant-specific characteristics determine whether the required operator action will be effective. Without operator action, natural circulation flows could heat ex-vessel RCS pressure boundaries (surge line and hot leg piping, steam generator tubes, etc.) to the point of failure before vessel breach, providing an alternate mechanism for RCS depressurization and HPME mitigation. This report contains an assessment of the potential for HPME during a Surry station blackout transient without operator action and without recovery. The assessment included a detailed transient analysis using the SCDAP/RELAP5/MOD3 computer code to calculate the plant response with and without hot leg countercurrent natural circulation, with and without reactor coolant pump seal leakage, and with variations on selected core damage progression parameters. RCS depressurization-related probabilities were also evaluated, primarily based on the code results

  19. Geomorphic and stratigraphic evidence for an unusual tsunami or storm a few centuries ago at Anegada, British Virgin Islands

    Science.gov (United States)

    Atwater, Brian F.; ten Brink, Uri S.; Buckley, Mark; Halley, Robert S.; Jaffe, Bruce E.; López-Venegas, Alberto M.; Reinhardt, Eduard G.; Tuttle, Maritia P.; Watt, Steve; Wei, Yong

    2012-01-01

    Waters from the Atlantic Ocean washed southward across parts of Anegada, east-northeast of Puerto Rico, during a singular event a few centuries ago. The overwash, after crossing a fringing coral reef and 1.5 km of shallow subtidal flats, cut dozens of breaches through sandy beach ridges, deposited a sheet of sand and shell capped with lime mud, and created inland fields of cobbles and boulders. Most of the breaches extend tens to hundreds of meters perpendicular to a 2-km stretch of Anegada’s windward shore. Remnants of the breached ridges stand 3 m above modern sea level, and ridges seaward of the breaches rise 2.2–3.0 m high. The overwash probably exceeded those heights when cutting the breaches by overtopping and incision of the beach ridges. Much of the sand-and-shell sheet contains pink bioclastic sand that resembles, in grain size and composition, the sand of the breached ridges. This sand extends as much as 1.5 km to the south of the breached ridges. It tapers southward from a maximum thickness of 40 cm, decreases in estimated mean grain size from medium sand to very fine sand, and contains mud laminae in the south. The sand-and-shell sheet also contains mollusks—cerithid gastropods and the bivalve Anomalocardia—and angular limestone granules and pebbles. The mollusk shells and the lime-mud cap were probably derived from a marine pond that occupied much of Anegada’s interior at the time of overwash. The boulders and cobbles, nearly all composed of limestone, form fields that extend many tens of meters generally southward from limestone outcrops as much as 0.8 km from the nearest shore. Soon after the inferred overwash, the marine pond was replaced by hypersaline ponds that produce microbial mats and evaporite crusts. This environmental change, which has yet to be reversed, required restriction of a former inlet or inlets, the location of which was probably on the island’s south (lee) side. The inferred overwash may have caused restriction

  20. An Examination of the Relational Aspects of Leadership Credibility, Psychological Contract Breach and Violation, and Interactional Justice

    OpenAIRE

    Johnson, Nicole Annette

    2009-01-01

    Especially during times of intense change, managers may negatively impact the quality of employee-manager relationships by breaching or violating psychological contract terms and exhibiting unfair treatment (i.e., interactional injustice) in the workplace. A psychological contract is conceptualized as an exmployee's perception or individualistic belief about the reciprocal and promissory nature of the employment relationship (Argyris, 1960; Levinson, Price, Munden, Mandl, & Solley, 1966; Rou...

  1. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  2. Dam-Breach hydrology of the Johnstown flood of 1889-challenging the findings of the 1891 investigation report.

    Science.gov (United States)

    Coleman, Neil M; Kaktins, Uldis; Wojno, Stephanie

    2016-06-01

    In 1891 a report was published by an ASCE committee to investigate the cause of the Johnstown flood of 1889. They concluded that changes made to the dam by the South Fork Fishing and Hunting Club did not cause the disaster because the embankment would have been overflowed and breached if the changes were not made. We dispute that conclusion based on hydraulic analyses of the dam as originally built, estimates of the time of concentration and time to peak for the South Fork drainage basin, and reported conditions at the dam and in the watershed. We present a LiDAR-based volume of Lake Conemaugh at the time of dam failure (1.455 × 10(7) m(3)) and hydrographs of flood discharge and lake stage decline. Our analytical approach incorporates the complex shape of this dam breach. More than 65 min would have been needed to drain most of the lake, not the 45 min cited by most sources. Peak flood discharges were likely in the range 7200 to 8970 m(3) s(-1). The original dam design, with a crest ∼0.9 m higher and the added capacity of an auxiliary spillway and five discharge pipes, had a discharge capacity at overtopping more than twice that of the reconstructed dam. A properly rebuilt dam would not have overtopped and would likely have survived the runoff event, thereby saving thousands of lives. We believe the ASCE report represented state-of-the-art for 1891. However, the report contains discrepancies and lapses in key observations, and relied on excessive reservoir inflow estimates. The confidence they expressed that dam failure was inevitable was inconsistent with information available to the committee. Hydrodynamic erosion was a likely culprit in the 1862 dam failure that seriously damaged the embankment. The Club's substandard repair of this earlier breach sowed the seeds of its eventual destruction.

  3. When I'm 64: Psychological contract breach, work motivation and the moderating roles of future time perspective and regulatory focus

    NARCIS (Netherlands)

    Lange, A.H. de; Bal, P.M.; Heijden, B.I.J.M. van der; Jong, N. de; Schaufeli, W.B.

    2011-01-01

    There is an increasing need for managers to understand what motivates younger versus older workers to continue work within their company. We believe that this two-wave study among 90 Dutch employees is the first to examine: (1) the cross-lagged relationships between breach of psychological contract

  4. When I'm 64 : Psychological contract breach, work motivation and the moderating roles of future time perspective and regulatory focus

    NARCIS (Netherlands)

    de Lange, Annet H.; Bal, P. Matthijs; Van der Heijden, Beatrice I. J. M.; de Jong, Nicole; Schaufeli, Wilmar B.

    2011-01-01

    There is an increasing need for managers to understand what motivates younger versus older workers to continue work within their company. We believe that this two-wave study among 90 Dutch employees is the first to examine: (1) the cross-lagged relationships between breach of psychological contract

  5. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  6. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  7. Results of transient overpower events on breached and unbreached fuel pins

    International Nuclear Information System (INIS)

    Strain, R.V.; Tsai, H.C.; Neimark, L.A.; Aratani, K.

    1986-04-01

    The objective of the extended overpower tests on intact pins was to determine the pin cladding breaching thresholds vis-a-vis the Plant Protection System (PPS) trip settings, typically at ∼10 to 15% overpower. These tests emphasize slow operational-type transients in light of earlier work which suggested that irradiated mixed-oxide fuel pins may be particularly vulnerable in the slow ramp-rate regime. An overview of the extended overpower test series was previously reported. More recent results on two of the tests in this series are included in this paper. These two tests, designated TOPI-1A and TOPI-1B, were each conducted on a 19-pin assembly with various pin design, operation and burnup variables. The overpower ramp rates for the TOPI-1A and -1B tests were 0.1%/s and 10%/s, respectively

  8. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  9. Neutron emission probability at high excitation and isospin

    International Nuclear Information System (INIS)

    Aggarwal, Mamta

    2005-01-01

    One-neutron and two-neutron emission probability at different excitations and varying isospin have been studied. Several degrees of freedom like deformation, rotations, temperature, isospin fluctuations and shell structure are incorporated via statistical theory of hot rotating nuclei

  10. Baby milk companies accused of breaching marketing code.

    Science.gov (United States)

    Wise, J

    1997-01-18

    A consortium of 27 religious and health organizations has released a report entitled "Cracking the Code," which criticizes the bottle-feeding marketing techniques used by Nestle, Gerber, Mead Johnson, Wyeth, and Nutricia. Research for the report was carried out in Thailand, Bangladesh, South Africa, and Poland using a random sample of 800 mothers and 120 health workers in each country. In all 4 sites, women had received information that violated the World Health Organization's 1981 international code of marketing breast milk substitutes. Violations included promoting artificial feeding without recognizing breast feeding as the best source of infant nutrition. The investigation also found that women and health workers in all 4 sites received free samples of artificial milk. The report includes detailed examples of manufacturer representatives making unrequested visits to give product information to mothers, providing incentives to health workers to promote products, and promoting products outside of health care facilities. While the International Association of Infant Food Manufacturers condemned the study as biased, the Nestle company promised to review the allegations contained in the report and to deal with any breaches in the code. The Interagency Group on Breastfeeding Monitoring, which prepared the report, was created in 1994 to provide data to groups supporting a boycott of Nestle for code violations.

  11. High-resolution urban flood modelling - a joint probability approach

    Science.gov (United States)

    Hartnett, Michael; Olbert, Agnieszka; Nash, Stephen

    2017-04-01

    (Divoky et al., 2005). Nevertheless, such events occur and in Ireland alone there are several cases of serious damage due to flooding resulting from a combination of high sea water levels and river flows driven by the same meteorological conditions (e.g. Olbert et al. 2015). A November 2009 fluvial-coastal flooding of Cork City bringing €100m loss was one such incident. This event was used by Olbert et al. (2015) to determine processes controlling urban flooding and is further explored in this study to elaborate on coastal and fluvial flood mechanisms and their roles in controlling water levels. The objective of this research is to develop a methodology to assess combined effect of multiple source flooding on flood probability and severity in urban areas and to establish a set of conditions that dictate urban flooding due to extreme climatic events. These conditions broadly combine physical flood drivers (such as coastal and fluvial processes), their mechanisms and thresholds defining flood severity. The two main physical processes controlling urban flooding: high sea water levels (coastal flooding) and high river flows (fluvial flooding), and their threshold values for which flood is likely to occur, are considered in this study. Contribution of coastal and fluvial drivers to flooding and their impacts are assessed in a two-step process. The first step involves frequency analysis and extreme value statistical modelling of storm surges, tides and river flows and ultimately the application of joint probability method to estimate joint exceedence return periods for combination of surges, tide and river flows. In the second step, a numerical model of Cork Harbour MSN_Flood comprising a cascade of four nested high-resolution models is used to perform simulation of flood inundation under numerous hypothetical coastal and fluvial flood scenarios. The risk of flooding is quantified based on a range of physical aspects such as the extent and depth of inundation (Apel et al

  12. Pupils' Visual Representations in Standard and Problematic Problem Solving in Mathematics: Their Role in the Breach of the Didactical Contract

    Science.gov (United States)

    Deliyianni, Eleni; Monoyiou, Annita; Elia, Iliada; Georgiou, Chryso; Zannettou, Eleni

    2009-01-01

    This study investigated the modes of representations generated by kindergarteners and first graders while solving standard and problematic problems in mathematics. Furthermore, it examined the influence of pupils' visual representations on the breach of the didactical contract rules in problem solving. The sample of the study consisted of 38…

  13. Consistency assessment of rating curve data in various locations using Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Coxon, Gemma; Freer, Jim; Verhoest, Niko E. C.

    2017-10-01

    When estimating discharges through rating curves, temporal data consistency is a critical issue. In this research, consistency in stage-discharge data is investigated using a methodology called Bidirectional Reach (BReach), which departs from a (in operational hydrology) commonly used definition of consistency. A period is considered to be consistent if no consecutive and systematic deviations from a current situation occur that exceed observational uncertainty. Therefore, the capability of a rating curve model to describe a subset of the (chronologically sorted) data is assessed in each observation by indicating the outermost data points for which the rating curve model behaves satisfactorily. These points are called the maximum left or right reach, depending on the direction of the investigation. This temporal reach should not be confused with a spatial reach (indicating a part of a river). Changes in these reaches throughout the data series indicate possible changes in data consistency and if not resolved could introduce additional errors and biases. In this research, various measurement stations in the UK, New Zealand and Belgium are selected based on their significant historical ratings information and their specific characteristics related to data consistency. For each country, regional information is maximally used to estimate observational uncertainty. Based on this uncertainty, a BReach analysis is performed and, subsequently, results are validated against available knowledge about the history and behavior of the site. For all investigated cases, the methodology provides results that appear to be consistent with this knowledge of historical changes and thus facilitates a reliable assessment of (in)consistent periods in stage-discharge measurements. This assessment is not only useful for the analysis and determination of discharge time series, but also to enhance applications based on these data (e.g., by informing hydrological and hydraulic model

  14. Quebras contratuais e dispersão de sentenças Contractual breaches and sentences dispersion

    Directory of Open Access Journals (Sweden)

    Christiane Leles Rezende

    2011-06-01

    Full Text Available O problema que motivou um estudo, o qual deu origem a este artigo, foram as quebras contratuais por parte dos produtores rurais geradas pela expressiva alta dos preços da soja e as consequentes disputas judiciais. Foram realizadas análises descritiva e econométrica utilizando 161 Apelações do Tribunal de Justiça de Goiás, e uma pesquisa quantitativa com 70 produtores rurais. O estudo considera a hipótese de que a instabilidade gerada a partir das decisões judiciais eleva os custos de transação e afeta as decisões dos agentes privados. Foi constatada larga dispersão entre decisões de primeira e segunda instâncias, bem como entre Câmaras Cíveis do TJ. Os agentes econômicos relataram que as alterações nas estratégias de suprimento foram centradas no aumento da exigência de garantias e redução do número de contratos. O conceito de função social do contrato está associado à elevação da instabilidade. Decorreram maiores custos de transação, bem como a adoção de sanções econômicas por parte dos agentes privados.The problems which have motivated a study and this article were the contractual breaches during an expressive increase of price and their judicial decisions. Descriptive and econometric analysis have been carried out on 161 appeal judicial decisions of Goiás Court of Justice (Brazil and a quantitative survey was done with 70 farmers. The study has supported the hypothesis that a weak judiciary increases transaction costs and decreases the economic development rate. A large dispersion of court decisions was found between the first and appeal decision, as well inside the same level. The use of the concept of “social role of contract” inserted a high degree of instability in contracts the effects of court decisions could be realized such as more requirements of warranties and the reduction in the number of contracts. Those soybean producers who did not breach their contracts also have been negatively

  15. Research Program Tests for the U.S. Defense Special Weapons Agency (DSWA) for Breaching of Concrete Panels Set Against a Sandstone Rock Wall

    National Research Council Canada - National Science Library

    Harvey, Kent

    2006-01-01

    ...) Determine the difficulties and nuances of drilling behind wall test panels 3) Test different blast hole sizes, blast hole locations, and blasting sequences in an effort to identify the advantages and disadvantages of different breaching approaches...

  16. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  17. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  18. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  19. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  20. Medicare program; offset of Medicare payments to individuals to collect past-due obligations arising from breach of scholarship and loan contracts--HCFA. Final rule.

    Science.gov (United States)

    1992-05-04

    This final rule sets forth the procedures to be followed for collection of past-due amounts owed by individuals who breached contracts under certain scholarship and loan programs. The programs that would be affected are the National Health Service Corps Scholarship, the Physician Shortage Area Scholarship, and the Health Education Assistance Loan. These procedures would apply to those individuals who breached contracts under the scholarship and loan programs and who-- Accept Medicare assignment for services; Are employed by or affiliated with a provider, Health Maintenance Organization, or Competitive Medical Plan that receives Medicare payment for services; or Are members of a group practice that receives Medicare payment for services. This regulation implements section 1892 of the Social Security Act, as added by section 4052 of the Omnibus Budget Reconciliation Act of 1987.

  1. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  2. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  3. Learning difficulties of senior high school students based on probability understanding levels

    Science.gov (United States)

    Anggara, B.; Priatna, N.; Juandi, D.

    2018-05-01

    Identifying students' difficulties in learning concept of probability is important for teachers to prepare the appropriate learning processes and can overcome obstacles that may arise in the next learning processes. This study revealed the level of students' understanding of the concept of probability and identified their difficulties as a part of the epistemological obstacles identification of the concept of probability. This study employed a qualitative approach that tends to be the character of descriptive research involving 55 students of class XII. In this case, the writer used the diagnostic test of probability concept learning difficulty, observation, and interview as the techniques to collect the data needed. The data was used to determine levels of understanding and the learning difficulties experienced by the students. From the result of students' test result and learning observation, it was found that the mean cognitive level was at level 2. The findings indicated that students had appropriate quantitative information of probability concept but it might be incomplete or incorrectly used. The difficulties found are the ones in arranging sample space, events, and mathematical models related to probability problems. Besides, students had difficulties in understanding the principles of events and prerequisite concept.

  4. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  5. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  6. Economic choices reveal probability distortion in macaque monkeys.

    Science.gov (United States)

    Stauffer, William R; Lak, Armin; Bossaerts, Peter; Schultz, Wolfram

    2015-02-18

    Economic choices are largely determined by two principal elements, reward value (utility) and probability. Although nonlinear utility functions have been acknowledged for centuries, nonlinear probability weighting (probability distortion) was only recently recognized as a ubiquitous aspect of real-world choice behavior. Even when outcome probabilities are known and acknowledged, human decision makers often overweight low probability outcomes and underweight high probability outcomes. Whereas recent studies measured utility functions and their corresponding neural correlates in monkeys, it is not known whether monkeys distort probability in a manner similar to humans. Therefore, we investigated economic choices in macaque monkeys for evidence of probability distortion. We trained two monkeys to predict reward from probabilistic gambles with constant outcome values (0.5 ml or nothing). The probability of winning was conveyed using explicit visual cues (sector stimuli). Choices between the gambles revealed that the monkeys used the explicit probability information to make meaningful decisions. Using these cues, we measured probability distortion from choices between the gambles and safe rewards. Parametric modeling of the choices revealed classic probability weighting functions with inverted-S shape. Therefore, the animals overweighted low probability rewards and underweighted high probability rewards. Empirical investigation of the behavior verified that the choices were best explained by a combination of nonlinear value and nonlinear probability distortion. Together, these results suggest that probability distortion may reflect evolutionarily preserved neuronal processing. Copyright © 2015 Stauffer et al.

  7. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  8. Bitter (CW6)

    CSIR Research Space (South Africa)

    Estuarine and Coastal

    1981-06-01

    Full Text Available originating from the sea tend to build up the sand bar at the mouth of the Bitter, whilst the river would tend to breach it at times of flow, particularly in the winter months. Sea water probably only overtops the sandbar during exceptionally high tides...

  9. Technical difficulties. Recent health IT security breaches are unlikely to improve the public's perception about the safety of personal data.

    Science.gov (United States)

    Becker, Cinda

    2006-02-20

    Consumers who claimed in recent surveys that they were "more afraid of cyber crimes than physical crimes" may have had reason for caution. A spate of well-publicized information thefts and security breaches at healthcare organizations have eroded trust in technology, says Carol Diamond, left, of the Markle Foundation, and that could have an adverse effect on acceptance of electronic medical records.

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. The Breach and the Observance : Theatre retranslation as a strategy of artistic differentiation, with special reference to retranslations of Shakespeare's Hamlet (1777-2001)

    NARCIS (Netherlands)

    Mathijssen, J.W.

    2007-01-01

    The subject of "The Breach and the Observance" is retranslation for the theatre. Besides offering a model that incorporates the findings of previous scholarship, it casts new light on the motivation behind retranslation, using the case of translations of Shakespeare's Hamlet on the Dutch stage.

  12. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    Directory of Open Access Journals (Sweden)

    Katherine E Baird

    2016-09-01

    Full Text Available Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills.

  13. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  14. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  15. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  16. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  17. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  18. Breached fuel location in FFTF by delayed neutron monitor triangulation

    International Nuclear Information System (INIS)

    Bunch, W.L.; Tang, E.L.

    1985-10-01

    The Fast Flux Test Facility (FFTF) features a three-loop, sodium-cooled 400 MWt mixed oxide fueled reactor designed for the irradiation testing of fuels and materials for use in liquid metal cooled fast reactors. To establish the ultimate capability of a particular fuel design and thereby generate information that will lead to improvements, many of the fuel irradiations are continued until a loss of cladding integrity (failure) occurs. When the cladding fails, fission gas escapes from the fuel pin and enters the reactor cover gas system. If the cladding failure permits the primary sodium to come in contact with the fuel, recoil fission products can enter the sodium. The presence of recoil fission products in the sodium can be detected by monitoring for the presence of delayed neutrons in the coolant. It is the present philosophy to not operate FFTF when a failure has occurred that permits fission fragments to enter the sodium. Thus, it is important that the identity and location of the fuel assembly that contains the failed cladding be established in order that it might be removed from the core. This report discusses method of location of fuel element when cladding is breached

  19. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  20. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  1. Starlings uphold principles of economic rationality for delay and probability of reward.

    Science.gov (United States)

    Monteiro, Tiago; Vasconcelos, Marco; Kacelnik, Alex

    2013-04-07

    Rationality principles are the bedrock of normative theories of decision-making in biology and microeconomics, but whereas in microeconomics, consistent choice underlies the notion of utility; in biology, the assumption of consistent selective pressures justifies modelling decision mechanisms as if they were designed to maximize fitness. In either case, violations of consistency contradict expectations and attract theoretical interest. Reported violations of rationality in non-humans include intransitivity (i.e. circular preferences) and lack of independence of irrelevant alternatives (changes in relative preference between options when embedded in different choice sets), but the extent to which these observations truly represent breaches of rationality is debatable. We tested both principles with starlings (Sturnus vulgaris), training subjects either with five options differing in food delay (exp. 1) or with six options differing in reward probability (exp. 2), before letting them choose repeatedly one option out of several binary and trinary sets of options. The starlings conformed to economic rationality on both tests, showing strong stochastic transitivity and no violation of the independence principle. These results endorse the rational choice and optimality approaches used in behavioural ecology, and highlight the need for functional and mechanistic enquiring when apparent violations of such principles are observed.

  2. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    Science.gov (United States)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  3. LMX, Breach Perceptions, Work-Family Conflict, and Well-Being: A Mediational Model.

    Science.gov (United States)

    Hill, Rachel T; Morganson, Valerie J; Matthews, Russell A; Atkinson, Theresa P

    2016-01-01

    Despite research advances, work-family scholars still lack an understanding of how leadership constructs relate to an employee's ability to effectively manage the work-family interface. In addition, there remains a need to examine the process through which leadership and work-family conflict influence well-being outcomes. Using a sample of 312 workers, a mediated process model grounded in social exchange theory is tested wherein the authors seek to explain how leaders shape employee perceptions, which, in turn, impact organizational fulfillment of expectations (i.e., psychological contract breach), work-family conflict, and well-being. A fully latent structural equation model was used to test study hypotheses, all of which were supported. Building on existing theory, findings suggest that the supervisor plays a critical role as a frontline representative for the organization and that work-family conflict is reduced and well-being enhanced through a process of social exchange between the supervisor and worker.

  4. Modifications to the HIPAA Privacy, Security, Enforcement, and Breach Notification rules under the Health Information Technology for Economic and Clinical Health Act and the Genetic Information Nondiscrimination Act; other modifications to the HIPAA rules.

    Science.gov (United States)

    2013-01-25

    The Department of Health and Human Services (HHS or ``the Department'') is issuing this final rule to: Modify the Health Insurance Portability and Accountability Act (HIPAA) Privacy, Security, and Enforcement Rules to implement statutory amendments under the Health Information Technology for Economic and Clinical Health Act (``the HITECH Act'' or ``the Act'') to strengthen the privacy and security protection for individuals' health information; modify the rule for Breach Notification for Unsecured Protected Health Information (Breach Notification Rule) under the HITECH Act to address public comment received on the interim final rule; modify the HIPAA Privacy Rule to strengthen the privacy protections for genetic information by implementing section 105 of Title I of the Genetic Information Nondiscrimination Act of 2008 (GINA); and make certain other modifications to the HIPAA Privacy, Security, Breach Notification, and Enforcement Rules (the HIPAA Rules) to improve their workability and effectiveness and to increase flexibility for and decrease burden on the regulated entities.

  5. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  6. Approximation of rejective sampling inclusion probabilities and application to high order correlations

    NARCIS (Netherlands)

    Boistard, H.; Lopuhää, H.P.; Ruiz-Gazen, A.

    2012-01-01

    This paper is devoted to rejective sampling. We provide an expansion of joint inclusion probabilities of any order in terms of the inclusion probabilities of order one, extending previous results by Hájek (1964) and Hájek (1981) and making the remainder term more precise. Following Hájek (1981), the

  7. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  8. A 'new' Cromer-related high frequency antigen probably antithetical to WES.

    Science.gov (United States)

    Daniels, G L; Green, C A; Darr, F W; Anderson, H; Sistonen, P

    1987-01-01

    An antibody to a high frequency antigen, made in a WES+ Black antenatal patient (Wash.), failed to react with the red cells of a presumed WES+ homozygote and is, therefore, probably antithetical to anti-WES. Like anti-WES, it reacted with papain, ficin, trypsin or neuraminidase treated cells but not with alpha-chymotrypsin or pronase treated cells and was specifically inhibited by concentrated serum. It also reacted more strongly in titration with WES- cells than with WES+ cells. The antibody is Cromer-related as it failed to react with Inab phenotype (IFC-) cells and reacted only weakly with Dr(a-) cells. Wash. cells and those of the other possible WES+ homozygote are Cr(a+) Tc(a+b-c-) Dr(a+) IFC+ but reacted only very weakly with anti-Esa.

  9. The Suitability of the Remedy of Specific Performance to Breach of a "Player's Contract" with Specific Reference to the Mapoe and Santos cases

    Directory of Open Access Journals (Sweden)

    K Mould

    2011-04-01

    Full Text Available During the 1990s, rugby union formation in the Republic of South Africa developed rapidly from a system of strict amateurism to one of professionalism. Professional participants in the sport received salaries for participation, and rugby became a business like any other. As in all forms of business, rugby had to be regulated moreefficiently than had previously been the case. Tighter regulations were instituted by governing bodies, and ultimately labour legislation became applicable to professional rugby. A professional sportsman or woman participating in a team sport is generallyconsidered an employee. This means that the same principles that govern employees in general should also apply to professional sportsmen and women. The exact nature of the "player's contract", a term generally used to describe the contract of employment between a professional sportsman or sportswoman and his or her employer, deserves closer attention. It has been argued with much merit that the "player's contract", while in essence a contract of employment, possesses certain sui generis characteristics. The first aim of this article is to demonstrate how this statement is in fact a substantial one. If it is concluded that the "player's contract" isin fact a sui generis contract of employment, the most suitable remedy in case of breach of contract must be determined. The second aim of this article is to indicate why the remedy of specific performance, which is generally not granted in cases where the defaulting party has to provide services of a personal nature, is the most suitable remedy in case of breach of "player's contracts". To substantiate this statement, recent applicable case law is investigated and discussed, particularly the recent case of Vrystaat Cheetahs (Edms Beperk v Mapoe. Suggestions are finally offered as to how breach of "player's contracts" should be approached by South African courts in future.

  10. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  11. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  12. Breaching of strike-slip faults and flooding of pull-apart basins to form the southern Gulf of California seaway from 8 to 6 Ma

    Science.gov (United States)

    Umhoefer, P. J.; Skinner, L. A.; Oskin, M. E.; Dorsey, R. J.; Bennett, S. E. K.; Darin, M. H.

    2017-12-01

    Studies from multiple disciplines delineate the development of the oblique-divergent Pacific - North America plate boundary in the southern Gulf of California. Integration of onshore data from the Loreto - Santa Rosalia margin with offshore data from the Pescadero, Farallon, and Guaymas basins provides a detailed geologic history. Our GIS-based paleotectonic maps of the plate boundary from 9 to 6 Ma show that evolution of pull-apart basins led to the episodic northwestward encroachment of the Gulf of California seaway. Because adjacent pull-apart basins commonly have highlands between them, juxtaposition of adjacent basin lows during translation and pull apart lengthening played a critical role in seaway flooding. Microfossils and volcanic units date the earliest marine deposits at 9(?) - 8 Ma at the mouth of the Gulf. By ca. 8 Ma, the seaway had flooded north to the Pescadero basin, while the Loreto fault and the related fault-termination basin was proposed to have formed along strike at the plate margin. East of Loreto basin, a short topographic barrier between the Pescadero and Farallon pull-apart basins suggests that the Farallon basin was either a terrestrial basin, or if breaching occurred, it may contain 8 Ma salt or marine deposits. This early southern seaway formed along a series of pull-apart basins within a narrow belt of transtension structurally similar to the modern Walker Lane in NV and CA. At ca. 7 Ma, a series of marine incursions breached a 75-100 km long transtensional fault barrier between the Farallon and Guaymas basins offshore Bahía Concepción. Repeated breaching events and the isolation of the Guaymas basin in a subtropical setting formed a 2 km-thick salt deposit imaged in offshore seismic data, and thin evaporite deposits in the onshore Santa Rosalia basin. Lengthening of the Guaymas, Yaqui, and Tiburon basins caused breaches of the intervening Guaymas and Tiburón transforms by 6.5-6.3 Ma, forming a permanent 1500 km-long marine seaway

  13. Development of risk assessment simulation tool for optimal control of a low probability-high consequence disaster

    International Nuclear Information System (INIS)

    Yotsumoto, Hiroki; Yoshida, Kikuo; Genchi, Hiroshi

    2011-01-01

    In order to control low probability-high consequence disaster which causes huge social and economic damage, it is necessary to develop simultaneous risk assessment simulation tool based on the scheme of disaster risk including diverse effects of primary disaster and secondary damages. We propose the scheme of this risk simulation tool. (author)

  14. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  15. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  16. Intelligent tutorial system for teaching of probability and statistics at high school in Mexico

    Directory of Open Access Journals (Sweden)

    Fernando Gudino Penaloza, Miguel Gonzalez Mendoza, Neil Hernandez Gress, Jaime Mora Vargas

    2009-12-01

    Full Text Available This paper describes the implementation of an intelligent tutoring system dedicated to teaching probability and statistics atthe preparatory school (or high school in Mexico. The system solution was used as a desktop computer and adapted tocarry a mobile environment for the implementation of mobile learning or m-learning. The system complies with the idea ofbeing adaptable to the needs of each student and is able to adapt to three different teaching models that meet the criteriaof three student profiles.

  17. Duty to warn of genetic harm in breach of patient confidentiality.

    Science.gov (United States)

    Keeling, Sharon L

    2004-11-01

    Harm caused by the failure of health professionals to warn an at-risk genetic relative of her or his risk is genetic harm. Genetic harm should be approached using the usual principles of negligence. When these principles are applied, it is shown that (a) genetic harm is foreseeable; (b) the salient features of vulnerability, the health professional's knowledge of the risk to the genetic relative and the determinancy of the affected class and individual result in a duty of care being owed to the genetic relative; (c) the standard of care required to fulfil the duty to warn should be the expectations of a reasonable person in the position of the relative; and (d) causation is satisfied as the harm is caused by the failure of intervention of the health professional. Legislation enacted subsequent to the Report of the Commonwealth of Australia, Panel of Eminent Persons (Chair D Ipp), Review of the Law of Negligence Report (2002) and relevant to a duty to warn of genetic harm is considered. The modes of regulation and penalties for breach of any future duty to warn of genetic harm are considered.

  18. Influence of the Probability Level on the Framing Effect

    Directory of Open Access Journals (Sweden)

    Kaja Damnjanovic

    2016-11-01

    Full Text Available Research of the framing effect of risky choice mostly applies to the tasks where the effect of only one probability or risk level on the choice of non-risky or risky options was examined. The conducted research was aimed to examine the framing effect in the function of probability level in the outcome of a risk option in three decision-making domains: health, money and human lives. It has been confirmed that the decision-making domain moderates the framing effect. In the monetary domain, the general risk aversion has been confirmed as registered in earlier research. At high probability levels, the framing effect is registered in both frames, while no framing effect is registered at lower probability levels. In the domain of decision-making about human lives, the framing effect is registered at medium high and medium low probability levels. In the domain of decision-making about health, the framing effect is registered almost in the entire probability range while this domain differs from the former two. The results show that the attitude to risk is not identical at different probability levels, that the dynamics of the attitude to risk influences the framing effect, and that the framing effect pattern is different in different decision-making domains. In other words, linguistic manipulation representing the frame in the tasks affects the change in the preference order only when the possibility of gain (expressed in probability is estimated as sufficiently high.

  19. Demographic characteristics and clinical predictors of patients discharged from university hospital-affiliated pain clinic due to breach in narcotic use contract.

    Science.gov (United States)

    Chakrabortty, Shushovan; Gupta, Deepak; Rustom, David; Berry, Hussein; Rai, Ajit

    2014-01-01

    The current retrospective study was completed with the aim to identify demographic characteristics and clinical predictors (if any) of the patients discharged from our pain clinic due to breach in narcotic use contract (BNUC). Retrospective patient charts' review and data audit. University hospital-affiliated pain clinic in the United States. All patient charts in our pain clinic for a 2-year period (2011-2012). The patients with BNUC were delineated from the patients who had not been discharged from our pain clinic. Pain characteristics, pain management, and substance abuse status were compared in each patient with BNUC between the time of admission and the time of discharge. The patients with BNUC discharges showed significant variability for the discharging factors among the pain physicians within a single pain clinic model with this variability being dependent on their years of experience and their proactive interventional pain management. The patients with BNUC in our pain clinic setting were primarily middle-aged, obese, unmarried males with nondocumented stable occupational history who were receiving only noninterventional pain management. Substance abuse, doctor shopping, and potential diversion were the top three documented reasons for BNUC discharges. In 2011-2012, our pain clinic discharged 1-in-16 patients due to breach in narcotic use contract.

  20. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  1. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  2. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  3. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  4. The Reluctance of Civil Law Countries in Adopting “the Without Breach of Peace” Standard of UCC Article 9: Evidence from National and International Legal Instruments Governing Secured Transactions

    DEFF Research Database (Denmark)

    Gikay, Asress Adimi; Stanescu, Catalin Gabriel

    2017-01-01

    been shaped by courts on a case-by-case basis. In reforming their secured transactions laws and to enhance access to credit, continental legal systems have shown great reception to Article 9 by adopting the unitary concept and functional approach to security interests, introducing private enforcement....... This article concludes that the alternatives of the “without breach of peace” standard prevailing in continental legal systems undermine the privilege of the secured creditor, pose enforcement problems (such as uncertainty of creditors’ rights and possible abuses against consumer debtors), and restrain out...... international legal instruments), this article demonstrates that continental European legal systems are generally apprehensive with the “without breach of peace” standard. Thus, they are reluctant to transplant it to their legislation and try to either modify it or replace it with different legal requirements...

  5. Robust estimation of the expected survival probabilities from high-dimensional Cox models with biomarker-by-treatment interactions in randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Nils Ternès

    2017-05-01

    Full Text Available Abstract Background Thanks to the advances in genomics and targeted treatments, more and more prediction models based on biomarkers are being developed to predict potential benefit from treatments in a randomized clinical trial. Despite the methodological framework for the development and validation of prediction models in a high-dimensional setting is getting more and more established, no clear guidance exists yet on how to estimate expected survival probabilities in a penalized model with biomarker-by-treatment interactions. Methods Based on a parsimonious biomarker selection in a penalized high-dimensional Cox model (lasso or adaptive lasso, we propose a unified framework to: estimate internally the predictive accuracy metrics of the developed model (using double cross-validation; estimate the individual survival probabilities at a given timepoint; construct confidence intervals thereof (analytical or bootstrap; and visualize them graphically (pointwise or smoothed with spline. We compared these strategies through a simulation study covering scenarios with or without biomarker effects. We applied the strategies to a large randomized phase III clinical trial that evaluated the effect of adding trastuzumab to chemotherapy in 1574 early breast cancer patients, for which the expression of 462 genes was measured. Results In our simulations, penalized regression models using the adaptive lasso estimated the survival probability of new patients with low bias and standard error; bootstrapped confidence intervals had empirical coverage probability close to the nominal level across very different scenarios. The double cross-validation performed on the training data set closely mimicked the predictive accuracy of the selected models in external validation data. We also propose a useful visual representation of the expected survival probabilities using splines. In the breast cancer trial, the adaptive lasso penalty selected a prediction model with 4

  6. The Risk of Goods in International Sales. An Approach from the Breach of Contract and Remedies of the Buyer

    Directory of Open Access Journals (Sweden)

    Álvaro Vidal Olivares

    2016-12-01

    Full Text Available This article refers to the regime risks in the CISG with the aim of showing that the regime that it is incorporated, is based on functional criteria to the interests of the parties to solve the problems that originated in the loss and prior to the transfer of risk to the buyer will have recourse to remedies system and thus achieve the connection with the breach of contract. In its development we used the dogmatic method, from the systematic analysis of the rules of the CISG, doctrine and case law.

  7. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  8. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  9. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  10. Serial follow up V/P scanning in assessment of treatment response in high probability scans for pulmonary embolism

    Energy Technology Data Exchange (ETDEWEB)

    Moustafa, H; Elhaddad, SH; Wagih, SH; Ziada, G; Samy, A; Saber, R [Department of nuclear medicine and radiology, faculty of medicine, Cairo university, Cairo, (Egypt)

    1995-10-01

    138 patients proved with V/P scan to have different probabilities of pulmonary emboli event. Serial follow up scanning after 3 days, 2 weeks, 1 month and 3 months was done, with anticoagulant therapy. Out of the remaining 10 patients, 6 patients died with documented P.E. by P.M. study and lost follow up recorded in 4 patients. Complete response with disappearance of all perfusion defects after 2 weeks was detected in 37 patients (49.3%), partial improvement of lesions after 3 months was elicited in 32%. The overall incidence of response was (81.3%) such response was complete in low probability group (100%), (84.2%) in intermediate group and (79.3%) in high probability group with partial response in 45.3%. New lesions were evident in 18.7% of this series. To conclude that serial follow up V/P scan is mandatory for evaluation of response to anticoagulant therapy specially in first 3 months. 2 figs., 3 tabs.

  11. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  12. Measurements of atomic transition probabilities in highly ionized atoms by fast ion beams

    International Nuclear Information System (INIS)

    Martinson, I.; Curtis, L.J.; Lindgaerd, A.

    1977-01-01

    A summary is given of the beam-foil method by which level lifetimes and transition probabilities can be determined in atoms and ions. Results are presented for systems of particular interest for fusion research, such as the Li, Be, Na, Mg, Cu and Zn isoelectronic sequences. The available experimental material is compared to theoretical transition probabilities. (author)

  13. Oil spill contamination probability in the southeastern Levantine basin.

    Science.gov (United States)

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  15. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  16. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  17. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  18. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  19. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.

    2017-09-07

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  20. Hierarchical Decompositions for the Computation of High-Dimensional Multivariate Normal Probabilities

    KAUST Repository

    Genton, Marc G.; Keyes, David E.; Turkiyyah, George

    2017-01-01

    We present a hierarchical decomposition scheme for computing the n-dimensional integral of multivariate normal probabilities that appear frequently in statistics. The scheme exploits the fact that the formally dense covariance matrix can be approximated by a matrix with a hierarchical low rank structure. It allows the reduction of the computational complexity per Monte Carlo sample from O(n2) to O(mn+knlog(n/m)), where k is the numerical rank of off-diagonal matrix blocks and m is the size of small diagonal blocks in the matrix that are not well-approximated by low rank factorizations and treated as dense submatrices. This hierarchical decomposition leads to substantial efficiencies in multivariate normal probability computations and allows integrations in thousands of dimensions to be practical on modern workstations.

  1. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  2. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    Science.gov (United States)

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  3. Assessment of local variability by high-throughput e-beam metrology for prediction of patterning defect probabilities

    Science.gov (United States)

    Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram

    2018-03-01

    We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.

  4. Bireysel Farklılıklar İle Psikolojik Sözleşme İhlali Arasındaki İlişkide Üstün Desteğinin Aracılık Rolü (The Mediating Role Of Supervisor Support In The Relationship Between Individual Differences And Psychological Contract Breach

    Directory of Open Access Journals (Sweden)

    Canan Nur KARABEY

    2016-03-01

    Full Text Available The aim of this study is to determine the individual differences that are effective in the formation of psychological contract breach perception and to examine whether perceived supervisor support has a mediating role in the effect of these differences. Psychological contract refers to each part’s evaluation regarding what will be presented to and received from the other in the employee- organization relationship. Psychological contract breach reflects employee’s assessment that the employer did not bear its responsibilities. It was investigated whether individual differences such as positive affectivity, equity sensitivity and reciprocation wariness has an impact on psychological contract breach and it was addressed whether perceived supervisor support has a mediating role in the impact of these variables. A field study based on random sampling was conducted on a firm having 1500 employees operating in service industry and data were gathered through question forms from 285 employees. After demonstrating the dimensional structures of variables through confirmatory factor analysis, path analysis was conducted through structural equation modelling. It was found that equity sensitivity, reciprocation wariness and positive affectivity didn’t affect psychological contract breach. But perceived supervisor support was found to have a mediation role in the relationship between positive affectivity and psychological contract breach.

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M; Edwards, Thomas C; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E; Zurell, Damaris; Schurr, Frank M

    2014-12-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate ( r ) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions. The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density ( N ) relate to occurrence probability ( P occ ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, Western US, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments. Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with P occ , while N, and for most regions K, was generally positively correlated with P occ . Thus, in temperate forest trees the regions of highest occurrence

  7. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  8. Probability shapes perceptual precision: A study in orientation estimation.

    Science.gov (United States)

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  9. Trial type probability modulates the cost of antisaccades

    Science.gov (United States)

    Chiau, Hui-Yan; Tseng, Philip; Su, Jia-Han; Tzeng, Ovid J. L.; Hung, Daisy L.; Muggleton, Neil G.

    2011-01-01

    The antisaccade task, where eye movements are made away from a target, has been used to investigate the flexibility of cognitive control of behavior. Antisaccades usually have longer saccade latencies than prosaccades, the so-called antisaccade cost. Recent studies have shown that this antisaccade cost can be modulated by event probability. This may mean that the antisaccade cost can be reduced, or even reversed, if the probability of surrounding events favors the execution of antisaccades. The probabilities of prosaccades and antisaccades were systematically manipulated by changing the proportion of a certain type of trial in an interleaved pro/antisaccades task. We aimed to disentangle the intertwined relationship between trial type probabilities and the antisaccade cost with the ultimate goal of elucidating how probabilities of trial types modulate human flexible behaviors, as well as the characteristics of such modulation effects. To this end, we examined whether implicit trial type probability can influence saccade latencies and also manipulated the difficulty of cue discriminability to see how effects of trial type probability would change when the demand on visual perceptual analysis was high or low. A mixed-effects model was applied to the analysis to dissect the factors contributing to the modulation effects of trial type probabilities. Our results suggest that the trial type probability is one robust determinant of antisaccade cost. These findings highlight the importance of implicit probability in the flexibility of cognitive control of behavior. PMID:21543748

  10. Time scales of change in chemical and biological parameters after engineered levee breaches adjacent to Upper Klamath and Agency Lakes, Oregon

    Science.gov (United States)

    Kuwabara, James S.; Topping, Brent R.; Carter, James L.; Wood, Tamara M.; Parcheso, Francis; Cameron, Jason M.; Asbill, Jessica R.; Carlson, Rick A.; Fend, Steven V.

    2012-01-01

    Eight sampling trips were coordinated after engineered levee breaches hydrologically reconnected both Upper Klamath Lake and Agency Lake, Oregon, to adjacent wetlands. The reconnection, by a series of explosive blasts, was coordinated by The Nature Conservancy to reclaim wetlands that had for approximately seven decades been leveed for crop production. Sets of nonmetallic porewater profilers (U.S. Patent 8,051,727 B1; November 8, 2011; http://www.uspto.gov/web/patents/patog/ week45/OG/html/1372-2/US08051727-20111108.html.) were deployed during these trips in November 2007, June 2008, May 2009, July 2009, May 2010, August 2010, June 2011, and July 2011 (table 1). Deployments temporally spanned the annual cyanophyte bloom of Aphanizomenon flos-aquae and spatially involved three lake and four wetland sites. Spatial and temporal variation in solute benthic flux was determined by the field team, using the profilers, over an approximately 4-year period beginning 3 days after the levee breaches. The highest flux to the water column of dissolved organic carbon (DOC) was detected in the newly flooded wetland, contrasting negative or insignificant DOC fluxes at adjacent lake sites. Over the multiyear study, DOC benthic fluxes dissipated in the reconnected wetlands, converging to values similar to those for established wetlands and to the adjacent lake (table 2). In contrast to DOC, benthic sources of soluble reactive phosphorus, ammonium, dissolved iron and manganese from within the reconnected wetlands were consistently elevated (that is, significant in magnitude relative to riverine and established-wetland sources) indicating a multi-year time scale for certain chemical changes after the levee breaches (table 2). Colonization of the reconnected wetlands by aquatic benthic invertebrates during the study trended toward the assemblages in established wetlands, providing further evidence of a multiyear transition of this area to permanent aquatic habitat (table 3). Both the

  11. Internal Medicine residents use heuristics to estimate disease probability.

    Science.gov (United States)

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  12. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  13. Semantic and associative factors in probability learning with words.

    Science.gov (United States)

    Schipper, L M; Hanson, B L; Taylor, G; Thorpe, J A

    1973-09-01

    Using a probability-learning technique with a single word as the cue and with the probability of a given event following this word fixed at .80, it was found (1) that neither high nor low associates to the original word and (2) that neither synonyms nor antonyms showed differential learning curves subsequent to original learning when the probability for the following event was shifted to .20. In a second study when feedback, in the form of knowledge of results, was withheld, there was a clear-cut similarity of predictions to the originally trained word and the synonyms of both high and low association value and a dissimilarity of these words to a set of antonyms of both high and low association value. Two additional studies confirmed the importance of the semantic dimension as compared with association value as traditionally measured.

  14. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  15. Addendum to ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’

    Science.gov (United States)

    Galarraga, Ibon; Sainz de Murieta, Elisa; Markandya, Anil; María Abadie, Luis

    2018-02-01

    This addendum adds to the analysis presented in ‘Understanding risks in the light of uncertainty: low-probability, high-impact coastal events in cities’ Abadie et al (2017 Environ. Res. Lett. 12 014017). We propose to use the framework developed earlier to enhance communication and understanding of risks, with the aim of bridging the gap between highly technical risk management discussion to the public risk aversion debate. We also propose that the framework could be used for stress-testing resilience.

  16. Psychological contract breach and employee health: The relevance of unmet obligations for mental and physical health

    Directory of Open Access Journals (Sweden)

    Mareike Reimann

    2017-04-01

    Full Text Available This study examines the effects of psychological contract breach (PCB on employee mental and physical health (SF-12 using a sample of 3,870 employees derived from a German longitudinal linked employer-employee study across various industries. Results of multivariate regression models and mediation analysis suggest that PCB affects both the mental and the physical health of employees but is more threatening to employee mental health. In addition, mental health partly mediates the effects of PCB on physical health. Also, the findings of this study show that the relative importance of obligations not met by employers differs according to the specific contents of the psychological contract. In conclusion, the results of this study support the idea that PCB works as a psychosocial stressor at work that represents a crucial risk to employee health.

  17. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  18. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  19. Developing a probability-based model of aquifer vulnerability in an agricultural region

    Science.gov (United States)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  20. Highly enhanced avalanche probability using sinusoidally-gated silicon avalanche photodiode

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Shingo; Namekata, Naoto, E-mail: nnao@phys.cst.nihon-u.ac.jp; Inoue, Shuichiro [Institute of Quantum Science, Nihon University, 1-8-14 Kanda-Surugadai, Chiyoda-ku, Tokyo 101-8308 (Japan); Tsujino, Kenji [Tokyo Women' s Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo 162-8666 (Japan)

    2014-01-27

    We report on visible light single photon detection using a sinusoidally-gated silicon avalanche photodiode. Detection efficiency of 70.6% was achieved at a wavelength of 520 nm when an electrically cooled silicon avalanche photodiode with a quantum efficiency of 72.4% was used, which implies that a photo-excited single charge carrier in a silicon avalanche photodiode can trigger a detectable avalanche (charge) signal with a probability of 97.6%.

  1. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  2. Skin damage probabilities using fixation materials in high-energy photon beams

    International Nuclear Information System (INIS)

    Carl, J.; Vestergaard, A.

    2000-01-01

    Patient fixation, such as thermoplastic masks, carbon-fibre support plates and polystyrene bead vacuum cradles, is used to reproduce patient positioning in radiotherapy. Consequently low-density materials may be introduced in high-energy photon beams. The aim of the this study was to measure the increase in skin dose when low-density materials are present and calculate the radiobiological consequences in terms of probabilities of early and late skin damage. An experimental thin-windowed plane-parallel ion chamber was used. Skin doses were measured using various overlaying low-density fixation materials. A fixed geometry of a 10 x 10 cm field, a SSD = 100 cm and photon energies of 4, 6 and 10 MV on Varian Clinac 2100C accelerators were used for all measurements. Radiobiological consequences of introducing these materials into the high-energy photon beams were evaluated in terms of early and late damage of the skin based on the measured surface doses and the LQ-model. The experimental ion chamber save results consistent with other studies. A relationship between skin dose and material thickness in mg/cm 2 was established and used to calculate skin doses in scenarios assuming radiotherapy treatment with opposed fields. Conventional radiotherapy may apply mid-point doses up to 60-66 Gy in daily 2-Gy fractions opposed fields. Using thermoplastic fixation and high-energy photons as low as 4 MV do increase the dose to the skin considerably. However, using thermoplastic materials with thickness less than 100 mg/cm 2 skin doses are comparable with those produced by variation in source to skin distance, field size or blocking trays within clinical treatment set-ups. The use of polystyrene cradles and carbon-fibre materials with thickness less than 100 mg/cm 2 should be avoided at 4 MV at doses above 54-60 Gy. (author)

  3. Does probability of occurrence relate to population dynamics?

    Science.gov (United States)

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence

  4. Breaching Biological Barriers: Protein Translocation Domains as Tools for Molecular Imaging and Therapy

    Directory of Open Access Journals (Sweden)

    Benjamin L. Franc

    2003-10-01

    Full Text Available The lipid bilayer of a cell presents a significant barrier for the delivery of many molecular imaging reagents into cells at target sites in the body. Protein translocation domains (PTDs are peptides that breach this barrier. Conjugation of PTDs to imaging agents can be utilized to facilitate the delivery of these agents through the cell wall, and in some cases, into the cell nucleus, and have potential for in vitro and in vivo applications. PTD imaging conjugates have included small molecules, peptides, proteins, DNA, metal chelates, and magnetic nanoparticles. The full potential of the use of PTDs in novel in vivo molecular probes is currently under investigation. Cells have been labeled in culture using magnetic nanoparticles derivatized with a PTD and monitored in vivo to assess trafficking patterns relative to cells expressing a target antigen. In vivo imaging of PTD-mediated gene transfer to cells of the skin has been demonstrated in living animals. Here we review several natural and synthetic PTDs that have evolved in the quest for easier translocation across biological barriers and the application of these peptide domains to in vivo delivery of imaging agents.

  5. The physician's breach of the duty to inform the parent of deformities and abnormalities in the foetus: "wrongful life" actions, a new frontier of medical responsibility.

    Science.gov (United States)

    Frati, Paola; Gulino, Matteo; Turillazzi, Emanuela; Zaami, Simona; Fineschi, Vittorio

    2014-07-01

    A recent decision of the Italian Highest Court for the first time legitimized wrongful life suits. The Court stated the following principles: (a) the contract between the mother and the doctor has also protective effects in favour of third parties (father, siblings and the disabled child) who have the right to be compensated; (b) the right to compensation is neither based on the right not to be born nor on the right to be born healthy, but rather it is based on the breach of duty of care which coincides with the child's disabled status; (c) siblings may suffer the reduced availability of their parents; (d) the doctor is held responsible for not providing full information to the mother about the foetal deformity. The Supreme Court once again emphasized the importance of information on the matter of very personal choices, such as termination of pregnancy in case of foetal malformations. In the present case, the gynaecologist breached the duty to inform, especially after the patient requested diagnostic tests designed to highlight any foetal malformations and informed the doctor of the possibility of an eventual subsequent termination of pregnancy if foetal malformations were found.

  6. Modelling of HTR Confinement Behaviour during Accidents Involving Breach of the Helium Pressure Boundary

    Directory of Open Access Journals (Sweden)

    Joan Fontanet

    2009-01-01

    Full Text Available Development of HTRs requires the performance of a thorough safety study, which includes accident analyses. Confinement building performance is a key element of the system since the behaviour of aerosol and attached fission products within the building is of an utmost relevance in terms of the potential source term to the environment. This paper explores the available simulation capabilities (ASTEC and CONTAIN codes and illustrates the performance of a postulated HTR vented confinement under prototypical accident conditions by a scoping study based on two accident sequences characterized by Helium Pressure Boundary breaches, a small and a large break. The results obtained indicate that both codes predict very similar thermal-hydraulic responses of the confinement both in magnitude and timing. As for the aerosol behaviour, both codes predict that most of the inventory coming into the confinement is eventually depleted on the walls and only about 1% of the aerosol dust is released to the environment. The crosscomparison of codes states that largest differences are in the intercompartmental flows and the in-compartment gas composition.

  7. Poor concordance of spiral CT (SCT) and high probability ventilation-perfusion (V/Q) studies in the diagnosis of pulmonary embolism (PE)

    International Nuclear Information System (INIS)

    Roman, M.R.; Angelides, S.; Chen, N.

    2000-01-01

    Full text: Despite its limitations, V/Q scintigraphy remains the favoured non-invasive technique for the diagnosis of pulmonary embolism (PE). PE is present in 85-90% and 30-40% of high and intermediate probability V/Q studies respectively. The value of spiral CT (SCT), a newer imaging modality, has yet to be determined. The aims of this study were to determine the frequency of positive SCT for PE in high and intermediate probability V/Q studies performed within 24hr apart. 15 patients (6M, 9F, mean age - 70.2) with a high probability study were included. Six (40%) SCT were reported as positive (four with emboli present in the main pulmonary arteries), seven as negative, one equivocal and one was technically sub-optimal. Pulmonary angiography was not performed in any patient. In all seven negative studies, the SCT was performed before the V/Q study. Of these, two studies were revised to positive once the result of the V/Q study was known, while, three others had resolving mismatch V/Q defects on follow-up studies (performed 5-14 days later); two of these three also had a positive duplex scan of the lower limbs. One other was most likely due to chronic thromboembolic disease. Only three patients had a V/Q scan prior to the SCT; all were positive for PE on both imaging modalities. Of 26 patients (11M, 15F, mean age - 68.5) with an intermediate probability V/Q study, SCT was positive in only two (8%). Thus the low detection rate of PE by SCT in this albeit small series, raises doubts as to its role in the diagnosis of PE. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  8. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  9. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    International Nuclear Information System (INIS)

    Krupnick, A.J.; Markandya, A.; Nickell, E.

    1994-01-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report

  10. The external costs of low probability-high consequence events: Ex ante damages and lay risks

    Energy Technology Data Exchange (ETDEWEB)

    Krupnick, A J; Markandya, A; Nickell, E

    1994-07-01

    This paper provides an analytical basis for characterizing key differences between two perspectives on how to estimate the expected damages of low probability - high consequence events. One perspective is the conventional method used in the U.S.-EC fuel cycle reports [e.g., ORNL/RFF (1994a,b]. This paper articulates another perspective, using economic theory. The paper makes a strong case for considering this, approach as an alternative, or at least as a complement, to the conventional approach. This alternative approach is an important area for future research. I Interest has been growing worldwide in embedding the external costs of productive activities, particularly the fuel cycles resulting in electricity generation, into prices. In any attempt to internalize these costs, one must take into account explicitly the remote but real possibilities of accidents and the wide gap between lay perceptions and expert assessments of such risks. In our fuel cycle analyses, we estimate damages and benefits' by simply monetizing expected consequences, based on pollution dispersion models, exposure-response functions, and valuation functions. For accidents, such as mining and transportation accidents, natural gas pipeline accidents, and oil barge accidents, we use historical data to estimate the rates of these accidents. For extremely severe accidents--such as severe nuclear reactor accidents and catastrophic oil tanker spills--events are extremely rare and they do not offer a sufficient sample size to estimate their probabilities based on past occurrences. In those cases the conventional approach is to rely on expert judgments about both the probability of the consequences and their magnitude. As an example of standard practice, which we term here an expert expected damage (EED) approach to estimating damages, consider how evacuation costs are estimated in the nuclear fuel cycle report.

  11. Inundation risk for embanked rivers

    Directory of Open Access Journals (Sweden)

    W. G. Strupczewski

    2013-08-01

    Full Text Available The Flood Frequency Analysis (FFA concentrates on probability distribution of peak flows of flood hydrographs. However, examination of floods that haunted and devastated the large parts of Poland lead us to revision of the views on the assessment of flood risk of Polish rivers. It turned out that flooding is caused not only by the overflow of the levee crest but also due to the prolonged exposure to high water on levees structure causing dangerous leaks and breaches that threaten their total destruction. This is because the levees are weakened by long-lasting water pressure and as a matter of fact their damage usually occurs after the culmination has passed the affected location. The probability of inundation is the total of probabilities of exceeding embankment crest by flood peak and the probability of washout of levees. Therefore, in addition to the maximum flow one should also consider the duration of high waters in a river channel. In the paper the new two-component model of flood dynamics: "Duration of high waters–Discharge Threshold–Probability of non-exceedance" (DqF, with the methodology of its parameter estimation was proposed as a completion to the classical FFA methods. Such a model can estimate the duration of stages (flows of an assumed magnitude with a given probability of exceedance. The model combined with the technical evaluation of the probability of levee breaches due to the duration (d of flow above alarm stage gives the annual probability of inundation caused by the embankment breaking. The results of theoretical investigation were illustrated by a practical example of the model implementation to the series of daily flow of the Vistula River at Szczucin. Regardless of promising results, the method of risk assessment due to prolonged exposure of levees to high water is still in its infancy despite its great cognitive potential and practical importance. Therefore, we would like to point out the need for and usefulness of

  12. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    Science.gov (United States)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  13. An analytical calculation of neighbourhood order probabilities for high dimensional Poissonian processes and mean field models

    International Nuclear Information System (INIS)

    Tercariol, Cesar Augusto Sangaletti; Kiipper, Felipe de Moura; Martinez, Alexandre Souto

    2007-01-01

    Consider that the coordinates of N points are randomly generated along the edges of a d-dimensional hypercube (random point problem). The probability P (d,N) m,n that an arbitrary point is the mth nearest neighbour to its own nth nearest neighbour (Cox probabilities) plays an important role in spatial statistics. Also, it has been useful in the description of physical processes in disordered media. Here we propose a simpler derivation of Cox probabilities, where we stress the role played by the system dimensionality d. In the limit d → ∞, the distances between pair of points become independent (random link model) and closed analytical forms for the neighbourhood probabilities are obtained both for the thermodynamic limit and finite-size system. Breaking the distance symmetry constraint drives us to the random map model, for which the Cox probabilities are obtained for two cases: whether a point is its own nearest neighbour or not

  14. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  15. Conditional probability of intense rainfall producing high ground concentrations from radioactive plumes

    International Nuclear Information System (INIS)

    Wayland, J.R.

    1977-03-01

    The overlap of the expanding plume of radioactive material from a hypothetical nuclear accident with rainstorms over dense population areas is considered. The conditional probability of the occurrence of hot spots from intense cellular rainfall is presented

  16. Control of degradation of spent LWR [light-water reactor] fuel during dry storage in an inert atmosphere

    International Nuclear Information System (INIS)

    Cunningham, M.E.; Simonen, E.P.; Allemann, R.T.; Levy, I.S.; Hazelton, R.F.

    1987-10-01

    Dry storage of Zircaloy-clad spent fuel in inert gas (referred to as inerted dry storage or IDS) is being developed as an alternative to water pool storage of spent fuel. The objectives of the activities described in this report are to identify potential Zircaloy degradation mechanisms and evaluate their applicability to cladding breach during IDS, develop models of the dominant Zircaloy degradation mechanisms, and recommend cladding temperature limits during IDS to control Zircaloy degradation. The principal potential Zircaloy cladding breach mechanisms during IDS have been identified as creep rupture, stress corrosion cracking (SCC), and delayed hydride cracking (DHC). Creep rupture is concluded to be the primary cladding breach mechanism during IDS. Deformation and fracture maps based on creep rupture were developed for Zircaloy. These maps were then used as the basis for developing spent fuel cladding temperature limits that would prevent cladding breach during a 40-year IDS period. The probability of cladding breach for spent fuel stored at the temperature limit is less than 0.5% per spent fuel rod. 52 refs., 7 figs., 1 tab

  17. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  18. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  20. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  1. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  2. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  3. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  4. Subsequent investigation and management of patients with intermediate-category and - probability ventilation - perfusion scintigraphy

    International Nuclear Information System (INIS)

    Walsh, G.; Jones, D.N.

    2000-01-01

    The authors wished to determine the proportion of patients with intermediate-category and intermediate-probability ventilation-perfusion scintigraphy (IVQS) who proceed to further imaging for investigation of thromboembolism, to identify the defining clinical parameters and to determine the proportion of patients who have a definite imaging diagnosis of thromboembolism prior to discharge from hospital on anticoagulation therapy. One hundred and twelve VQS studies performed at the Flinders Medical Centre over a 9-month period were reported as having intermediate category and probability for pulmonary embolism. Medical case notes were available for review in 99 of these patients and from these the pretest clinical probability, subsequent patient progress and treatment were recorded. Eight cases were excluded because they were already receiving anticoagulation therapy. In the remaining 91 patients the pretest clinical probability was considered to be low in 25; intermediate in 30; and high in 36 cases. In total, 51.6% (n = 47) of these patients (8% (n = 2) with low, 66% (n = 20) with intermediate, and 69.4% (n = 25) with high pretest probability) proceeded to CT pulmonary angiography (CTPA) and/or lower limb duplex Doppler ultrasound (DUS) evaluation. Of the patients with IVQS results, 30.7% (n 28) were evaluated with CTPA. No patient with a low, all patients with a high and 46% of patients with an intermediate pretest probability initially received anticoagulation therapy. This was discontinued in three patients with high and in 12 patients with intermediate clinical probability prior to discharge from hospital. Overall, 40% of patients discharged on anticoagulation therapy (including 39% of those with a high pretest probability) had a positive imaging diagnosis of thromboembolism The results suggest that, although the majority of patients with intermediate-to-high pretest probability and IVQS proceed to further imaging investigation, CTPA is relatively underused in

  5. Probability of misclassifying biological elements in surface waters.

    Science.gov (United States)

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  6. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    Science.gov (United States)

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response to…

  7. Kasei Vallis of Mars: Dating the Interplay of Tectonics and Geomorphology

    Science.gov (United States)

    Wise, D. U.

    1985-01-01

    Crater density age dates on more than 250 small geomorphic surfaces in the Kasei Region of Mars show clusterings indicative of times of peak geomorphic and tectonic activity. Kasei Vallis is part of a 300 km wide channel system breaching a N-S trending ancient basement high (+50,000 crater age) separating the Chryse Basin from the Tharsis Volcanic Province of Mars. The basement high was covered by a least 3 groups of probable volcanic deposits. Major regional fracturing took place at age 4,000 to 5,000 and was immediately followed by deposition of regional volcanics of the Fesenkov Plains (age 3,000 to 4,200). Younger clusterings of dates in the 900 to 1,500 and 500 to 700 range represent only minor modification of the basic tectonic geomorphic landform. The data suggest that Kasei gap is a structurally controlled breach of a buried ridge by a rather brief episode of fluvial activity.

  8. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  9. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  10. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  11. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  12. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  13. Scale-invariant transition probabilities in free word association trajectories

    Directory of Open Access Journals (Sweden)

    Martin Elias Costa

    2009-09-01

    Full Text Available Free-word association has been used as a vehicle to understand the organization of human thoughts. The original studies relied mainly on qualitative assertions, yielding the widely intuitive notion that trajectories of word associations are structured, yet considerably more random than organized linguistic text. Here we set to determine a precise characterization of this space, generating a large number of word association trajectories in a web implemented game. We embedded the trajectories in the graph of word co-occurrences from a linguistic corpus. To constrain possible transport models we measured the memory loss and the cycling probability. These two measures could not be reconciled by a bounded diffusive model since the cycling probability was very high (16 % of order-2 cycles implying a majority of short-range associations whereas the memory loss was very rapid (converging to the asymptotic value in ∼ 7 steps which, in turn, forced a high fraction of long-range associations. We show that memory loss and cycling probabilities of free word association trajectories can be simultaneously accounted by a model in which transitions are determined by a scale invariant probability distribution.

  14. Unrelated Hematopoietic Stem Cell Donor Matching Probability and Search Algorithm

    Directory of Open Access Journals (Sweden)

    J.-M. Tiercy

    2012-01-01

    Full Text Available In transplantation of hematopoietic stem cells (HSCs from unrelated donors a high HLA compatibility level decreases the risk of acute graft-versus-host disease and mortality. The diversity of the HLA system at the allelic and haplotypic level and the heterogeneity of HLA typing data of the registered donors render the search process a complex task. This paper summarizes our experience with a search algorithm that includes at the start of the search a probability estimate (high/intermediate/low to identify a HLA-A, B, C, DRB1, DQB1-compatible donor (a 10/10 match. Based on 2002–2011 searches about 30% of patients have a high, 30% an intermediate, and 40% a low probability search. Search success rate and duration are presented and discussed in light of the experience of other centers. Overall a 9-10/10 matched HSC donor can now be identified for 60–80% of patients of European descent. For high probability searches donors can be selected on the basis of DPB1-matching with an estimated success rate of >40%. For low probability searches there is no consensus on which HLA incompatibilities are more permissive, although HLA-DQB1 mismatches are generally considered as acceptable. Models for the discrimination of more detrimental mismatches based on specific amino acid residues rather than specific HLA alleles are presented.

  15. Jihadist Foreign Fighter Phenomenon in Western Europe: A Low-Probability, High-Impact Threat

    Directory of Open Access Journals (Sweden)

    Edwin Bakker

    2015-11-01

    Full Text Available The phenomenon of foreign fighters in Syria and Iraq is making headlines. Their involvement in the atrocities committed by terrorist groups such as the so-called “Islamic State” and Jabhat al-Nusra have caused grave concern and public outcry in the foreign fighters’ European countries of origin. While much has been written about these foreign fighters and the possible threat they pose, the impact of this phenomenon on Western European societies has yet to be documented. This Research Paper explores four particular areas where this impact is most visible: a violent incidents associated with (returned foreign fighters, b official and political responses linked to these incidents, c public opinion, and d anti-Islam reactions linked to these incidents. The authors conclude that the phenomenon of jihadist foreign fighters in European societies should be primarily regarded as a social and political threat, not a physical one. They consider the phenomenon of European jihadist foreign fighters a “low-probability, high-impact” threat.

  16. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  17. Tuned by experience: How orientation probability modulates early perceptual processing.

    Science.gov (United States)

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-09-01

    Probable stimuli are more often and more quickly detected. While stimulus probability is known to affect decision-making, it can also be explained as a perceptual phenomenon. Using spatial gratings, we have previously shown that probable orientations are also more precisely estimated, even while participants remained naive to the manipulation. We conducted an electrophysiological study to investigate the effect that probability has on perception and visual-evoked potentials. In line with previous studies on oddballs and stimulus prevalence, low-probability orientations were associated with a greater late positive 'P300' component which might be related to either surprise or decision-making. However, the early 'C1' component, thought to reflect V1 processing, was dampened for high-probability orientations while later P1 and N1 components were unaffected. Exploratory analyses revealed a participant-level correlation between C1 and P300 amplitudes, suggesting a link between perceptual processing and decision-making. We discuss how these probability effects could be indicative of sharpening of neurons preferring the probable orientations, due either to perceptual learning, or to feature-based attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  19. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  20. Human Error Probability Assessment During Maintenance Activities of Marine Systems

    Directory of Open Access Journals (Sweden)

    Rabiul Islam

    2018-03-01

    Full Text Available Background: Maintenance operations on-board ships are highly demanding. Maintenance operations are intensive activities requiring high man–machine interactions in challenging and evolving conditions. The evolving conditions are weather conditions, workplace temperature, ship motion, noise and vibration, and workload and stress. For example, extreme weather condition affects seafarers' performance, increasing the chances of error, and, consequently, can cause injuries or fatalities to personnel. An effective human error probability model is required to better manage maintenance on-board ships. The developed model would assist in developing and maintaining effective risk management protocols. Thus, the objective of this study is to develop a human error probability model considering various internal and external factors affecting seafarers' performance. Methods: The human error probability model is developed using probability theory applied to Bayesian network. The model is tested using the data received through the developed questionnaire survey of >200 experienced seafarers with >5 years of experience. The model developed in this study is used to find out the reliability of human performance on particular maintenance activities. Results: The developed methodology is tested on the maintenance of marine engine's cooling water pump for engine department and anchor windlass for deck department. In the considered case studies, human error probabilities are estimated in various scenarios and the results are compared between the scenarios and the different seafarer categories. The results of the case studies for both departments are also compared. Conclusion: The developed model is effective in assessing human error probabilities. These probabilities would get dynamically updated as and when new information is available on changes in either internal (i.e., training, experience, and fatigue or external (i.e., environmental and operational conditions

  1. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  2. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  3. Entanglement probabilities of polymers: a white noise functional approach

    International Nuclear Information System (INIS)

    Bernido, Christopher C; Carpio-Bernido, M Victoria

    2003-01-01

    The entanglement probabilities for a highly flexible polymer to wind n times around a straight polymer are evaluated using white noise analysis. To introduce the white noise functional approach, the one-dimensional random walk problem is taken as an example. The polymer entanglement scenario, viewed as a random walk on a plane, is then treated and the entanglement probabilities are obtained for a magnetic flux confined along the straight polymer, and a case where an entangled polymer is subjected to the potential V = f-dot(s)θ. In the absence of the magnetic flux and the potential V, the entanglement probabilities reduce to a result obtained by Wiegel

  4. The probability of containment failure by direct containment heating in Zion. Supplement 1

    International Nuclear Information System (INIS)

    Pilch, M.M.; Allen, M.D.; Stamps, D.W.; Tadios, E.L.; Knudson, D.L.

    1994-12-01

    Supplement 1 of NUREG/CR-6075 brings to closure the DCH issue for the Zion plant. It includes the documentation of the peer review process for NUREG/CR-6075, the assessments of four new splinter scenarios defined in working group meetings, and modeling enhancements recommended by the working groups. In the four new scenarios, consistency of the initial conditions has been implemented by using insights from systems-level codes. SCDAP/RELAP5 was used to analyze three short-term station blackout cases with Different lead rates. In all three case, the hot leg or surge line failed well before the lower head and thus the primary system depressurized to a point where DCH was no longer considered a threat. However, these calculations were continued to lower head failure in order to gain insights that were useful in establishing the initial and boundary conditions. The most useful insights are that the RCS pressure is-low at vessel breach metallic blockages in the core region do not melt and relocate into the lower plenum, and melting of upper plenum steel is correlated with hot leg failure. THE SCDAP/RELAP output was used as input to CONTAIN to assess the containment conditions at vessel breach. The containment-side conditions predicted by CONTAIN are similar to those originally specified in NUREG/CR-6075

  5. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  6. Breaching the Devil’s Garden- The 6th New Zealand Brigade in Operation Lightfoot. The Second Battle of El Alamein, 23 October 1942. Appendices

    Science.gov (United States)

    2006-02-01

    in Rommel. A Narrative and Pictorial History, by Richard D. Law and Craig W. H. Luther, ISBN 0-912138-20-3, R. James Bender Publishing, San Jose...1Minem aauetr Gefeclit geaetzt. gez. Andrea . Hw.ipt~mann and Btl. Kdr. ob erleutnt;&rit- BREACHING THE "DEVIL’S GARDEN" Operation Liglqfoot F-49 7...51` Semi-Motorized Mixed Engineer Battalion (battaglione delgenio e di colleganmenti) (CPT Alberti , assigned 12/323 as of 22 Aug) 15"’ Semi

  7. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  8. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  9. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  10. An Alluvial Fan at Apollinaris Patera, Mars

    OpenAIRE

    Ghail, RC; Hutchison, JE

    2003-01-01

    Apollinaris Patera, Mars (7?S,173?E), is an intermediate sized volcano (~6 km high, 150 km diameter) with a large (200-km long) fan-like deposit on its southern flank. This fan is deeply incised and originates from a single breach in the rim of the summit caldera. New topographic and multispectral image data reveal that this fan is alluvial, implying a long-lived source of (volcaniclastic) sediment and water (probably from a caldera lake).

  11. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  12. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  13. Ruin probability with claims modeled by a stationary ergodic stable process

    NARCIS (Netherlands)

    Mikosch, T.; Samorodnitsky, G.

    2000-01-01

    For a random walk with negative drift we study the exceedance probability (ruin probability) of a high threshold. The steps of this walk (claim sizes) constitute a stationary ergodic stable process. We study how ruin occurs in this situation and evaluate the asymptotic behavior of the ruin

  14. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. Bireysel Farklılıklar İle Psikolojik Sözleşme İhlali Arasındaki İlişkide Üstün Desteğinin Aracılık Rolü (The Mediating Role Of Supervisor Support In The Relationship Between Individual Differences And Psychological Contract Breach)

    OpenAIRE

    Canan Nur KARABEY; Canan Nur KARABEY

    2016-01-01

    The aim of this study is to determine the individual differences that are effective in the formation of psychological contract breach perception and to examine whether perceived supervisor support has a mediating role in the effect of these differences. Psychological contract refers to each part’s evaluation regarding what will be presented to and received from the other in the employee- organization relationship. Psychological contract breach reflects employee’s assessment that t...

  16. The State of Integrated Air and Missile Defense Held in Laurel, Maryland on July 14, 2011

    Science.gov (United States)

    2011-07-14

    compromised from servers (-22%) 86% were discovered by a third party (+25%) 96% of breaches were avoidable (+-0) Source 2011 Data Breach Investigations...Foreign Espionage - Terrorists - State Sponsored Attacks UNCLASSIFIED UNCLASSIFIED 11 What commonalities exist? How do breaches occur? Verizon Data ... Breach Study “Breaching organizations still doesn’t typically require highly sophisticated attacks, most victims are a target of opportunity rather

  17. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  18. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  19. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  20. The mediating effect of psychosocial factors on suicidal probability among adolescents.

    Science.gov (United States)

    Hur, Ji-Won; Kim, Won-Joong; Kim, Yong-Ku

    2011-01-01

    Suicidal probability is an actual tendency including negative self-evaluation, hopelessness, suicidal ideation, and hostility. The purpose of this study was to examine the role of psychosocial variances in the suicidal probability of adolescents, especially the role of mediating variance. This study investigated the mediating effects of psychosocial factors such as depression, anxiety, self-esteem, stress, and social support on the suicidal probability among 1,586 adolescents attending middle and high schools in the Kyunggi Province area of South Korea. The relationship between depression and anxiety/suicidal probability was mediated by both social resources and self-esteem. Furthermore, the influence of social resources was mediated by interpersonal and achievement stress as well as self-esteem. This study suggests that suicidal probability in adolescents has various relationships, including mediating relations, with several psychosocial factors. The interventions on suicidal probability in adolescents should focus on social factors as well as clinical symptoms.

  1. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  2. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  3. Production of 147Eu for gamma-ray emission probability measurement

    International Nuclear Information System (INIS)

    Katoh, Keiji; Marnada, Nada; Miyahara, Hiroshi

    2002-01-01

    Gamma-ray emission probability is one of the most important decay parameters of radionuclide and many researchers are paying efforts to improve the certainty of it. The certainties of γ-ray emission probabilities for neutron-rich nuclides are being improved little by little, but the improvements of those for proton-rich nuclides are still insufficient. Europium-147 that decays by electron capture or β + -particle emission is a proton-rich nuclide and the γ-ray emission probabilities evaluated by Mateosian and Peker have large uncertainties. They referred to only one report concerning with γ-ray emission probabilities. Our final purpose is to determine the precise γ-ray emission probabilities of 147 Eu from disintegration rates and γ-ray intensities by using a 4πβ-γ coincidence apparatus. Impurity nuclides affect largely to the determination of disintegration rate; therefore, a highly pure 147 Eu source is required. This short note will describe the most proper energy for 147 Eu production through 147 Sm(p, n) reaction. (author)

  4. Probability intervals for the top event unavailability of fault trees

    International Nuclear Information System (INIS)

    Lee, Y.T.; Apostolakis, G.E.

    1976-06-01

    The evaluation of probabilities of rare events is of major importance in the quantitative assessment of the risk from large technological systems. In particular, for nuclear power plants the complexity of the systems, their high reliability and the lack of significant statistical records have led to the extensive use of logic diagrams in the estimation of low probabilities. The estimation of probability intervals for the probability of existence of the top event of a fault tree is examined. Given the uncertainties of the primary input data, a method is described for the evaluation of the first four moments of the top event occurrence probability. These moments are then used to estimate confidence bounds by several approaches which are based on standard inequalities (e.g., Tchebycheff, Cantelli, etc.) or on empirical distributions (the Johnson family). Several examples indicate that the Johnson family of distributions yields results which are in good agreement with those produced by Monte Carlo simulation

  5. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    Science.gov (United States)

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.

  6. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  7. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  8. The estimated lifetime probability of acquiring human papillomavirus in the United States.

    Science.gov (United States)

    Chesson, Harrell W; Dunne, Eileen F; Hariri, Susan; Markowitz, Lauri E

    2014-11-01

    Estimates of the lifetime probability of acquiring human papillomavirus (HPV) can help to quantify HPV incidence, illustrate how common HPV infection is, and highlight the importance of HPV vaccination. We developed a simple model, based primarily on the distribution of lifetime numbers of sex partners across the population and the per-partnership probability of acquiring HPV, to estimate the lifetime probability of acquiring HPV in the United States in the time frame before HPV vaccine availability. We estimated the average lifetime probability of acquiring HPV among those with at least 1 opposite sex partner to be 84.6% (range, 53.6%-95.0%) for women and 91.3% (range, 69.5%-97.7%) for men. Under base case assumptions, more than 80% of women and men acquire HPV by age 45 years. Our results are consistent with estimates in the existing literature suggesting a high lifetime probability of HPV acquisition and are supported by cohort studies showing high cumulative HPV incidence over a relatively short period, such as 3 to 5 years.

  9. Breach of belongingness: Newcomer relationship conflict, information, and task-related outcomes during organizational socialization.

    Science.gov (United States)

    Nifadkar, Sushil S; Bauer, Talya N

    2016-01-01

    Previous studies of newcomer socialization have underlined the importance of newcomers' information seeking for their adjustment to the organization, and the conflict literature has consistently reported negative effects of relationship conflict with coworkers. However, to date, no study has examined the consequences of relationship conflict on newcomers' information seeking. In this study, we examined newcomers' reactions when they have relationship conflict with their coworkers, and hence cannot obtain necessary information from them. Drawing upon belongingness theory, we propose a model that moves from breach of belongingness to its proximal and distal consequences, to newcomer information seeking, and then to task-related outcomes. In particular, we propose that second paths exist-first coworker-centric and the other supervisor-centric-that may have simultaneous yet contrasting influence on newcomer adjustment. To test our model, we employ a 3-wave data collection research design with egocentric and Likert-type multisource surveys among a sample of new software engineers and their supervisors working in India. This study contributes to the field by linking the literatures on relationship conflict and newcomer information seeking and suggesting that despite conflict with coworkers, newcomers may succeed in organizations by building relationships with and obtaining information from supervisors. (c) 2016 APA, all rights reserved).

  10. On the probability of occurrence of rogue waves

    Directory of Open Access Journals (Sweden)

    E. M. Bitner-Gregersen

    2012-03-01

    Full Text Available A number of extreme and rogue wave studies have been conducted theoretically, numerically, experimentally and based on field data in the last years, which have significantly advanced our knowledge of ocean waves. So far, however, consensus on the probability of occurrence of rogue waves has not been achieved. The present investigation is addressing this topic from the perspective of design needs. Probability of occurrence of extreme and rogue wave crests in deep water is here discussed based on higher order time simulations, experiments and hindcast data. Focus is given to occurrence of rogue waves in high sea states.

  11. Medical negligence. An overview of legal theory and neurosurgical practice: causation.

    Science.gov (United States)

    Todd, Nicholas V

    2014-06-01

    This article discusses the principles of the law in relation to legal causation as applied to neurosurgical practice. Causation is a causal link between a breach of duty of care and the final harm. The fundamental "but-for" test for causation will be discussed, together with Chester v Afshar modified causation, prospective and retrospective probabilities of harm, loss of a chance, causation following breach of duty of care by omission, breaking the chain of causation, material contribution and the law in relation to multiple defendants, with neurosurgical examples.

  12. INVESTIGATION OF INFLUENCE OF ENCODING FUNCTION COMPLEXITY ON DISTRIBUTION OF ERROR MASKING PROBABILITY

    Directory of Open Access Journals (Sweden)

    A. B. Levina

    2016-03-01

    Full Text Available Error detection codes are mechanisms that enable robust delivery of data in unreliable communication channels and devices. Unreliable channels and devices are error-prone objects. Respectively, error detection codes allow detecting such errors. There are two classes of error detecting codes - classical codes and security-oriented codes. The classical codes have high percentage of detected errors; however, they have a high probability to miss an error in algebraic manipulation. In order, security-oriented codes are codes with a small Hamming distance and high protection to algebraic manipulation. The probability of error masking is a fundamental parameter of security-oriented codes. A detailed study of this parameter allows analyzing the behavior of the error-correcting code in the case of error injection in the encoding device. In order, the complexity of the encoding function plays an important role in the security-oriented codes. Encoding functions with less computational complexity and a low probability of masking are the best protection of encoding device against malicious acts. This paper investigates the influence of encoding function complexity on the error masking probability distribution. It will be shownthat the more complex encoding function reduces the maximum of error masking probability. It is also shown in the paper that increasing of the function complexity changes the error masking probability distribution. In particular, increasing of computational complexity decreases the difference between the maximum and average value of the error masking probability. Our resultshave shown that functions with greater complexity have smoothed maximums of error masking probability, which significantly complicates the analysis of error-correcting code by attacker. As a result, in case of complex encoding function the probability of the algebraic manipulation is reduced. The paper discusses an approach how to measure the error masking

  13. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  14. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  15. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  16. A Multidisciplinary Approach for Teaching Statistics and Probability

    Science.gov (United States)

    Rao, C. Radhakrishna

    1971-01-01

    The author presents a syllabus for an introductory (first year after high school) course in statistics and probability and some methods of teaching statistical techniques. The description comes basically from the procedures used at the Indian Statistical Institute, Calcutta. (JG)

  17. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  18. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  19. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    Science.gov (United States)

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  20. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    Science.gov (United States)

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  1. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  2. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  3. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  4. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  5. An Upper Bound on High Speed Satellite Collision Probability When Only One Object has Position Uncertainty Information

    Science.gov (United States)

    Frisbee, Joseph H., Jr.

    2015-01-01

    Upper bounds on high speed satellite collision probability, PC †, have been investigated. Previous methods assume an individual position error covariance matrix is available for each object. The two matrices being combined into a single, relative position error covariance matrix. Components of the combined error covariance are then varied to obtain a maximum PC. If error covariance information for only one of the two objects was available, either some default shape has been used or nothing could be done. An alternative is presented that uses the known covariance information along with a critical value of the missing covariance to obtain an approximate but potentially useful Pc upper bound.

  6. Effects of the Upper Taum Sauk Reservoir Embankment Breach on the Surface-Water Quality and Sediments of the East Fork Black River and the Black River, Southeastern Missouri - 2006-07

    Science.gov (United States)

    Barr, Miya N.

    2009-01-01

    On December 14, 2005, a 680-foot wide section of the upper reservoir embankment of the Taum Sauk pump-storage hydroelectric powerplant located in Reynolds County, Missouri, suddenly failed. This catastrophic event sent approximately 1.5 billion gallons of water into the Johnson's Shut-Ins State Park and into the East Fork Black River, and deposited enormous quantities of rock, soil, and vegetation in the flooded areas. Water-quality data were collected within and below the impacted area to study and document the changes to the riverene system. Data collection included routine, event-based, and continuous surface-water quality monitoring as well as suspended- and streambed-sediment sampling. Surface water-quality samples were collected and analyzed for a suite of physical and chemical constituents including: turbidity; nutrients; major ions such as calcium, magnesium, and potassium; total suspended solids; total dissolved solids; trace metals such as aluminum, iron, and lead; and suspended-sediment concentrations. Suspended-sediment concentrations were used to calculate daily sediment discharge. A peculiar blue-green coloration on the water surface of the East Fork Black River and Black River was evident downstream from the lower reservoir during the first year of the study. It is possible that this phenomenon was the result of 'rock flour' occurring when the upper reservoir embankment was breached, scouring the mountainside and producing extremely fine sediment particles, or from the alum-based flocculent used to reduce turbidity in the lower reservoir. It also was determined that no long-term effects of the reservoir embankment breach are expected as the turbidity and concentrations of trace metals such as total recoverable aluminum, dissolved aluminum, dissolved iron, and suspended-sediment concentration graphically decreased over time. Larger concentrations of these constituents during the beginning of the study also could be a direct result of the alum

  7. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  8. High-resolution elastic recoil detection utilizing Bayesian probability theory

    International Nuclear Information System (INIS)

    Neumaier, P.; Dollinger, G.; Bergmaier, A.; Genchev, I.; Goergens, L.; Fischer, R.; Ronning, C.; Hofsaess, H.

    2001-01-01

    Elastic recoil detection (ERD) analysis is improved in view of depth resolution and the reliability of the measured spectra. Good statistics at even low ion fluences is obtained utilizing a large solid angle of 5 msr at the Munich Q3D magnetic spectrograph and using a 40 MeV 197 Au beam. In this way the elemental depth profiles are not essentially altered during analysis even if distributions with area densities below 1x10 14 atoms/cm 2 are measured. As the energy spread due to the angular acceptance is fully eliminated by ion-optical and numerical corrections, an accurate and reliable apparatus function is derived. It allows to deconvolute the measured spectra using the adaptive kernel method, a maximum entropy concept in the framework of Bayesian probability theory. In addition, the uncertainty of the reconstructed spectra is quantified. The concepts are demonstrated at 13 C depth profiles measured at ultra-thin films of tetrahedral amorphous carbon (ta-C). Depth scales of those profiles are given with an accuracy of 1.4x10 15 atoms/cm 2

  9. CGC/saturation approach for soft interactions at high energy: survival probability of central exclusive production

    Energy Technology Data Exchange (ETDEWEB)

    Gotsman, E.; Maor, U. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Levin, E. [Tel Aviv University, Department of Particle Physics, Raymond and Beverly Sackler Faculty of Exact Science, School of Physics and Astronomy, Tel Aviv (Israel); Universidad Tecnica Federico Santa Maria, Departemento de Fisica, Centro Cientifico-Tecnologico de Valparaiso, Valparaiso (Chile)

    2016-04-15

    We estimate the value of the survival probability for central exclusive production in a model which is based on the CGC/saturation approach. Hard and soft processes are described in the same framework. At LHC energies, we obtain a small value for the survival probability. The source of the small value is the impact parameter dependence of the hard amplitude. Our model has successfully described a large body of soft data: elastic, inelastic and diffractive cross sections, inclusive production and rapidity correlations, as well as the t-dependence of deep inelastic diffractive production of vector mesons. (orig.)

  10. Impact probabilities of meteoroid streams with artificial satellites: An assessment

    International Nuclear Information System (INIS)

    Foschini, L.; Cevolani, G.

    1997-01-01

    Impact probabilities of artificial satellites with meteoroid streams were calculated using data collected with the CNR forward scatter (FS) bistatic radar over the Bologna-Lecce baseline (about 700 km). Results show that impact probabilities are 2 times higher than other previously calculated values. Nevertheless, although catastrophic impacts are still rare even in the case of meteor storm conditions, it is expected that high meteoroid fluxes can erode satellites surfaces and weaken their external structures

  11. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  12. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  13. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  14. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  15. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  16. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  17. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  18. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  19. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Sensitivity of the probability of failure to probability of detection curve regions

    International Nuclear Information System (INIS)

    Garza, J.; Millwater, H.

    2016-01-01

    Non-destructive inspection (NDI) techniques have been shown to play a vital role in fracture control plans, structural health monitoring, and ensuring availability and reliability of piping, pressure vessels, mechanical and aerospace equipment. Probabilistic fatigue simulations are often used in order to determine the efficacy of an inspection procedure with the NDI method modeled as a probability of detection (POD) curve. These simulations can be used to determine the most advantageous NDI method for a given application. As an aid to this process, a first order sensitivity method of the probability-of-failure (POF) with respect to regions of the POD curve (lower tail, middle region, right tail) is developed and presented here. The sensitivity method computes the partial derivative of the POF with respect to a change in each region of a POD or multiple POD curves. The sensitivities are computed at no cost by reusing the samples from an existing Monte Carlo (MC) analysis. A numerical example is presented considering single and multiple inspections. - Highlights: • Sensitivities of probability-of-failure to a region of probability-of-detection curve. • The sensitivities are computed with negligible cost. • Sensitivities identify the important region of a POD curve. • Sensitivities can be used as a guide to selecting the optimal POD curve.

  1. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  2. Truth, possibility and probability new logical foundations of probability and statistical inference

    CERN Document Server

    Chuaqui, R

    1991-01-01

    Anyone involved in the philosophy of science is naturally drawn into the study of the foundations of probability. Different interpretations of probability, based on competing philosophical ideas, lead to different statistical techniques, and frequently to mutually contradictory consequences. This unique book presents a new interpretation of probability, rooted in the traditional interpretation that was current in the 17th and 18th centuries. Mathematical models are constructed based on this interpretation, and statistical inference and decision theory are applied, including some examples in artificial intelligence, solving the main foundational problems. Nonstandard analysis is extensively developed for the construction of the models and in some of the proofs. Many nonstandard theorems are proved, some of them new, in particular, a representation theorem that asserts that any stochastic process can be approximated by a process defined over a space with equiprobable outcomes.

  3. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  4. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  6. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  7. Integrated analysis of DCH in Surry

    International Nuclear Information System (INIS)

    Dingman, S.E.; Harper, F.T.; Pilch, M.M.; Washington, K.E.

    1993-01-01

    An evaluation of the key elements affecting Direct Containment Heating (DCH) was performed for the Surry plant. This involved determining the dominant high pressure core damage sequences, the probability of proceeding to vessel breach at high pressure, the DCH loads, and the containment strength. Each of these factors was evaluated separately, and then the results were combined to give the overall threat from DCH. The maximum containment failure probability by DCH for Surry is 10 -3 when considering four base DCH scenarios and using the two-cell equilibrium (TCE) model. However, higher contamination failure probabilities are estimated in sensitivity cases. When the depressurization and containment loads aspects are combined, the containment failure probability (conditional on station blackout sequence) is less than 19 -2 . CONTAIN calculations were performed to provide insights regarding DCH phenomenological uncertainties and potential conservatisms in the TCE model. The CONTAIN calculations indicated that the TCE calculations were conservative for Surry and that the dominant factors were neglect of heat transfer to surroundings and complete combustion of hydrogen on DCH time scales

  8. Staphylococcus aureus and healthcare-associated infections

    NARCIS (Netherlands)

    Ekkelenkamp, M.B.

    2011-01-01

    Many medical procedures breach or suppress patients’ natural defences, leaving them vulnerable to infections which would not occur in healthy humans: “healthcare-associated infections”. Healthcare-associated infections caused by the bacterium Staphylococcus aureus (S. aureus) are probably the most

  9. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  10. Estimates of annual survival probabilities for adult Florida manatees (Trichechus manatus latirostris)

    Science.gov (United States)

    Langtimm, C.A.; O'Shea, T.J.; Pradel, R.; Beck, C.A.

    1998-01-01

    The population dynamics of large, long-lived mammals are particularly sensitive to changes in adult survival. Understanding factors affecting survival patterns is therefore critical for developing and testing theories of population dynamics and for developing management strategies aimed at preventing declines or extinction in such taxa. Few studies have used modern analytical approaches for analyzing variation and testing hypotheses about survival probabilities in large mammals. This paper reports a detailed analysis of annual adult survival in the Florida manatee (Trichechus manatus latirostris), an endangered marine mammal, based on a mark-recapture approach. Natural and boat-inflicted scars distinctively 'marked' individual manatees that were cataloged in a computer-based photographic system. Photo-documented resightings provided 'recaptures.' Using open population models, annual adult-survival probabilities were estimated for manatees observed in winter in three areas of Florida: Blue Spring, Crystal River, and the Atlantic coast. After using goodness-of-fit tests in Program RELEASE to search for violations of the assumptions of mark-recapture analysis, survival and sighting probabilities were modeled under several different biological hypotheses with Program SURGE. Estimates of mean annual probability of sighting varied from 0.948 for Blue Spring to 0.737 for Crystal River and 0.507 for the Atlantic coast. At Crystal River and Blue Spring, annual survival probabilities were best estimated as constant over the study period at 0.96 (95% CI = 0.951-0.975 and 0.900-0.985, respectively). On the Atlantic coast, where manatees are impacted more by human activities, annual survival probabilities had a significantly lower mean estimate of 0.91 (95% CI = 0.887-0.926) and varied unpredictably over the study period. For each study area, survival did not differ between sexes and was independent of relative adult age. The high constant adult-survival probabilities estimated

  11. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  12. Calculating the albedo characteristics by the method of transmission probabilities

    International Nuclear Information System (INIS)

    Lukhvich, A.A.; Rakhno, I.L.; Rubin, I.E.

    1983-01-01

    The possibility to use the method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones is studied. The transmission probabilities method is a numerical method for solving the transport equation in the integrated form. All calculations have been conducted as a one-group approximation for the planes and rods with different optical thicknesses and capture-to-scattering ratios. Above calculations for plane and cylindrical geometries have shown the possibility to use the numerical method of transmission probabilities for calculating the albedo characteristics of homogeneous and heterogeneous zones with high accuracy. In this case the computer time consumptions are minimum even with the cylindrical geometry, if the interpolation calculation of characteristics is used for the neutrons of the first path

  13. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  14. Tailoring single-photon and multiphoton probabilities of a single-photon on-demand source

    International Nuclear Information System (INIS)

    Migdall, A.L.; Branning, D.; Castelletto, S.

    2002-01-01

    As typically implemented, single-photon sources cannot be made to produce single photons with high probability, while simultaneously suppressing the probability of yielding two or more photons. Because of this, single-photon sources cannot really produce single photons on demand. We describe a multiplexed system that allows the probabilities of producing one and more photons to be adjusted independently, enabling a much better approximation of a source of single photons on demand

  15. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  16. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  17. Dynamic encoding of speech sequence probability in human temporal cortex.

    Science.gov (United States)

    Leonard, Matthew K; Bouchard, Kristofer E; Tang, Claire; Chang, Edward F

    2015-05-06

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. Copyright © 2015 the authors 0270-6474/15/357203-12$15.00/0.

  18. Fragility estimation for seismically isolated nuclear structures by high confidence low probability of failure values and bi-linear regression

    International Nuclear Information System (INIS)

    Carausu, A.

    1996-01-01

    A method for the fragility estimation of seismically isolated nuclear power plant structure is proposed. The relationship between the ground motion intensity parameter (e.g. peak ground velocity or peak ground acceleration) and the response of isolated structures is expressed in terms of a bi-linear regression line, whose coefficients are estimated by the least-square method in terms of available data on seismic input and structural response. The notion of high confidence low probability of failure (HCLPF) value is also used for deriving compound fragility curves for coupled subsystems. (orig.)

  19. On estimating the fracture probability of nuclear graphite components

    International Nuclear Information System (INIS)

    Srinivasan, Makuteswara

    2008-01-01

    The properties of nuclear grade graphites exhibit anisotropy and could vary considerably within a manufactured block. Graphite strength is affected by the direction of alignment of the constituent coke particles, which is dictated by the forming method, coke particle size, and the size, shape, and orientation distribution of pores in the structure. In this paper, a Weibull failure probability analysis for components is presented using the American Society of Testing Materials strength specification for nuclear grade graphites for core components in advanced high-temperature gas-cooled reactors. The risk of rupture (probability of fracture) and survival probability (reliability) of large graphite blocks are calculated for varying and discrete values of service tensile stresses. The limitations in these calculations are discussed from considerations of actual reactor environmental conditions that could potentially degrade the specification properties because of damage due to complex interactions between irradiation, temperature, stress, and variability in reactor operation

  20. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  1. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  2. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  3. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  5. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  6. High-severity fire: evaluating its key drivers and mapping its probability across western US forests

    Science.gov (United States)

    Parks, Sean A.; Holsinger, Lisa M.; Panunto, Matthew H.; Jolly, W. Matt; Dobrowski, Solomon Z.; Dillon, Gregory K.

    2018-04-01

    Wildland fire is a critical process in forests of the western United States (US). Variation in fire behavior, which is heavily influenced by fuel loading, terrain, weather, and vegetation type, leads to heterogeneity in fire severity across landscapes. The relative influence of these factors in driving fire severity, however, is poorly understood. Here, we explore the drivers of high-severity fire for forested ecoregions in the western US over the period 2002–2015. Fire severity was quantified using a satellite-inferred index of severity, the relativized burn ratio. For each ecoregion, we used boosted regression trees to model high-severity fire as a function of live fuel, topography, climate, and fire weather. We found that live fuel, on average, was the most important factor driving high-severity fire among ecoregions (average relative influence = 53.1%) and was the most important factor in 14 of 19 ecoregions. Fire weather was the second most important factor among ecoregions (average relative influence = 22.9%) and was the most important factor in five ecoregions. Climate (13.7%) and topography (10.3%) were less influential. We also predicted the probability of high-severity fire, were a fire to occur, using recent (2016) satellite imagery to characterize live fuel for a subset of ecoregions in which the model skill was deemed acceptable (n = 13). These ‘wall-to-wall’ gridded ecoregional maps provide relevant and up-to-date information for scientists and managers who are tasked with managing fuel and wildland fire. Lastly, we provide an example of the predicted likelihood of high-severity fire under moderate and extreme fire weather before and after fuel reduction treatments, thereby demonstrating how our framework and model predictions can potentially serve as a performance metric for land management agencies tasked with reducing hazardous fuel across large landscapes.

  7. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  8. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  9. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  10. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  11. Midcourse Guidance Law Based on High Target Acquisition Probability Considering Angular Constraint and Line-of-Sight Angle Rate Control

    Directory of Open Access Journals (Sweden)

    Xiao Liu

    2016-01-01

    Full Text Available Random disturbance factors would lead to the variation of target acquisition point during the long distance flight. To acquire a high target acquisition probability and improve the impact precision, missiles should be guided to an appropriate target acquisition position with certain attitude angles and line-of-sight (LOS angle rate. This paper has presented a new midcourse guidance law considering the influences of random disturbances, detection distance restraint, and target acquisition probability with Monte Carlo simulation. Detailed analyses of the impact points on the ground and the random distribution of the target acquisition position in the 3D space are given to get the appropriate attitude angles and the end position for the midcourse guidance. Then, a new formulation biased proportional navigation (BPN guidance law with angular constraint and LOS angle rate control has been derived to ensure the tracking ability when attacking the maneuvering target. Numerical simulations demonstrates that, compared with the proportional navigation guidance (PNG law and the near-optimal spatial midcourse guidance (NSMG law, BPN guidance law demonstrates satisfactory performances and can meet both the midcourse terminal angular constraint and the LOS angle rate requirement.

  12. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  13. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  14. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  15. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  16. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  17. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  18. A Probability-based Evolutionary Algorithm with Mutations to Learn Bayesian Networks

    Directory of Open Access Journals (Sweden)

    Sho Fukuda

    2014-12-01

    Full Text Available Bayesian networks are regarded as one of the essential tools to analyze causal relationship between events from data. To learn the structure of highly-reliable Bayesian networks from data as quickly as possible is one of the important problems that several studies have been tried to achieve. In recent years, probability-based evolutionary algorithms have been proposed as a new efficient approach to learn Bayesian networks. In this paper, we target on one of the probability-based evolutionary algorithms called PBIL (Probability-Based Incremental Learning, and propose a new mutation operator. Through performance evaluation, we found that the proposed mutation operator has a good performance in learning Bayesian networks

  19. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  20. Stochastic models for predicting pitting corrosion damage of HLRW containers

    International Nuclear Information System (INIS)

    Henshall, G.A.

    1991-10-01

    Stochastic models for predicting aqueous pitting corrosion damage of high-level radioactive-waste containers are described. These models could be used to predict the time required for the first pit to penetrate a container and the increase in the number of breaches at later times, both of which would be useful in the repository system performance analysis. Monte Carlo implementations of the stochastic models are described, and predictions of induction time, survival probability and pit depth distributions are presented. These results suggest that the pit nucleation probability decreases with exposure time and that pit growth may be a stochastic process. The advantages and disadvantages of the stochastic approach, methods for modeling the effects of environment, and plans for future work are discussed

  1. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  2. Estimation of functional failure probability of passive systems based on subset simulation method

    International Nuclear Information System (INIS)

    Wang Dongqing; Wang Baosheng; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to solve the problem of multi-dimensional epistemic uncertainties and small functional failure probability of passive systems, an innovative reliability analysis algorithm called subset simulation based on Markov chain Monte Carlo was presented. The method is found on the idea that a small failure probability can be expressed as a product of larger conditional failure probabilities by introducing a proper choice of intermediate failure events. Markov chain Monte Carlo simulation was implemented to efficiently generate conditional samples for estimating the conditional failure probabilities. Taking the AP1000 passive residual heat removal system, for example, the uncertainties related to the model of a passive system and the numerical values of its input parameters were considered in this paper. And then the probability of functional failure was estimated with subset simulation method. The numerical results demonstrate that subset simulation method has the high computing efficiency and excellent computing accuracy compared with traditional probability analysis methods. (authors)

  3. Enforcing planning regulations in areas of high immigration: a case study of London

    OpenAIRE

    Harris, Neil

    2017-01-01

    This paper explores the interface between immigration and compliance with planning regulations using data from interviews and a focus group with senior planning enforcement officers in London. The data reveal distinctive issues that arise for immigrants’ compliance with planning regulations; specific types of residential, commercial and cultural breach that occur with immigration; and operational issues that arise when investigating and resolving planning breaches involving immigrant communit...

  4. It will never happen to us: the likelihood and impact of privacy breaches on health data in Australia.

    Science.gov (United States)

    Williams, Patricia A H; Hossack, Emma

    2013-01-01

    With the recent introduction of the Australian e-health system, health reforms and legislation were passed. Whilst the aim of these health reforms was reasonable and sensible, the implementation was rushed and less than perfect. The Deloitte e-health Strategy [1] which was endorsed by the National Health and Hospital Reform Commission (NHHRC) recommended that based on international experience implementation of shared electronic health records nationally was a ten year journey. In Australia this was condensed into two years. The resultant effect has been that privacy, which is essential for the uptake of technologies to share data in a compliant manner, may be compromised. People trust transparent systems. Where there is a breach in privacy people deserve the respect and right to know about it so that they can mitigate damages and with full disclosure, retain their trust in the system. If this is not evident, the public will refuse to share their information. Hence, whilst the technologies may work, their use may be limited. The consequence of this in Australia would be the continuance of dangerous and inefficient silos of health data.

  5. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  6. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  7. Subspace Learning via Local Probability Distribution for Hyperspectral Image Classification

    Directory of Open Access Journals (Sweden)

    Huiwu Luo

    2015-01-01

    Full Text Available The computational procedure of hyperspectral image (HSI is extremely complex, not only due to the high dimensional information, but also due to the highly correlated data structure. The need of effective processing and analyzing of HSI has met many difficulties. It has been evidenced that dimensionality reduction has been found to be a powerful tool for high dimensional data analysis. Local Fisher’s liner discriminant analysis (LFDA is an effective method to treat HSI processing. In this paper, a novel approach, called PD-LFDA, is proposed to overcome the weakness of LFDA. PD-LFDA emphasizes the probability distribution (PD in LFDA, where the maximum distance is replaced with local variance for the construction of weight matrix and the class prior probability is applied to compute the affinity matrix. The proposed approach increases the discriminant ability of the transformed features in low dimensional space. Experimental results on Indian Pines 1992 data indicate that the proposed approach significantly outperforms the traditional alternatives.

  8. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  9. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  10. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  11. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  12. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  13. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  14. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  15. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  16. Application of escape probability to line transfer in laser-produced plasmas

    International Nuclear Information System (INIS)

    Lee, Y.T.; London, R.A.; Zimmerman, G.B.; Haglestein, P.L.

    1989-01-01

    In this paper the authors apply the escape probability method to treat transfer of optically thick lines in laser-produced plasmas in plan-parallel geometry. They investigate the effect of self-absorption on the ionization balance and ion level populations. In addition, they calculate such effect on the laser gains in an exploding foil target heated by an optical laser. Due to the large ion streaming motion in laser-produced plasmas, absorption of an emitted photon occurs only over the length in which the Doppler shift is equal to the line width. They find that the escape probability calculated with the Doppler shift is larger compared to the escape probability for a static plasma. Therefore, the ion streaming motion contributes significantly to the line transfer process in laser-produced plasmas. As examples, they have applied escape probability to calculate transfer of optically thick lines in both ablating slab and exploding foil targets under irradiation of a high-power optical laser

  17. Numeracy moderates the influence of task-irrelevant affect on probability weighting.

    Science.gov (United States)

    Traczyk, Jakub; Fulawka, Kamil

    2016-06-01

    Statistical numeracy, defined as the ability to understand and process statistical and probability information, plays a significant role in superior decision making. However, recent research has demonstrated that statistical numeracy goes beyond simple comprehension of numbers and mathematical operations. On the contrary to previous studies that were focused on emotions integral to risky prospects, we hypothesized that highly numerate individuals would exhibit more linear probability weighting because they would be less biased by incidental and decision-irrelevant affect. Participants were instructed to make a series of insurance decisions preceded by negative (i.e., fear-inducing) or neutral stimuli. We found that incidental negative affect increased the curvature of the probability weighting function (PWF). Interestingly, this effect was significant only for less numerate individuals, while probability weighting in more numerate people was not altered by decision-irrelevant affect. We propose two candidate mechanisms for the observed effect. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Probable high prevalence of limb-girdle muscular dystrophy type 2D in Taiwan.

    Science.gov (United States)

    Liang, Wen-Chen; Chou, Po-Ching; Hung, Chia-Cheng; Su, Yi-Ning; Kan, Tsu-Min; Chen, Wan-Zi; Hayashi, Yukiko K; Nishino, Ichizo; Jong, Yuh-Jyh

    2016-03-15

    Limb-girdle muscular dystrophy type 2D (LGMD2D), an autosomal-recessive inherited LGMD, is caused by the mutations in SGCA. SGCA encodes alpha-sarcoglycan (SG) that forms a heterotetramer with other SGs in the sarcolemma, and comprises part of the dystrophin-glycoprotein complex. The frequency of LGMD2D is variable among different ethnic backgrounds, and so far only a few patients have been reported in Asia. We identified five patients with a novel homozygous mutation of c.101G>T (p.Arg34Leu) in SGCA from a big aboriginal family ethnically consisting of two tribes in Taiwan. Patient 3 is the maternal uncle of patients 1 and 2. All their parents, heterozygous for c.101G>T, denied consanguineous marriages although they were from the same tribe. The heterozygous parents of patients 4 and 5 were from two different tribes, originally residing in different geographic regions in Taiwan. Haplotype analysis showed that all five patients shared the same mutation-associated haplotype, indicating the probability of a founder effect and consanguinity. The results suggest that the carrier rate of c.101G>T in SGCA may be high in Taiwan, especially in the aboriginal population regardless of the tribes. It is important to investigate the prevalence of LGMD2D in Taiwan for early diagnosis and treatment. Copyright © 2016. Published by Elsevier B.V.

  19. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  20. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...