WorldWideScience

Sample records for probable future capabilities

  1. Capabilities of Future Training Support Packages

    National Research Council Canada - National Science Library

    Burnside, Billy

    2004-01-01

    .... This report identifies and analyzes five key capabilities needed in future TSPs: rapid tailoring or modification, reach, simulated operating environment, performance measurement, and pretests/selection criteria...

  2. Array capabilities and future arrays

    International Nuclear Information System (INIS)

    Radford, D.

    1993-01-01

    Early results from the new third-generation instruments GAMMASPHERE and EUROGAM are confirming the expectation that such arrays will have a revolutionary effect on the field of high-spin nuclear structure. When completed, GAMMASHPERE will have a resolving power am order of magnitude greater that of the best second-generation arrays. When combined with other instruments such as particle-detector arrays and fragment mass analysers, the capabilites of the arrays for the study of more exotic nuclei will be further enhanced. In order to better understand the limitations of these instruments, and to design improved future detector systems, it is important to have some intelligible and reliable calculation for the relative resolving power of different instrument designs. The derivation of such a figure of merit will be briefly presented, and the relative sensitivities of arrays currently proposed or under construction presented. The design of TRIGAM, a new third-generation array proposed for Chalk River, will also be discussed. It is instructive to consider how far arrays of Compton-suppressed Ge detectors could be taken. For example, it will be shown that an idealised open-quote perfectclose quotes third-generation array of 1000 detectors has a sensitivity an order of magnitude higher again than that of GAMMASPHERE. Less conventional options for new arrays will also be explored

  3. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  4. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  5. Probability Weighting and Loss Aversion in Futures Hedging

    NARCIS (Netherlands)

    Mattos, F.; Garcia, P.; Pennings, J.M.E.

    2008-01-01

    We analyze how the introduction of probability weighting and loss aversion in a futures hedging model affects decision making. Analytical findings indicate that probability weighting alone always affects optimal hedge ratios, while loss and risk aversion only have an impact when probability

  6. Estimating market probabilities of future interest rate changes

    OpenAIRE

    Hlušek, Martin

    2002-01-01

    The goal of this paper is to estimate the market consensus forecast of future monetary policy development and to quantify the priced-in probability of interest rate changes for different future time horizons. The proposed model uses the current spot money market yield curve and available money market derivative instruments (forward rate agreements, FRAs) and estimates the market probability of interest rate changes up to a 12-month horizon.

  7. SuperMAG: Present and Future Capabilities

    Science.gov (United States)

    Hsieh, S. W.; Gjerloev, J. W.; Barnes, R. J.

    2009-12-01

    SuperMAG is a global collaboration that provides ground magnetic field perturbations from a long list of stations in the same coordinate system, identical time resolution and with a common baseline removal approach. This unique high quality dataset provides a continuous and nearly global monitoring of the ground magnetic field perturbation. Currently, only archived data are available on the website and hence it targets basic research without any operational capabilities. The existing SuperMAG software can be easily adapted to ingest real-time or near real-time data and provide a now-casting capability. The SuperDARN program has a long history of providing near real-time maps of the northern hemisphere electrostatic potential and as both SuperMAG and SuperDARN share common software it is relatively easy to adapt these maps for global magnetic perturbations. Magnetometer measurements would be assimilated by the SuperMAG server using a variety of techniques, either by downloading data at regular intervals from remote servers or by real-time streaming connections. The existing SuperMAG analysis software would then process these measurements to provide the final calibrated data set using the SuperMAG coordinate system. The existing plotting software would then be used to produce regularly updated global plots. The talk will focus on current SuperMAG capabilities illustrating the potential for now-casting and eventually forecasting.

  8. Future southcentral US wildfire probability due to climate change

    Science.gov (United States)

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  9. Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD)

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.

  10. Key Future Engineering Capabilities for Human Capital Retention

    Science.gov (United States)

    Sivich, Lorrie

    Projected record retirements of Baby Boomer generation engineers have been predicted to result in significant losses of mission-critical knowledge in space, national security, and future scientific ventures vital to high-technology corporations. No comprehensive review or analysis of engineering capabilities has been performed to identify threats related to the specific loss of mission-critical knowledge posed by the increasing retirement of tenured engineers. Archival data from a single diversified Fortune 500 aerospace manufacturing engineering company's engineering career database were analyzed to ascertain whether relationships linking future engineering capabilities, engineering disciplines, and years of engineering experience could be identified to define critical knowledge transfer models. Chi square, logistic, and linear regression analyses were used to map patterns of discipline-specific, mission-critical knowledge using archival data of engineers' perceptions of engineering capabilities, key developmental experiences, and knowledge learned from their engineering careers. The results from the study were used to document key engineering future capabilities. The results were then used to develop a proposed human capital retention plan to address specific key knowledge gaps of younger engineers as veteran engineers retire. The potential for social change from this study involves informing leaders of aerospace engineering corporations on how to build better quality mentoring or succession plans to fill the void of lost knowledge from retiring engineers. This plan can secure mission-critical knowledge for younger engineers for current and future product development and increased global competitiveness in the technology market.

  11. Nuclear Research and Development Capabilities Needed to Support Future Growth

    Energy Technology Data Exchange (ETDEWEB)

    Wham, Robert M. [ORNL, P.O. Box 2008, Oak Ridge, TN 37831-6154 (United States); Kearns, Paul [Battelle Memorial Institute (United States); Marston, Ted [Marston Consulting (United States)

    2009-06-15

    The energy crisis looming before the United States can be resolved only by an approach that integrates a 'portfolio' of options. Nuclear energy, already an important element in the portfolio, should play an even more significant role in the future as the U.S. strives to attain energy independence and reduce carbon emissions. The DOE Office of Nuclear Energy asked Battelle Memorial Institute to obtain input from the commercial power generation industry on industry's vision for nuclear energy over the next 30-50 years. With this input, Battelle was asked to generate a set of research and development capabilities necessary for DOE to support the anticipated growth in nuclear power generation. This presentation, based on the report generated for the Office of Nuclear Energy, identifies the current and future nuclear research and development capabilities required to make this happen. The capabilities support: (1) continued, safe operation of the current fleet of nuclear plants; (2) the availability of a well qualified and trained workforce; (3) demonstration of the next generation nuclear plants; (4) development of a sustainable fuel cycle; (5) advanced technologies for maximizing resource utilization and minimization of waste and (6) advanced modeling and simulation for rapid and reliable development and deployment of new nuclear technologies. In order to assure these capabilities are made available, a Strategic Nuclear Energy Capability Initiative is proposed to provide the required resources during this critical period of time. (authors)

  12. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  13. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  14. Rationalization and future planning for AECL's research reactor capability

    International Nuclear Information System (INIS)

    Slater, J.B.

    1990-01-01

    AECL's research reactor capability has played a crucial role in the development of Canada's nuclear program. All essential concepts for the CANDU reactors were developed and tested in the NRX and NRU reactors, and in parallel, important contributions to basic physics were made. The technical feasibility of advanced fuel cycles and of the organic-cooled option for CANDU reactors were also demonstrated in the two reactors and the WR-1 reactor. In addition, an important and growing radio-isotope production industry was established and marketed on a world-wide basis. In 1984, however, it was recognized that a review and rationalization of the research reactor capability was required. The commercial success of the CANDU reactor system had reduced the scope and size of the required development program. Limited research and development funding and competition from other research facilities and programs, required that the scope be reduced to a support basis essential to maintain strategic capability. Currently, AECL, is part-way through this rationalization program and completion should be attained during 1992/93 when the MAPLE reactor is operational and decisions on NRX decommissioning will be made. A companion paper describes some of the unique operational and maintenance problems which have resulted from this program and the solutions which have been developed. Future planning must recognize the age of the NRU reactor (currently 32 years) and the need to plan for eventual replacement. Strategy is being developed and supporting studies include a full technical assessment of the NRU reactor and the required age-related upgrading program, evaluation of the performance characteristics and costs of potential future replacement reactors, particularly the advanced MAPLE concept, and opportunities for international co-operation in developing mutually supportive research programs

  15. An Overview of Current and Future Stratospheric Balloon Mission Capabilities

    Science.gov (United States)

    Smith, Michael

    The modern stratospheric balloon has been used for a variety of missions since the late 1940's. Capabilities of these vehicles to carry larger payloads, fly to higher altitudes, and fly for longer periods of time have increased dramatically over this time. In addition to these basic performance metrics, reliability statistics for balloons have reached unprecedented levels in recent years. Balloon technology developed in the United States in the last decade has the potential to open a new era in economical space science using balloons. As always, the advantage of the balloon platform is the fact that missions can be carried out at a fraction of the cost and schedule of orbital missions. A secondary advantage is the fact that instruments can be re-flown numerous times while upgrading sensor and data processing technologies from year to year. New mission capabilities now have the potential for enabling ground breaking observations using balloons as the primary platform as opposed to a stepping stone to eventual orbital observatories. The limit of very high altitude balloon missions will be explored with respect to the current state of the art of balloon materials and fabrication. The same technological enablers will also be applied to possibilities for long duration missions at mid latitudes with payloads of several tons. The balloon types and their corresponding mission profiles will be presented in a performance matrix that will be useful for potential scientific users in planning future research programs.

  16. Actual growth and probable future of the worldwide nuclear industry

    International Nuclear Information System (INIS)

    Bupp, I.C.

    1981-01-01

    Worldwide nuclear-power-reactor manufacturing capacity will exceed worldwide demand by a factor of two or more during the 1980s. Only in France and the Soviet bloc countries is it likely that the ambitious nuclear-power programs formulated in the mid-1970s will be implemented. In all other developed countries and in most developing countries, further delays and cancellations of previously announced programs are all but certain. The stalemate over the future of nuclear power is particularly deep in America. Administrative and personnel problems in the Nuclear Regulatory Commission, slow progress on radioactive waste disposal by the Department of Energy, severe financial problems for most electric utilities, and drastic reductions in the rate of electricity demand growth combine to make continuation of the five-year-old moratorium on reactor orders inevitable. Many of the ninety plants under construction may never operate, and some of the seventy in operation may shut down before the end of their economic life. Contrary to widespread belief, further oil price increases may not speed up world-wide reactor sales. It is possible that the world is heading for a worst of all possible outcomes: a large number of small nuclear power programs that do little to meet real energy needs but substantially complicate the problem of nuclear weapons proliferation. 24 references, 4 tables

  17. The ESA River & Lake System: Current Capabilities and Future Potential

    DEFF Research Database (Denmark)

    Smith, Richard G.; Salloway, Mark; Berry, Philippa A. M.

    Measuring the earth's river and lake resources using satellite radar altimetry offers a unique global monitoring capability, which complements the detailed measurements made by the steadily decreasing number of in-situ gauges. To exploit this unique remote monitoring capability, a global pilot...

  18. Cultural Differences in Young Adults' Perceptions of the Probability of Future Family Life Events.

    Science.gov (United States)

    Speirs, Calandra; Huang, Vivian; Konnert, Candace

    2017-09-01

    Most young adults are exposed to family caregiving; however, little is known about their perceptions of their future caregiving activities such as the probability of becoming a caregiver for their parents or providing assistance in relocating to a nursing home. This study examined the perceived probability of these events among 182 young adults and the following predictors of their probability ratings: gender, ethnicity, work or volunteer experience, experiences with caregiving and nursing homes, expectations about these transitions, and filial piety. Results indicated that Asian or South Asian participants rated the probability of being a caregiver as significantly higher than Caucasian participants, and the probability of placing a parent in a nursing home as significantly lower. Filial piety was the strongest predictor of the probability of these life events, and it mediated the relationship between ethnicity and probability ratings. These findings indicate the significant role of filial piety in shaping perceptions of future life events.

  19. The Renovation and Future Capabilities of the Thacher Observatory

    Science.gov (United States)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  20. Legitimacy, capability, effectiveness and the future of the NPT

    International Nuclear Information System (INIS)

    Keeley, J.F.

    1987-01-01

    This chapter looks at the relationship between legitimacy and capability in conceptually and politically contestable regions. This issue was highlighted by India's nuclear test of May 1974 and the Osiraq raid of 1981. These illustrated the general problem of the threat to the coherence and legitimacy of the non-proliferation regime. This threat arose from the spread of nuclear technological capabilities. Two developments in the non-proliferation regime that have helped produce the more specific problems of that regime are discussed. These are the spread of nuclear technological capabilities and the development of complex co-operation networks. The prospects for the modification of the NPT in response to these challenges are considered finally. (U.K.)

  1. Alternatives for Future U.S. Space-Launch Capabilities

    Science.gov (United States)

    2006-10-01

    directive issued on January 14, 2004—called the new Vision for Space Exploration (VSE)—set out goals for future exploration of the solar system using...of the solar system using manned spacecraft. Among those goals was a proposal to return humans to the moon no later than 2020. The ultimate goal...U.S. launch capacity exclude the Sea Launch system operated by Boeing in partnership with RSC- Energia (based in Moscow), Kvaerner ASA (based in Oslo

  2. Structure life prediction at high temperature: present and future capabilities

    International Nuclear Information System (INIS)

    Chaboche, J.L.

    1987-01-01

    The life prediction techniques for high temperature conditions include several aspects which are considered successively in this article. Crack initiation criteria themselves, defined for the isolated volume element (the tension-compression specimen for example), including parametric relationships and continuous damage approaches and calculation of local stress and strain fields in the structure and their evolution under cyclic plasticity, which poses several difficult problems to obtain stabilized cyclic solutions are examined. The use of crack initiation criteria or damage rules from the result of the cyclic inelastic analysis and the prediction of crack growth in the structure are considered. Different levels are considered for the predictive tools: the classical approach, future methods presently under development and intermediate rules, which are already in use. Several examples are given on materials and components used either in the nuclear industry or in gas turbine engines. (author)

  3. GaN-on-Silicon - Present capabilities and future directions

    Science.gov (United States)

    Boles, Timothy

    2018-02-01

    Gallium Nitride, in the form of epitaxial HEMT transistors on various substrate materials, is the newest and most promising semiconductor technology for high performance devices in the RF, microwave, and mmW arenas. This is particularly true for GaN-on-Silicon based devices and MMIC's which enable both state-of-the-art high frequency functionality and the ability to scale production into large wafer diameter CMOS foundries. The design and development of GaN-on-Silicon structures and devices will be presented beginning with the basic material parameters, growth of the required epitaxial construction, and leading to the fundamental operational theory of high frequency, high power HEMTs. In this discussion comparisons will be made with alternative substrate materials with emphasis on contrasting the inherent advantages of a silicon based system. Theory of operation of microwave and mmW high power HEMT devices will be presented with special emphasis on fundamental limitations of device performance including inherent frequency limiting transit time analysis, required impedance transformations, internal and external parasitic reactance, thermal impedance optimization, and challenges improved by full integration into monolithic MMICs. Lastly, future directions for implementing GaN-on-Silicon into mainstream CMOS silicon semiconductor technologies will be discussed.

  4. Space Shuttle Launch Probability Analysis: Understanding History so We Can Predict the Future

    Science.gov (United States)

    Cates, Grant R.

    2014-01-01

    The Space Shuttle was launched 135 times and nearly half of those launches required 2 or more launch attempts. The Space Shuttle launch countdown historical data of 250 launch attempts provides a wealth of data that is important to analyze for strictly historical purposes as well as for use in predicting future launch vehicle launch countdown performance. This paper provides a statistical analysis of all Space Shuttle launch attempts including the empirical probability of launch on any given attempt and the cumulative probability of launch relative to the planned launch date at the start of the initial launch countdown. This information can be used to facilitate launch probability predictions of future launch vehicles such as NASA's Space Shuttle derived SLS. Understanding the cumulative probability of launch is particularly important for missions to Mars since the launch opportunities are relatively short in duration and one must wait for 2 years before a subsequent attempt can begin.

  5. Mediators of the Availability Heuristic in Probability Estimates of Future Events.

    Science.gov (United States)

    Levi, Ariel S.; Pryor, John B.

    Individuals often estimate the probability of future events by the ease with which they can recall or cognitively construct relevant instances. Previous research has not precisely identified the cognitive processes mediating this "availability heuristic." Two potential mediators (imagery of the event, perceived reasons or causes for the…

  6. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  7. The Future of Deterrent Capability for Medium-Sized Western Powers in the New Environment

    International Nuclear Information System (INIS)

    Quinlan, Michael

    2001-01-01

    What should be the longer-term future for the nuclear-weapons capabilities of France and the United Kingdom? I plan to tackle the subject in concrete terms. My presentation will be divided into three parts, and, though they are distinct rather than separate, they interact extensively. The first and largest part will relate to strategic context and concept: what aims, justifications and limitations should guide the future, or the absence of a future, for our capabilities? The second part, a good deal briefer, will be the practical content and character of the capabilities: what questions for decision will arise, and in what timescale, about the preservation, improvement or adjustment of the present capabilities? And the third part, still more briefly, will concern the political and institutional framework into which their future should or might be fitted. (author)

  8. Informing future NRT satellite distribution capabilities: Lessons learned from NASA's Land Atmosphere NRT capability for EOS (LANCE)

    Science.gov (United States)

    Davies, D.; Murphy, K. J.; Michael, K.

    2013-12-01

    NASA's Land Atmosphere Near real-time Capability for EOS (Earth Observing System) (LANCE) provides data and imagery from Terra, Aqua and Aura satellites in less than 3 hours from satellite observation, to meet the needs of the near real-time (NRT) applications community. This article describes the architecture of the LANCE and outlines the modifications made to achieve the 3-hour latency requirement with a view to informing future NRT satellite distribution capabilities. It also describes how latency is determined. LANCE is a distributed system that builds on the existing EOS Data and Information System (EOSDIS) capabilities. To achieve the NRT latency requirement, many components of the EOS satellite operations, ground and science processing systems have been made more efficient without compromising the quality of science data processing. The EOS Data and Operations System (EDOS) processes the NRT stream with higher priority than the science data stream in order to minimize latency. In addition to expediting transfer times, the key difference between the NRT Level 0 products and those for standard science processing is the data used to determine the precise location and tilt of the satellite. Standard products use definitive geo-location (attitude and ephemeris) data provided daily, whereas NRT products use predicted geo-location provided by the instrument Global Positioning System (GPS) or approximation of navigational data (depending on platform). Level 0 data are processed in to higher-level products at designated Science Investigator-led Processing Systems (SIPS). The processes used by LANCE have been streamlined and adapted to work with datasets as soon as they are downlinked from satellites or transmitted from ground stations. Level 2 products that require ancillary data have modified production rules to relax the requirements for ancillary data so reducing processing times. Looking to the future, experience gained from LANCE can provide valuable lessons on

  9. Rainfall and net infiltration probabilities for future climate conditions at Yucca Mountain

    International Nuclear Information System (INIS)

    Long, A.; Childs, S.W.

    1993-01-01

    Performance assessment of repository integrity is a task rendered difficult because it requires predicting the future. This challenge has occupied many scientists who realize that the best assessments are required to maximize the probability of successful repository sitting and design. As part of a performance assessment effort directed by the EPRI, the authors have used probabilistic methods to assess the magnitude and timing of net infiltration at Yucca Mountain. A mathematical model for net infiltration previously published incorporated a probabilistic treatment of climate, surface hydrologic processes and a mathematical model of the infiltration process. In this paper, we present the details of the climatological analysis. The precipitation model is event-based, simulating characteristics of modern rainfall near Yucca Mountain, then extending the model to most likely values for different degrees of pluvial climates. Next the precipitation event model is fed into a process-based infiltration model that considers spatial variability in parameters relevant to net infiltration of Yucca Mountain. The model predicts that average annual net infiltration at Yucca Mountain will range from a mean of about 1 mm under present climatic conditions to a mean of at least 2.4 mm under full glacial (pluvial) conditions. Considerable variations about these means are expected to occur from year-to-year

  10. Capability and dependency in the Newcastle 85+ cohort study. Projections of future care needs.

    Science.gov (United States)

    Jagger, Carol; Collerton, Joanna C; Davies, Karen; Kingston, Andrew; Robinson, Louise A; Eccles, Martin P; von Zglinicki, Thomas; Martin-Ruiz, Carmen; James, Oliver F W; Kirkwood, Tom B L; Bond, John

    2011-05-04

    Little is known of the capabilities of the oldest old, the fastest growing age group in the population. We aimed to estimate capability and dependency in a cohort of 85 year olds and to project future demand for care. Structured interviews at age 85 with 841 people born in 1921 and living in Newcastle and North Tyneside, UK who were permanently registered with participating general practices. Measures of capability included were self-reported activities of daily living (ADL), timed up and go test (TUG), standardised mini-mental state examination (SMMSE), and assessment of urinary continence in order to classify interval-need dependency. To project future demand for care the proportion needing 24-hour care was applied to the 2008 England and Wales population projections of those aged 80 years and over by gender. Of participants, 62% (522/841) were women, 77% (651/841) lived in standard housing, 13% (106/841) in sheltered housing and 10% (84/841) in a care home. Overall, 20% (165/841) reported no difficulty with any of the ADLs. Men were more capable in performing ADLs and more independent than women. TUG validated self-reported ADLs. When classified by 'interval of need' 41% (332/810) were independent, 39% (317/810) required help less often than daily, 12% (94/810) required help at regular times of the day and 8% (67/810) required 24-hour care. Of care-home residents, 94% (77/82) required daily help or 24-hour care. Future need for 24-hour care for people aged 80 years or over in England and Wales is projected to increase by 82% from 2010 to 2030 with a demand for 630,000 care-home places by 2030. This analysis highlights the diversity of capability and levels of dependency in this cohort. A remarkably high proportion remain independent, particularly men. However a significant proportion of this population require 24-hour care at home or in care homes. Projections for the next 20 years suggest substantial increases in the number requiring 24-hour care due to

  11. Introduction of an Evaluation Tool to Predict the Probability of Success of Companies: The Innovativeness, Capabilities and Potential Model (ICP

    Directory of Open Access Journals (Sweden)

    Michael Lewrick

    2009-05-01

    Full Text Available Successful innovation requires management and in this paper a model to help manage the innovation process is presented. This model can be used to audit the management capability to innovate and to monitor how sales increase is related to innovativeness. The model was developed from a study of companies in the high technology cluster around Munich and validated using statistical procedures. The model was found to be effective at predicting the success or otherwise of the innovation strategy pursued by the company. The use of this model and how it can be used to identify areas for improvement are documented in this paper.

  12. Quantifying the Global Fresh Water Budget: Capabilities from Current and Future Satellite Sensors

    Science.gov (United States)

    Hildebrand, Peter; Zaitchik, Benjamin

    2007-01-01

    The global water cycle is complex and its components are difficult to measure, particularly at the global scales and with the precision needed for assessing climate impacts. Recent advances in satellite observational capabilities, however, are greatly improving our knowledge of the key terms in the fresh water flux budget. Many components of the of the global water budget, e.g. precipitation, atmospheric moisture profiles, soil moisture, snow cover, sea ice are now routinely measured globally using instruments on satellites such as TRMM, AQUA, TERRA, GRACE, and ICESat, as well as on operational satellites. New techniques, many using data assimilation approaches, are providing pathways toward measuring snow water equivalent, evapotranspiration, ground water, ice mass, as well as improving the measurement quality for other components of the global water budget. This paper evaluates these current and developing satellite capabilities to observe the global fresh water budget, then looks forward to evaluate the potential for improvements that may result from future space missions as detailed by the US Decadal Survey, and operational plans. Based on these analyses, and on the goal of improved knowledge of the global fresh water budget under the effects of climate change, we suggest some priorities for the future, based on new approaches that may provide the improved measurements and the analyses needed to understand and observe the potential speed-up of the global water cycle under the effects of climate change.

  13. Department of Defense Energy and Logistics: Implications of Historic and Future Cost, Risk, and Capability Analysis

    Science.gov (United States)

    Tisa, Paul C.

    Every year the DoD spends billions satisfying its large petroleum demand. This spending is highly sensitive to uncontrollable and poorly understood market forces. Additionally, while some stakeholders may not prioritize its monetary cost and risk, energy is fundamentally coupled to other critical factors. Energy, operational capability, and logistics are heavily intertwined and dependent on uncertain security environment and technology futures. These components and their relationships are less understood. Without better characterization, future capabilities may be significantly limited by present-day acquisition decisions. One attempt to demonstrate these costs and risks to decision makers has been through a metric known as the Fully Burdened Cost of Energy (FBCE). FBCE is defined as the commodity price for fuel plus many of these hidden costs. The metric encouraged a valuable conversation and is still required by law. However, most FBCE development stopped before the lessons from that conversation were incorporated. Current implementation is easy to employ but creates little value. Properly characterizing the costs and risks of energy and putting them in a useful tradespace requires a new framework. This research aims to highlight energy's complex role in many aspects of military operations, the critical need to incorporate it in decisions, and a novel framework to do so. It is broken into five parts. The first describes the motivation behind FBCE, the limits of current implementation, and outlines a new framework that aids decisions. Respectively, the second, third, and fourth present a historic analysis of the connections between military capabilities and energy, analyze the recent evolution of this conversation within the DoD, and pull the historic analysis into a revised framework. The final part quantifies the potential impacts of deeply uncertain futures and technological development and introduces an expanded framework that brings capability, energy, and

  14. Relationship between Future Time Orientation and Item Nonresponse on Subjective Probability Questions: A Cross-Cultural Analysis.

    Science.gov (United States)

    Lee, Sunghee; Liu, Mingnan; Hu, Mengyao

    2017-06-01

    Time orientation is an unconscious yet fundamental cognitive process that provides a framework for organizing personal experiences in temporal categories of past, present and future, reflecting the relative emphasis given to these categories. Culture lies central to individuals' time orientation, leading to cultural variations in time orientation. For example, people from future-oriented cultures tend to emphasize the future and store information relevant for the future more than those from present- or past-oriented cultures. For survey questions that ask respondents to report expected probabilities of future events, this may translate into culture-specific question difficulties, manifested through systematically varying "I don't know" item nonresponse rates. This study drew on the time orientation theory and examined culture-specific nonresponse patterns on subjective probability questions using methodologically comparable population-based surveys from multiple countries. The results supported our hypothesis. Item nonresponse rates on these questions varied significantly in the way that future-orientation at the group as well as individual level was associated with lower nonresponse rates. This pattern did not apply to non-probability questions. Our study also suggested potential nonresponse bias. Examining culture-specific constructs, such as time orientation, as a framework for measurement mechanisms may contribute to improving cross-cultural research.

  15. Projection of Korean Probable Maximum Precipitation under Future Climate Change Scenarios

    Directory of Open Access Journals (Sweden)

    Okjeong Lee

    2016-01-01

    Full Text Available According to the IPCC Fifth Assessment Report, air temperature and humidity of the future are expected to gradually increase over the current. In this study, future PMPs are estimated by using future dew point temperature projection data which are obtained from RCM data provided by the Korea Meteorological Administration. First, bias included in future dew point temperature projection data which is provided on a daily basis is corrected through a quantile-mapping method. Next, using a scale-invariance technique, 12-hour duration 100-year return period dew point temperatures which are essential input data for PMPs estimation are estimated from bias-corrected future dew point temperature data. After estimating future PMPs, it can be shown that PMPs in all future climate change scenarios (AR5 RCP2.6, RCP 4.5, RCP 6.0, and RCP 8.5 are very likely to increase.

  16. The NASA MSFC Electrostatic Levitation (ESL) Laboratory: Summary of Capabilities, Recent Upgrades, and Future Work

    Science.gov (United States)

    SanSoucie, Michael P.; Vermilion, David J.; Rogers, Jan R.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) electrostatic levitation (ESL) laboratory has a long history of providing materials research and thermophysical property data. A summary of the labs capabilities, recent upgrades, and ongoing and future work will be provided. The laboratory has recently added two new capabilities to its main levitation chamber: a rapid quench system and an oxygen control system. The rapid quench system allows samples to be dropped into a quench vessel that can be filled with a low melting point material, such as a gallium or indium alloy. Thereby allowing rapid quenching of undercooled liquid metals. The oxygen control system consists of an oxygen sensor, oxygen pump, and a control unit. The sensor is a potentiometric device that determines the difference in oxygen activity between two gas compartments separated by an electrolyte, which is yttria-stabilized zirconia. The pump utilizes coulometric titration to either add or remove oxygen. The system is controlled by a desktop control unit, which can also be accessed via a computer. This system allows the oxygen partial pressure within the vacuum chamber to be measured and controlled, theoretically in the range from 10-36 to 100 bar. The ESL laboratory also has an emissometer, called the High-Temperature Emissivity Measurement System (HiTEMS). This system measures the spectral emissivity of materials from 600degC to 3,000degC. The system consists of a vacuum chamber, a black body source, and a Fourier Transform Infrared Spectrometer (FTIR). The system utilizes optics to swap the signal between the sample and the black body. The system was originally designed to measure the hemispherical spectral emissivity of levitated samples, which are typically 2.5mm spheres. Levitation allows emissivity measurements of molten samples, but more work is required to develop this capability. The system is currently setup measure the near-normal spectral emissivity of stationary samples, which has been used

  17. Assessing the present and future probability of Hurricane Harvey’s rainfall

    OpenAIRE

    Emanuel, Kerry

    2017-01-01

    Significance Natural disasters such as the recent Hurricanes Harvey, Irma, and Maria highlight the need for quantitative estimates of the risk of such disasters. Statistically based risk assessment suffers from short records of often poor quality, and in the case of meteorological hazards, from the fact that the underlying climate is changing. This study shows how a recently developed physics-based risk assessment method can be applied to assessing the probabilities of extreme hurricane rainf...

  18. Arctic Observing Network Data Management: Current Capabilities and Their Promise for the Future

    Science.gov (United States)

    Collins, J.; Fetterer, F.; Moore, J. A.

    2008-12-01

    CADIS (the Cooperative Arctic Data and Information Service) serves as the data management, discovery and delivery component of the Arctic Observing Network (AON). As an International Polar Year (IPY) initiative, AON comprises 34 land, atmosphere and ocean observation sites, and will acquire much of the data coming from the interagency Study of Environmental Arctic Change (SEARCH). CADIS is tasked with ensuring that these observational data are managed for long term use by members of the entire Earth System Science community. Portions of CADIS are either in use by the community or available for testing. We now have an opportunity to evaluate the feedback received from our users, to identify any design shortcomings, and to identify those elements which serve their purpose well and will support future development. This presentation will focus on the nuts-and-bolts of the CADIS development to date, with an eye towards presenting lessons learned and best practices based on our experiences so far. The topics include: - How did we assess our users' needs, and how are those contributions reflected in the end product and its capabilities? - Why did we develop a CADIS metadata profile, and how does it allow CADIS to support preservation and scientific interoperability? - How can we shield the user from metadata complexities (especially those associated with various standards) while still obtaining the metadata needed to support an effective data management system? - How can we bridge the gap between the data storage formats considered convenient by researchers in the field, and those which are necessary to provide data interoperability? - What challenges have been encountered in our efforts to provide access to federated data (data stored outside of the CADIS system)? - What are the data browsing and visualization needs of the AON community, and which tools and technologies are most promising in terms of supporting those needs? A live demonstration of the current

  19. Future fire probability modeling with climate change data and physical chemistry

    Science.gov (United States)

    Richard P. Guyette; Frank R. Thompson; Jodi Whittier; Michael C. Stambaugh; Daniel C. Dey

    2014-01-01

    Climate has a primary influence on the occurrence and rate of combustion in ecosystems with carbon-based fuels such as forests and grasslands. Society will be confronted with the effects of climate change on fire in future forests. There are, however, few quantitative appraisals of how climate will affect wildland fire in the United States. We demonstrated a method for...

  20. Enabling unmanned capabilities in the tactical wheeled vehicle fleet of the future

    Science.gov (United States)

    Zych, Noah

    2012-06-01

    From transporting troops and weapons systems to supplying beans, bullets, and Band-Aids to front-line warfighters, tactical wheeled vehicles serve as the materiel backbone anywhere there are boots on the ground. Drawing from the U.S. Army's Tactical Wheeled Vehicle Strategy and the Marine Corps Vision & Strategy 2025 reports, one may conclude that the services have modest expectations for the introduction of large unmanned ground systems into operational roles in the next 15 years. However, the Department of Defense has already invested considerably in the research and development of full-size UGVs-and commanders deployed in both Iraq and Afghanistan have advocated the urgent fielding of early incarnations of this technology, believing it could make a difference on their battlefields today. For military UGVs to evolve from mere tactical advantages into strategic assets with developed doctrine, they must become as trustworthy as a well-trained warfighter in performing their assigned task. Starting with the Marine Corps' ongoing Cargo Unmanned Ground Vehicle program as a baseline, and informed by feedback from previously deployed subject matter experts, this paper examines the gaps which presently exist in UGVs from a mission-capable perspective. It then considers viable near-term technical solutions to meet today's functional requirements, as well as long-term development strategies to enable truly robust performance. With future conflicts expected to be characterized by increasingly complex operational environments and a broad spectrum of rapidly adapting threats, one of the largest challenges for unmanned ground systems will be the ability to exhibit agility in unpredictable circumstances.

  1. Land use planning and wildfire: development policies influence future probability of housing loss

    Science.gov (United States)

    Syphard, Alexandra D.; Massada, Avi Bar; Butsic, Van; Keeley, Jon E.

    2013-01-01

    Increasing numbers of homes are being destroyed by wildfire in the wildland-urban interface. With projections of climate change and housing growth potentially exacerbating the threat of wildfire to homes and property, effective fire-risk reduction alternatives are needed as part of a comprehensive fire management plan. Land use planning represents a shift in traditional thinking from trying to eliminate wildfires, or even increasing resilience to them, toward avoiding exposure to them through the informed placement of new residential structures. For land use planning to be effective, it needs to be based on solid understanding of where and how to locate and arrange new homes. We simulated three scenarios of future residential development and projected landscape-level wildfire risk to residential structures in a rapidly urbanizing, fire-prone region in southern California. We based all future development on an econometric subdivision model, but we varied the emphasis of subdivision decision-making based on three broad and common growth types: infill, expansion, and leapfrog. Simulation results showed that decision-making based on these growth types, when applied locally for subdivision of individual parcels, produced substantial landscape-level differences in pattern, location, and extent of development. These differences in development, in turn, affected the area and proportion of structures at risk from burning in wildfires. Scenarios with lower housing density and larger numbers of small, isolated clusters of development, i.e., resulting from leapfrog development, were generally predicted to have the highest predicted fire risk to the largest proportion of structures in the study area, and infill development was predicted to have the lowest risk. These results suggest that land use planning should be considered an important component to fire risk management and that consistently applied policies based on residential pattern may provide substantial benefits for

  2. Optimism as a prior belief about the probability of future reward.

    Directory of Open Access Journals (Sweden)

    Aistis Stankevicius

    2014-05-01

    Full Text Available Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly.

  3. Optimism as a Prior Belief about the Probability of Future Reward

    Science.gov (United States)

    Kalra, Aditi; Seriès, Peggy

    2014-01-01

    Optimists hold positive a priori beliefs about the future. In Bayesian statistical theory, a priori beliefs can be overcome by experience. However, optimistic beliefs can at times appear surprisingly resistant to evidence, suggesting that optimism might also influence how new information is selected and learned. Here, we use a novel Pavlovian conditioning task, embedded in a normative framework, to directly assess how trait optimism, as classically measured using self-report questionnaires, influences choices between visual targets, by learning about their association with reward progresses. We find that trait optimism relates to an a priori belief about the likelihood of rewards, but not losses, in our task. Critically, this positive belief behaves like a probabilistic prior, i.e. its influence reduces with increasing experience. Contrary to findings in the literature related to unrealistic optimism and self-beliefs, it does not appear to influence the iterative learning process directly. PMID:24853098

  4. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Science.gov (United States)

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  5. NASA Capabilities That Could Impact Terrestrial Smart Grids of the Future

    Science.gov (United States)

    Beach, Raymond F.

    2015-01-01

    Incremental steps to steadily build, test, refine, and qualify capabilities that lead to affordable flight elements and a deep space capability. Potential Deep Space Vehicle Power system characteristics: power 10 kilowatts average; two independent power channels with multi-level cross-strapping; solar array power 24 plus kilowatts; multi-junction arrays; lithium Ion battery storage 200 plus ampere-hours; sized for deep space or low lunar orbit operation; distribution120 volts secondary (SAE AS 5698); 2 kilowatt power transfer between vehicles.

  6. Capability and Interface Assessment of Gaming Technologies for Future Multi-Unmanned Air Vehicle Systems

    Science.gov (United States)

    2011-08-01

    Playing Games ( MMORPG ), which necessitate the management of multiple independent entities with sophisticated capabilities; and finally, arcade-style... MMORPG , SA 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 34 19a. NAME OF RESPONSIBLE...tested platform for simultaneous control of multiple entities. Similarly, the popularity of Massively Multiplayer Online Role Playing Games ( MMORPG

  7. Trends in Human-Computer Interaction to Support Future Intelligence Analysis Capabilities

    Science.gov (United States)

    2011-06-01

    Oblong Industries Inc. (Oblong, 2011). In addition to the camera-based gesture interaction (Figure 4), this system offers a management capability...EyeTap Lumus Eyewear LOE FogScreen HP LiM PC Microvision PEK and SHOWWX Pico Projectors Head Mounted Display Chinese Holo Screen 10 Advanced Analyst

  8. Survey Probability and Factors affecting Farmers Participation in Future and Option Markets Case Study: Cotton product in Gonbad kavos city

    Directory of Open Access Journals (Sweden)

    F. sakhi

    2016-03-01

    .5 respectively. Multinomial Logit model estimation results for the probability of participation in the future and option markets showed that variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index are statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables showed positive effects on the probability of participation in these markets. The results are in line with previous studies. Conclusion: The purpose of the current study was to look at the possibility of farmers participations in the future and option markets that presented as a means to reduce the cotton prices volatility. The dependent variable for this purpose, have four categories: participation in both market, and future market, participation in option market and participation in both future and option markets. Multinomial Legit Regression Model was used for data analysis. Results indicated that during the period of 2014 -2015 and the sample under study 35% of cotton growers unwilling to participate in the future and option markets. Farmers willingness to participate in the future and option market was 19% and %21.5, respectively. Multinomial Legit model estimation results for the probability of participation in the future and option markets showed that the variables of the level of education, farm ownership, cotton acreage, and non-farm income, work experience in agriculture, the index of willing to use new technologies, the index of risk perception cotton market and risk aversion index were statistically significant. The variables of farm ownership, non-farm income and work experience in agriculture, showed negative effects and the other variables positive effects on the probability of participation in these markets. The results are in line

  9. Advanced simulation capability for environmental management - current status and future applications

    Energy Technology Data Exchange (ETDEWEB)

    Freshley, Mark; Scheibe, Timothy [Pacific Northwest National Laboratory, Richland, Washington (United States); Robinson, Bruce; Moulton, J. David; Dixon, Paul [Los Alamos National Laboratory, Los Alamos, New Mexico (United States); Marble, Justin; Gerdes, Kurt [U.S. Department of Energy, Office of Environmental Management, Washington DC (United States); Stockton, Tom [Neptune and Company, Inc, Los Alamos, New Mexico (United States); Seitz, Roger [Savannah River National Laboratory, Aiken, South Carolina (United States); Black, Paul [Neptune and Company, Inc, Lakewood, Colorado (United States)

    2013-07-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management (EM), Office of Soil and Groundwater (EM-12), is supporting development of the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is a state-of-the-art scientific tool and approach that is currently aimed at understanding and predicting contaminant fate and transport in natural and engineered systems. ASCEM is a modular and open source high-performance computing tool. It will be used to facilitate integrated approaches to modeling and site characterization, and provide robust and standardized assessments of performance and risk for EM cleanup and closure activities. The ASCEM project continues to make significant progress in development of capabilities, with current emphasis on integration of capabilities in FY12. Capability development is occurring for both the Platform and Integrated Tool-sets and High-Performance Computing (HPC) multi-process simulator. The Platform capabilities provide the user interface and tools for end-to-end model development, starting with definition of the conceptual model, management of data for model input, model calibration and uncertainty analysis, and processing of model output, including visualization. The HPC capabilities target increased functionality of process model representations, tool-sets for interaction with Platform, and verification and model confidence testing. The integration of the Platform and HPC capabilities were tested and evaluated for EM applications in a set of demonstrations as part of Site Applications Thrust Area activities in 2012. The current maturity of the ASCEM computational and analysis capabilities has afforded the opportunity for collaborative efforts to develop decision analysis tools to support and optimize radioactive waste disposal. Recent advances in computerized decision analysis frameworks provide the perfect opportunity to bring this capability into ASCEM. This will allow radioactive waste

  10. Capabilities, performance, and future possibilities of high frequency polyphase resonant converters

    International Nuclear Information System (INIS)

    Reass, W.A.; Baca, D.M.; Bradley, J.T. III; Hardek, T.W.; Kwon, S.I.; Lynch, M.T.; Rees, D.E.

    2004-01-01

    High Frequency Polyphase Resonant Power Conditioning (PRPC) techniques developed at Los Alamos National Laboratory (LANL) are now being utilized for the Oak Ridge National Laboratory (ORNL) Spallation Neutron Source (SNS) accelerator klystron RF amplifier power systems. Three different styles of polyphase resonant converter modulators were developed for the SNS application. The various systems operate up to 140 kV, or 11 MW pulses, or up to 1.1 MW average power, all from a DC input of +/- 1.2 kV. Component improvements realized with the SNS effort coupled with new applied engineering techniques have resulted in dramatic changes in RF power conditioning topology. As an example, the high-voltage transformers are over 100 times smaller and lighter than equivalent 60 Hz versions. With resonant conversion techniques, load protective networks are not required. A shorted load de-tunes the resonance and little power transfer can occur. This provides for power conditioning systems that are inherently self-protective, with automatic fault 'ride-through' capabilities. By altering the Los Alamos design, higher power and CW power conditioning systems can be realized without further demands of the individual component voltage or current capabilities. This has led to designs that can accommodate 30 MW long pulse applications and megawatt class CW systems with high efficiencies. The same PRPC techniques can also be utilized for lower average power systems (∼250 kW). This permits the use of significantly higher frequency conversion techniques that result in extremely compact systems with short pulse (10 to 100 us) capabilities. These lower power PRPC systems may be suitable for medical Linacs and mobile RF systems. This paper will briefly review the performance achieved for the SNS accelerator and examine designs for high efficiency megawatt class CW systems and 30 MW peak power applications. The devices and designs for compact higher frequency converters utilized for short pulse

  11. The Sentry Autonomous Underwater Vehicle: Field Trial Results and Future Capabilities

    Science.gov (United States)

    Yoerger, D. R.; Bradley, A. M.; Martin, S. C.; Whitcomb, L. L.

    2006-12-01

    The Sentry autonomous underwater vehicle combines an efficient long range survey capability with the ability to maneuver at low speeds. These attributes will permit Sentry to perform a variety of conventional and unconventional surveys including long range sonar surveys, hydrothermal plume surveys and near-bottom photo surveys. Sentry's streamlined body and fore and aft tilting planes, each possessing an independently controlled thruster, enable efficient operation in both near-bottom and cruising operations. Sentry is capable of being configured in two modes: hover mode, which commands Sentry's control surfaces to be aligned vertically, and forward flight mode, which allows Sentry's control surfaces to actuate between plus or minus 45 degrees. Sentry is equipped for full 6-Degrees of freedom position measurement. Vehicle heading, roll, and pitch are instrumented with a TCM2 PNI heading and attitude sensor. A Systron Donner yaw rate sensor instrumented heading rate. Depth is instrumented by a Paroscientific depth sensor. A 300kHz RD Instruments Doppler Sonar provides altitude and XYZ velocity measurements. In April 2006, we conducted our first deep water field trials of Sentry in Bermuda. These trials enabled us to examine a variety of issues, including the control software, vehicle safety systems, launch and recovery procedures, operation at depth, heading and depth controllers over a range of speeds, and power consumption. Sentry employ's a control system based upon the Jason 2 control system for low-level control, which has proven effective and reliable over several hundred deep-water dives. The Jason 2 control system, developed jointly at Johns Hopkins University and Woods Hole Oceanographic Institution, was augmented to manage Sentry-specific devices (sensors, actuators, and power storage) and to employ a high-level mission controller that supported autonomous mission scripting and error detection and response. This control suite will also support the Nereus

  12. Future capabilities of CME polarimetric 3D reconstructions with the METIS instrument: A numerical test

    Science.gov (United States)

    Pagano, P.; Bemporad, A.; Mackay, D. H.

    2015-10-01

    fundamental importance for future space weather forecasting. In addition, we find that the column density derived from white-light images is accurate and we propose an improved technique where the combined use of the polarization ratio technique and white-light images minimizes the error in the estimation of column densities. Moreover, by applying the comparison to a set of snapshots of the simulation we can also assess the errors related to the trajectory and the expansion of the CME. Conclusions: Our method allows us to thoroughly test the performance of the polarization ratio technique and allows a determination of the errors associated with it, which means that it can be used to quantify the results from the analysis of the forthcoming METIS observations in white light (total and polarized brightness). Finally, we describe a satellite observing configuration relative to the Earth that can allow the technique to be efficiently used for space weather predictions. A movie attached to Fig. 15 is available in electronic form at http://www.aanda.org

  13. Study for Safeguards Challenges to the Most Probably First Indonesian Future Power Plant of the Pebble Bed Modular Reactor

    International Nuclear Information System (INIS)

    Susilowati, E.

    2015-01-01

    In the near future Indonesia, the fourth most populous country, plans to build a small size power plant most probably a Pebble Bed Modular Reactor PBMR. This first nuclear power plant (NPP) is aimed to provide clear picture to the society in regard to performance and safety of nuclear power plant operation. Selection to the PBMR based on several factor including the combination of small size of the reactor and type of fuel allowing the use of passive safety systems, resulting in essential advantages in nuclear plant design and less dependence on plant operators for safety. In the light of safeguards perspective this typical reactor is also quite difference with previous light water reactor (LWR) design. From the fact that there are a small size large number of elements present in the reactor produced without individual serial numbers combine to on-line refueling same as the CANDU reactor, enforcing a new challenge to safeguards approach for this typical reactor. This paper discusses a bunch of safeguards measures have to be prepared by facility operator to support successfully international nuclear material and facility verification including elements of design relevant to safeguards need to be accomplished in consultation to the regulatory body, supplier or designer and the Agency/IAEA such as nuclear material balance area and key measurement point; possible diversion scenarios and safeguards strategy; and design features relevant to the IAEA equipment have to be installed at the reactor facility. It is deemed that result of discussion will alleviate and support the Agency approaching safeguards measure that may be applied to the purpose Indonesian first power plant of PBMR construction and operation. (author)

  14. Development and evaluation of an ultra-fast ASIC for future PET scanners using TOF-capable MPPC array detectors

    International Nuclear Information System (INIS)

    Ambe, T.; Ikeda, H.; Kataoka, J.; Matsuda, H.; Kato, T.

    2015-01-01

    We developed a front-end ASIC for future PET scanners with Time-Of-Flight (TOF) capability to be coupled with 4×4 Multi-Pixel Photon Counter (MPPC) arrays. The ASIC is designed based on the open-IP project proposed by JAXA and realized in TSMC 0.35 μm CMOS technology. The circuit comprises 16-channel, low impedance current conveyors for effectively acquiring fast MPPC signals. For precise measurement of the coincidence timing of 511-keV gamma rays, the leading-edge method was used to discriminate the signals. We first tested the time response of the ASIC by illuminating each channel of a MPPC array device 3×3 mm 2 in size with a Pico-second Light Pulsar with a light emission peak of 655 nm and pulse duration of 54 ps (FWHM). We obtained 105 ps (FWHM) on average for each channel in time jitter measurements. Moreover, we compensated for the time lag of each channel with inner delay circuits and succeeded in suppressing about a 700-ps lag to only 15 ps. This paper reports TOF measurements using back-to-back 511-keV signals, and suggests that the ASIC can be a promising device for future TOF-PET scanners based on the MPPC array. - Highlights: • We developed a newly designed large-area monolithic MPPC array. • We obtained fine gain uniformity, and good energy and time resolutions when coupled to the LYSO scintillator. • We fabricated gamma-ray camera consisting of the MPPC array and the submillimeter pixelized LYSO and GGAG scintillators. • In the flood images, each crystal of scintillator matrices was clearly resolved. • Good energy resolutions for 662 keV gamma-rays for each LYSO and GGAG scintillator matrices were obtained

  15. A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process and Joint Staff Capability Gap Assessment Process as Related to Pacific Commands (PACOM) Integrated Priority List Submission

    Science.gov (United States)

    2013-04-01

    based on personal interviews with Kit Carlan and Ken Bruner of PACOM, and PowerPoint slides dated March 23, 2011, and prepared by Kit Carlan. 8 The...Integration Branch, Joint Capability Division, J-8, Joint Staff; Mr. Ken Bruner , Science and Technology Advisor, PACOM; Mr. Kit Carlan, Future

  16. A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process and Joint Staff Capability Gap Assessment Process as Related to Pacific Command’s Integrated Priority List Submission

    Science.gov (United States)

    2012-12-13

    is based on personal interviews with Kit Carlan and Ken Bruner of PACOM, and PowerPoint slides dated March 23, 2011 that were prepared by Kit Carlan...Ken Bruner , Science and Technology Advisor, PACOM; Mr. Kit Carlan, Future Capabilities Analyst, J-82, PACOM for key information and helpful

  17. Greenland plays a large role in the gloomy picture painted of probable future sea-level rise

    Science.gov (United States)

    Hanna, Edward

    2012-12-01

    very coarse at 5.625° latitude/longitude resolution. There appears to be a cancelling out of errors in LOVECLIM, where its climate sensitivity seems quite low (in comparison with other models) but the simulated enhanced high-latitude warming—often termed Arctic amplification and evident in observed climate data for the last 30 years—is quite high. It would be good to include precipitation as well as temperature changes when modelling the future response of glaciers, even though the former is likely to be less important. I do not agree that uncertainties in climate sensitivity can be adequately accounted for by varying boundary and initial conditions in ensembles of models, as all of the model simulations may be systematically biased due to some physical effect that is improperly considered—or unrepresented—by all of the models, but this is a widely used technique and probably the best that can be done here. Despite these caveats, Goelzer et al 's (2012) results will undoubtedly prove useful for the Intergovernmental Panel on Climate Change (IPCC)'s upcoming Fifth Assessment Report due to be released in 2014. The key challenge remains to further improve the individual components of the Earth system model, especially those concerning ice-sheet dynamics. Acknowledgments EH thanks Ben Brock, Amy Jowett and Andrew Sole for useful editorial suggestions to the text. References Barletta V R, Sørensen L S and Forsberg R 2012 Variability of mass changes at basin scale for Greenland and Antarctica Cryosp. Discuss. 6 3397-446 Bartholomew I, Nienow P, Sole A, Mair D, Cowton T and King M A 2011 Seasonal variations in Greenland ice sheet motion: inland extent and behaviour at higher elevations Earth Planet. Sci. Lett. 307 271-8 Goelzer H, Huybrechts P, Raper S C B, Loutre M -F, Goosse H and Fichefet T 2012 Millennial total sea-level commitments projected with the Earth system model of intermediate complexity LOVECLIM Environ. Res. Lett. 7 045401 Hanna E, Huybrechts P

  18. Future Availability of Water Supply from Karstic Springs under Probable Climate Change. The case of Aravissos, Central Macedonia, Greece.

    Science.gov (United States)

    Vafeiadis, M.; Spachos, Th.; Zampetoglou, K.; Soupilas, Th.

    2012-04-01

    The test site of Aravissos is located at 70 Km to the West (W-NW) of Thessaloniki at the south banks of mount Païko, in the north part of Central Macedonia The karstic Aravissos springs supply 40% of total volume needed for the water supply of Thessaloniki, Greece. As the water is of excellent quality, it is feed directly in the distribution network without any previous treatment. The availability of this source is therefore of high importance for the sustainable water supply of this area with almost 1000000 inhabitants. The water system of Aravissos is developed in a karstic limestone with an age of about Late Cretaceous that covers almost the entire western part of the big-anticline of Païko Mountain. The climate in this area and the water consumption area, Thessaloniki, is a typical Mediterranean climate with mild and humid winters and hot and dry summers. The total annual number of rainy days is around 110. The production of the Aravissos springs depends mostly from the annual precipitations. As the feeding catchement and the karst aquifer are not well defined, a practical empirical balance model, that contains only well known relevant terms, is applied for the simulation of the operation of the springs under normal water extraction for water supply in present time. The estimation of future weather conditions are based on GCM and RCM simulation data and the extension of trend lines of the actual data. The future evolution of the availability of adequate water quantities from the springs is finally estimated from the balance model and the simulated future climatic data. This study has been realised within the project CC-WaterS, funded by the SEE program of the European Regional Development Fund (http://www.ccwaters.eu/).

  19. The Future Today: Creating an All Purpose Battalion to Enhance the Marine Corps’ Capabilities for Tomorrow, Today

    Science.gov (United States)

    2008-01-01

    Maude, (Houghton Mifflin, Boston MA, and New York, NY, 1920), p. 279. 21Author’s notes from personal interview. . 22Philip Kotler , Marketing Management...adhere to the principle of unity of command and provide the Marine Corps with a credible and confident force in readiness, capable of supporting all...one command. This simple creation of a new unit, essentially a headquarters element, will adhere to the principle of unity of command and provide the

  20. The Armored Brigade Combat Team (ABCT) in the Future: An Assessment of Capabilities Against the Hybrid Threat in the Future Operational Environment

    Science.gov (United States)

    2013-06-13

    requirements, followed by a tactical case study assessment, and a strengths, weaknesses, opportunities, and threats ( SWOT ) analysis of the BCTs against a...strategic and operational deployment. This information further developed the case study SWOT analysis of the BCTs against a hybrid threat. The SWOT ...a SWOT analysis . The next section addressed is the analysis . The analysis is comprised of the strategic capabilities assessment and the tactical

  1. Potential Applications of Modularity to Enable a Deep Space Habitation Capability for Future Human Exploration Beyond Low-Earth Orbit

    Science.gov (United States)

    Simon, Matthew A.; Toups, Larry; Smitherman, David

    2012-01-01

    Evaluating preliminary concepts of a Deep Space Habitat (DSH) enabling long duration crewed exploration of asteroids, the Moon, and Mars is a technically challenging problem. Sufficient habitat volumes and equipment, necessary to ensure crew health and functionality, increase propellant requirements and decrease launch flexibility to deliver multiple elements on a single launch vehicle; both of which increase overall mission cost. Applying modularity in the design of the habitat structures and subsystems can alleviate these difficulties by spreading the build-up of the overall habitation capability across several smaller parts. This allows for a more flexible habitation approach that accommodates various crew mission durations and levels of functionality. This paper provides a technical analysis of how various modular habitation approaches can impact the parametric design of a DSH with potential benefits in mass, packaging volume, and architectural flexibility. This includes a description of the desired long duration habitation capability, the definition of a baseline model for comparison, a small trade study to investigate alternatives, and commentary on potentially advantageous configurations to enable different levels of habitability. The approaches investigated include modular pressure vessel strategies, modular subsystems, and modular manufacturing approaches to habitat structure. The paper also comments upon the possibility of an integrated habitation strategy using modular components to create all short and long duration habitation elements required in the current exploration architectures.

  2. BECCS capability of dedicated bioenergy crops under a future land-use scenario targeting net negative carbon emissions

    Science.gov (United States)

    Kato, E.; Yamagata, Y.

    2014-12-01

    Bioenergy with Carbon Capture and Storage (BECCS) is a key component of mitigation strategies in future socio-economic scenarios that aim to keep mean global temperature rise below 2°C above pre-industrial, which would require net negative carbon emissions in the end of the 21st century. Because of the additional need for land, developing sustainable low-carbon scenarios requires careful consideration of the land-use implications of deploying large-scale BECCS. We evaluated the feasibility of the large-scale BECCS in RCP2.6, which is a scenario with net negative emissions aiming to keep the 2°C temperature target, with a top-down analysis of required yields and a bottom-up evaluation of BECCS potential using a process-based global crop model. Land-use change carbon emissions related to the land expansion were examined using a global terrestrial biogeochemical cycle model. Our analysis reveals that first-generation bioenergy crops would not meet the required BECCS of the RCP2.6 scenario even with a high fertilizer and irrigation application. Using second-generation bioenergy crops can marginally fulfill the required BECCS only if a technology of full post-process combustion CO2 capture is deployed with a high fertilizer application in the crop production. If such an assumed technological improvement does not occur in the future, more than doubling the area for bioenergy production for BECCS around 2050 assumed in RCP2.6 would be required, however, such scenarios implicitly induce large-scale land-use changes that would cancel half of the assumed CO2 sequestration by BECCS. Otherwise a conflict of land-use with food production is inevitable.

  3. Propulsion and Power Generation Capabilities of a Dense Plasma Focus (DPF) Fusion System for Future Military Aerospace Vehicles

    International Nuclear Information System (INIS)

    Knecht, Sean D.; Mead, Franklin B.; Thomas, Robert E.; Miley, George H.; Froning, David

    2006-01-01

    The objective of this study was to perform a parametric evaluation of the performance and interface characteristics of a dense plasma focus (DPF) fusion system in support of a USAF advanced military aerospace vehicle concept study. This vehicle is an aerospace plane that combines clean 'aneutronic' dense plasma focus (DPF) fusion power and propulsion technology, with advanced 'lifting body'-like airframe configurations utilizing air-breathing MHD propulsion and power technology within a reusable single-stage-to-orbit (SSTO) vehicle. The applied approach was to evaluate the fusion system details (geometry, power, T/W, system mass, etc.) of a baseline p-11B DPF propulsion device with Q = 3.0 and thruster efficiency, ηprop = 90% for a range of thrust, Isp and capacitor specific energy values. The baseline details were then kept constant and the values of Q and ηprop were varied to evaluate excess power generation for communication systems, pulsed-train plasmoid weapons, ultrahigh-power lasers, and gravity devices. Thrust values were varied between 100 kN and 1,000 kN with Isp of 1,500 s and 2,000 s, while capacitor specific energy was varied from 1 - 15 kJ/kg. Q was varied from 3.0 to 6.0, resulting in gigawatts of excess power. Thruster efficiency was varied from 0.9 to 1.0, resulting in hundreds of megawatts of excess power. Resulting system masses were on the order of 10's to 100's of metric tons with thrust-to-weight ratios ranging from 2.1 to 44.1, depending on capacitor specific energy. Such a high thrust/high Isp system with a high power generation capability would allow military versatility in sub-orbital space, as early as 2025, and beyond as early as 2050. This paper presents the results that coincide with a total system mass between 15 and 20 metric tons

  4. PROCEEDINGS OF THE 1983 DPF WORKSHOP ON COLLIDER DETECTORS: PRESENT CAPABILITIES AND FUTURE POSSIBILITIES, FEB. 28 - MARCH 4, 1983.

    Energy Technology Data Exchange (ETDEWEB)

    Loken Ed, S.C.; Nemethy Ed, P.

    1983-04-01

    It is useful before beginning our work here to restate briefly the purpose of this workshop in the light of the present circumstances of elementary particle physics in the U.S. The goal of our field is easily stated in a general way: it is to reach higher center of mass energies and higher luminosities while employing more sensitive and more versatile event detectors, all in order to probe more deeply into the physics of elementary particles. The obstacles to achieving this goal are equally apparent. Escalating costs of construction and operation of our facilities limit alternatives and force us to make hard choices among those alternatives. The necessity to be highly selective in the choice of facilities, in conjunction with the need for increased manpower concentrations to build accelerators and mount experiments, leads to complex social problems within the science. As the frontier is removed ever further, serious technical difficulties and limitations arise. Finally, competition, much of which is usually healthy, now manifests itself with greater intensity on a regional basis within our country and also on an international scale. In the far ({ge}20 yr) future, collaboration on physics facilities by two or more of the major economic entities of the world will possibly be forthcoming. In the near future, we are left to bypass or overcome these obstacles on a regional scale as best we can. The choices we face are in part indicated in the list of planned and contemplated accelerators shown in Table I. The facilities indicated with an asterisk pose immediate questions: (1) Do we need them all and what should be their precise properties? (2) How are the ones we choose to be realized? (3) What is the nature of the detectors to exploit those facilities? (4) How do we respond to the challenge of higher luminosity as well as higher energy in those colliders? The decision-making process in this country and elsewhere depends on the answers to these technical questions

  5. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  6. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  7. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  8. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  9. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  10. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  11. Capability Paternalism

    NARCIS (Netherlands)

    Claassen, R.J.G.|info:eu-repo/dai/nl/269266224

    A capability approach prescribes paternalist government actions to the extent that it requires the promotion of specific functionings, instead of the corresponding capabilities. Capability theorists have argued that their theories do not have much of these paternalist implications, since promoting

  12. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  13. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  14. Capability ethics

    OpenAIRE

    Robeyns, Ingrid

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological theories, virtue ethics, or pragmatism. As I will argue in this chapter, at present the core of the capability approach is an account of value, which together with some other (more minor) normative comm...

  15. Dynamic Capabilities

    DEFF Research Database (Denmark)

    Grünbaum, Niels Nolsøe; Stenger, Marianne

    2013-01-01

    The findings reveal a positive relationship between dynamic capabilities and innovation performance in the case enterprises, as we would expect. It was, however, not possible to establish a positive relationship between innovation performance and profitability. Nor was there any positive...... relationship between dynamic capabilities and profitability....

  16. Capability ethics

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological

  17. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  18. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  19. Hydrological model calibration for flood prediction in current and future climates using probability distributions of observed peak flows and model based rainfall

    Science.gov (United States)

    Haberlandt, Uwe; Wallner, Markus; Radtke, Imke

    2013-04-01

    Derived flood frequency analysis based on continuous hydrological modelling is very demanding regarding the required length and temporal resolution of precipitation input data. Often such flood predictions are obtained using long precipitation time series from stochastic approaches or from regional climate models as input. However, the calibration of the hydrological model is usually done using short time series of observed data. This inconsistent employment of different data types for calibration and application of a hydrological model increases its uncertainty. Here, it is proposed to calibrate a hydrological model directly on probability distributions of observed peak flows using model based rainfall in line with its later application. Two examples are given to illustrate the idea. The first one deals with classical derived flood frequency analysis using input data from an hourly stochastic rainfall model. The second one concerns a climate impact analysis using hourly precipitation from a regional climate model. The results show that: (I) the same type of precipitation input data should be used for calibration and application of the hydrological model, (II) a model calibrated on extreme conditions works quite well for average conditions but not vice versa, (III) the calibration of the hydrological model using regional climate model data works as an implicit bias correction method and (IV) the best performance for flood estimation is usually obtained when model based precipitation and observed probability distribution of peak flows are used for model calibration.

  20. Finite element analysis of ageing reinforced and prestressed concrete structures in nuclear plant - An international review of current capabilities and priorities for future developments

    International Nuclear Information System (INIS)

    2002-01-01

    Nuclear plants contain a variety of concrete structures whose structural performance is essential to the safety of the plant. There is a requirement to demonstrate the robustness of these structures during normal operating and extreme accident conditions, throughout their life. During this time, the concrete may degrade due to the effects of ageing. This degradation must be accounted for during the assessment of their performance. Finite Element Analysis (FEA) techniques have tremendous potential for providing valuable insight into the behaviour of these aged concrete structures under a range of different loading conditions. Advanced FEA techniques currently enjoy widespread use within the nuclear industry for the non-linear analysis of concrete. Many practitioners within the nuclear industry are at the forefront of the industrial application of these methods. However, in some areas, the programs that are commercially available lag behind the best information available from research. This document is an international review of current capabilities and priorities for future development relating to non-linear finite element analysis of reinforced and prestressed concrete in the nuclear industry in the various member states. Particular attention is paid to the analysis of degraded or ageing structures. This report: 1. Summarises the needs for FEA of aged concrete nuclear structures; 2. Details the existing capabilities, not just in terms of what the software is capable of, but also in terms of the current practices employed by those in industry; 3. Looks at how engineers, within the nuclear industry, working in this field would like to see methods improved, and identifies the factors that are limiting current practice; 4. Summarises ongoing research that may provide beneficial technological advances; 5. Assigns priorities to the different development requests; 6. Selects those developments that are felt to be of greatest benefit to industry and provides a qualitative

  1. Gossiping Capabilities

    DEFF Research Database (Denmark)

    Mogensen, Martin; Frey, Davide; Guerraoui, Rachid

    Gossip-based protocols are now acknowledged as a sound basis to implement collaborative high-bandwidth content dissemination: content location is disseminated through gossip, the actual contents being subsequently pulled. In this paper, we present HEAP, HEterogeneity Aware gossip Protocol, where...... nodes dynamically adjust their contribution to gossip dissemination according to their capabilities. Using a continuous, itself gossip-based, approximation of relative capabilities, HEAP dynamically leverages the most capable nodes by (a) increasing their fanouts (while decreasing by the same proportion...... declare a high capability in order to augment their perceived quality without contributing accordingly. We evaluate HEAP in the context of a video streaming application on a 236 PlanetLab nodes testbed. Our results shows that HEAP improves the quality of the streaming by 25% over a standard gossip...

  2. Futures

    DEFF Research Database (Denmark)

    Pedersen, Michael Haldrup

    2017-01-01

    Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores the potenti......Currently both design thinking and critical social science experience an increased interest in speculating in alternative future scenarios. This interest is not least related to the challenges issues of global sustainability present for politics, ethics and design. This paper explores...... the potentials of speculative thinking in relation to design and social and cultural studies, arguing that both offer valuable insights for creating a speculative space for new emergent criticalities challenging current assumptions of the relations between power and design. It does so by tracing out discussions...... of ‘futurity’ and ‘futuring’ in design as well as social and cultural studies. Firstly, by discussing futurist and speculative approaches in design thinking; secondly by engaging with ideas of scenario thinking and utopianism in current social and cultural studies; and thirdly by showing how the articulation...

  3. Predicting future blood supply and demand in Japan with a Markov model: application to the sex- and age-specific probability of blood donation.

    Science.gov (United States)

    Akita, Tomoyuki; Tanaka, Junko; Ohisa, Masayuki; Sugiyama, Aya; Nishida, Kazuo; Inoue, Shingo; Shirasaka, Takuma

    2016-11-01

    Simulation studies were performed to predict the future supply and demand for blood donations, and future shortfalls. Using data from all donations in 2006 to 2009, the Markov model was applied to estimate future blood donations until 2050. Based on data concerning the actual use of blood products, the number of blood products needed was estimated based on future population projections. We estimated that the number of blood donations increased from 5,020,000 in 2008 to 5,260,000 in 2012, but will decrease to 4,770,000 units by 2025. In particular, the number of donors in their 20s and 30s decreased every year. Moreover, the number of donations required to supply blood products would have been increased from 5,390,000 in 2012 to 5,660,000 units in 2025. Thus, the estimated shortfall of blood donations is expected to increase each year from 140,000 in 2012 to 890,000 in 2025 and then more than double to 1,670,000 in 2050. If the current blood donation behaviors continue, a shortfall of blood availability is likely to occur in Japan. Insufficient blood donations are mainly related to a projected reduction in population of 20 to 30 year olds, a significant group of donors. Thus, it is crucial to recruit and retain new donors and to develop recommendations for proper use of blood products to minimize unnecessary use. This study provides useful information that can be used by governments to help ensure the adequacy of the blood supply through promoting donations and conserving blood resources. © 2016 AABB.

  4. Capability approach

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal; Kjeldsen, Christian Christrup

    Lærebogen er den første samlede danske præsentation af den af Amartya Sen og Martha Nussbaum udviklede Capability Approach. Bogen indeholder en præsentation og diskussion af Sen og Nussbaums teoretiske platform. I bogen indgår eksempler fra såvel uddannelse/uddannelsespolitik, pædagogik og omsorg....

  5. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  6. ENTREPRENEURIAL CAPABILITIES

    DEFF Research Database (Denmark)

    Rasmussen, Lauge Baungaard; Nielsen, Thorkild

    2003-01-01

    The aim of this article is to analyse entrepreneurship from an action research perspective. What is entrepreneurship about? Which are the fundamental capabilities and processes of entrepreneurship? To answer these questions the article includes a case study of a Danish entrepreneur and his networ....... Finally, the article discuss, how more long term action research methods could be integrated into the entrepreneurial processes and the possible impacts of such an implementation?...

  7. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  8. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  9. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  10. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  11. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  12. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  13. Campus Capability Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arsenlis, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bailey, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergman, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brase, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brenner, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Camara, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carlton, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cheng, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chrzanowski, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Colson, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); East, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Farrell, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferranti, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gursahani, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hansen, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Helms, L. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hernandez, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jeffries, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Larson, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lu, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNabb, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mercer, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Skeate, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sueksdorf, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zucca, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Le, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ancria, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scott, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leininger, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gagliardi, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gash, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronson, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chung, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hobson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meeker, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanchez, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zagar, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Quivey, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sommer, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Atherton, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-06

    Lawrence Livermore National Laboratory Campus Capability Plan for 2018-2028. Lawrence Livermore National Laboratory (LLNL) is one of three national laboratories that are part of the National Nuclear Security Administration. LLNL provides critical expertise to strengthen U.S. security through development and application of world-class science and technology that: Ensures the safety, reliability, and performance of the U.S. nuclear weapons stockpile; Promotes international nuclear safety and nonproliferation; Reduces global danger from weapons of mass destruction; Supports U.S. leadership in science and technology. Essential to the execution and continued advancement of these mission areas are responsive infrastructure capabilities. This report showcases each LLNL capability area and describes the mission, science, and technology efforts enabled by LLNL infrastructure, as well as future infrastructure plans.

  14. Potential performance analysis and future trend prediction of electric vehicle with V2G/V2H/V2B capability

    Directory of Open Access Journals (Sweden)

    Dalong Guo

    2016-03-01

    Full Text Available Due to the intermittent nature, renewable energy sources (RES has brought new challenges on load balancing and energy dispatching to the Smart Grid. Potentially served as distributed energy storage, Electric Vehicle’s (EV battery can be used as a way to help mitigate the pressure of fluctuation brought by RES and reinforce the stability of power systems. This paper gives a comprehensive review of the current situation of EV technology and mainly emphasizing three EV discharging operations which are Vehicle to Grid (V2G, Vehicle to Home (V2H, and Vehicle to Building (V2B, respectively. When needed, EV’s battery can discharge and send its surplus energy back to power grid, residential homes, or buildings. Based on our data analysis, we argue that V2G with the largest transmission power losses is potentially less efficient compared with the other two modes. We show that the residential users have the incentive to schedule the charging, V2G, and V2H according to the real-time price (RTP and the market sell-back price. In addition, we discuss some challenges and potential risks resulting from EVs’ fast growth. Finally we propose some suggestions on future power systems and also argue that some incentives or rewards need to be provided to motivate EV owners to behave in the best interests of the overall power systems.

  15. Antarctic Exploration Parallels for Future Human Planetary Exploration: Science Operations Lessons Learned, Planning, and Equipment Capabilities for Long Range, Long Duration Traverses

    Science.gov (United States)

    Hoffman, Stephen J.

    2012-01-01

    The purpose for this workshop can be summed up by the question: Are there relevant analogs to planetary (meaning the Moon and Mars) to be found in polar exploration on Earth? The answer in my opinion is yes or else there would be no reason for this workshop. However, I think some background information would be useful to provide a context for my opinion on this matter. As all of you are probably aware, NASA has been set on a path that, in its current form, will eventually lead to putting human crews on the surface of the Moon and Mars for extended (months to years) in duration. For the past 50 V 60 years, starting not long after the end of World War II, exploration of the Antarctic has accumulated a significant body of experience that is highly analogous to our anticipated activities on the Moon and Mars. This relevant experience base includes: h Long duration (1 year and 2 year) continuous deployments by single crews, h Established a substantial outpost with a single deployment event to support these crews, h Carried out long distance (100 to 1000 kilometer) traverses, with and without intermediate support h Equipment and processes evolved based on lessons learned h International cooperative missions This is not a new or original thought; many people within NASA, including the most recent two NASA Administrators, have commented on the recognizable parallels between exploration in the Antarctic and on the Moon or Mars. But given that level of recognition, relatively little has been done, that I am aware of, to encourage these two exploration communities to collaborate in a significant way. [Slide 4] I will return to NASA s plans and the parallels with Antarctic traverses in a moment, but I want to spend a moment to explain the objective of this workshop and the anticipated products. We have two full days set aside for this workshop. This first day will be taken up with a series of presentations prepared by individuals with experience that extends back as far as the

  16. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  17. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  18. Identifying 21st Century Capabilities

    Science.gov (United States)

    Stevens, Robert

    2012-01-01

    What are the capabilities necessary to meet 21st century challenges? Much of the literature on 21st century skills focuses on skills necessary to meet those challenges associated with future work in a globalised world. The result is a limited characterisation of those capabilities necessary to address 21st century social, health and particularly…

  19. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  20. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    Science.gov (United States)

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk  1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  1. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  2. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  3. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  4. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  5. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  6. Bubble Radiation Detection: Current and Future Capability

    International Nuclear Information System (INIS)

    Peurrung, A.J.; Craig, R.A.

    1999-01-01

    Despite a number of noteworthy achievements in other fields, superheated droplet detectors (SDDs) and bubble chambers (BCs) have not been used for nuclear nonproliferation and arms control. This report examines these two radiation-detection technologies in detail and answers the question of how they can be or should be ''adapted'' for use in national security applications. These technologies involve closely related approaches to radiation detection in which an energetic charged particle deposits sufficient energy to initiate the process of bubble nucleation in a superheated fluid. These detectors offer complete gamma-ray insensitivity when used to detect neutrons. They also provide controllable neutron-energy thresholds and excellent position resolution. SDDs are extraordinarily simple and inexpensive. BCs offer the promise of very high efficiency (∼75%). A notable drawback for both technologies is temperature sensitivity. As a result of this problem, the temperature must be controlled whenever high accuracy is required, or harsh environmental conditions are encountered. The primary findings of this work are listed and briefly summarized below: (1) SDDs are ready to function as electronics-free neutron detectors on demand for arms-control applications. The elimination of electronics at the weapon's location greatly eases the negotiability of radiation-detection technologies in general. (2) As a result of their high efficiency and sharp energy threshold, current BCs are almost ready for use in the development of a next-generation active assay system. Development of an instrument based on appropriately safe materials is warranted. (3) Both kinds of bubble detectors are ready for use whenever very high gamma-ray fields must be confronted. Spent fuel MPC and A is a good example where this need presents itself. (4) Both kinds of bubble detectors have the potential to function as low-cost replacements for conventional neutron detectors such as 3 He tubes. For SDDs, this requires finding some way to get boron into the detector. For BCs, this requires finding operating conditions permitting a high duty cycle

  7. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  8. A business analytics capability framework

    Directory of Open Access Journals (Sweden)

    Ranko Cosic

    2015-09-01

    Full Text Available Business analytics (BA capabilities can potentially provide value and lead to better organisational performance. This paper develops a holistic, theoretically-grounded and practically relevant business analytics capability framework (BACF that specifies, defines and ranks the capabilities that constitute an organisational BA initiative. The BACF was developed in two phases. First, an a priori conceptual framework was developed based on the Resource-Based View theory of the firm and a thematic content analysis of the BA literature. Second, the conceptual framework was further developed and refined using a three round Delphi study involving 16 BA experts. Changes from the Delphi study resulted in a refined and confirmed framework including detailed capability definitions, together with a ranking of the capabilities based on importance. The BACF will help academic researchers and industry practitioners to better understand the capabilities that constitute an organisational BA initiative and their relative importance. In future work, the capabilities in the BACF will be operationalised to measure their as-is status, thus enabling organisations to identify key areas of strength and weakness and prioritise future capability improvement efforts.

  9. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  10. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  11. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  12. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  13. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  14. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  15. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  16. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  17. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  18. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  19. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  20. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  1. Rights, goals, and capabilities

    NARCIS (Netherlands)

    van Hees, M.V.B.P.M

    This article analyses the relationship between rights and capabilities in order to get a better grasp of the kind of consequentialism that the capability theory represents. Capability rights have been defined as rights that have a capability as their object (rights to capabilities). Such a

  2. Determination of stability of epimetamorphic rock slope using Minimax Probability Machine

    Directory of Open Access Journals (Sweden)

    Manoj Kumar

    2016-01-01

    Full Text Available The article employs Minimax Probability Machine (MPM for the prediction of the stability status of epimetamorphic rock slope. The MPM gives a worst-case bound on the probability of misclassification of future data points. Bulk density (d, height (H, inclination (β, cohesion (c and internal friction angle (φ have been used as input of the MPM. This study uses the MPM as a classification technique. Two models {Linear Minimax Probability Machine (LMPM and Kernelized Minimax Probability Machine (KMPM} have been developed. The generalization capability of the developed models has been checked by a case study. The experimental results demonstrate that MPM-based approaches are promising tools for the prediction of the stability status of epimetamorphic rock slope.

  3. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  4. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  5. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  6. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  7. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  8. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  9. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  10. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  11. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  12. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  13. Mobile Test Capabilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Electrical Power Mobile Test capabilities are utilized to conduct electrical power quality testing on aircraft and helicopters. This capability allows that the...

  14. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  15. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  16. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  17. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  18. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  19. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  20. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  1. Electricity distribution within the future residence

    Energy Technology Data Exchange (ETDEWEB)

    Breeze, J.E.

    1981-11-01

    This study examined present residential wiring systems and identified their shortcomings. A list of the desirable attributes for future wiring systems is proposed. The outlook for the application to wiring systems of solid-state electronic devices is assessed. As further background for a proposed new wiring concept, the residential use of energy today and probable future trends are reviewed. Lastly, the concept of a distributed bus is proposed and developed on a conceptual basis for the residential wiring system of the future. The distributed bus concept can lead to the development of a residential wiring system to meet the following requirements: adaptable to meet probable future energy requirements for residences including alternative energy sources and energy storage; flexibility for servicing loads both in respect to location in the residence and to the size of the load; improved economy in the use of materials; capability for development as a designed or engineered system with factory assembled components and wiring harness; capability for expansion through the attachment of legs or auxillary rings; adaptable to any probable architectural residential development; capability for development to meet the requirements for ease of use and maintenance and with recognition of the growing importance of do-it-yourself repairs and alterations; and adaptable to the full range of solid-state electronics and micro-computer devices and controls including the concept of load control and management through the use of a central control module. 66 refs., 15 figs., 1 tab.

  2. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  3. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  4. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  5. Capabilities for Strategic Adaptation

    DEFF Research Database (Denmark)

    Distel, Andreas Philipp

    This dissertation explores capabilities that enable firms to strategically adapt to environmental changes and preserve competitiveness over time – often referred to as dynamic capabilities. While dynamic capabilities being a popular research domain, too little is known about what these capabiliti...

  6. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  7. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  8. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  9. Dynamic capabilities, Marketing Capability and Organizational Performance

    Directory of Open Access Journals (Sweden)

    Adriana Roseli Wünsch Takahashi

    2017-01-01

    Full Text Available The goal of the study is to investigate the influence of dynamic capabilities on organizational performance and the role of marketing capabilities as a mediator in this relationship in the context of private HEIs in Brazil. As a research method we carried out a survey with 316 IES and data analysis was operationalized with the technique of structural equation modeling. The results indicate that the dynamic capabilities have influence on organizational performance only when mediated by marketing ability. The marketing capability has an important role in the survival, growth and renewal on educational services offerings for HEIs in private sector, and consequently in organizational performance. It is also demonstrated that mediated relationship is more intense for HEI with up to 3,000 students and other organizational profile variables such as amount of courses, the constitution, the type of institution and type of education do not significantly alter the results.

  10. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  11. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  14. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  15. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  16. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  17. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  18. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  19. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  20. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  1. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  2. Aircraft Capability Management

    Science.gov (United States)

    Mumaw, Randy; Feary, Mike

    2018-01-01

    This presentation presents an overview of work performed at NASA Ames Research Center in 2017. The work concerns the analysis of current aircraft system management displays, and the initial development of an interface for providing information about aircraft system status. The new interface proposes a shift away from current aircraft system alerting interfaces that report the status of physical components, and towards displaying the implications of degradations on mission capability. The proposed interface describes these component failures in terms of operational consequences of aircraft system degradations. The research activity was an effort to examine the utility of different representations of complex systems and operating environments to support real-time decision making of off-nominal situations. A specific focus was to develop representations that provide better integrated information to allow pilots to more easily reason about the operational consequences of the off-nominal situations. The work is also seen as a pathway to autonomy, as information is integrated and understood in a form that automated responses could be developed for the off-nominal situations in the future.

  3. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  4. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  5. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  6. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  7. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  8. Accelerator and Electrodynamics Capability Review

    International Nuclear Information System (INIS)

    Jones, Kevin W.

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  9. Accelerator and electrodynamics capability review

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kevin W [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  10. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  11. Building Service Provider Capabilities

    DEFF Research Database (Denmark)

    Brandl, Kristin; Jaura, Manya; Ørberg Jensen, Peter D.

    2015-01-01

    In this paper we study whether and how the interaction between clients and the service providers contributes to the development of capabilities in service provider firms. In situations where such a contribution occurs, we analyze how different types of activities in the production process...... process. We find that clients influence the development of human capital capabilities and management capabilities in reciprocally produced services. While in sequential produced services clients influence the development of organizational capital capabilities and management capital capabilities....... of the services, such as sequential or reciprocal task activities, influence the development of different types of capabilities. We study five cases of offshore-outsourced knowledge-intensive business services that are distinguished according to their reciprocal or sequential task activities in their production...

  12. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  13. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    decision making versus advance science, are noted. It is argued that, just as no point forecast is complete without an estimate of its accuracy, no model-based probability forecast is complete without an estimate of its own irrelevance. The same nonlinearities that made the electronic computer so valuable links the selection and assimilation of observations, the formation of ensembles, the evolution of models, the casting of model simulations back into observables, and the presentation of this information to those who use it to take action or to advance science. Timescales of interest exceed the lifetime of a climate model and the career of a climate scientist, disarming the trichotomy that lead to swift advances in weather forecasting. Providing credible, informative climate services is a more difficult task. In this context, the value of comparing the forecasts of simulation models not only with each other but also with the performance of simple empirical models, whenever possible, is stressed. The credibility of meteorology is based on its ability to forecast and explain the weather. The credibility of climatology will always be based on flimsier stuff. Solid insights of climate science may be obscured if the severe limits on our ability to see the details of the future even probabilistically are not communicated clearly.

  14. Capability Handbook- offline metrology

    DEFF Research Database (Denmark)

    Islam, Aminul; Marhöfer, David Maximilian; Tosello, Guido

    This offline metrological capability handbook has been made in relation to HiMicro Task 3.3. The purpose of this document is to assess the metrological capability of the HiMicro partners and to gather the information of all available metrological instruments in the one single document. It provides...

  15. Dynamic Capabilities and Performance

    DEFF Research Database (Denmark)

    Wilden, Ralf; Gudergan, Siegfried P.; Nielsen, Bo Bernhard

    2013-01-01

    are contingent on the competitive intensity faced by firms. Our findings demonstrate the performance effects of internal alignment between organizational structure and dynamic capabilities, as well as the external fit of dynamic capabilities with competitive intensity. We outline the advantages of PLS...

  16. Developing Alliance Capabilities

    DEFF Research Database (Denmark)

    Heimeriks, Koen H.; Duysters, Geert; Vanhaverbeke, Wim

    This paper assesses the differential performance effects of learning mechanisms on the development of alliance capabilities. Prior research has suggested that different capability levels could be identified in which specific intra-firm learning mechanisms are used to enhance a firm's alliance...

  17. Telematics Options and Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hodge, Cabell [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-05

    This presentation describes the data tracking and analytical capabilities of telematics devices. Federal fleet managers can use the systems to keep their drivers safe, maintain a fuel efficient fleet, ease their reporting burden, and save money. The presentation includes an example of how much these capabilities can save fleets.

  18. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    % in those with low probability. The prevalence of PE in patients with intermediate clinical probability was 41%. These results underscore the importance of incorporating the standardized reading of the electrocardiogram and of the chest radiograph into the clinical evaluation of patients with suspected PE. The interpretation of these laboratory data, however, requires experience. Future research is needed to develop standardized models, of varying degree of complexity, which may find application in different clinical settings to predict the probability of PE

  19. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  20. FMEF/experimental capabilities

    International Nuclear Information System (INIS)

    Burgess, C.A.; Dronen, V.R.

    1981-01-01

    The Fuels and Materials Examination Facility (FMEF), under construction at the Hanford site north of Richland, Washington, will be one of the most modern facilities offering irradiated fuels and materials examination capabilities and fuel fabrication development technologies. Scheduled for completion in 1984, the FMEF will provide examination capability for fuel assemblies, fuel pins and test pins irradiated in the FFTF. Various functions of the FMEF are described, with emphasis on experimental data-gathering capabilities in the facility's Nondestructive and Destructive examination cell complex

  1. KSC Technical Capabilities Website

    Science.gov (United States)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  2. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  3. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  4. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  5. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  6. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  7. Resources, constraints and capabilities

    NARCIS (Netherlands)

    Dhondt, S.; Oeij, P.R.A.; Schröder, A.

    2018-01-01

    Human and financial resources as well as organisational capabilities are needed to overcome the manifold constraints social innovators are facing. To unlock the potential of social innovation for the whole society new (social) innovation friendly environments and new governance structures

  8. a Capability approach

    African Journals Online (AJOL)

    efforts towards gender equality in education as a means of achieving social justice. ... should mean that a lot of capability approach-oriented commentators are ... processes, their forms of exercising power, and their rules, unwritten cultures, ...

  9. Engineering Capabilities and Partnerships

    Science.gov (United States)

    Poulos, Steve

    2010-01-01

    This slide presentation reviews the engineering capabilities at Johnson Space Center, The presentation also reviews the partnerships that have resulted in successfully designed and developed projects that involved commercial and educational institutions.

  10. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  11. Brandishing Cyberattack Capabilities

    Science.gov (United States)

    2013-01-01

    Advertising cyberwar capabilities may be helpful. It may back up a deterrence strategy. It might dissuade other states from conventional mischief or...to enable the attack.5 Many of the instruments of the attack remain with the target system, nestled in its log files, or even in the malware itself...debat- able. Even if demonstrated, what worked yesterday may not work today. But difficult does not mean impossible. Advertising cyberwar capabilities

  12. CASL Dakota Capabilities Summary

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simmons, Chris [Univ. of Texas, Austin, TX (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  13. Graphical Visualization of Human Exploration Capabilities

    Science.gov (United States)

    Rodgers, Erica M.; Williams-Byrd, Julie; Arney, Dale C.; Simon, Matthew A.; Williams, Phillip A.; Barsoum, Christopher; Cowan, Tyler; Larman, Kevin T.; Hay, Jason; Burg, Alex

    2016-01-01

    NASA's pioneering space strategy will require advanced capabilities to expand the boundaries of human exploration on the Journey to Mars (J2M). The Evolvable Mars Campaign (EMC) architecture serves as a framework to identify critical capabilities that need to be developed and tested in order to enable a range of human exploration destinations and missions. Agency-wide System Maturation Teams (SMT) are responsible for the maturation of these critical exploration capabilities and help formulate, guide and resolve performance gaps associated with the EMC-identified capabilities. Systems Capability Organization Reporting Engine boards (SCOREboards) were developed to integrate the SMT data sets into cohesive human exploration capability stories that can be used to promote dialog and communicate NASA's exploration investments. Each SCOREboard provides a graphical visualization of SMT capability development needs that enable exploration missions, and presents a comprehensive overview of data that outlines a roadmap of system maturation needs critical for the J2M. SCOREboards are generated by a computer program that extracts data from a main repository, sorts the data based on a tiered data reduction structure, and then plots the data according to specified user inputs. The ability to sort and plot varying data categories provides the flexibility to present specific SCOREboard capability roadmaps based on customer requests. This paper presents the development of the SCOREboard computer program and shows multiple complementary, yet different datasets through a unified format designed to facilitate comparison between datasets. Example SCOREboard capability roadmaps are presented followed by a discussion of how the roadmaps are used to: 1) communicate capability developments and readiness of systems for future missions, and 2) influence the definition of NASA's human exploration investment portfolio through capability-driven processes. The paper concludes with a description

  14. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  15. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  16. Conceptualizing innovation capabilities: A contingency perspective

    Directory of Open Access Journals (Sweden)

    Tor Helge Aas

    2017-01-01

    Full Text Available Empirical research has confirmed that a positive relationship exists between the implementation of innovation activities and the future performance of organizations. Firms utilize resources and capabilities to develop innovations in the form of new products, services or processes. Some firms prove to be better at reproducing innovation success than others, and the capacity to do so is referred to as innovation capability. However, the term innovation capability is ambiguously treated in extant literature. There are several different definitions of the concept and the distinction between innovation capabilities and other types of capabilities, such as dynamic capabilities, is neither explicitly stated, nor is the relationship between the concept and other resource- and capability-based concepts within strategy theory established. Although innovation is increasingly identified as crucial for a firm’s sustainable competitiveness in contemporary volatile and complex markets, the strategy-innovation link is underdeveloped in extant research. To overcome this challenge this paper raises the following research question: What type of innovation capabilities are required to innovate successfully? Due to the status of the extant research, we chose a conceptual research design to answer our research question and the paper contributes with a conceptual framework to discuss what innovation capabilities firms need to reproduce innovation success. Based on careful examination of current literature on innovation capability specifically, and the strategy-innovation link in general, we suggest that innovation capability must be viewed along two dimensions – innovation novelty and market characteristics. This framework enables the identification of four different contexts for innovation capabilities in a two-bytwo matrix. We discuss the types of innovation capabilities necessary within the four different contexts. This novel framework contributes to the

  17. Technological Capability's Predictor Variables

    Directory of Open Access Journals (Sweden)

    Fernanda Maciel Reichert

    2011-03-01

    Full Text Available The aim of this study was to identify the factors that influence in configuration of the technological capability of companies in sectors with medium-low technological intensity. To achieve the goal proposed in this article a survey was carried out. Based on the framework developed by Lall (1992 which classifies firms in basic, intermediate and advanced level of technological capability; it was found that the predominant technological capability is intermediate, with 83.7% of respondent companies (plastics companies in Brazil. It is believed that the main contribution of this study is the finding that the dependent variable named “Technological Capability” can be explained at a rate of 65% by six variables: development of new processes; selection of the best equipment supplier; sales of internally developed new technology to third parties; design and manufacture of equipment; study of the work methods and perform inventory control; and improvement of product quality.

  18. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  19. Capabilities for innovation

    DEFF Research Database (Denmark)

    Nielsen, Peter; Nielsen, Rene Nesgaard; Bamberger, Simon Grandjean

    2012-01-01

    is a survey that collected information from 601 firms belonging to the private urban sector in Denmark. The survey was carried out in late 2010. Keywords: dynamic capabilities/innovation/globalization/employee/employer cooperation/Nordic model Acknowledgment: The GOPA study was financed by grant 20080053113......Technological developments combined with increasing levels of competition related to the ongoing globalization imply that firms find themselves in dynamic, changing environments that call for dynamic capabilities. This challenges the internal human and organizational resources of firms in general...

  20. Human push capability.

    Science.gov (United States)

    Barnett, Ralph L; Liber, Theodore

    2006-02-22

    Use of unassisted human push capability arises from time to time in the areas of crowd and animal control, the security of locked doors, the integrity of railings, the removal of tree stumps and entrenched vehicles, the manoeuvering of furniture, and athletic pursuits such as US football or wrestling. Depending on the scenario, human push capability involves strength, weight, weight distribution, push angle, footwear/floor friction, and the friction between the upper body and the pushed object. Simple models are used to establish the relationships among these factors.

  1. The Capability Approach

    OpenAIRE

    Robeyns, Ingrid

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary moral importance, and second, that freedom to achieve well-being is to be understood in terms of people’s capabilities, that is, their real opportunities to do and be what they have reason to value. Thi...

  2. Sandia QIS Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.

  3. Building Airport Surface HITL Simulation Capability

    Science.gov (United States)

    Chinn, Fay Cherie

    2016-01-01

    FutureFlight Central is a high fidelity, real-time simulator designed to study surface operations and automation. As an air traffic control tower simulator, FFC allows stakeholders such as the FAA, controllers, pilots, airports, and airlines to develop and test advanced surface and terminal area concepts and automation including NextGen and beyond automation concepts and tools. These technologies will improve the safety, capacity and environmental issues facing the National Airspace system. FFC also has extensive video streaming capabilities, which combined with the 3-D database capability makes the facility ideal for any research needing an immersive virtual and or video environment. FutureFlight Central allows human in the loop testing which accommodates human interactions and errors giving a more complete picture than fast time simulations. This presentation describes FFCs capabilities and the components necessary to build an airport surface human in the loop simulation capability.

  4. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  5. Technological Innovation Capabilities and Firm Performance

    OpenAIRE

    Richard C.M. Yam; William Lo; Esther P.Y. Tang; Antonio; K.W. Lau

    2010-01-01

    Technological innovation capability (TIC) is defined as a comprehensive set of characteristics of a firm that facilities and supports its technological innovation strategies. An audit to evaluate the TICs of a firm may trigger improvement in its future practices. Such an audit can be used by the firm for self assessment or third-party independent assessment to identify problems of its capability status. This paper attempts to develop such an auditing framework that can...

  6. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  7. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  8. ISOPHOT - Capabilities and performance

    DEFF Research Database (Denmark)

    Lemke, D.; Klaas, U.; Abolins, J.

    1996-01-01

    ISOPHOT covers the largest wavelength range on ISO from 2.5 to 240 mu m. Its scientific capabilities include multi filter and multi-aperture photometry, polarimetry, imaging and spectrophotometry. All modes can optionally include a focal plane chopper. The backbone of the photometric calibration...

  9. Capabilities for Intercultural Dialogue

    Science.gov (United States)

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  10. Capabilities and Special Needs

    DEFF Research Database (Denmark)

    Kjeldsen, Christian Christrup

    into international consideration in relation to the implementation of the UN convention on the rights of persons with disabilities. As for the theoretical basis, the research makes use of the sociological open-ended and relational concepts of Pierre Bourdieu and the normative yardstick of the Capability Approach...

  11. Metrology Measurement Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Glen E. Gronniger

    2007-10-02

    This document contains descriptions of Federal Manufacturing & Technologies (FM&T) Metrology capabilities, traceability flow charts, and the measurement uncertainty of each measurement capability. Metrology provides NIST traceable precision measurements or equipment calibration for a wide variety of parameters, ranges, and state-of-the-art uncertainties. Metrology laboratories conform to the requirements of the Department of Energy Development and Production Manual Chapter 13.2, ANSI/ISO/IEC ANSI/ISO/IEC 17025:2005, and ANSI/NCSL Z540-1. FM&T Metrology laboratories are accredited by NVLAP for the parameters, ranges, and uncertainties listed in the specific scope of accreditation under NVLAP Lab code 200108-0. See the Internet at http://ts.nist.gov/Standards/scopes/2001080.pdf. These parameters are summarized. The Honeywell Federal Manufacturing & Technologies (FM&T) Metrology Department has developed measurement technology and calibration capability in four major fields of measurement: (1) Mechanical; (2) Environmental, Gas, Liquid; (3) Electrical (DC, AC, RF/Microwave); and (4) Optical and Radiation. Metrology Engineering provides the expertise to develop measurement capabilities for virtually any type of measurement in the fields listed above. A strong audit function has been developed to provide a means to evaluate the calibration programs of our suppliers and internal calibration organizations. Evaluation includes measurement audits and technical surveys.

  12. The Capability Approach

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary

  13. Sensor Alerting Capability

    Science.gov (United States)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  14. Capitalizing on capabilities.

    Science.gov (United States)

    Ulrich, Dave; Smallwood, Norm

    2004-06-01

    By making the most of organizational capabilities--employees' collective skills and fields of expertise--you can dramatically improve your company's market value. Although there is no magic list of proficiencies that every organization needs in order to succeed, the authors identify 11 intangible assets that well-managed companies tend to have: talent, speed, shared mind-set and coherent brand identity, accountability, collaboration, learning, leadership, customer connectivity, strategic unity, innovation, and efficiency. Such companies typically excel in only three of these capabilities while maintaining industry parity in the other areas. Organizations that fall below the norm in any of the 11 are likely candidates for dysfunction and competitive disadvantage. So you can determine how your company fares in these categories (or others, if the generic list doesn't suit your needs), the authors explain how to conduct a "capabilities audit," describing in particular the experiences and findings of two companies that recently performed such audits. In addition to highlighting which intangible assets are most important given the organization's history and strategy, this exercise will gauge how well your company delivers on its capabilities and will guide you in developing an action plan for improvement. A capabilities audit can work for an entire organization, a business unit, or a region--indeed, for any part of a company that has a strategy to generate financial or customer-related results. It enables executives to assess overall company strengths and weaknesses, senior leaders to define strategy, midlevel managers to execute strategy, and frontline leaders to achieve tactical results. In short, it helps turn intangible assets into concrete strengths.

  15. Recent Developments and Probable Future Scenarios Concerning Seafarer Labour Markets

    DEFF Research Database (Denmark)

    Wagtmann, Maria Anne; Poulsen, René Taudal

    2009-01-01

      During the past 25 years, demand for seafarers has changed a great deal, due to the creation of second registers in Western Europe as well as ship register adjustments in other flag states. Concurrently, supply patterns have shifted, with new supply centres emerging in especially Asia and Easte...

  16. NJOY 99/2001: new capabilities in data processing

    International Nuclear Information System (INIS)

    MacFarlane, Robert E.

    2002-01-01

    The NJOY Nuclear Data Processing System is used all over the world to process evaluated nuclear data in the ENDF format into libraries for applications. Over the last few years, a number of new capabilities have been added to the system to provide advanced features for MCNP, MCNPX, and other applications codes. These include probability tables for unresolved range self shielding, capabilities optimized for high-energy libraries (typically to 150 MeV for accelerator applications), options for detailed treatments of incident and outgoing charged particles, and a capability to handle photonuclear reactions. These new features and recent experience using NJOY99 for library production will be discussed, along with possible future work, such as delayed-neutron processing and capabilities to handle the new generation of photo-atomic, electro-atomic, and atomic-relaxation evaluations now becoming available in ENDF format. The latest version of the code, NJOY 2001, uses modern Fortran90 style, modularization, and memory allocation methods. The Evaluated Nuclear Data Files (ENDF) format has become the standard for representing nuclear data throughout the world, being used in the US ENDF/B libraries, the European JEF libraries, the Japanese JENDL libraries, and many others. At the same time, the NJOY Nuclear Data Processing System, which is used to convert evaluated nuclear data in the ENDF format into data libraries for nuclear applications, has become the method of choice throughout the world. The combination of these modern libraries of evaluated nuclear data and NJOY processing has proved very capable for classical applications in reactor analysis, fusion work, shielding, and criticality safety. However, over the last few years, new applications have appeared that require extended evaluated data and new processing techniques. A good example of this is the interest in accelerator-boosted applications, which has led to the need for data to higher energies, such as 150 Me

  17. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  18. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  19. Atmospheric Release Advisory Capability

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Gudiksen, P.H.; Sullivan, T.J.

    1983-02-01

    The Atmospheric Release Advisory Capability (ARAC) project is a Department of Energy (DOE) sponsored real-time emergency response service available for use by both federal and state agencies in case of a potential or actual atmospheric release of nuclear material. The project, initiated in 1972, is currently evolving from the research and development phase to full operation. Plans are underway to expand the existing capability to continuous operation by 1984 and to establish a National ARAC Center (NARAC) by 1988. This report describes the ARAC system, its utilization during the past two years, and plans for its expansion during the next five to six years. An integral part of this expansion is due to a very important and crucial effort sponsored by the Defense Nuclear Agency to extend the ARAC service to approximately 45 Department of Defense (DOD) sites throughout the continental US over the next three years

  20. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  1. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  2. Group Capability Model

    Science.gov (United States)

    Olejarski, Michael; Appleton, Amy; Deltorchio, Stephen

    2009-01-01

    The Group Capability Model (GCM) is a software tool that allows an organization, from first line management to senior executive, to monitor and track the health (capability) of various groups in performing their contractual obligations. GCM calculates a Group Capability Index (GCI) by comparing actual head counts, certifications, and/or skills within a group. The model can also be used to simulate the effects of employee usage, training, and attrition on the GCI. A universal tool and common method was required due to the high risk of losing skills necessary to complete the Space Shuttle Program and meet the needs of the Constellation Program. During this transition from one space vehicle to another, the uncertainty among the critical skilled workforce is high and attrition has the potential to be unmanageable. GCM allows managers to establish requirements for their group in the form of head counts, certification requirements, or skills requirements. GCM then calculates a Group Capability Index (GCI), where a score of 1 indicates that the group is at the appropriate level; anything less than 1 indicates a potential for improvement. This shows the health of a group, both currently and over time. GCM accepts as input head count, certification needs, critical needs, competency needs, and competency critical needs. In addition, team members are categorized by years of experience, percentage of contribution, ex-members and their skills, availability, function, and in-work requirements. Outputs are several reports, including actual vs. required head count, actual vs. required certificates, CGI change over time (by month), and more. The program stores historical data for summary and historical reporting, which is done via an Excel spreadsheet that is color-coded to show health statistics at a glance. GCM has provided the Shuttle Ground Processing team with a quantifiable, repeatable approach to assessing and managing the skills in their organization. They now have a common

  3. Expeditionary Rubber Removal Capability

    Science.gov (United States)

    2006-12-31

    the modified spray unit or system with equivalent capabilities. 24 25 9.8. A pressure sensor or caster wheels should be incorporated into the...DISCUSSION 18 8.0 CONCLUSIONS 23 9.0 RECOMMENDATIONS 24 APPENDIX A – DETAILED LIST OF EQUIPMENT AND MODIFICATIONS 26 APPENDIX B – LIST OF SOURCES FOR...tall Weight – 4820 lb (No Attachments) Top Speed – 18 mph High Flow Hydraulics (Optional) – 26 gpm Steering – All Wheel Steering Cargo Max Load

  4. Atmospheric release advisory capability

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1981-01-01

    The ARAC system (Atmospheric Release Advisory Capability) is described. The system is a collection of people, computers, computer models, topographic data and meteorological input data that together permits a calculation of, in a quasi-predictive sense, where effluent from an accident will migrate through the atmosphere, where it will be deposited on the ground, and what instantaneous and integrated dose an exposed individual would receive

  5. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  6. Estimation of probability of failure for damage-tolerant aerospace structures

    Science.gov (United States)

    Halbert, Keith

    The majority of aircraft structures are designed to be damage-tolerant such that safe operation can continue in the presence of minor damage. It is necessary to schedule inspections so that minor damage can be found and repaired. It is generally not possible to perform structural inspections prior to every flight. The scheduling is traditionally accomplished through a deterministic set of methods referred to as Damage Tolerance Analysis (DTA). DTA has proven to produce safe aircraft but does not provide estimates of the probability of failure of future flights or the probability of repair of future inspections. Without these estimates maintenance costs cannot be accurately predicted. Also, estimation of failure probabilities is now a regulatory requirement for some aircraft. The set of methods concerned with the probabilistic formulation of this problem are collectively referred to as Probabilistic Damage Tolerance Analysis (PDTA). The goal of PDTA is to control the failure probability while holding maintenance costs to a reasonable level. This work focuses specifically on PDTA for fatigue cracking of metallic aircraft structures. The growth of a crack (or cracks) must be modeled using all available data and engineering knowledge. The length of a crack can be assessed only indirectly through evidence such as non-destructive inspection results, failures or lack of failures, and the observed severity of usage of the structure. The current set of industry PDTA tools are lacking in several ways: they may in some cases yield poor estimates of failure probabilities, they cannot realistically represent the variety of possible failure and maintenance scenarios, and they do not allow for model updates which incorporate observed evidence. A PDTA modeling methodology must be flexible enough to estimate accurately the failure and repair probabilities under a variety of maintenance scenarios, and be capable of incorporating observed evidence as it becomes available. This

  7. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  8. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. The evolution of alliance capabilities

    NARCIS (Netherlands)

    Heimeriks, K.H.; Duysters, G.M.; Vanhaverbeke, W.P.M.

    2004-01-01

    This paper assesses the effectiveness and differential performance effects of learning mechanisms on the evolution of alliance capabilities. Relying on the concept of capability lifecycles, prior research has suggested that different capability levels could be identified in which different

  13. Childcare, Children and Capability

    Science.gov (United States)

    Wright, Hazel R.

    2012-01-01

    Empirical research focused on women studying childcare in an English further education college found the participants strangely contented despite demanding lifestyles. They were intent on integrating their family, work and educational commitments rather than actively seeking future gain, an understanding that led to the development of an original…

  14. Future climate

    International Nuclear Information System (INIS)

    La Croce, A.

    1991-01-01

    According to George Woodwell, founder of the Woods Hole Research Center, due the combustion of fossil fuels, deforestation and accelerated respiration, the net annual increase of carbon, in the form of carbon dioxide, to the 750 billion tonnes already present in the earth's atmosphere, is in the order of 3 to 5 billion tonnes. Around the world, scientists, investigating the probable effects of this increase on the earth's future climate, are now formulating coupled air and ocean current models which take account of water temperature and salinity dependent carbon dioxide exchange mechanisms acting between the atmosphere and deep layers of ocean waters

  15. A probability of synthesis of the superheavy element Z = 124

    Energy Technology Data Exchange (ETDEWEB)

    Manjunatha, H.C. [Government College for Women, Department of Physics, Kolar, Karnataka (India); Sridhar, K.N. [Government First Grade College, Department of Physics, Kolar, Karnataka (India)

    2017-10-15

    We have studied the fusion cross section, evaporation residue cross section, compound nucleus formation probability (P{sub CN}) and survival probability (P{sub sur}) of different projectile target combinations to synthesize the superheavy element Z=124. Hence, we have identified the most probable projectile-target combination to synthesize the superheavy element Z = 124. To synthesize the superheavy element Z=124, the most probable projectile target combinations are Kr+Ra, Ni+Cm, Se+Th, Ge+U and Zn+Pu. We hope that our predictions may be a guide for the future experiments in the synthesis of superheavy nuclei Z = 124. (orig.)

  16. Building Server Capabilities

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    2013-01-01

    Many western companies have moved part of their operations to China in order to take advantage of cheap resources and/or to gain access to a high potential market. Depending on motive, offshore facilities usually start either as “sales-only” of products exported by headquarters or “production......-only”, exporting parts and components back to headquarter for sales in the home country. In the course of time, the role of offshore subsidiaries in a company’s operations network tends to change and, with that, the capabilities, of the subsidiaries. Focusing on Danish subsidiaries in China, the objective...

  17. Building server capabilities

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    Many western companies have moved part of their operations to China in order to take advantage of cheap resources and/or to gain access to a high potential market. Depending on motive, offshore facilities usually start either as “sales-only” of products exported by headquarters or “production......-only”, exporting parts and components back to headquarter for sales in the home country. In the course of time, the role of offshore subsidiaries in a company’s operations network tends to change and, with that, the capabilities, of the subsidiaries. Focusing on Danish subsidiaries in China, the objective...

  18. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  20. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  1. Laboratory microfusion capability study

    International Nuclear Information System (INIS)

    1993-05-01

    The purpose of this study is to elucidate the issues involved in developing a Laboratory Microfusion Capability (LMC) which is the major objective of the Inertial Confinement Fusion (ICF) program within the purview of the Department of Energy's Defense Programs. The study was initiated to support a number of DOE management needs: to provide insight for the evolution of the ICF program; to afford guidance to the ICF laboratories in planning their research and development programs; to inform Congress and others of the details and implications of the LMC; to identify criteria for selection of a concept for the Laboratory Microfusion Facility and to develop a coordinated plan for the realization of an LMC. As originally proposed, the LMC study was divided into two phases. The first phase identifies the purpose and potential utility of the LMC, the regime of its performance parameters, driver independent design issues and requirements, its development goals and requirements, and associated technical, management, staffing, environmental, and other developmental and operational issues. The second phase addresses driver-dependent issues such as specific design, range of performance capabilities, and cost. The study includes four driver options; the neodymium-glass solid state laser, the krypton fluoride excimer gas laser, the light-ion accelerator, and the heavy-ion induction linear accelerator. The results of the Phase II study are described in the present report

  2. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  3. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  4. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  5. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  6. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  7. The collision probability modules of WIMS-E

    International Nuclear Information System (INIS)

    Roth, M.J.

    1985-04-01

    This report describes how flat source first flight collision probabilities are calculated and used in the WIMS-E modular program. It includes a description of the input to the modules W-FLU, W-THES, W-PIP, W-PERS and W-MERGE. Input to other collision probability modules are described in separate reports. WIMS-E is capable of calculating collision probabilities in a wide variety of geometries, some of them quite complicated. It can also use them for a variety of purposes. (author)

  8. Using Porterian Activity Analysis to Understand Organizational Capabilities

    DEFF Research Database (Denmark)

    Sheehan, Norman T.; Foss, Nicolai Juul

    2017-01-01

    conceptualized by Porter’s writings on the activity-based view. Porterian activity analysis is becoming more accepted in the strategy literature, but no strategy scholar has explicitly used Porter’s activities, and particularly his concept of drivers, to understand and analyze organizational capabilities....... Introducing Porterian activities into the discussion of capabilities improves strategy scholars’ understanding of the bases of capability heterogeneity, offers academics future directions for research, and provides managers with guidance to enhance their organizations’ capabilities....

  9. Analytical Chemistry Core Capability Assessment - Preliminary Report

    International Nuclear Information System (INIS)

    Barr, Mary E.; Farish, Thomas J.

    2012-01-01

    useful in defining a roadmap for what future capability needs to look like.

  10. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  12. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  13. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  14. Exploration Medical Capability (ExMC) Projects

    Science.gov (United States)

    Wu, Jimmy; Watkins, Sharmila; Baumann, David

    2010-01-01

    During missions to the Moon or Mars, the crew will need medical capabilities to diagnose and treat disease as well as for maintaining their health. The Exploration Medical Capability Element develops medical technologies, medical informatics, and clinical capabilities for different levels of care during space missions. The work done by team members in this Element is leading edge technology, procedure, and pharmacological development. They develop data systems that protect patient's private medical information, aid in the diagnosis of medical conditions, and act as a repository of relevant NASA life sciences experimental studies. To minimize the medical risks to crew health the physicians and scientists in this Element develop models to quantify the probability of medical events occurring during a mission. They define procedures to treat an ill or injured crew member who does not have access to an emergency room and who must be cared for in a microgravity environment where both liquids and solids behave differently than on Earth. To support the development of these medical capabilities, the Element manages the development of medical technologies that prevent, monitor, diagnose, and treat an ill or injured crewmember. The Exploration Medical Capability Element collaborates with the National Space Biomedical Research Institute (NSBRI), the Department of Defense, other Government-funded agencies, academic institutions, and industry.

  15. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  16. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  17. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  18. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  19. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  20. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  1. Production capability and supply

    International Nuclear Information System (INIS)

    Klemenic, J.

    1977-01-01

    The strong market for uranium of recent years is about to usher in a new era in domestic uranium production. The spot market price of uranium has remained relatively stable at a little over $40/lb for more than 18 months. Many of the recent contracts for delivery in the early 1980s are calling for prices in the range of $40 to $65 per lb in year-of-delivery dollars. Low-grade, high-cost projects, such as uranium recovery from mill tailings and the reopening of ''mined-out'' ore bodies, have already been initiated. New underground mines to produce at greater depths, and new surface mines to recover lower grade ores, are being developed or seriously planned. In keeping with this movement to recover uranium from low-grade ore and other high cost materials, the Grand Junction Office has examined, for the first time, the production capability of the domestic industry assuming a $30/lb (or less) ''forward cost'' resource base. As in the past, keep in mind that the market price needed to stimulate full production of a given resource base may be significantly higher than the estimated forward cost of producing that resource. Results of the $30/lb study are presented

  2. LHC Capabilities for Quarkonia

    CERN Document Server

    Petrushanko, Sergey

    2008-01-01

    The measurement of the charmonium and bottomonium resonances in nucleus-nucleus collisions provides crucial information on high-density QCD matter. First, the suppression of quarkonia production is generally agreed to be one of the most direct probes of quark-gluon plasma formation. The observation of anomalous J/$\\psi$ suppression at the CERN-SPS and at RHIC is well established but the clarification of some important remaining questions requires equivalent studies of the $\\Upsilon$ family, only possible at the LHC energies. Second, the production of heavy-quarks proceeds mainly via gluon-gluon fusion processes and, as such, is sensitive to saturation of the gluon density at low-x in the nucleus. Measured departures from the expected vacuum quarkonia cross-sections in Pb+Pb collisions at the LHC will thus provide valuable information not only on the thermodynamical state of the produced partonic medium, but also on the initial-state modifications of the nuclear parton distribution functions. The capabilities ...

  3. Mobile systems capability plan

    International Nuclear Information System (INIS)

    1996-09-01

    This plan was prepared to initiate contracting for and deployment of these mobile system services. 102,000 cubic meters of retrievable, contact-handled TRU waste are stored at many sites around the country. Also, an estimated 38,000 cubic meters of TRU waste will be generated in the course of waste inventory workoff and continuing DOE operations. All the defense TRU waste is destined for disposal in WIPP near Carlsbad NM. To ship TRU waste there, sites must first certify that the waste meets WIPP waste acceptance criteria. The waste must be characterized, and if not acceptable, subjected to additional processing, including repackaging. Most sites plan to use existing fixed facilities or open new ones between FY1997-2006 to perform these functions; small-quantity sites lack this capability. An alternative to fixed facilities is the use of mobile systems mounted in trailers or skids, and transported to sites. Mobile systems will be used for all characterization and certification at small sites; large sites can also use them. The Carlsbad Area Office plans to pursue a strategy of privatization of mobile system services, since this offers a number of advantages. To indicate the possible magnitude of the costs of deploying mobile systems, preliminary estimates of equipment, maintenance, and operating costs over a 10-year period were prepared and options for purchase, lease, and privatization through fixed-price contracts considered

  4. Strength capability while kneeling.

    Science.gov (United States)

    Haslegrave, C M; Tracy, M F; Corlett, E N

    1997-12-01

    Work sometimes has to be carried out kneeling, particularly where jobs are performed in confined spaces as is common for miners, aircraft baggage handlers and maintenance workers. In order to assess the risks in performing forceful tasks under such conditions, data is needed on strength capabilities of kneeling subjects. A study was undertaken to measure isometric strength in single-handed exertions for male subjects and to investigate the effects on this of task layout factors (direction of force exertion, reach distance, height of the workpiece and orientation relative to the subject's sagittal plane). The data has been tabulated to show the degree to which strength may be reduced in different situations and analysis of the task factors showed their influence to be complex with direction of exertion and reach distance having the greatest effect. The results also suggest that exertions are weaker when subjects are kneeling on two knees than when kneeling on one knee, although this needs to be confirmed by direct experimental comparison.

  5. Thermal disadvantage factor calculation by the multiregion collision probability method

    International Nuclear Information System (INIS)

    Ozgener, B.; Ozgener, H.A.

    2004-01-01

    A multi-region collision probability formulation that is capable of applying white boundary condition directly is presented and applied to thermal neutron transport problems. The disadvantage factors computed are compared with their counterparts calculated by S N methods with both direct and indirect application of white boundary condition. The results of the ABH and collision probability method with indirect application of white boundary condition are also considered and comparisons with benchmark Monte Carlo results are carried out. The studies show that the proposed formulation is capable of calculating thermal disadvantage factor with sufficient accuracy without resorting to the fictitious scattering outer shell approximation associated with the indirect application of the white boundary condition in collision probability solutions

  6. Elastic K-means using posterior probability.

    Science.gov (United States)

    Zheng, Aihua; Jiang, Bo; Li, Yan; Zhang, Xuehan; Ding, Chris

    2017-01-01

    The widely used K-means clustering is a hard clustering algorithm. Here we propose a Elastic K-means clustering model (EKM) using posterior probability with soft capability where each data point can belong to multiple clusters fractionally and show the benefit of proposed Elastic K-means. Furthermore, in many applications, besides vector attributes information, pairwise relations (graph information) are also available. Thus we integrate EKM with Normalized Cut graph clustering into a single clustering formulation. Finally, we provide several useful matrix inequalities which are useful for matrix formulations of learning models. Based on these results, we prove the correctness and the convergence of EKM algorithms. Experimental results on six benchmark datasets demonstrate the effectiveness of proposed EKM and its integrated model.

  7. Convergence of Transition Probability Matrix in CLVMarkov Models

    Science.gov (United States)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  8. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  9. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  10. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  11. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  12. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  13. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  14. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  15. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  16. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  17. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  18. Commercially Available Low Probability of Intercept Radars and Non-Cooperative ELINT Receiver Capabilities

    Science.gov (United States)

    2014-09-01

    3D Antenna Gain 0 dB Azimuth Accuracy 20°/quadrant 52 51. ELT/750 Receiver Figure 51: ELT/750 Receiver/processor...71]) The Itata ELINT system has been developed by Desarrollo de Tecnologia y Sistemas (DTS) Ltd. and is a high-sensitivity electronic... 3D Long Range Surveillance Radar. [Online]. Available: https://www.thalesgroup.com/en/worldwide/defence/smart-l- 3d - long-range-surveillance-radar

  19. The new MCNP6 depletion capability

    International Nuclear Information System (INIS)

    Fensin, M. L.; James, M. R.; Hendricks, J. S.; Goorley, J. T.

    2012-01-01

    The first MCNP based in-line Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology. (authors)

  20. The New MCNP6 Depletion Capability

    International Nuclear Information System (INIS)

    Fensin, Michael Lorne; James, Michael R.; Hendricks, John S.; Goorley, John T.

    2012-01-01

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  1. Traffic simulation based ship collision probability modeling

    Energy Technology Data Exchange (ETDEWEB)

    Goerlandt, Floris, E-mail: floris.goerlandt@tkk.f [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland); Kujala, Pentti [Aalto University, School of Science and Technology, Department of Applied Mechanics, Marine Technology, P.O. Box 15300, FI-00076 AALTO, Espoo (Finland)

    2011-01-15

    Maritime traffic poses various risks in terms of human, environmental and economic loss. In a risk analysis of ship collisions, it is important to get a reasonable estimate for the probability of such accidents and the consequences they lead to. In this paper, a method is proposed to assess the probability of vessels colliding with each other. The method is capable of determining the expected number of accidents, the locations where and the time when they are most likely to occur, while providing input for models concerned with the expected consequences. At the basis of the collision detection algorithm lays an extensive time domain micro-simulation of vessel traffic in the given area. The Monte Carlo simulation technique is applied to obtain a meaningful prediction of the relevant factors of the collision events. Data obtained through the Automatic Identification System is analyzed in detail to obtain realistic input data for the traffic simulation: traffic routes, the number of vessels on each route, the ship departure times, main dimensions and sailing speed. The results obtained by the proposed method for the studied case of the Gulf of Finland are presented, showing reasonable agreement with registered accident and near-miss data.

  2. Airlift Capabilities for Future U.S. Counterinsurgency Operations

    National Research Council Canada - National Science Library

    Owen, Robert C; Mueller, Karl P

    2007-01-01

    .... Department of Defense (DoD), and elsewhere in the U.S. Government. Although ongoing operations in Afghanistan and Iraq give particular immediacy to the problem, the challenge of combating insurgencies extends well beyond these specific conflicts...

  3. Defining an Approach for Future Close Air Support Capability

    Science.gov (United States)

    2017-01-01

    report, and Laura Novacic, who formatted the draft. Finally, we thank our reviewers, Carl Rhodes and David Johnson, who worked through many versions of...Operation Allied Force,” Air Power Australia website, June 14, 2009. As of March 27, 2014: http://www.ausairpower.net/ APA -2009-04.html Bechhusen, Robert...Australia website, June 2011. As of October 15, 2015: http://www.ausairpower.net/ APA -Rus-SAM-AAA-Footage.html Lamonthe, Dan, “Report: Army Denied Aid to

  4. High temperature combustion facility: present capabilities and future prospects

    International Nuclear Information System (INIS)

    Boccio, J.L.; Ginsberg, T.; Ciccarelli, G.

    1995-01-01

    The high-temperature combustion facility constructed and operated by the Department of Advanced Technology of Brookhaven National Laboratory to support and promote research in the area of hydrogen combustion phenomena in mixtures prototypical to light-water reactor containment atmospheres under potential severe accident conditions is reported. The facility can accommodate combustion research activities encompassing the fields of detonation physics, flame acceleration, and low-speed deflagration in a wide range of combustible gas mixtures at initial temperatures up to 700 K and post-combustion pressures up to 100 atmospheres. Some preliminary test results are presented that provide further evidence that the effect of temperature is to increase the sensitivity of hydrogen-air-steam mixtures to undergo detonation [ru

  5. Sandia Laboratories technical capabilities. Auxiliary capabilities: environmental health information science

    International Nuclear Information System (INIS)

    1975-09-01

    Sandia Laboratories is an engineering laboratory in which research, development, testing, and evaluation capabilities are integrated by program management for the generation of advanced designs. In fulfilling its primary responsibility to ERDA, Sandia Laboratories has acquired extensive research and development capabilities. The purpose of this series of documents is to catalog the many technical capabilities of the Laboratories. After the listing of capabilities, supporting information is provided in the form of highlights, which show applications. This document deals with auxiliary capabilities, in particular, environmental health and information science. (11 figures, 1 table) (RWR)

  6. Sandia Laboratories technical capabilities: testing

    International Nuclear Information System (INIS)

    Lundergan, C.D.

    1975-12-01

    The testing capabilities at Sandia Laboratories are characterized. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs

  7. Sandia Laboratories technical capabilities: electronics

    International Nuclear Information System (INIS)

    Lundergan, C.D.

    1975-12-01

    This report characterizes the electronics capabilities at Sandia Laboratories. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs

  8. OPSAID improvements and capabilities report.

    Energy Technology Data Exchange (ETDEWEB)

    Halbgewachs, Ronald D.; Chavez, Adrian R.

    2011-08-01

    Process Control System (PCS) and Industrial Control System (ICS) security is critical to our national security. But there are a number of technological, economic, and educational impediments to PCS owners implementing effective security on their systems. Sandia National Laboratories has performed the research and development of the OPSAID (Open PCS Security Architecture for Interoperable Design), a project sponsored by the US Department of Energy Office of Electricity Delivery and Energy Reliability (DOE/OE), to address this issue. OPSAID is an open-source architecture for PCS/ICS security that provides a design basis for vendors to build add-on security devices for legacy systems, while providing a path forward for the development of inherently-secure PCS elements in the future. Using standardized hardware, a proof-of-concept prototype system was also developed. This report describes the improvements and capabilities that have been added to OPSAID since an initial report was released. Testing and validation of this architecture has been conducted in another project, Lemnos Interoperable Security Project, sponsored by DOE/OE and managed by the National Energy Technology Laboratory (NETL).

  9. Structural Capability of an Organization toward Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    The scholars in the field of strategic management have developed two major approaches for attainment of competitive advantage: an approach based on environmental opportunities, and another one based on internal capabilities of an organization. Some investigations in the last two decades have...... indicated that the advantages relying on the internal capabilities of organizations may determine the competitive position of organizations better than environmental opportunities do. Characteristics of firms shows that one of the most internal capabilities that lead the organizations to the strongest...... competitive advantage in the organizations is the innovation capability. The innovation capability is associated with other organizational capabilities, and many organizations have focused on the need to identify innovation capabilities.This research focuses on recognition of the structural aspect...

  10. The Capability to Hold Property

    NARCIS (Netherlands)

    Claassen, Rutger

    2015-01-01

    This paper discusses the question of whether a capability theory of justice (such as that of Martha Nussbaum) should accept a basic “capability to hold property.” Answering this question is vital for bridging the gap between abstract capability theories of justice and their institutional

  11. Personality Assessment: A Competency-Capability Perspective.

    Science.gov (United States)

    Kaslow, Nadine J; Finklea, J Tyler; Chan, Ginny

    2018-01-01

    This article begins by reviewing the proficiency of personality assessment in the context of the competencies movement, which has dominated health service psychology in recent years. It examines the value of including a capability framework for advancing this proficiency and enhancing the quality of personality assessments, including Therapeutic Assessment (Finn & Tonsager, 1997 ), that include a personality assessment component. This hybrid competency-capability framework is used to set the stage for the conduct of personality assessments in a variety of contexts and for the optimal training of personality assessment. Future directions are offered in terms of ways psychologists can strengthen their social contract with the public and offer a broader array of personality assessments in more diverse contexts and by individuals who are both competent and capable.

  12. Production capability: ERDA methods and results

    International Nuclear Information System (INIS)

    Klemenic, J.

    1977-01-01

    Production centers are categorized into four classes, according to the relative certainty of future production. A ''forward cost'' basis is used to establish both the resource base and to define the acceptable production centers. The first phase of the work is called the ''Could'' capability. Resources are assigned to existing production centers, or new production centers are postulated based on adequate resources to support a mill for a reasonable economic life. A production schedule is developed for each center. The last step in the ''Could'' study is to aggregate the capital and operating costs. The final step in the Production Capability study is the rescheduling of the production from the ''Could'' to produce only sufficient U concentrate to meet the feed requirements of enrichment facilities operated at the announced transaction tails assay plans. The optimized production schedules are called the ''Need'' production capability. A separate study was also performed of industry production plans. 4 tables, 7 figs

  13. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  14. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  15. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  16. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  17. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  18. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  19. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  20. Financial Capability:New Evidence for Ireland

    OpenAIRE

    Keeney, Mary J.; O’Donnell, Nuala

    2009-01-01

    Recent increases in financial innovation, particularly in the Anglo-Saxon banking culture, have seen a considerable growth in the amount of financial products available to the general public. Simultaneously, many workers are increasingly assuming responsibility for planning for their future pensions. This allied to increased life expectancy necessitates a greater degree of financial capability amongst the general public. This study has empirically examined this issue for the first time in an ...

  1. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  2. LOGISTIC REGRESSION AS A TOOL FOR DETERMINATION OF THE PROBABILITY OF DEFAULT FOR ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Erika SPUCHLAKOVA

    2017-12-01

    Full Text Available In a rapidly changing world it is necessary to adapt to new conditions. From a day to day approaches can vary. For the proper management of the company it is essential to know the financial situation. Assessment of the company financial health can be carried out by financial analysis which provides a number of methods how to evaluate the company financial health. Analysis indicators are often included in the company assessment, in obtaining bank loans and other financial resources to ensure the functioning of the company. As company focuses on the future and its planning, it is essential to forecast the future financial situation. According to the results of company´s financial health prediction, the company decides on the extension or limitation of its business. It depends mainly on the capabilities of company´s management how they will use information obtained from financial analysis in practice. The findings of logistic regression methods were published firstly in the 60s, as an alternative to the least squares method. The essence of logistic regression is to determine the relationship between being explained (dependent variable and explanatory (independent variables. The basic principle of this static method is based on the regression analysis, but unlike linear regression, it can predict the probability of a phenomenon that has occurred or not. The aim of this paper is to determine the probability of bankruptcy enterprises.

  3. Transforming organizational capabilities in strategizing

    DEFF Research Database (Denmark)

    Jørgensen, Claus; Friis, Ole Uhrskov; Koch, Christian

    2014-01-01

    Offshored and networked enterprises are becoming an important if not leading organizational form and this development seriously challenges their organizational capabilities. More specifically, over the last years, SMEs have commenced entering these kinds of arrangements. As the organizational...... capabilities of SMEs are limited at the outset, even more emphasis is needed regarding the issues of developing relevant organizational capabilities. This paper aims at investigating how capabilities evolve during an offshoring process of more than 5 years in two Danish SMEs, i.e. not only short- but long......-term evolvements within the companies. We develop our framework of understanding organizational capabilities drawing on dynamic capability, relational capability and strategy as practice concepts, appreciating the performative aspects of developing new routines. Our two cases are taken from one author’s Ph...

  4. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  5. Impact of Personnel Capabilities on Organizational Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    in this rapidly changing world. This research focuses on definition of the personnel aspect of innovation capability, and proposes a conceptual model based on the scientific articles of academic literature on organisations innovation capability. This paper includes an expert based validation in three rounds...... of the Delphi method. And for the purpose of a better appreciation of the relationship dominating the factors of the model, it has distributed the questionnaire to Iranian companies in the Food industry. This research proposed a direct relationship between Innovation Capability and the Personnel Capability...

  6. The United States should forego a damage-limitation capability against China

    Science.gov (United States)

    Glaser, Charles L.

    2017-11-01

    Bottom Lines • THE KEY STRATEGIC NUCLEAR CHOICE. Whether to attempt to preserve its damage-limitation capability against China is the key strategic nuclear choice facing the United States. The answer is much less clear-cut than when the United States faced the Soviet Union during the Cold War. • FEASIBILITY OF DAMAGE LIMITATION. Although technology has advanced significantly over the past three decades, future military competition between the U.S. and Chinese forces will favor large-scale nuclear retaliation over significant damage limitation. • BENEFITS AND RISKS OF A DAMAGE-LIMITATION CAPABILITY. The benefits provided by a modest damage-limitation capability would be small, because the United States can meet its most important regional deterrent requirements without one. In comparison, the risks, which include an increased probability of accidental and unauthorized Chinese attacks, as well as strained U.S.—China relations, would be large. • FOREGO DAMAGE LIMITATION. These twin findings—the poor prospects for prevailing in the military competition, and the small benefits and likely overall decrease in U.S. security—call for a U.S. policy that foregoes efforts to preserve or enhance its damage-limitation capability.

  7. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  8. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  9. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  10. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  11. Probability and uncertainty in nuclear safety decisions

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1986-01-01

    In this paper, we examine some problems posed by the use of probabilities in Nuclear Safety decisions. We discuss some of the theoretical difficulties due to the collective nature of regulatory decisions, and, in particular, the calibration and the aggregation of risk information (e.g., experts opinions). We argue that, if one chooses numerical safety goals as a regulatory basis, one can reduce the constraints to an individual safety goal and a cost-benefit criterion. We show the relevance of risk uncertainties in this kind of regulatory framework. We conclude that, whereas expected values of future failure frequencies are adequate to show compliance with economic constraints, the use of a fractile (e.g., 95%) to be specified by the regulatory agency is justified to treat hazard uncertainties for the individual safety goal. (orig.)

  12. PROBABLE FORECASTING IN THE COURSE OF INTERPRETING

    Directory of Open Access Journals (Sweden)

    Ye. B. Kagan

    2017-01-01

    Full Text Available Introduction. Translation practice has a heuristic nature and involves cognitive structures of consciousness of any interpreter. When preparing translators, special attention is paid to the development of their skill of probable forecasting.The aim of the present publication is to understand the process of anticipation from the position of the cognitive model of translation, development of exercises aimed at the development of prognostic abilities of students and interpreters when working with newspaper articles, containing metaphorical headlines.Methodology and research methods. The study is based on the competence approach to the training of students-translators, the complex of interrelated scientific methods, the main of which is the psycholinguistic experiment. With the use of quantitative data the features of the perception of newspaper texts on their metaphorical titles are characterized.Results and scientific novelty. On the basis of the conducted experiment to predict the content of newspaper articles with metaphorical headlines it is concluded that the main condition of predictability is the expectation. Probable forecasting as a professional competence of a future translator is formed in the process of training activities by integrating efforts of various departments of any language university. Specific exercises for the development of anticipation of students while studying the course of translation and interpretation are offered.Practical significance. The results of the study can be used by foreign language teachers of both language and non-language universities in teaching students of different specialties to translate foreign texts. 

  13. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  14. People Capability Maturity Model. SM.

    Science.gov (United States)

    1995-09-01

    tailored so it consumes less time and resources than a traditional software process assessment or CMU/SEI-95-MM-02 People Capability Maturity Model...improved reputation or customer loyalty. CMU/SEI-95-MM-02 People Capability Maturity Model ■ L5-17 Coaching Level 5: Optimizing Activity 1...Maturity Model CMU/SEI-95-MM-62 Carnegie-Mellon University Software Engineering Institute DTIC ELECTE OCT 2 7 1995 People Capability Maturity

  15. Capable design or designing capabilities? An exploration of service design as an emerging organizational capability in Telenor – Martinkenaite

    Directory of Open Access Journals (Sweden)

    Ieva Martinkenaite

    2017-01-01

    Full Text Available This empirical paper examines a process, starting with the managerial decision to make service design an organizational capability, and follows it as it unfolds over time within one organization. Service design has become an established business practice of how firms create new products and services to promote differentiation in an increasingly uncertain business landscape. Implicit in the literature on service design are assumptions about strategic implications of adopting the prescribed innovation methods and tools. However, little is known about how service design evolves into an organizational capability enabling firms to transform their existing businesses and sustain competitiveness. Through a longitudinal, exploratory case study of service design practices in one of the world’s largest telecommunications companies, we explicate mechanisms through which service design evolves into an organizational capability by exploring the research question: what are the mechanisms through which service design develops into an organizational capability? Our study reveals the effect of an initial introduction of service design tools, identification of boundaryspanning actors and co-alignment of dedicated resources between internal functions, as well as through co-creation with customers. Over time, these activities lead to the adoption of service design practices, and subsequently these practices spark incremental learning throughout the organization, alter managerial decisions and influence multiple paths for the development of new capabilities. Reporting on this process, we are able to describe how service design practices were disseminated and institutionalized within the organization we observed. This study thus contributes by informing how service design can evolve into an organizational capability, as well as by bridging the emerging literature on service design and design thinking with established strategy theory. Further research will have to

  16. Focus in High School Mathematics: Statistics and Probability

    Science.gov (United States)

    National Council of Teachers of Mathematics, 2009

    2009-01-01

    Reasoning about and making sense of statistics and probability are essential to students' future success. This volume belongs to a series that supports National Council of Teachers of Mathematics' (NCTM's) "Focus in High School Mathematics: Reasoning and Sense Making" by providing additional guidance for making reasoning and sense making part of…

  17. Measures, Probability and Holography in Cosmology

    Science.gov (United States)

    Phillips, Daniel

    This dissertation compiles four research projects on predicting values for cosmological parameters and models of the universe on the broadest scale. The first examines the Causal Entropic Principle (CEP) in inhomogeneous cosmologies. The CEP aims to predict the unexpectedly small value of the cosmological constant Lambda using a weighting by entropy increase on causal diamonds. The original work assumed a purely isotropic and homogeneous cosmology. But even the level of inhomogeneity observed in our universe forces reconsideration of certain arguments about entropy production. In particular, we must consider an ensemble of causal diamonds associated with each background cosmology and we can no longer immediately discard entropy production in the far future of the universe. Depending on our choices for a probability measure and our treatment of black hole evaporation, the prediction for Lambda may be left intact or dramatically altered. The second related project extends the CEP to universes with curvature. We have found that curvature values larger than rho k = 40rhom are disfavored by more than $99.99% and a peak value at rhoLambda = 7.9 x 10-123 and rhok =4.3rho m for open universes. For universes that allow only positive curvature or both positive and negative curvature, we find a correlation between curvature and dark energy that leads to an extended region of preferred values. Our universe is found to be disfavored to an extent depending the priors on curvature. We also provide a comparison to previous anthropic constraints on open universes and discuss future directions for this work. The third project examines how cosmologists should formulate basic questions of probability. We argue using simple models that all successful practical uses of probabilities originate in quantum fluctuations in the microscopic physical world around us, often propagated to macroscopic scales. Thus we claim there is no physically verified fully classical theory of probability. We

  18. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  19. Technological Dynamics and Social Capability

    DEFF Research Database (Denmark)

    Fagerberg, Jan; Feldman, Maryann; Srholec, Martin

    2014-01-01

    for the sample as a whole between 1998 and 2008. The results indicate that social capabilities, such as well-developed public knowledge infrastructure, an egalitarian distribution of income, a participatory democracy and prevalence of public safety condition the growth of technological capabilities. Possible...

  20. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  1. Developing Collaborative Product Development Capabilities

    DEFF Research Database (Denmark)

    Mahnke, Volker; Tran, Yen

    2012-01-01

    innovation strategies’. Our analyses suggest that developing such collaboration capabilities benefits from the search for complementary practices, the combination of learning styles, and the development of weak and strong ties. Results also underscore the crucial importance of co-evolution of multi......Collaborative product development capabilities support a company’s product innovation activities. In the context of the fast fashion sector, this paper examines the development of the product development capabilities (PDC) that align product development capabilities in a dual innovation context......, one, slow paced, where the firm is well established and the other, fast paced, which represents a new competitive arena in which the company competes. To understand the process associated with collaborative capability development, we studied three Scandinavian fashion companies pursuing ‘dual...

  2. Marketing Capability in Strategy Research

    DEFF Research Database (Denmark)

    Ritter, Thomas; Distel, Andreas Philipp

    Following the call for a demand-side perspective of strategic management (e.g., Priem et al., 2012), a firm’s marketing capability, i.e. its ability to interact with down-stream stakeholders, becomes a pivotal element in explaining a firm’s competitiveness. While marketing capability is recognized...... in the strategic management literature as an important driver of firm performance, our review of 86 articles reveals a lack of a generally accepted definition of marketing capability, a lack of a common conceptualization as well as differences in the measurement of marketing capability. In order to build a common...... ground for advancing marketing capability research and thus supporting the demand-side perspective in strategic management, we develop an integrative framework to explain the differences and propose a research agenda for developing the field....

  3. Future food.

    Science.gov (United States)

    Wahlqvist, Mark L

    2016-12-01

    Food systems have changed markedly with human settlement and agriculture, industrialisation, trade, migration and now the digital age. Throughout these transitions, there has been a progressive population explosion and net ecosystem loss and degradation. Climate change now gathers pace, exacerbated by ecological dysfunction. Our health status has been challenged by a developing people-environment mismatch. We have regarded ecological conquest and innovative technology as solutions, but have not understood how ecologically dependent and integrated we are. We are ecological creatures interfaced by our sensoriness, microbiomes, shared regulatory (endocrine) mechanisms, immune system, biorhythms and nutritional pathways. Many of us are 'nature-deprived'. We now suffer what might be termed ecological health disorders (EHD). If there were less of us, nature's resilience might cope, but more than 9 billion people by 2050 is probably an intolerable demand on the planet. Future food must increasingly take into account the pressures on ecosystem-dependent food systems, with foods probably less biodiverse, although eating in this way allows optimal health; energy dysequilibrium with less physical activity and foods inappropriately energy dense; and less socially-conducive food habits. 'Personalised Nutrition', with extensive and resource-demanding nutrigenomic, metabolomic and microbiomic data may provide partial health solutions in clinical settings, but not be justified for ethical, risk management or sustainability reasons in public health. The globally prevalent multidimensional malnutritional problems of food insecurity, quality and equity require local, regional and global action to prevent further ecosystem degradation as well as to educate, provide sustainable livelihoods and encourage respectful social discourse and practice about the role of food.

  4. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  5. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  6. Capabilities and Incapabilities of the Capabilities Approach to Health Justice.

    Science.gov (United States)

    Selgelid, Michael J

    2016-01-01

    This first part of this article critiques Sridhar Venkatapuram's conception of health as a capability. It argues that Venkatapuram relies on the problematic concept of dignity, implies that those who are unhealthy lack lives worthy of dignity (which seems politically incorrect), sets a low bar for health, appeals to metaphysically problematic thresholds, fails to draw clear connections between appealed-to capabilities and health, and downplays the importance/relevance of health functioning. It concludes by questioning whether justice entitlements should pertain to the capability for health versus health achievements, challenging Venkatapuram's claims about the strength of health entitlements, and demonstrating that the capabilities approach is unnecessary to address social determinants of health. © 2016 John Wiley & Sons Ltd.

  7. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  8. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  9. The future of Plowshare

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, John S [Division of Peaceful Nuclear Explosives, U.S. Atomic Energy Commission (United States)

    1970-05-01

    Since the last general symposium on Plowshare in 1964, significant progress has been made 1) in improving our understanding of explosion phenomenology, 2) in developing suitable explosive designs, and 3) in applying the technology to specific applications in the industrial, public works and scientific areas. The papers to be presented at this symposium will discuss in depth the progress that has been made in each of these areas, and to some degree, what still remains to be accomplished, so I will not attempt to go into detail here. However, I would like to take a few minutes to summarize where the technology stands today, where we believe it is going, and most importantly, how we hope to get there. In the excavation area, both Cabriolet and Schooner extended cratering experience in hard rock to higher yields. We also conducted Project Buggy, the first nuclear row-charge experiment. Buggy involved the simultaneous detonation of five 1.1 kiloton nuclear explosives, spaced 150 feet apart at a depth of 135 feet. The explosion created a smooth channel about 865 feet long, 254 feet wide and 70 feet deep. Two very significant contributions from Buggy were information on spacing between the explosives and on lip height. Buggy demonstrated that explosives can probably be spaced somewhat farther apart than previously thought without significantly affecting the smoothness of the channel. This could result in considerable savings in future row-charge excavations. We were also particularly pleased that, as predicted, the height of the lips at the end of the ditch was less than half the height of the lips on the sides - some 14 feet versus 41 feet. This is extremely important for the connecting of ditches. The data obtained from Buggy, Schooner and other experiments have been used to extend and refine our predictive capability.

  10. The future of Plowshare

    International Nuclear Information System (INIS)

    Kelly, John S.

    1970-01-01

    Since the last general symposium on Plowshare in 1964, significant progress has been made 1) in improving our understanding of explosion phenomenology, 2) in developing suitable explosive designs, and 3) in applying the technology to specific applications in the industrial, public works and scientific areas. The papers to be presented at this symposium will discuss in depth the progress that has been made in each of these areas, and to some degree, what still remains to be accomplished, so I will not attempt to go into detail here. However, I would like to take a few minutes to summarize where the technology stands today, where we believe it is going, and most importantly, how we hope to get there. In the excavation area, both Cabriolet and Schooner extended cratering experience in hard rock to higher yields. We also conducted Project Buggy, the first nuclear row-charge experiment. Buggy involved the simultaneous detonation of five 1.1 kiloton nuclear explosives, spaced 150 feet apart at a depth of 135 feet. The explosion created a smooth channel about 865 feet long, 254 feet wide and 70 feet deep. Two very significant contributions from Buggy were information on spacing between the explosives and on lip height. Buggy demonstrated that explosives can probably be spaced somewhat farther apart than previously thought without significantly affecting the smoothness of the channel. This could result in considerable savings in future row-charge excavations. We were also particularly pleased that, as predicted, the height of the lips at the end of the ditch was less than half the height of the lips on the sides - some 14 feet versus 41 feet. This is extremely important for the connecting of ditches. The data obtained from Buggy, Schooner and other experiments have been used to extend and refine our predictive capability

  11. Vulnerability assessment: Determining probabilities of neutralization of adversaries

    International Nuclear Information System (INIS)

    Graves, B.R.

    1987-01-01

    The Security Manager charged with the responsibility of designing Safeguards and Security Systems at Department of Energy facilities must take many factors into consideration. There must be a clear understanding, supported by documented guidance, of the level of threat to be addressed; the nature of the facility to be protected, and the funds available to design, implement, and maintain the Safeguards and Security System. Armed with these prerequisites, the Security Manager may then determine the characteristics of the Safeguards measures and security forces necessary to protect the facility. Security forces selection and training programs may then be established based on realistic facility needs. The next step is to attempt to determine the probability of security forces winning in a confrontation with adversaries. To determine the probability of success the Security Manager must consider the characteristics of the facility and surrounding area; the characteristics of the security forces and safeguards system at the facility; the response time and capabilities of the augmentation forces and the characteristics and capabilities of the adversary threat level to be addressed. Obviously, the Safeguards and Security Systems must initially address ''worst case'' scenarios consistent with stated guidelines. Validation of the assessment of the Safeguards and Security Systems must then be determined by simulation testing of the capabilities of the response forces against the capabilities of the adversary

  12. Indigenous Technological Innovation : Capability and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Indigenous Technological Innovation : Capability and Competitiveness in China's ... IDRC and key partners will showcase critical work on adaptation and ... Call for new OWSD Fellowships for Early Career Women Scientists now open.

  13. Earth Science Capability Demonstration Project

    Science.gov (United States)

    Cobleigh, Brent

    2006-01-01

    A viewgraph presentation reviewing the Earth Science Capability Demonstration Project is shown. The contents include: 1) ESCD Project; 2) Available Flight Assets; 3) Ikhana Procurement; 4) GCS Layout; 5) Baseline Predator B Architecture; 6) Ikhana Architecture; 7) UAV Capability Assessment; 8) The Big Picture; 9) NASA/NOAA UAV Demo (5/05 to 9/05); 10) NASA/USFS Western States Fire Mission (8/06); and 11) Suborbital Telepresence.

  14. Recent Investments by NASA's National Force Measurement Technology Capability

    Science.gov (United States)

    Commo, Sean A.; Ponder, Jonathan D.

    2016-01-01

    The National Force Measurement Technology Capability (NFMTC) is a nationwide partnership established in 2008 and sponsored by NASA's Aeronautics Evaluation and Test Capabilities (AETC) project to maintain and further develop force measurement capabilities. The NFMTC focuses on force measurement in wind tunnels and provides operational support in addition to conducting balance research. Based on force measurement capability challenges, strategic investments into research tasks are designed to meet the experimental requirements of current and future aerospace research programs and projects. This paper highlights recent and force measurement investments into several areas including recapitalizing the strain-gage balance inventory, developing balance best practices, improving calibration and facility capabilities, and researching potential technologies to advance balance capabilities.

  15. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  16. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  17. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  18. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  19. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  20. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  1. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  2. REDUCTIONS WITHOUT REGRET: DEFINING THE NEEDED CAPABILITIES

    Energy Technology Data Exchange (ETDEWEB)

    Swegle, J.; Tincher, D.

    2013-09-10

    This is the second of three papers (in addition to an introductory summary) aimed at providing a framework for evaluating future reductions or modifications of the U.S. nuclear force, first by considering previous instances in which nuclear-force capabilities were eliminated; second by looking forward into at least the foreseeable future at the features of global and regional deterrence (recognizing that new weapon systems currently projected will have expected lifetimes stretching beyond our ability to predict the future); and third by providing examples of past or possible undesirable outcomes in the shaping of the future nuclear force, as well as some closing thoughts for the future. This paper begins with a discussion of the current nuclear force and the plans and procurement programs for the modernization of that force. Current weapon systems and warheads were conceived and built decades ago, and procurement programs have begun for the modernization or replacement of major elements of the nuclear force: the heavy bomber, the air-launched cruise missile, the ICBMs, and the ballistic-missile submarines. In addition, the Nuclear Weapons Council has approved a new framework for nuclear-warhead life extension not fully fleshed out yet that aims to reduce the current number of nuclear explosives from seven to five, the so-called 3+2 vision. This vision includes three interoperable warheads for both ICBMs and SLBMs (thus eliminating one backup weapon) and two warheads for aircraft delivery (one gravity bomb and one cruise-missile, eliminating a second backup gravity bomb). This paper also includes a discussion of the current and near-term nuclear-deterrence mission, both global and regional, and offers some observations on future of the strategic deterrence mission and the challenges of regional and extended nuclear deterrence.

  3. NASA DOE POD NDE Capabilities Data Book

    Science.gov (United States)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  4. Probability judgments under ambiguity and conflict.

    Science.gov (United States)

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.

  5. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  6. Technological Capability and Firm Performance

    Directory of Open Access Journals (Sweden)

    Fernanda Maciel Reichert

    2014-08-01

    Full Text Available This research aims to investigate the relationship between investments in technological capability and economic performance in Brazilian firms. Based on economic development theory and on developed countries history, it is assumed that this relationship is positive. Through key indicators, 133 Brazilian firms have been analyzed. Given the economic circumstances of an emerging economy, which the majority of businesses are primarily based on low and medium-low-technology industries, it is not possible to affirm the existence of a positive relation between technological capability and firm performance. There are other elements that allow firms to achieve such results. Firms of lower technological intensity industries performed above average in the economic performance indicators, adversely, they invested below average in technological capability. These findings do not diminish the merit of firms’ and country’s success. They in fact confirm a historical tradition of a country that concentrates its efforts on basic industries.

  7. Functional capability of piping systems

    International Nuclear Information System (INIS)

    Terao, D.; Rodabaugh, E.C.

    1992-11-01

    General Design Criterion I of Appendix A to Part 50 of Title 10 of the Code of Federal Regulations requires, in part, that structures, systems, and components important to safety be designed to withstand the effects of earthquakes without a loss of capability to perform their safety function. ne function of a piping system is to convey fluids from one location to another. The functional capability of a piping system might be lost if, for example, the cross-sectional flow area of the pipe were deformed to such an extent that the required flow through the pipe would be restricted. The objective of this report is to examine the present rules in the American Society of Mechanical Engineers Boiler and Pressure Vessel Code, Section III, and potential changes to these rules, to determine if they are adequate for ensuring the functional capability of safety-related piping systems in nuclear power plants

  8. Upgrading of TREAT experimental capabilities

    International Nuclear Information System (INIS)

    Dickerman, C.E.; Rose, D.; Bhattacharyya, S.K.

    1982-01-01

    The TREAT facility at the Argonne National Laboratory site in the Idaho National Engineering Laboratory is being upgraded to provide capabilities for fast-reactor-safety transient experiments not possible at any other experimental facility. Principal TREAT Upgrade (TU) goal is provision for 37-pin size experiments on energetics of core-disruptive accidents (CDA) in fast breeder reactor cores with moderate sodium void coefficients. this goal requires a significant enhancement of the capabilities of the TREAT facility, specifically including reactor control, hardened neutron spectrum incident on the test sample, and enlarged building. The upgraded facility will retain the capability for small-size experiments of the types currently being performed in TREAT. Reactor building and crane upgrading have been completed. TU schedules call for the components of the upgraded reactor system to be finished in 1984, including upgraded TREAT fuel and control system, and expanded coverage by the hodoscope fuel-motion diagnostics system

  9. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  10. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  11. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  12. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  13. U.S. Nuclear Regulatory Commission Extremely Low Probability of Rupture pilot study: xLPR framework model user's guide

    International Nuclear Information System (INIS)

    Kalinich, Donald A.; Sallaberry, Cedric M.; Mattie, Patrick D.

    2010-01-01

    For the U.S. Nuclear Regulatory Commission (NRC) Extremely Low Probability of Rupture (xLPR) pilot study, Sandia National Laboratories (SNL) was tasked to develop and evaluate a probabilistic framework using a commercial software package for Version 1.0 of the xLPR Code. Version 1.0 of the xLPR code is focused assessing the probability of rupture due to primary water stress corrosion cracking in dissimilar metal welds in pressurizer surge nozzles. Future versions of this framework will expand the capabilities to other cracking mechanisms, and other piping systems for both pressurized water reactors and boiling water reactors. The goal of the pilot study project is to plan the xLPR framework transition from Version 1.0 to Version 2.0; hence the initial Version 1.0 framework and code development will be used to define the requirements for Version 2.0. The software documented in this report has been developed and tested solely for this purpose. This framework and demonstration problem will be used to evaluate the commercial software's capabilities and applicability for use in creating the final version of the xLPR framework. This report details the design, system requirements, and the steps necessary to use the commercial-code based xLPR framework developed by SNL.

  14. Nanofabrication principles, capabilities and limits

    CERN Document Server

    Cui, Zheng

    2017-01-01

    This second edition of Nanofabrication is one of the most comprehensive introductions on nanofabrication technologies and processes. A practical guide and reference, this book introduces readers to all of the developed technologies that are capable of making structures below 100nm. The principle of each technology is introduced and illustrated with minimum mathematics involved. Also analyzed are the capabilities of each technology in making sub-100nm structures, and the limits of preventing a technology from going further down the dimensional scale. This book provides readers with a toolkit that will help with any of their nanofabrication challenges.

  15. Judgmental Forecasting of Operational Capabilities

    DEFF Research Database (Denmark)

    Hallin, Carina Antonia; Tveterås, Sigbjørn; Andersen, Torben Juul

    This paper explores a new judgmental forecasting indicator, the Employee Sensed Operational Capabilities (ESOC). The purpose of the ESOC is to establish a practical prediction tool that can provide early signals about changes in financial performance by gauging frontline employees’ sensing...... of changes in the firm’s operational capabilities. We present the first stage of the development of ESOC by applying a formative measurement approach to test the index in relation to financial performance and against an organizational commitment scale. We use distributed lag models to test whether the ESOC...

  16. Ensuring US National Aeronautics Test Capabilities

    Science.gov (United States)

    Marshall, Timothy J.

    2010-01-01

    process; and the reductions in wind tunnel testing requirements within the largest consumer of ATP wind tunnel test time, the Aeronautics Research Mission Directorate (ARMD). Retirement of the Space Shuttle Program and recent perturbations of NASA's Constellation Program will exacerbate this downward trend. Therefore it is crucial that ATP periodically revisit and determine which of its test capabilities are strategically important, which qualify as low-risk redundancies that could be put in an inactive status or closed, and address the challenges associated with both sustainment and improvements to the test capabilities that must remain active. This presentation will provide an overview of the ATP vision, mission, and goals as well as the challenges and opportunities the program is facing both today and in the future. We will discuss the strategy ATP is taking over the next five years to address the National aeronautics test capability challenges and what the program will do to capitalize on its opportunities to ensure a ready, robust and relevant portfolio of National aeronautics test capabilities.

  17. Defining Baconian Probability for Use in Assurance Argumentation

    Science.gov (United States)

    Graydon, Patrick J.

    2016-01-01

    The use of assurance cases (e.g., safety cases) in certification raises questions about confidence in assurance argument claims. Some researchers propose to assess confidence in assurance cases using Baconian induction. That is, a writer or analyst (1) identifies defeaters that might rebut or undermine each proposition in the assurance argument and (2) determines whether each defeater can be dismissed or ignored and why. Some researchers also propose denoting confidence using the counts of defeaters identified and eliminated-which they call Baconian probability-and performing arithmetic on these measures. But Baconian probabilities were first defined as ordinal rankings which cannot be manipulated arithmetically. In this paper, we recount noteworthy definitions of Baconian induction, review proposals to assess confidence in assurance claims using Baconian probability, analyze how these comport with or diverge from the original definition, and make recommendations for future practice.

  18. A Stochastic Model for the Landing Dispersion of Hazard Detection and Avoidance Capable Flight Systems

    Science.gov (United States)

    Witte, L.

    2014-06-01

    To support landing site assessments for HDA-capable flight systems and to facilitate trade studies between the potential HDA architectures versus the yielded probability of safe landing a stochastic landing dispersion model has been developed.

  19. Satellite-based Tropical Cyclone Monitoring Capabilities

    Science.gov (United States)

    Hawkins, J.; Richardson, K.; Surratt, M.; Yang, S.; Lee, T. F.; Sampson, C. R.; Solbrig, J.; Kuciauskas, A. P.; Miller, S. D.; Kent, J.

    2012-12-01

    Satellite remote sensing capabilities to monitor tropical cyclone (TC) location, structure, and intensity have evolved by utilizing a combination of operational and research and development (R&D) sensors. The microwave imagers from the operational Defense Meteorological Satellite Program [Special Sensor Microwave/Imager (SSM/I) and the Special Sensor Microwave Imager Sounder (SSMIS)] form the "base" for structure observations due to their ability to view through upper-level clouds, modest size swaths and ability to capture most storm structure features. The NASA TRMM microwave imager and precipitation radar continue their 15+ yearlong missions in serving the TC warning and research communities. The cessation of NASA's QuikSCAT satellite after more than a decade of service is sorely missed, but India's OceanSat-2 scatterometer is now providing crucial ocean surface wind vectors in addition to the Navy's WindSat ocean surface wind vector retrievals. Another Advanced Scatterometer (ASCAT) onboard EUMETSAT's MetOp-2 satellite is slated for launch soon. Passive microwave imagery has received a much needed boost with the launch of the French/Indian Megha Tropiques imager in September 2011, basically greatly supplementing the very successful NASA TRMM pathfinder with a larger swath and more frequent temporal sampling. While initial data issues have delayed data utilization, current news indicates this data will be available in 2013. Future NASA Global Precipitation Mission (GPM) sensors starting in 2014 will provide enhanced capabilities. Also, the inclusion of the new microwave sounder data from the NPP ATMS (Oct 2011) will assist in mapping TC convective structures. The National Polar orbiting Partnership (NPP) program's VIIRS sensor includes a day night band (DNB) with the capability to view TC cloud structure at night when sufficient lunar illumination exits. Examples highlighting this new capability will be discussed in concert with additional data fusion efforts.

  20. Evaluating late detection capability against diverse insider adversaries

    International Nuclear Information System (INIS)

    Sicherman, A.

    1987-01-01

    This paper describes a model for evaluating the late (after-the-fact) detection capability of material control and accountability (MCandA) systems against insider theft or diversion of special nuclear material. Potential insider cover-up strategies to defeat activities providing detection (e.g., inventories) are addressed by the model in a tractable manner. For each potential adversary and detection activity, two probabilities are assessed and used to fit the model. The model then computes the probability of detection for activities occurring periodically over time. The model provides insight into MCandA effectiveness and helps identify areas for safeguards improvement. 4 refs., 4 tabs

  1. The probability and the management of human error

    International Nuclear Information System (INIS)

    Dufey, R.B.; Saull, J.W.

    2004-01-01

    Embedded within modern technological systems, human error is the largest, and indeed dominant contributor to accident cause. The consequences dominate the risk profiles for nuclear power and for many other technologies. We need to quantify the probability of human error for the system as an integral contribution within the overall system failure, as it is generally not separable or predictable for actual events. We also need to provide a means to manage and effectively reduce the failure (error) rate. The fact that humans learn from their mistakes allows a new determination of the dynamic probability and human failure (error) rate in technological systems. The result is consistent with and derived from the available world data for modern technological systems. Comparisons are made to actual data from large technological systems and recent catastrophes. Best estimate values and relationships can be derived for both the human error rate, and for the probability. We describe the potential for new approaches to the management of human error and safety indicators, based on the principles of error state exclusion and of the systematic effect of learning. A new equation is given for the probability of human error (λ) that combines the influences of early inexperience, learning from experience (ε) and stochastic occurrences with having a finite minimum rate, this equation is λ 5.10 -5 + ((1/ε) - 5.10 -5 ) exp(-3*ε). The future failure rate is entirely determined by the experience: thus the past defines the future

  2. Building server capabilities in China

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi; Slepniov, Dmitrij; Wæhrens, Brian Vejrum

    2012-01-01

    The purpose of this paper is to further our understanding of multinational companies building server capabilities in China. The paper is based on the cases of two western companies with operations in China. The findings highlight a number of common patterns in the 1) managerial challenges related...

  3. Microfoundations of Routines and Capabilities

    DEFF Research Database (Denmark)

    Felin, Teppo; Foss, Nicolai Juul; Heimriks, Koen H.

    We discuss the microfoundations of routines and capabilities, including why a microfoundations view is needed and how it may inform work on organizational and competitive heterogeneity. Building on extant research, we identify three primary categories of micro-level components underlying routines...

  4. Microfoundations of Routines and Capabilities

    DEFF Research Database (Denmark)

    Felin, Tippo; Foss, Nicolai Juul; Heimericks, Koen H.

    2012-01-01

    This article introduces the Special Issue and discusses the microfoundations of routines and capabilities, including why a microfoundations view is needed and how it may inform work on organizational and competitive heterogeneity. Building on extant research, we identify three primary categories ...

  5. Capability and Learning to Choose

    Science.gov (United States)

    LeBmann, Ortrud

    2009-01-01

    The Capability Approach (henceforth CA) is in the first place an approach to the evaluation of individual well-being and social welfare. Many disciplines refer to the CA, first and foremost welfare economics, development studies and political philosophy. Educational theory was not among the first disciplines that took notice of the CA, but has a…

  6. Research for new UAV capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Canavan, G.H.; Leadabrand, R.

    1996-07-01

    This paper discusses research for new Unmanned Aerial Vehicles (UAV) capabilities. Findings indicate that UAV performance could be greatly enhanced by modest research. Improved sensors and communications enhance near term cost effectiveness. Improved engines, platforms, and stealth improve long term effectiveness.

  7. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  8. Development of students learning capabilities and professional capabilities

    DEFF Research Database (Denmark)

    Ringtved, Ulla Lunde; Wahl, Christian; Belle, Gianna

    This paper describes the work-in-progress on a project that aims todevelop a tool that via learning analytic methods enable studentsto enhance, document and assess the development of their learningcapabilities and professional capabilities in consequence of theirself-initiated study activities...... during their bachelor educations. Thetool aims at enhancing the development of students’ capabilities toself-initiate, self-regulate and self-assess their study activities.The tool uses the concept of collective intelligence as source formotivation and inspiration in self-initiating study activities...... as wellas self-assessing them. The tool is based on a heutagogical approachto support reflection on learning potential in these activities. Thisenhances the educational use of students self-initiated learningactivities by bringing visibility and evidence to them, and therebybringing value to the assessment...

  9. Technological capability at the Brazilian official pharmaceutical laboratories

    Directory of Open Access Journals (Sweden)

    José Vitor Bomtempo Martins

    2008-10-01

    Full Text Available This paper studies the technological capability in the Brazilian Official Pharmaceutical Laboratories [OPL]. The technological capability analysis could contribute to organization strategies and governmental actions in order to improve OPL basic tasks as well to incorporate new ones, particularly concerning the innovation management. Inspired in Figueiredo (2000, 2003a, 2003b and Figueiredo and Ariffin (2003, a framework was drawn and adapted to pharmaceutical industry characteristics and current sanitary and health legislation. The framework allows to map different dimensions of the technological capability (installations, processes, products, equipments, organizational capability and knowledge management and the level attained by OPL (ordinary or innovating capability. OPL show a good development of ordinary capabilities, particularly in Product and Processes. Concerning the other dimensions, OPL are quite diverse. In general, innovating capabilities are not much developed. In the short term, it was identified a dispersion in the capacitating efforts. Considering their present level and the absorption efforts, good perspectives can be found in Installations, Processes and Organizational Capability. A lower level of efforts in Products and Knowledge Management could undermine these capabilities in the future.

  10. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  11. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  12. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  13. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  14. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  15. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  16. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  17. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  18. Human-Centered Design Capability

    Science.gov (United States)

    Fitts, David J.; Howard, Robert

    2009-01-01

    For NASA, human-centered design (HCD) seeks opportunities to mitigate the challenges of living and working in space in order to enhance human productivity and well-being. Direct design participation during the development stage is difficult, however, during project formulation, a HCD approach can lead to better more cost-effective products. HCD can also help a program enter the development stage with a clear vision for product acquisition. HCD tools for clarifying design intent are listed. To infuse HCD into the spaceflight lifecycle the Space and Life Sciences Directorate developed the Habitability Design Center. The Center has collaborated successfully with program and project design teams and with JSC's Engineering Directorate. This presentation discusses HCD capabilities and depicts the Center's design examples and capabilities.

  19. Developing Acquisition IS Integration Capabilities

    DEFF Research Database (Denmark)

    Wynne, Peter J.

    2016-01-01

    An under researched, yet critical challenge of Mergers and Acquisitions (M&A), is what to do with the two organisations’ information systems (IS) post-acquisition. Commonly referred to as acquisition IS integration, existing theory suggests that to integrate the information systems successfully......, an acquiring company must leverage two high level capabilities: diagnosis and integration execution. Through a case study, this paper identifies how a novice acquirer develops these capabilities in anticipation of an acquisition by examining its use of learning processes. The study finds the novice acquirer...... applies trial and error, experimental, and vicarious learning processes, while actively avoiding improvisational learning. The results of the study contribute to the acquisition IS integration literature specifically by exploring it from a new perspective: the learning processes used by novice acquirers...

  20. LOFT Augmented Operator Capability Program

    International Nuclear Information System (INIS)

    Hollenbeck, D.A.; Krantz, E.A.; Hunt, G.L.; Meyer, O.R.

    1980-01-01

    The outline of the LOFT Augmented Operator Capability Program is presented. This program utilizes the LOFT (Loss-of-Fluid Test) reactor facility which is located at the Idaho National Engineering Laboratory and the LOFT operational transient experiment series as a test bed for methods of enhancing the reactor operator's capability for safer operation. The design of an Operational Diagnotics and Display System is presented which was backfit to the existing data acquisition computers. Basic color-graphic displays of the process schematic and trend type are presented. In addition, displays were developed and are presented which represent safety state vector information. A task analysis method was applied to LOFT reactor operating procedures to test its usefulness in defining the operator's information needs and workload

  1. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    Science.gov (United States)

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible period of time

  2. Continuation of probability density functions using a generalized Lyapunov approach

    Energy Technology Data Exchange (ETDEWEB)

    Baars, S., E-mail: s.baars@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Viebahn, J.P., E-mail: viebahn@cwi.nl [Centrum Wiskunde & Informatica (CWI), P.O. Box 94079, 1090 GB, Amsterdam (Netherlands); Mulder, T.E., E-mail: t.e.mulder@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); Kuehn, C., E-mail: ckuehn@ma.tum.de [Technical University of Munich, Faculty of Mathematics, Boltzmannstr. 3, 85748 Garching bei München (Germany); Wubs, F.W., E-mail: f.w.wubs@rug.nl [Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, P.O. Box 407, 9700 AK Groningen (Netherlands); Dijkstra, H.A., E-mail: h.a.dijkstra@uu.nl [Institute for Marine and Atmospheric research Utrecht, Department of Physics and Astronomy, Utrecht University, Princetonplein 5, 3584 CC Utrecht (Netherlands); School of Chemical and Biomolecular Engineering, Cornell University, Ithaca, NY (United States)

    2017-05-01

    Techniques from numerical bifurcation theory are very useful to study transitions between steady fluid flow patterns and the instabilities involved. Here, we provide computational methodology to use parameter continuation in determining probability density functions of systems of stochastic partial differential equations near fixed points, under a small noise approximation. Key innovation is the efficient solution of a generalized Lyapunov equation using an iterative method involving low-rank approximations. We apply and illustrate the capabilities of the method using a problem in physical oceanography, i.e. the occurrence of multiple steady states of the Atlantic Ocean circulation.

  3. Energy futures

    International Nuclear Information System (INIS)

    Treat, J.E.

    1990-01-01

    This book provides fifteen of the futures industry's leading authorities with broader background in both theory and practice of energy futures trading in this updated text. The authors review the history of the futures market and the fundamentals of trading, hedging, and technical analysis; then they update you with the newest trends in energy futures trading - natural gas futures, options, regulations, and new information services. The appendices outline examples of possible contracts and their construction

  4. Exploration Medical Capability - Technology Watch

    Science.gov (United States)

    Krihak, Michael; Watkins, Sharmila; Barr, Yael; Barsten, Kristina; Fung, Paul; Baumann, David

    2011-01-01

    The objectives of the Technology Watch process are to identify emerging, high-impact technologies that augment current ExMC development efforts, and to work with academia, industry, and other government agencies to accelerate the development of medical care and research capabilities for the mitigation of potential health issues that could occur during space exploration missions. The establishment of collaborations with these entities is beneficial to technology development, assessment and/or insertion. Such collaborations also further NASA s goal to provide a safe and healthy environment for human exploration. The Tech Watch project addresses requirements and capabilities identified by knowledge and technology gaps that are derived from a discrete set of medical conditions that are most likely to occur on exploration missions. These gaps are addressed through technology readiness level assessments, market surveys, collaborations and distributed innovation opportunities. Ultimately, these gaps need to be closed with respect to exploration missions, and may be achieved through technology development projects. Information management is a key aspect to this process where Tech Watch related meetings, research articles, collaborations and partnerships are tracked by the HRP s Exploration Medical Capabilities (ExMC) Element. In 2011, ExMC will be introducing the Tech Watch external website and evidence wiki that will provide access to ExMC technology and knowledge gaps, technology needs and requirements documents.

  5. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  6. Stochastic and simulation models of maritime intercept operations capabilities

    OpenAIRE

    Sato, Hiroyuki

    2005-01-01

    The research formulates and exercises stochastic and simulation models to assess the Maritime Intercept Operations (MIO) capabilities. The models focus on the surveillance operations of the Maritime Patrol Aircraft (MPA). The analysis using the models estimates the probability with which a terrorist vessel (Red) is detected, correctly classified, and escorted for intensive investigation and neutralization before it leaves an area of interest (AOI). The difficulty of obtaining adequate int...

  7. Under the veil of neoliberalism: inequality, health, and capabilities

    OpenAIRE

    Kemp, Eagan

    2008-01-01

    The relationship between income inequality and health has received substantial attention in the fields of medical sociology and public health and continues to be debated. In Chile, previous findings indicate that there is an income inequality effect; respondents who live in areas with high inequality experience a greater probability of poor self-reported health. This study examines the Wilkinson income inequality hypothesis in a new way by using it in conjunction with Sen’s capability approac...

  8. Safety related requirements on future nuclear power plants

    International Nuclear Information System (INIS)

    Niehaus, F.

    1991-01-01

    Nuclear power has the potential to significantly contribute to the future energy supply. However, this requires continuous improvements in nuclear safety. Technological advancements and implementation of safety culture will achieve a safety level for future reactors of the present generation of a probability of core-melt of less than 10 -5 per year, and less than 10 -6 per year for large releases of radioactive materials. There are older reactors which do not comply with present safety thinking. The paper reviews findings of a recent design review of WWER 440/230 plants. Advanced evolutionary designs might be capable of reducing the probability of significant off-site releases to less than 10 -7 per year. For such reactors there are inherent limitations to increase safety further due to the human element, complexity of design and capability of the containment function. Therefore, revolutionary designs are being explored with the aim of eliminating the potential for off-site releases. In this context it seems to be advisable to explore concepts where the ultimate safety barrier is the fuel itself. (orig.) [de

  9. Futuring for Future Ready Librarians

    Science.gov (United States)

    Figueroa, Miguel A.

    2018-01-01

    Futurists and foresight professionals offer several guiding principles for thinking about the future. These principles can help people to think about the future and become more powerful players in shaping the preferred futures they want for themselves and their communities. The principles also fit in well as strategies to support the Future Ready…

  10. Achieving a Launch on Demand Capability

    Science.gov (United States)

    Greenberg, Joel S.

    2002-01-01

    The ability to place payloads [satellites] into orbit as and when required, often referred to as launch on demand, continues to be an elusive and yet largely unfulfilled goal. But what is the value of achieving launch on demand [LOD], and what metrics are appropriate? Achievement of a desired level of LOD capability must consider transportation system thruput, alternative transportation systems that comprise the transportation architecture, transportation demand, reliability and failure recovery characteristics of the alternatives, schedule guarantees, launch delays, payload integration schedules, procurement policies, and other factors. Measures of LOD capability should relate to the objective of the transportation architecture: the placement of payloads into orbit as and when required. Launch on demand capability must be defined in probabilistic terms such as the probability of not incurring a delay in excess of T when it is determined that it is necessary to place a payload into orbit. Three specific aspects of launch on demand are considered: [1] the ability to recover from adversity [i.e., a launch failure] and to keep up with the steady-state demand for placing satellites into orbit [this has been referred to as operability and resiliency], [2] the ability to respond to the requirement to launch a satellite when the need arises unexpectedly either because of an unexpected [random] on-orbit satellite failure that requires replacement or because of the sudden recognition of an unanticipated requirement, and [3] the ability to recover from adversity [i.e., a launch failure] during the placement of a constellation into orbit. The objective of this paper is to outline a formal approach for analyzing alternative transportation architectures in terms of their ability to provide a LOD capability. The economic aspect of LOD is developed by establishing a relationship between scheduling and the elimination of on-orbit spares while achieving the desired level of on

  11. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  12. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  13. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  14. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  15. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  16. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  17. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  18. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  19. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  20. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  1. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  2. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  3. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  4. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  5. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  6. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  7. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  8. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  9. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  10. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  11. Design Mechanism as Territorial Strategic Capability

    Directory of Open Access Journals (Sweden)

    Gianita BLEOJU

    2009-01-01

    Full Text Available The current exigencies that a territory must faced in order to its’ optimalpositioning in future regional competition requires the ability to design theappropriate mechanism which better valorize the territory capability. Such aconstruct is vital for territorial sustainable development and supposes thecreation of a specific body of knowledge from distinctive local resourceexploitation and unique value creation and allocation. Territorial mechanismdesign is a typical management decision about identification, ownership andcontrol of specific strategic capabilities and their combination in a distinctiveterritorial portfolio. The most difficult responsibility is to allocate the territorialvalue added which is a source of conflict among territorial components. Ourcurrent paper research covers the basics of two complementary territorialpillars-rural and tourism potential and proves the lack of specific designmechanisms which explain the current diminishing value of Galati Brailaregion. The proposed management system, relying upon territorial controlmechanism, will ensure knowledge sharing process via collaborative learning,with the final role of appropriate territorial attractivity signals, reinforcingidentity as key factor of territorial attractability. Our paper is fully documentedon there years of data analyzing from territorial area of interest. This offers usthe necessary empiric contrasting for our proposed solution.

  12. Systems Engineering for Space Exploration Medical Capabilities

    Science.gov (United States)

    Mindock, Jennifer; Reilly, Jeffrey; Rubin, David; Urbina, Michelle; Hailey, Melinda; Hanson, Andrea; Burba, Tyler; McGuire, Kerry; Cerro, Jeffrey; Middour, Chris; hide

    2017-01-01

    Human exploration missions that reach destinations beyond low Earth orbit, such as Mars, will present significant new challenges to crew health management. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is applying systems engineering principles and practices to accomplish its goals. This paper discusses the structured and integrative approach that is guiding the medical system technical development. Assumptions for the required levels of care on exploration missions, medical system goals, and a Concept of Operations are early products that capture and clarify stakeholder expectations. Model-Based Systems Engineering techniques are then applied to define medical system behavior and architecture. Interfaces to other flight and ground systems, and within the medical system are identified and defined. Initial requirements and traceability are established, which sets the stage for identification of future technology development needs. An early approach for verification and validation, taking advantage of terrestrial and near-Earth exploration system analogs, is also defined to further guide system planning and development.

  13. Capability Development in an Offshoring Context

    DEFF Research Database (Denmark)

    Jaura, Manya

    Capability development can be defined as deliberate firm-level investment involving a search and learning process aimed at modifying or enhancing existing capabilities. Increasingly, firms are relocating advanced services to offshore locations resulting in the challenge of capability development ...

  14. Unpacking dynamic capability : a design perspective

    NARCIS (Netherlands)

    Mulders, D.E.M.; Romme, A.G.L.; Bøllingtoft, A.; Håkonsson, D.D.; Nielsen, J.F.; Snow, C.C; Ulhøi, J.

    2009-01-01

    This chapter reviews the dynamic capability literature to explore relationships between definition, operationalization, and measurement of dynamic capability. Subsequently, we develop a design-oriented approach toward dynamic capability that distinguishes between design rules, recurrent patterns of

  15. Summary of Sandia Laboratories technical capabilities

    International Nuclear Information System (INIS)

    1977-05-01

    The technical capabilities of Sandia Laboratories are detailed in a series of companion reports. In this summary the use of the capabilities in technical programs is outlined and the capabilities are summarized. 25 figures, 3 tables

  16. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  17. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  18. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  19. NORSOF Military Assistance Capability Development

    Science.gov (United States)

    2016-09-01

    key to successful mentoring and partnering. Norwegian SOF, for example, focused on building friendships, playing sports together, and even engaging...sea route297 will likely become a vital route for global shipping and arctic tourism in the future. HRO at sea, with both Norwegian and Russian lives

  20. Developing A/E Capabilities

    International Nuclear Information System (INIS)

    Gonzalez, A.; Gurbindo, J.

    1987-01-01

    During the last few years, the methods used by EMPRESARIOS AGRUPADOS and INITEC to perform Architect-Engineering work in Spain for nuclear projects has undergone a process of significant change in project management and engineering approaches. Specific practical examples of management techniques and design practices which represent a good record of results will be discussed. They are identified as areas of special interest in developing A/E capabilities for nuclear projects . Command of these areas should produce major payoffs in local participation and contribute to achieving real nuclear engineering capabities in the country. (author)

  1. Dynamic capabilities and network benefits

    Directory of Open Access Journals (Sweden)

    Helge Svare

    2017-01-01

    Full Text Available The number of publicly funded initiatives to establish or strengthen networks and clusters, in order to enhance innovation, has been increasing. Returns on such investments vary, and the aim of this study is to explore to what extent the variation in benefits for firms participating in networks or clusters can be explained by their dynamic capabilities (DC. Based on survey data from five Norwegian networks, the results suggest that firms with higher DC are more successful in harvesting the potential benefits of being member of a network.

  2. PROGRAMS WITH DATA MINING CAPABILITIES

    Directory of Open Access Journals (Sweden)

    Ciobanu Dumitru

    2012-03-01

    Full Text Available The fact that the Internet has become a commodity in the world has created a framework for anew economy. Traditional businesses migrate to this new environment that offers many features and options atrelatively low prices. However competitiveness is fierce and successful Internet business is tied to rigorous use of allavailable information. The information is often hidden in data and for their retrieval is necessary to use softwarecapable of applying data mining algorithms and techniques. In this paper we want to review some of the programswith data mining capabilities currently available in this area.We also propose some classifications of this softwareto assist those who wish to use such software.

  3. Human Capability, Mild Perfectionism and Thickened Educational Praxis

    Science.gov (United States)

    Walker, Melanie

    2008-01-01

    This paper argues for a mild perfectionism in applying Amartya Sen's capability approach for an education transformative of student agency and well-being. Key to the paper is the significance of education as a process of being and becoming in the future, and education's fundamental objective of a positively changed human being. The capability…

  4. Future accelerators (?)

    Energy Technology Data Exchange (ETDEWEB)

    John Womersley

    2003-08-21

    I describe the future accelerator facilities that are currently foreseen for electroweak scale physics, neutrino physics, and nuclear structure. I will explore the physics justification for these machines, and suggest how the case for future accelerators can be made.

  5. Modeling Stochastic Complexity in Complex Adaptive Systems: Non-Kolmogorov Probability and the Process Algebra Approach.

    Science.gov (United States)

    Sulis, William H

    2017-10-01

    Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.

  6. Capability Building and Learning: An Emergent Behavior Approach

    Directory of Open Access Journals (Sweden)

    Andreu Rafael

    2014-12-01

    Full Text Available Economics-based models of firms typically overlook management acts and capability development. We propose a model that analyzes the aggregate behavior of a population of firms resulting from both specific management decisions and learning processes, that induce changes in companies’ capabilities. Decisions are made under imperfect information and bounded rationality, and managers may sacrifice short-term performance in exchange for qualitative outcomes that affect their firm’s future potential. The proposed model provides a structured setting in which these issues -often discussed only informally- can be systematically analyzed through simulation, producing a variety of hard-to-anticipate emergent behaviors. Economic performance is quite sensitive to managers’ estimates of their firms’ capabilities, and companies willing to sacrifice short-run results for future potential appear to be more stable than the rest. Also, bounded rationality can produce chaotic dynamics reminiscent of real life situations.

  7. Detection capabilities. Some historical footnotes

    International Nuclear Information System (INIS)

    Currie, L.A.

    2017-01-01

    Part I Summary of relevant topics from 1923 to present-including: Currie (Anal Chem 40:586-593, 1968) detection concepts and capabilities; International detection and uncertainty standards; Failure of classical "1"4C dating and birth of new scientific disciplines; Exploratory nuclear data analysis of "8"5Kr monitors found coincident with the collapse of the Iron Curtain (1989); Faulty statistics proved responsible for mistaken assertions that Currie's LC yields excessive false positives; Low-level counting and AMS for atmospheric "3"7Ar and µmolar fossil/biomass carbon in the environment; Erroneous assumption that our low-level background is a Poisson Process, linked to ∼8 % spurious anticoincidence events. Part II. Exact treatment of bivariate Poisson data-solved in 1930s by Przyborowski and Wilenski, Krakow University, for detecting extreme trace amounts of a malicious contaminant (dodder) in high purity seed standards. We adapted their treatment to detection capabilities in ultra-low-level nuclear counting. The timing of their work had great historical significance, marking the start of World War II, with the invasion of Poland (1939). (author)

  8. WFPC2 Science Capability Report

    Science.gov (United States)

    Brown, David I.

    2001-01-01

    In the following pages, a brief outline of the salient science features of Wide Field/Planetary Camera 2 (WFPC2) that impact the proposal writing process and conceptual planning of observations is presented. At the time of writing, WFPC2, while having been better defined than in the past, is far from being at the stage where science and engineering details are well enough known that concrete observational/operational sequences can be plannned with assurance. Conceptual issues are another matter. The thrust of the Science Capability Report at this time is to outline the known performance parameters and capabilities of WFPC2, filling in with specifications when necessary to hold a place for these items as they become known. Also, primary scientific and operational differences between WFPC 1 and 2 are discussed section-by-section, along with issues that remain to be determined and idiosyncrasies when known. Clearly the determination of the latter awaits some form of testing, most likely thermal/vacuum testing. All data in this report should be viewed with a jaundiced eye at this time.

  9. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  10. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  11. Estimating the probability of rare events: addressing zero failure data.

    Science.gov (United States)

    Quigley, John; Revie, Matthew

    2011-07-01

    Traditional statistical procedures for estimating the probability of an event result in an estimate of zero when no events are realized. Alternative inferential procedures have been proposed for the situation where zero events have been realized but often these are ad hoc, relying on selecting methods dependent on the data that have been realized. Such data-dependent inference decisions violate fundamental statistical principles, resulting in estimation procedures whose benefits are difficult to assess. In this article, we propose estimating the probability of an event occurring through minimax inference on the probability that future samples of equal size realize no more events than that in the data on which the inference is based. Although motivated by inference on rare events, the method is not restricted to zero event data and closely approximates the maximum likelihood estimate (MLE) for nonzero data. The use of the minimax procedure provides a risk adverse inferential procedure where there are no events realized. A comparison is made with the MLE and regions of the underlying probability are identified where this approach is superior. Moreover, a comparison is made with three standard approaches to supporting inference where no event data are realized, which we argue are unduly pessimistic. We show that for situations of zero events the estimator can be simply approximated with 1/2.5n, where n is the number of trials. © 2011 Society for Risk Analysis.

  12. Towards a National Space Weather Predictive Capability

    Science.gov (United States)

    Fox, N. J.; Ryschkewitsch, M. G.; Merkin, V. G.; Stephens, G. K.; Gjerloev, J. W.; Barnes, R. J.; Anderson, B. J.; Paxton, L. J.; Ukhorskiy, A. Y.; Kelly, M. A.; Berger, T. E.; Bonadonna, L. C. M. F.; Hesse, M.; Sharma, S.

    2015-12-01

    National needs in the area of space weather informational and predictive tools are growing rapidly. Adverse conditions in the space environment can cause disruption of satellite operations, communications, navigation, and electric power distribution grids, leading to a variety of socio-economic losses and impacts on our security. Future space exploration and most modern human endeavors will require major advances in physical understanding and improved transition of space research to operations. At present, only a small fraction of the latest research and development results from NASA, NOAA, NSF and DoD investments are being used to improve space weather forecasting and to develop operational tools. The power of modern research and space weather model development needs to be better utilized to enable comprehensive, timely, and accurate operational space weather tools. The mere production of space weather information is not sufficient to address the needs of those who are affected by space weather. A coordinated effort is required to support research-to-applications transition efforts and to develop the tools required those who rely on this information. In this presentation we will review the space weather system developed for the Van Allen Probes mission, together with other datasets, tools and models that have resulted from research by scientists at JHU/APL. We will look at how these, and results from future missions such as Solar Probe Plus, could be applied to support space weather applications in coordination with other community assets and capabilities.

  13. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  14. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  15. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  16. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  17. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  18. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  19. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  20. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  1. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  2. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  3. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  4. MHTGR inherent heat transfer capability

    International Nuclear Information System (INIS)

    Berkoe, J.M.

    1992-01-01

    This paper reports on the Commercial Modular High Temperature Gas-Cooled Reactor (MHTGR) which achieves improved reactor safety performance and reliability by utilizing a completely passive natural convection cooling system called the RCCS to remove decay heat in the event that all active cooling systems fail to operate. For the highly improbable condition that the RCCS were to become non-functional following a reactor depressurization event, the plant would be forced to rely upon its inherent thermo-physical characteristics to reject decay heat to the surrounding earth and ambient environment. A computational heat transfer model was created to simulate such a scenario. Plant component temperature histories were computed over a period of 20 days into the event. The results clearly demonstrate the capability of the MHTGR to maintain core integrity and provide substantial lead time for taking corrective measures

  5. Author's capabilities in author indexing

    International Nuclear Information System (INIS)

    Ooi, Shoichi

    1988-01-01

    The purpose of this paper is to provide a author capability of current author indexing practices in journal literature indexing practices in 'Journal of Nuclear Science and Technology of Japan'. This Journal employed keywords freely assigned by author and not taken from INIS Thesaurus or other vocabulary list. Author examined 413 literatures, comparing keywords assigned by the literatures' authors with descriptor's (ATOMINDEX) assigned by an experienced professional indexer. The results of the comparisons showed that the average set of terms assigned by author included about 70% of all the terms assigned to the same literature by the professional indexer. Authors eventually would contribute, for the most effective point to create reference to information is at the time of its generation. Consequently, it may be possible to transfer them easily to descriptors in every secondary information system. (author)

  6. Human Inferences about Sequences: A Minimal Transition Probability Model.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2016-12-01

    Full Text Available The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge.

  7. Production capability of the US uranium industry

    International Nuclear Information System (INIS)

    deVergie, P.C.; Anderson, J.R.; Miley, J.W.; Frederick, C.J.L.

    1980-01-01

    Demand for U 3 O 8 through the late 1990s could be met at the grades and costs represented by the $30 resources, although for the next 3 or 4 years, production will probably be from the lower cost portions of these resources if prices remain low. However, to meet currently projected uranium requirements beyond the year 2000, there will have to be a transition by the mid-1990s to higher cost and lower grade production in order to include supply from the additional increment of resources available between the $30 and $50 levels. Plans and financial commitments required to accomplish such a transition must be initiated y the mid-1980s, since lead times are increasing for exploration and for mill licensing and construction. Engineering planning and feasibility analyses would have to be carried out under a more advanced time frame than previously required. The importance of the potential resources can easily be seen. In meeting the high-case demand during the years 2005 through 2019 more than 50% of the production would be from resources assigned to the $50 probable potential resource category. By about the year 2006, there will have had to be considerable development of the possible, and perhaps, some of the speculative resources to assure continued production expansion; by 2020, more than 50% of the production would depend on the previous successes in finding and developing such resources. The continuation of the current trend in production curtailment and decreasing exploration will significantly lessen the domestic uranium industry's ability to respond quickly to the projected increases in uranium requirements. The industry's future will be unsettled until it preceives clear indications of demand and price incentives that will justify long-term capital investments

  8. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  9. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  10. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  11. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  12. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  13. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  14. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  15. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  16. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  17. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  18. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  19. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  20. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  1. Amartya Sen's Capability Approach and Education

    Science.gov (United States)

    Walker, Melanie

    2005-01-01

    The human capabilities approach developed by the economist Amartya Sen links development, quality of life and freedom. This article explores the key ideas in the capability approach of: capability, functioning, agency, human diversity and public participation in generating valued capabilities. It then considers how these ideas relate specifically…

  2. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  3. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  4. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  5. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  6. Organizational Economics of Capability and Heterogeneity

    DEFF Research Database (Denmark)

    Argyres, Nicholas S.; Felin, Teppo; Foss, Nicolai Juul

    2012-01-01

    For decades, the literatures on firm capabilities and organizational economics have been at odds with each other, specifically relative to explaining organizational boundaries and heterogeneity. We briefly trace the history of the relationship between the capabilities literature and organizational...... economics, and we point to the dominance of a “capabilities first” logic in this relationship. We argue that capabilities considerations are inherently intertwined with questions about organizational boundaries and internal organization, and we use this point to respond to the prevalent capabilities first...... logic. We offer an integrative research agenda that focuses first on the governance of capabilities and then on the capability of governance....

  7. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  8. Probability of detection as a function of multiple influencing parameters

    Energy Technology Data Exchange (ETDEWEB)

    Pavlovic, Mato

    2014-10-15

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  9. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-10-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  10. NESTEM-QRAS: A Tool for Estimating Probability of Failure

    Science.gov (United States)

    Patel, Bhogilal M.; Nagpal, Vinod K.; Lalli, Vincent A.; Pai, Shantaram; Rusick, Jeffrey J.

    2002-01-01

    An interface between two NASA GRC specialty codes, NESTEM and QRAS has been developed. This interface enables users to estimate, in advance, the risk of failure of a component, a subsystem, and/or a system under given operating conditions. This capability would be able to provide a needed input for estimating the success rate for any mission. NESTEM code, under development for the last 15 years at NASA Glenn Research Center, has the capability of estimating probability of failure of components under varying loading and environmental conditions. This code performs sensitivity analysis of all the input variables and provides their influence on the response variables in the form of cumulative distribution functions. QRAS, also developed by NASA, assesses risk of failure of a system or a mission based on the quantitative information provided by NESTEM or other similar codes, and user provided fault tree and modes of failure. This paper will describe briefly, the capabilities of the NESTEM, QRAS and the interface. Also, in this presentation we will describe stepwise process the interface uses using an example.

  11. Probability of detection as a function of multiple influencing parameters

    International Nuclear Information System (INIS)

    Pavlovic, Mato

    2014-01-01

    Non-destructive testing is subject to measurement uncertainties. In safety critical applications the reliability assessment of its capability to detect flaws is therefore necessary. In most applications, the flaw size is the single most important parameter that influences the probability of detection (POD) of the flaw. That is why the POD is typically calculated and expressed as a function of the flaw size. The capability of the inspection system to detect flaws is established by comparing the size of reliably detected flaw with the size of the flaw that is critical for the structural integrity. Applications where several factors have an important influence on the POD are investigated in this dissertation. To devise a reliable estimation of the NDT system capability it is necessary to express the POD as a function of all these factors. A multi-parameter POD model is developed. It enables POD to be calculated and expressed as a function of several influencing parameters. The model was tested on the data from the ultrasonic inspection of copper and cast iron components with artificial flaws. Also, a technique to spatially present POD data called the volume POD is developed. The fusion of the POD data coming from multiple inspections of the same component with different sensors is performed to reach the overall POD of the inspection system.

  12. Pattern formation, logistics, and maximum path probability

    Science.gov (United States)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are

  13. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    S. Kuzio

    2004-01-01

    Fracture spacing is a key hydrologic parameter in analyses of matrix diffusion. Although the individual fractures that transmit flow in the saturated zone (SZ) cannot be identified directly, it is possible to determine the fractured zones that transmit flow from flow meter survey observations. The fractured zones that transmit flow as identified through borehole flow meter surveys have been defined in this report as flowing intervals. The flowing interval spacing is measured between the midpoints of each flowing interval. The determination of flowing interval spacing is important because the flowing interval spacing parameter is a key hydrologic parameter in SZ transport modeling, which impacts the extent of matrix diffusion in the SZ volcanic matrix. The output of this report is input to the ''Saturated Zone Flow and Transport Model Abstraction'' (BSC 2004 [DIRS 170042]). Specifically, the analysis of data and development of a data distribution reported herein is used to develop the uncertainty distribution for the flowing interval spacing parameter for the SZ transport abstraction model. Figure 1-1 shows the relationship of this report to other model reports that also pertain to flow and transport in the SZ. Figure 1-1 also shows the flow of key information among the SZ reports. It should be noted that Figure 1-1 does not contain a complete representation of the data and parameter inputs and outputs of all SZ reports, nor does it show inputs external to this suite of SZ reports. Use of the developed flowing interval spacing probability distribution is subject to the limitations of the assumptions discussed in Sections 5 and 6 of this analysis report. The number of fractures in a flowing interval is not known. Therefore, the flowing intervals are assumed to be composed of one flowing zone in the transport simulations. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be

  14. Exact probability distribution function for the volatility of cumulative production

    Science.gov (United States)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  15. Total Probability of Collision as a Metric for Finite Conjunction Assessment and Collision Risk Management

    Science.gov (United States)

    Frigm, Ryan C.; Hejduk, Matthew D.; Johnson, Lauren C.; Plakalovic, Dragan

    2015-01-01

    On-orbit collision risk is becoming an increasing mission risk to all operational satellites in Earth orbit. Managing this risk can be disruptive to mission and operations, present challenges for decision-makers, and is time-consuming for all parties involved. With the planned capability improvements to detecting and tracking smaller orbital debris and capacity improvements to routinely predict on-orbit conjunctions, this mission risk will continue to grow in terms of likelihood and effort. It is very real possibility that the future space environment will not allow collision risk management and mission operations to be conducted in the same manner as it is today. This paper presents the concept of a finite conjunction assessment-one where each discrete conjunction is not treated separately but, rather, as a continuous event that must be managed concurrently. The paper also introduces the Total Probability of Collision as an analogous metric for finite conjunction assessment operations and provides several options for its usage in a Concept of Operations.

  16. FACTOR STRUCTURE OF FUNCTIONAL CAPABILITIES OF BODYBUILDERS

    Directory of Open Access Journals (Sweden)

    Predrag Milenović

    2007-05-01

    Full Text Available It is evident that researches in the fi eld of kineziology and sports sciences on the topic of body-building here are very rare mainly and probably because of its place in the system of hyerarchy of sports. Lack of interest in body-building and its insuffi cient popularization springs probably, among other things, from its different interpretation and, according to some people, from its ultimate goals which are not justifi ed by many. Others, experts from the fi eld of body-building, starting from the basic principles of its exercising point out its numerous positive characteristics and sides. Undoubtedly, characteristics of functional capabilities of sportspeople are specifi c for each sport or discipline. In body-building the functional sphere is bordered and defi ned by the nature of the sport’s activity itself, as well as by genetics and internal and external factors in a very complex training process of a bodu-builder. The goal of this research was determining the structure of the functional sphere of a body-builder. It was performed on the sample of 30 selected sportsmen, body-builders, of chronological age between 17 and 19 ( 6 months, members of the Sports' Club Strength ''Leskovac'', the Weight Lifters' Club '' Dubočica'' and the Body-building Club '' Dubočica'' from Leskovac. All the examiees have been submitted to training processes during a period longer than a year. For the purpose of determining the structure of the morphological sphere the Factor Analysis has been applied. Based on the data from the matrix of the Factor Structure the isolated factors can be interpreted in the following manner: The fi rst isolated factor in the sphere of applied functional variables is best defi ned by the variable of pulse under stress (FPUOP and the variable of maximum Oxygen consumption in liters per minute (FOLM. This isolated factor can be defi ned as a dimension of the transportation system of Oxygen. The second isolated factor in the

  17. Atmospheric Release Advisory Capability (ARAC)

    International Nuclear Information System (INIS)

    Dickerson, M.H.

    1975-01-01

    The chief purpose of ARAC data acquisition program is to provide site officials, who are responsible for ensuring maximum health protection for the endangered site personnel and public, with estimates of the effects of atmospheric releases of hazardous material as rapidly and accurately as possible. ARAC is in the initial stages of being implemented and is therefore susceptible to changes before it reaches its final form. However the concept of ARAC is fully developed and was successfully demonstrated during a feasibility study conducted in June 1974, as a joint effort between the Savannah River Laboratory (SRL) and Lawrence Livermore Laboratory (LLL). Additional tests between SRL and LLL are scheduled for December 1975. While the immediate goal is the application of ARAC to assist a limited number of ERDA sites, the system is designed with sufficient flexibility to permit expanding the service to a large number of sites. Success in ARAC application should provide nuclear facilities with a means to handle better the urgent questions concerning the potential accidental hazards from atmospheric releases in addition to providing the sites with a capability to assess the effort of their normal operations

  18. Heavy Lift Launch Capability with a New Hydrocarbon Engine

    Science.gov (United States)

    Threet, Grady E., Jr.; Holt, James B.; Philips, Alan D.; Garcia, Jessica A.

    2011-01-01

    The Advanced Concepts Office at NASA's George C. Marshall Space Flight Center was tasked to define the thrust requirement of a new liquid oxygen rich staged combustion cycle hydrocarbon engine that could be utilized in a launch vehicle to meet NASA s future heavy lift needs. Launch vehicle concepts were sized using this engine for different heavy lift payload classes. Engine out capabilities for one of the heavy lift configurations were also analyzed for increased reliability that may be desired for high value payloads or crewed missions. The applicability for this engine in vehicle concepts to meet military and commercial class payloads comparable to current ELV capability was also evaluated.

  19. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  20. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  1. Future Textiles

    DEFF Research Database (Denmark)

    Hansen, Anne-Louise Degn; Jensen, Hanne Troels Fusvad; Hansen, Martin

    2011-01-01

    Magasinet Future Textiles samler resultaterne fra projektet Future Textiles, der markedsfører området intelligente tekstiler. I magasinet kan man læse om trends, drivkræfter, udfordringer samt få ideer til nye produkter inden for intelligente tekstiler. Områder som bæredygtighed og kundetilpasning...

  2. Physics at Future Colliders

    CERN Document Server

    Ellis, John R.

    1999-01-01

    After a brief review of the Big Issues in particle physics, we discuss the contributions to resolving that could be made by various planned and proposed future colliders. These include future runs of LEP and the Fermilab Tevatron collider, B factories, RHIC, the LHC, a linear electron-positron collider, an electron-proton collider in the LEP/LHC tunnel, a muon collider and a future larger hadron collider (FLHC). The Higgs boson and supersymmetry are used as benchmarks for assessing their capabilities. The LHC has great capacities for precision measurements as well as exploration, but also shortcomings where the complementary strengths of a linear electron-positron collider would be invaluable. It is not too soon to study seriously possible subsequent colliders.

  3. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  4. Extension of TRIGA reactor capabilities

    International Nuclear Information System (INIS)

    Gietzen, A.J.

    1980-01-01

    The first TRIGA reactor went into operation at 10 kW about 22 years ago. Since that time 55 TRIGAs have been put into operation including steady-state powers up to 14,000 kW and pulsing reactors that pulse to 20,000,000 kW. Five more are under construction and a proposal will soon be submitted for a reactor of 25,000 kW. Along with these increases in power levels (and the corresponding fluxes) the experimental facilities have also been expanded. In addition to the installation of new TRIGA reactors with enhanced capabilities many of the older reactors have been modified and upgraded. Also, a number of reactors originally fueled with plate fuel were converted to TRIGA fuel to take advantage of the improved technical and safety characteristics, including the ability for pulsed operation. In order to accommodate increased power and performance the fuel has undergone considerable evolution. Most of the changes have been in the geometry, enrichment and cladding material. However, more recently further development on the UZrH alloy has been carried out to extend the uranium content up to 45% by weight. This increased U content is necessary to allow the use of less than 20% enrichment in the higher powered reactors while maintaining longer core lifetime. The instrumentation and control system has undergone remarkable improvement as the electronics technology has evolved so rapidly in the last two decades. The information display and the circuitry logic has also undergone improvements for enhanced ease of operation and safety. (author)

  5. Futures Brokerages Face uncertain Future

    Institute of Scientific and Technical Information of China (English)

    WANG PEI

    2006-01-01

    @@ 2005 was a quiet year for China's futures market.After four new trading products, including cotton, fuel oil and corn, were launched on the market in 2004, the development of the market seemed to stagnate. The trade value of the futures market totaled 13.4 trillion yuan (US$ 1.67 trillion) in 2005, down 8.5 percent year-on-year. Although the decrease is quite small and the trade value was still the second highest in the market's history, the majority of futures brokerage firms were running in the red. In some areas, up to 80 percent of futures companies made losses.

  6. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  7. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  8. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  9. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  10. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  11. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  12. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  14. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  15. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  16. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  17. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  18. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  19. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  20. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.